Sample records for qc sampling sample

  1. QA/QC requirements for physical properties sampling and analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Innis, B.E.

    1993-07-21

    This report presents results of an assessment of the available information concerning US Environmental Protection Agency (EPA) quality assurance/quality control (QA/QC) requirements and guidance applicable to sampling, handling, and analyzing physical parameter samples at Comprehensive Environmental Restoration, Compensation, and Liability Act (CERCLA) investigation sites. Geotechnical testing laboratories measure the following physical properties of soil and sediment samples collected during CERCLA remedial investigations (RI) at the Hanford Site: moisture content, grain size by sieve, grain size by hydrometer, specific gravity, bulk density/porosity, saturated hydraulic conductivity, moisture retention, unsaturated hydraulic conductivity, and permeability of rocks by flowing air. Geotechnical testing laboratories alsomore » measure the following chemical parameters of soil and sediment samples collected during Hanford Site CERCLA RI: calcium carbonate and saturated column leach testing. Physical parameter data are used for (1) characterization of vadose and saturated zone geology and hydrogeology, (2) selection of monitoring well screen sizes, (3) to support modeling and analysis of the vadose and saturated zones, and (4) for engineering design. The objectives of this report are to determine the QA/QC levels accepted in the EPA Region 10 for the sampling, handling, and analysis of soil samples for physical parameters during CERCLA RI.« less

  2. Comparison of Different Matrices as Potential Quality Control Samples for Neurochemical Dementia Diagnostics.

    PubMed

    Lelental, Natalia; Brandner, Sebastian; Kofanova, Olga; Blennow, Kaj; Zetterberg, Henrik; Andreasson, Ulf; Engelborghs, Sebastiaan; Mroczko, Barbara; Gabryelewicz, Tomasz; Teunissen, Charlotte; Mollenhauer, Brit; Parnetti, Lucilla; Chiasserini, Davide; Molinuevo, Jose Luis; Perret-Liaudet, Armand; Verbeek, Marcel M; Andreasen, Niels; Brosseron, Frederic; Bahl, Justyna M C; Herukka, Sanna-Kaisa; Hausner, Lucrezia; Frölich, Lutz; Labonte, Anne; Poirier, Judes; Miller, Anne-Marie; Zilka, Norbert; Kovacech, Branislav; Urbani, Andrea; Suardi, Silvia; Oliveira, Catarina; Baldeiras, Ines; Dubois, Bruno; Rot, Uros; Lehmann, Sylvain; Skinningsrud, Anders; Betsou, Fay; Wiltfang, Jens; Gkatzima, Olymbia; Winblad, Bengt; Buchfelder, Michael; Kornhuber, Johannes; Lewczuk, Piotr

    2016-03-01

    Assay-vendor independent quality control (QC) samples for neurochemical dementia diagnostics (NDD) biomarkers are so far commercially unavailable. This requires that NDD laboratories prepare their own QC samples, for example by pooling leftover cerebrospinal fluid (CSF) samples. To prepare and test alternative matrices for QC samples that could facilitate intra- and inter-laboratory QC of the NDD biomarkers. Three matrices were validated in this study: (A) human pooled CSF, (B) Aβ peptides spiked into human prediluted plasma, and (C) Aβ peptides spiked into solution of bovine serum albumin in phosphate-buffered saline. All matrices were tested also after supplementation with an antibacterial agent (sodium azide). We analyzed short- and long-term stability of the biomarkers with ELISA and chemiluminescence (Fujirebio Europe, MSD, IBL International), and performed an inter-laboratory variability study. NDD biomarkers turned out to be stable in almost all samples stored at the tested conditions for up to 14 days as well as in samples stored deep-frozen (at - 80°C) for up to one year. Sodium azide did not influence biomarker stability. Inter-center variability of the samples sent at room temperature (pooled CSF, freeze-dried CSF, and four artificial matrices) was comparable to the results obtained on deep-frozen samples in other large-scale projects. Our results suggest that it is possible to replace self-made, CSF-based QC samples with large-scale volumes of QC materials prepared with artificial peptides and matrices. This would greatly facilitate intra- and inter-laboratory QC schedules for NDD measurements.

  3. PCB Analysis Plan for Tank Archive Samples

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NGUYEN, D.M.

    2001-03-22

    This analysis plan specifies laboratory analysis, quality assurance/quality control (QA/QC), and data reporting requirements for analyzing polychlorinated biphenyls (PCB) concentrations in archive samples. Tank waste archive samples that are planned for PCB analysis are identified in Nguyen 2001. The tanks and samples are summarized in Table 1-1. The analytical data will be used to establish a PCB baseline inventory in Hanford tanks.

  4. Quality assurance and quality control for thermal/optical analysis of aerosol samples for organic and elemental carbon.

    PubMed

    Chow, Judith C; Watson, John G; Robles, Jerome; Wang, Xiaoliang; Chen, L-W Antony; Trimble, Dana L; Kohl, Steven D; Tropp, Richard J; Fung, Kochy K

    2011-12-01

    Accurate, precise, and valid organic and elemental carbon (OC and EC, respectively) measurements require more effort than the routine analysis of ambient aerosol and source samples. This paper documents the quality assurance (QA) and quality control (QC) procedures that should be implemented to ensure consistency of OC and EC measurements. Prior to field sampling, the appropriate filter substrate must be selected and tested for sampling effectiveness. Unexposed filters are pre-fired to remove contaminants and acceptance tested. After sampling, filters must be stored in the laboratory in clean, labeled containers under refrigeration (<4 °C) to minimize loss of semi-volatile OC. QA activities include participation in laboratory accreditation programs, external system audits, and interlaboratory comparisons. For thermal/optical carbon analyses, periodic QC tests include calibration of the flame ionization detector with different types of carbon standards, thermogram inspection, replicate analyses, quantification of trace oxygen concentrations (<100 ppmv) in the helium atmosphere, and calibration of the sample temperature sensor. These established QA/QC procedures are applicable to aerosol sampling and analysis for carbon and other chemical components.

  5. Theory of sampling: four critical success factors before analysis.

    PubMed

    Wagner, Claas; Esbensen, Kim H

    2015-01-01

    Food and feed materials characterization, risk assessment, and safety evaluations can only be ensured if QC measures are based on valid analytical data, stemming from representative samples. The Theory of Sampling (TOS) is the only comprehensive theoretical framework that fully defines all requirements to ensure sampling correctness and representativity, and to provide the guiding principles for sampling in practice. TOS also defines the concept of material heterogeneity and its impact on the sampling process, including the effects from all potential sampling errors. TOS's primary task is to eliminate bias-generating errors and to minimize sampling variability. Quantitative measures are provided to characterize material heterogeneity, on which an optimal sampling strategy should be based. Four critical success factors preceding analysis to ensure a representative sampling process are presented here.

  6. Quality Control in Clinical Laboratory Samples

    DTIC Science & Technology

    2015-01-01

    is able to find and correct flaws in the analytical processes of a lab before potentially incorrect patient resu lts are released. According to...verifi es that the results produced are accurate and precise . Clinical labs use management of documentation as well as inco rporation of a continuous...improvement process to streamline the overall quality control process . QC samples are expected to be identical and tested identically to patient

  7. Droplet Digital™ PCR Next-Generation Sequencing Library QC Assay.

    PubMed

    Heredia, Nicholas J

    2018-01-01

    Digital PCR is a valuable tool to quantify next-generation sequencing (NGS) libraries precisely and accurately. Accurately quantifying NGS libraries enable accurate loading of the libraries on to the sequencer and thus improve sequencing performance by reducing under and overloading error. Accurate quantification also benefits users by enabling uniform loading of indexed/barcoded libraries which in turn greatly improves sequencing uniformity of the indexed/barcoded samples. The advantages gained by employing the Droplet Digital PCR (ddPCR™) library QC assay includes the precise and accurate quantification in addition to size quality assessment, enabling users to QC their sequencing libraries with confidence.

  8. Pilot studies for the North American Soil Geochemical Landscapes Project - Site selection, sampling protocols, analytical methods, and quality control protocols

    USGS Publications Warehouse

    Smith, D.B.; Woodruff, L.G.; O'Leary, R. M.; Cannon, W.F.; Garrett, R.G.; Kilburn, J.E.; Goldhaber, M.B.

    2009-01-01

    In 2004, the US Geological Survey (USGS) and the Geological Survey of Canada sampled and chemically analyzed soils along two transects across Canada and the USA in preparation for a planned soil geochemical survey of North America. This effort was a pilot study to test and refine sampling protocols, analytical methods, quality control protocols, and field logistics for the continental survey. A total of 220 sample sites were selected at approximately 40-km intervals along the two transects. The ideal sampling protocol at each site called for a sample from a depth of 0-5 cm and a composite of each of the O, A, and C horizons. The <2-mm fraction of each sample was analyzed for Al, Ca, Fe, K, Mg, Na, S, Ti, Ag, As, Ba, Be, Bi, Cd, Ce, Co, Cr, Cs, Cu, Ga, In, La, Li, Mn, Mo, Nb, Ni, P, Pb, Rb, Sb, Sc, Sn, Sr, Te, Th, Tl, U, V, W, Y, and Zn by inductively coupled plasma-mass spectrometry and inductively coupled plasma-atomic emission spectrometry following a near-total digestion in a mixture of HCl, HNO3, HClO4, and HF. Separate methods were used for Hg, Se, total C, and carbonate-C on this same size fraction. Only Ag, In, and Te had a large percentage of concentrations below the detection limit. Quality control (QC) of the analyses was monitored at three levels: the laboratory performing the analysis, the USGS QC officer, and the principal investigator for the study. This level of review resulted in an average of one QC sample for every 20 field samples, which proved to be minimally adequate for such a large-scale survey. Additional QC samples should be added to monitor within-batch quality to the extent that no more than 10 samples are analyzed between a QC sample. Only Cr (77%), Y (82%), and Sb (80%) fell outside the acceptable limits of accuracy (% recovery between 85 and 115%) because of likely residence in mineral phases resistant to the acid digestion. A separate sample of 0-5-cm material was collected at each site for determination of organic compounds. A subset

  9. NHEXAS PHASE I ARIZONA STUDY--STANDARD OPERATING PROCEDURE FOR SAMPLING WEIGHT CALCULATION (IIT-A-9.0)

    EPA Science Inventory

    The purpose of this SOP is to describe the procedures undertaken to calculate sampling weights. The sampling weights are needed to obtain weighted statistics of the NHEXAS data. This SOP uses data that have been properly coded and certified with appropriate QA/QC procedures by t...

  10. QC-ART: A tool for real-time quality control assessment of mass spectrometry-based proteomics data.

    PubMed

    Stanfill, Bryan A; Nakayasu, Ernesto S; Bramer, Lisa M; Thompson, Allison M; Ansong, Charles K; Clauss, Therese; Gritsenko, Marina A; Monroe, Matthew E; Moore, Ronald J; Orton, Daniel J; Piehowski, Paul D; Schepmoes, Athena A; Smith, Richard D; Webb-Robertson, Bobbie-Jo; Metz, Thomas O

    2018-04-17

    Liquid chromatography-mass spectrometry (LC-MS)-based proteomics studies of large sample cohorts can easily require from months to years to complete. Acquiring consistent, high-quality data in such large-scale studies is challenging because of normal variations in instrumentation performance over time, as well as artifacts introduced by the samples themselves, such as those due to collection, storage and processing. Existing quality control methods for proteomics data primarily focus on post-hoc analysis to remove low-quality data that would degrade downstream statistics; they are not designed to evaluate the data in near real-time, which would allow for interventions as soon as deviations in data quality are detected.  In addition to flagging analyses that demonstrate outlier behavior, evaluating how the data structure changes over time can aide in understanding typical instrument performance or identify issues such as a degradation in data quality due to the need for instrument cleaning and/or re-calibration.  To address this gap for proteomics, we developed Quality Control Analysis in Real-Time (QC-ART), a tool for evaluating data as they are acquired in order to dynamically flag potential issues with instrument performance or sample quality.  QC-ART has similar accuracy as standard post-hoc analysis methods with the additional benefit of real-time analysis.  We demonstrate the utility and performance of QC-ART in identifying deviations in data quality due to both instrument and sample issues in near real-time for LC-MS-based plasma proteomics analyses of a sample subset of The Environmental Determinants of Diabetes in the Young cohort. We also present a case where QC-ART facilitated the identification of oxidative modifications, which are often underappreciated in proteomic experiments. Published under license by The American Society for Biochemistry and Molecular Biology, Inc.

  11. UK audit of glomerular filtration rate measurement from plasma sampling in 2013.

    PubMed

    Murray, Anthony W; Lawson, Richard S; Cade, Sarah C; Hall, David O; Kenny, Bob; O'Shaughnessy, Emma; Taylor, Jon; Towey, David; White, Duncan; Carson, Kathryn

    2014-11-01

    An audit was carried out into UK glomerular filtration rate (GFR) calculation. The results were compared with an identical 2001 audit. Participants used their routine method to calculate GFR for 20 data sets (four plasma samples) in millilitres per minute and also the GFR normalized for body surface area. Some unsound data sets were included to analyse the applied quality control (QC) methods. Variability between centres was assessed for each data set, compared with the national median and a reference value calculated using the method recommended in the British Nuclear Medicine Society guidelines. The influence of the number of samples on variability was studied. Supplementary data were requested on workload and methodology. The 59 returns showed widespread standardization. The applied early exponential clearance correction was the main contributor to the observed variability. These corrections were applied by 97% of centres (50% - 2001) with 80% using the recommended averaged Brochner-Mortenson correction. Approximately 75% applied the recommended Haycock body surface area formula for adults (78% for children). The effect of the number of samples used was not significant. There was wide variability in the applied QC techniques, especially in terms of the use of the volume of distribution. The widespread adoption of the guidelines has harmonized national GFR calculation compared with the previous audit. Further standardization could further reduce variability. This audit has highlighted the need to address the national standardization of QC methods. Radionuclide techniques are confirmed as the preferred method for GFR measurement when an unequivocal result is required.

  12. RNA-SeQC: RNA-seq metrics for quality control and process optimization.

    PubMed

    DeLuca, David S; Levin, Joshua Z; Sivachenko, Andrey; Fennell, Timothy; Nazaire, Marc-Danie; Williams, Chris; Reich, Michael; Winckler, Wendy; Getz, Gad

    2012-06-01

    RNA-seq, the application of next-generation sequencing to RNA, provides transcriptome-wide characterization of cellular activity. Assessment of sequencing performance and library quality is critical to the interpretation of RNA-seq data, yet few tools exist to address this issue. We introduce RNA-SeQC, a program which provides key measures of data quality. These metrics include yield, alignment and duplication rates; GC bias, rRNA content, regions of alignment (exon, intron and intragenic), continuity of coverage, 3'/5' bias and count of detectable transcripts, among others. The software provides multi-sample evaluation of library construction protocols, input materials and other experimental parameters. The modularity of the software enables pipeline integration and the routine monitoring of key measures of data quality such as the number of alignable reads, duplication rates and rRNA contamination. RNA-SeQC allows investigators to make informed decisions about sample inclusion in downstream analysis. In summary, RNA-SeQC provides quality control measures critical to experiment design, process optimization and downstream computational analysis. See www.genepattern.org to run online, or www.broadinstitute.org/rna-seqc/ for a command line tool.

  13. FQC Dashboard: integrates FastQC results into a web-based, interactive, and extensible FASTQ quality control tool

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brown, Joseph; Pirrung, Meg; McCue, Lee Ann

    FQC is software that facilitates quality control of FASTQ files by carrying out a QC protocol using FastQC, parsing results, and aggregating quality metrics into an interactive dashboard designed to richly summarize individual sequencing runs. The dashboard groups samples in dropdowns for navigation among the data sets, utilizes human-readable configuration files to manipulate the pages and tabs, and is extensible with CSV data.

  14. FQC Dashboard: integrates FastQC results into a web-based, interactive, and extensible FASTQ quality control tool

    DOE PAGES

    Brown, Joseph; Pirrung, Meg; McCue, Lee Ann

    2017-06-09

    FQC is software that facilitates quality control of FASTQ files by carrying out a QC protocol using FastQC, parsing results, and aggregating quality metrics into an interactive dashboard designed to richly summarize individual sequencing runs. The dashboard groups samples in dropdowns for navigation among the data sets, utilizes human-readable configuration files to manipulate the pages and tabs, and is extensible with CSV data.

  15. USGS Blind Sample Project: monitoring and evaluating laboratory analytical quality

    USGS Publications Warehouse

    Ludtke, Amy S.; Woodworth, Mark T.

    1997-01-01

    The U.S. Geological Survey (USGS) collects and disseminates information about the Nation's water resources. Surface- and ground-water samples are collected and sent to USGS laboratories for chemical analyses. The laboratories identify and quantify the constituents in the water samples. Random and systematic errors occur during sample handling, chemical analysis, and data processing. Although all errors cannot be eliminated from measurements, the magnitude of their uncertainty can be estimated and tracked over time. Since 1981, the USGS has operated an independent, external, quality-assurance project called the Blind Sample Project (BSP). The purpose of the BSP is to monitor and evaluate the quality of laboratory analytical results through the use of double-blind quality-control (QC) samples. The information provided by the BSP assists the laboratories in detecting and correcting problems in the analytical procedures. The information also can aid laboratory users in estimating the extent that laboratory errors contribute to the overall errors in their environmental data.

  16. U.S.-MEXICO BORDER PROGRAM ARIZONA BORDER STUDY--STANDARD OPERATING PROCEDURE FOR SAMPLING WEIGHT CALCULATION (IIT-A-9.0)

    EPA Science Inventory

    The purpose of this SOP is to describe the procedures undertaken to calculate sampling weights. The sampling weights are needed to obtain weighted statistics of the study data. This SOP uses data that have been properly coded and certified with appropriate QA/QC procedures by th...

  17. Characterizing a Quantum Cascade Tunable Infrared Laser Differential Absorption Spectrometer (QC-TILDAS) for Measurements of Atmospheric Ammonia

    NASA Astrophysics Data System (ADS)

    Ellis, R.; Murphy, J. G.; van Haarlem, R.; Pattey, E.; O'Brien, J.

    2009-05-01

    A compact, fast response Quantum Cascade Tunable Infrared Laser Differential Absorption Spectrometer (QC- TILDAS) for measurements of ammonia has been evaluated under both laboratory and field conditions. Absorption of radiation from a pulsed, thermoelectrically cooled QC laser occurs at reduced pressure in a 76 m path length, 0.5 L volume multiple pass absorption cell. Detection is achieved using a thermoelectrically cooled HgCdTe infrared detector. A novel sampling technique was used, consisting of a short, heated, quartz inlet with a hydrophobic coating to minimize the adsorption of ammonia to surfaces. The inlet contains a critical orifice that reduces the pressure, a virtual impactor for separation of particles and additional ports for delivering ammonia free background air and calibration gas standards. This instrument has been found to have a detection limit of 0.3 ppb with a time resolution of 1 s. The sampling technique has been compared to the results of a conventional lead salt Tunable Diode Laser (TDL) absorption spectrometer during a laboratory intercomparison. Various lengths and types of sample inlet tubing material, heated and unheated, under dry and ambient humidity conditions with ammonia concentrations ranging from 10-1000 ppb were investigated. Preliminary analysis suggests the time response improves with the use of short, PFA tubing sampling lines. No significant improvement was observed when using a heated sampling line and humidity was seen to play an important role on the bi-exponential decay of ammonia. A field intercomparison of the QC-TILDAS with a modified Thermo 42C chemiluminescence based analyzer was also performed at Environment Canada's Centre for Atmospheric Research Experiments (CARE) in the rural town of Egbert, ON between May-July 2008. Background tests and calibrations using two different permeation tube sources and an ammonia gas cylinder were regularly carried out throughout the study. Results indicate a very good correlation

  18. The importance of quality control in validating concentrations of contaminants of emerging concern in source and treated drinking water samples.

    PubMed

    Batt, Angela L; Furlong, Edward T; Mash, Heath E; Glassmeyer, Susan T; Kolpin, Dana W

    2017-02-01

    A national-scale survey of 247 contaminants of emerging concern (CECs), including organic and inorganic chemical compounds, and microbial contaminants, was conducted in source and treated drinking water samples from 25 treatment plants across the United States. Multiple methods were used to determine these CECs, including six analytical methods to measure 174 pharmaceuticals, personal care products, and pesticides. A three-component quality assurance/quality control (QA/QC) program was designed for the subset of 174 CECs which allowed us to assess and compare performances of the methods used. The three components included: 1) a common field QA/QC protocol and sample design, 2) individual investigator-developed method-specific QA/QC protocols, and 3) a suite of 46 method comparison analytes that were determined in two or more analytical methods. Overall method performance for the 174 organic chemical CECs was assessed by comparing spiked recoveries in reagent, source, and treated water over a two-year period. In addition to the 247 CECs reported in the larger drinking water study, another 48 pharmaceutical compounds measured did not consistently meet predetermined quality standards. Methodologies that did not seem suitable for these analytes are overviewed. The need to exclude analytes based on method performance demonstrates the importance of additional QA/QC protocols. Published by Elsevier B.V.

  19. AN EVALUATION OF SAMPLE DISPERSION MEDIAS USED WITH ACCELERATED SOLVENT EXTRACTION FOR THE EXTRACTION AND RECOVERY OF ARSENICALS FROM LFB AND DORM-2

    EPA Science Inventory

    An accelerated solvent extraction (ASE) device was evaluated as a semi-automated means for extracting arsenicals from quality control (QC) samples and DORM-2 [standard reference material (SRM)]. Unlike conventional extraction procedures, the ASE requires that the sample be dispe...

  20. Microbial Groundwater Sampling Protocol for Fecal-Rich Environments

    PubMed Central

    Harter, Thomas; Watanabe, Naoko; Li, Xunde; Atwill, Edward R; Samuels, William

    2014-01-01

    Inherently, confined animal farming operations (CAFOs) and other intense fecal-rich environments are potential sources of groundwater contamination by enteric pathogens. The ubiquity of microbial matter poses unique technical challenges in addition to economic constraints when sampling wells in such environments. In this paper, we evaluate a groundwater sampling protocol that relies on extended purging with a portable submersible stainless steel pump and Teflon® tubing as an alternative to equipment sterilization. The protocol allows for collecting a large number of samples quickly, relatively inexpensively, and under field conditions with limited access to capacity for sterilizing equipment. The protocol is tested on CAFO monitoring wells and considers three cross-contamination sources: equipment, wellbore, and ambient air. For the assessment, we use Enterococcus, a ubiquitous fecal indicator bacterium (FIB), in laboratory and field tests with spiked and blank samples, and in an extensive, multi-year field sampling campaign on 17 wells within 2 CAFOs. The assessment shows that extended purging can successfully control for equipment cross-contamination, but also controls for significant contamination of the well-head, within the well casing and within the immediate aquifer vicinity of the well-screen. Importantly, our tests further indicate that Enterococcus is frequently entrained in water samples when exposed to ambient air at a CAFO during sample collection. Wellbore and air contamination pose separate challenges in the design of groundwater monitoring strategies on CAFOs that are not addressed by equipment sterilization, but require adequate QA/QC procedures and can be addressed by the proposed sampling strategy. PMID:24903186

  1. QA/QC in the laboratory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hood, F.C.

    1992-05-01

    Quality assurance and quality control (QA/QC) of analytical chemistry laboratory activities are essential to the validity and usefulness of resultant data. However, in themselves, conventional QA/QC measures will not always ensure that fraudulent data are not generated. Conventional QA/QC measures are based on the assumption that work will be done in good faith; to assure against fraudulent practices, QA/QC measures must be tailored to specific analyses protocols in anticipation of intentional misapplication of those protocols. Application of specific QA/QC measures to ensure against fraudulent practices result in an increased administrative burden being placed on the analytical process; accordingly, in keepingmore » with graded QA philosophy, data quality objectives must be used to identify specific points of concern for special control to minimize the administrative impact.« less

  2. Stability of hepatitis C virus RNA and anti-HCV antibody in air-dried and freeze-dried human plasma samples.

    PubMed

    Poe, Amanda; Duong, Ngocvien Thi; Bedi, Kanwar; Kodani, Maja

    2018-03-01

    Diagnosis of hepatitis C virus (HCV) infection is based on testing for antibodies to HCV (anti-HCV), hepatitis C core antigen (HCV cAg) and HCV RNA. To ensure quality control (QC) and quality assurance (QA), proficiency panels are provided by reference laboratories and various international organizations, requiring costly dry ice shipments to maintain specimen integrity. Alternative methods of specimen preservation and transport can save on shipping and handling and help in improving diagnostics by facilitating QA/QC of various laboratories especially in resource limited countries. Plasma samples positive for anti-HCV and HCV RNA were either dried using dried tube specimens (DTS) method or lyophilization for varying durations of time and temperature. Preservation of samples using DTS method resulted in loss of anti-HCV reactivity for low-positive samples and did not generate enough volume for HCV RNA testing. Lyophilized samples tested positive for anti-HCV even after storage at 4 °C and 25 °C for 12 weeks. Further, HCV RNA was detectable in 5 of 5 (100%) samples over the course of 12 week storage at 4, 25, 37 and 45 °C. In conclusion, lyophilization of specimens maintains integrity of plasma samples for testing for markers of HCV infection and can be a potent mode of sharing proficiency samples without incurring huge shipping costs and avoids challenges with dry ice shipments between donor and recipient laboratories. Copyright © 2017. Published by Elsevier B.V.

  3. Countably QC-Approximating Posets

    PubMed Central

    Mao, Xuxin; Xu, Luoshan

    2014-01-01

    As a generalization of countably C-approximating posets, the concept of countably QC-approximating posets is introduced. With the countably QC-approximating property, some characterizations of generalized completely distributive lattices and generalized countably approximating posets are given. The main results are as follows: (1) a complete lattice is generalized completely distributive if and only if it is countably QC-approximating and weakly generalized countably approximating; (2) a poset L having countably directed joins is generalized countably approximating if and only if the lattice σ c(L)op of all σ-Scott-closed subsets of L is weakly generalized countably approximating. PMID:25165730

  4. MARS: bringing the automation of small-molecule bioanalytical sample preparations to a new frontier.

    PubMed

    Li, Ming; Chou, Judy; Jing, Jing; Xu, Hui; Costa, Aldo; Caputo, Robin; Mikkilineni, Rajesh; Flannelly-King, Shane; Rohde, Ellen; Gan, Lawrence; Klunk, Lewis; Yang, Liyu

    2012-06-01

    In recent years, there has been a growing interest in automating small-molecule bioanalytical sample preparations specifically using the Hamilton MicroLab(®) STAR liquid-handling platform. In the most extensive work reported thus far, multiple small-molecule sample preparation assay types (protein precipitation extraction, SPE and liquid-liquid extraction) have been integrated into a suite that is composed of graphical user interfaces and Hamilton scripts. Using that suite, bioanalytical scientists have been able to automate various sample preparation methods to a great extent. However, there are still areas that could benefit from further automation, specifically, the full integration of analytical standard and QC sample preparation with study sample extraction in one continuous run, real-time 2D barcode scanning on the Hamilton deck and direct Laboratory Information Management System database connectivity. We developed a new small-molecule sample-preparation automation system that improves in all of the aforementioned areas. The improved system presented herein further streamlines the bioanalytical workflow, simplifies batch run design, reduces analyst intervention and eliminates sample-handling error.

  5. NGSCheckMate: software for validating sample identity in next-generation sequencing studies within and across data types.

    PubMed

    Lee, Sejoon; Lee, Soohyun; Ouellette, Scott; Park, Woong-Yang; Lee, Eunjung A; Park, Peter J

    2017-06-20

    In many next-generation sequencing (NGS) studies, multiple samples or data types are profiled for each individual. An important quality control (QC) step in these studies is to ensure that datasets from the same subject are properly paired. Given the heterogeneity of data types, file types and sequencing depths in a multi-dimensional study, a robust program that provides a standardized metric for genotype comparisons would be useful. Here, we describe NGSCheckMate, a user-friendly software package for verifying sample identities from FASTQ, BAM or VCF files. This tool uses a model-based method to compare allele read fractions at known single-nucleotide polymorphisms, considering depth-dependent behavior of similarity metrics for identical and unrelated samples. Our evaluation shows that NGSCheckMate is effective for a variety of data types, including exome sequencing, whole-genome sequencing, RNA-seq, ChIP-seq, targeted sequencing and single-cell whole-genome sequencing, with a minimal requirement for sequencing depth (>0.5X). An alignment-free module can be run directly on FASTQ files for a quick initial check. We recommend using this software as a QC step in NGS studies. https://github.com/parklab/NGSCheckMate. © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.

  6. Use of Enterococcus faecalis and Bacillus atrophaeus as surrogates to establish and maintain laboratory proficiency for concentration of water samples using ultrafiltration.

    PubMed

    Mapp, Latisha; Klonicki, Patricia; Takundwa, Prisca; Hill, Vincent R; Schneeberger, Chandra; Knee, Jackie; Raynor, Malik; Hwang, Nina; Chambers, Yildiz; Miller, Kenneth; Pope, Misty

    2015-11-01

    The U.S. Environmental Protection Agency's (EPA) Water Laboratory Alliance (WLA) currently uses ultrafiltration (UF) for concentration of biosafety level 3 (BSL-3) agents from large volumes (up to 100-L) of drinking water prior to analysis. Most UF procedures require comprehensive training and practice to achieve and maintain proficiency. As a result, there was a critical need to develop quality control (QC) criteria. Because select agents are difficult to work with and pose a significant safety hazard, QC criteria were developed using surrogates, including Enterococcus faecalis and Bacillus atrophaeus. This article presents the results from the QC criteria development study and results from a subsequent demonstration exercise in which E. faecalis was used to evaluate proficiency using UF to concentrate large volume drinking water samples. Based on preliminary testing EPA Method 1600 and Standard Methods 9218, for E. faecalis and B. atrophaeus respectively, were selected for use during the QC criteria development study. The QC criteria established for Method 1600 were used to assess laboratory performance during the demonstration exercise. Based on the results of the QC criteria study E. faecalis and B. atrophaeus can be used effectively to demonstrate and maintain proficiency using ultrafiltration. Published by Elsevier B.V.

  7. FQC Dashboard: integrates FastQC results into a web-based, interactive, and extensible FASTQ quality control tool.

    PubMed

    Brown, Joseph; Pirrung, Meg; McCue, Lee Ann

    2017-06-09

    FQC is software that facilitates quality control of FASTQ files by carrying out a QC protocol using FastQC, parsing results, and aggregating quality metrics into an interactive dashboard designed to richly summarize individual sequencing runs. The dashboard groups samples in dropdowns for navigation among the data sets, utilizes human-readable configuration files to manipulate the pages and tabs, and is extensible with CSV data. FQC is implemented in Python 3 and Javascript, and is maintained under an MIT license. Documentation and source code is available at: https://github.com/pnnl/fqc . joseph.brown@pnnl.gov. © The Author(s) 2017. Published by Oxford University Press.

  8. Development and application of a validated HPLC method for the analysis of dissolution samples of levothyroxine sodium drug products

    PubMed Central

    Collier, J.W.; Shah, R.B.; Bryant, A.R.; Habib, M.J.; Khan, M.A.; Faustino, P.J.

    2011-01-01

    A rapid, selective, and sensitive gradient HPLC method was developed for the analysis of dissolution samples of levothyroxine sodium tablets. Current USP methodology for levothyroxine (l-T4) was not adequate to resolve co-elutants from a variety of levothyroxine drug product formulations. The USP method for analyzing dissolution samples of the drug product has shown significant intra- and inter-day variability. The sources of method variability include chromatographic interferences introduced by the dissolution media and the formulation excipients. In the present work, chromatographic separation of levothyroxine was achieved on an Agilent 1100 Series HPLC with a Waters Nova-pak column (250mm × 3.9mm) using a 0.01 M phosphate buffer (pH 3.0)–methanol (55:45, v/v) in a gradient elution mobile phase at a flow rate of 1.0 mL/min and detection UV wavelength of 225 nm. The injection volume was 800 µL and the column temperature was maintained at 28 °C. The method was validated according to USP Category I requirements. The validation characteristics included accuracy, precision, specificity, linearity, and analytical range. The standard curve was found to have a linear relationship (r2 > 0.99) over the analytical range of 0.08–0.8 µg/mL. Accuracy ranged from 90 to 110% for low quality control (QC) standards and 95 to 105% for medium and high QC standards. Precision was <2% at all QC levels. The method was found to be accurate, precise, selective, and linear for l-T4 over the analytical range. The HPLC method was successfully applied to the analysis of dissolution samples of marketed levothyroxine sodium tablets. PMID:20947276

  9. High-throughput sample processing and sample management; the functional evolution of classical cytogenetic assay towards automation.

    PubMed

    Ramakumar, Adarsh; Subramanian, Uma; Prasanna, Pataje G S

    2015-11-01

    High-throughput individual diagnostic dose assessment is essential for medical management of radiation-exposed subjects after a mass casualty. Cytogenetic assays such as the Dicentric Chromosome Assay (DCA) are recognized as the gold standard by international regulatory authorities. DCA is a multi-step and multi-day bioassay. DCA, as described in the IAEA manual, can be used to assess dose up to 4-6 weeks post-exposure quite accurately but throughput is still a major issue and automation is very essential. The throughput is limited, both in terms of sample preparation as well as analysis of chromosome aberrations. Thus, there is a need to design and develop novel solutions that could utilize extensive laboratory automation for sample preparation, and bioinformatics approaches for chromosome-aberration analysis to overcome throughput issues. We have transitioned the bench-based cytogenetic DCA to a coherent process performing high-throughput automated biodosimetry for individual dose assessment ensuring quality control (QC) and quality assurance (QA) aspects in accordance with international harmonized protocols. A Laboratory Information Management System (LIMS) is designed, implemented and adapted to manage increased sample processing capacity, develop and maintain standard operating procedures (SOP) for robotic instruments, avoid data transcription errors during processing, and automate analysis of chromosome-aberrations using an image analysis platform. Our efforts described in this paper intend to bridge the current technological gaps and enhance the potential application of DCA for a dose-based stratification of subjects following a mass casualty. This paper describes one such potential integrated automated laboratory system and functional evolution of the classical DCA towards increasing critically needed throughput. Published by Elsevier B.V.

  10. Determination of the anionic surfactant di(ethylhexyl) sodium sulfosuccinate in water samples collected from Gulf of Mexico coastal waters before and after landfall of oil from the Deepwater Horizon oil spill, May to October, 2010

    USGS Publications Warehouse

    Gray, James L.; Kanagy, Leslie K.; Furlong, Edward T.; McCoy, Jeff W.; Kanagy, Chris J.

    2011-01-01

    On April 22, 2010, the explosion on and subsequent sinking of the Deepwater Horizon oil drilling platform resulted in the release of crude oil into the Gulf of Mexico. At least 4.4 million barrels had been released into the Gulf of Mexico through July 15, 2010, 10 to 29 percent of which was chemically dispersed, primarily using two dispersant formulations. Initially, the dispersant Corexit 9527 was used, and when existing stocks of that formulation were exhausted, Corexit 9500 was used. Over 1.8 million gallons of the two dispersants were applied in the first 3 months after the spill. This report presents the development of an analytical method to analyze one of the primary surfactant components of both Corexit formulations, di(ethylhexyl) sodium sulfosuccinate (DOSS), the preliminary results, and the associated quality assurance/quality control (QA/QC) from samples collected from various points on the Gulf Coast between Texas and Florida. Seventy water samples and 8 field QC samples were collected before the predicted landfall of oil (pre-landfall) on the Gulf Coast, and 51 water samples and 10 field QC samples after the oil made landfall (post-landfall). Samples were collected in Teflon(Registered) bottles and stored at -20(degrees)C until analysis. Extraction of whole-water samples used sorption onto a polytetrafluoroethylene (PTFE) filter to isolate DOSS, with subsequent 50 percent methanol/water elution of the combined dissolved and particulate DOSS fractions. High-performance liquid chromatography/tandem mass spectrometry (LC/MS/MS) was used to identify and quantify DOSS by the isotope dilution method, using a custom-synthesized 13C4-DOSS labeled standard. Because of the ubiquitous presence of DOSS in laboratory reagent water, a chromatographic column was installed in the LC/MS/MS between the system pumps and the sample injector that separated this ambient background DOSS contamination from the sample DOSS, minimizing one source of blank contamination

  11. QA/QC in the laboratory. Session F

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hood, F.C.

    1992-05-01

    Quality assurance and quality control (QA/QC) of analytical chemistry laboratory activities are essential to the validity and usefulness of resultant data. However, in themselves, conventional QA/QC measures will not always ensure that fraudulent data are not generated. Conventional QA/QC measures are based on the assumption that work will be done in good faith; to assure against fraudulent practices, QA/QC measures must be tailored to specific analyses protocols in anticipation of intentional misapplication of those protocols. Application of specific QA/QC measures to ensure against fraudulent practices result in an increased administrative burden being placed on the analytical process; accordingly, in keepingmore » with graded QA philosophy, data quality objectives must be used to identify specific points of concern for special control to minimize the administrative impact.« less

  12. Field assessment of dried Plasmodium falciparum samples for malaria rapid diagnostic test quality control and proficiency testing in Ethiopia.

    PubMed

    Tamiru, Afework; Boulanger, Lucy; Chang, Michelle A; Malone, Joseph L; Aidoo, Michael

    2015-01-21

    Rapid diagnostic tests (RDTs) are now widely used for laboratory confirmation of suspected malaria cases to comply with the World Health Organization recommendation for universal testing before treatment. However, many malaria programmes lack quality control (QC) processes to assess RDT use under field conditions. Prior research showed the feasibility of using the dried tube specimen (DTS) method for preserving Plasmodium falciparum parasites for use as QC samples for RDTs. This study focused on the use of DTS for RDT QC and proficiency testing under field conditions. DTS were prepared using cultured P. falciparum at densities of 500 and 1,000 parasites/μL; 50 μL aliquots of these along with parasite negative human blood controls (0 parasites/μL) were air-dried in specimen tubes and reactivity verified after rehydration. The DTS were used in a field study in the Oromia Region of Ethiopia. Replicate DTS samples containing 0, 500 and 1,000 parasites/μL were stored at 4°C at a reference laboratory and at ambient temperatures at two nearby health facilities. At weeks 0, 4, 8, 12, 16, 20, and 24, the DTS were rehydrated and tested on RDTs stored under manufacturer-recommended temperatures at the RL and on RDTs stored under site-specific conditions at the two health facilities. Reactivity of DTS stored at 4°C at the reference laboratory on RDTs stored at the reference laboratory was considered the gold standard for assessing DTS stability. A proficiency-testing panel consisting of one negative and three positive samples, monitored with a checklist was administered at weeks 12 and 24. At all the seven time points, DTS stored at both the reference laboratory and health facility were reactive on RDTs stored under the recommended temperature and under field conditions, and the DTS without malaria parasites were negative. At the reference laboratory and one health facility, a 500 parasites/μL DTS from the proficiency panel was falsely reported as negative at week 24

  13. Development and application of a validated HPLC method for the analysis of dissolution samples of levothyroxine sodium drug products.

    PubMed

    Collier, J W; Shah, R B; Bryant, A R; Habib, M J; Khan, M A; Faustino, P J

    2011-02-20

    A rapid, selective, and sensitive gradient HPLC method was developed for the analysis of dissolution samples of levothyroxine sodium tablets. Current USP methodology for levothyroxine (L-T(4)) was not adequate to resolve co-elutants from a variety of levothyroxine drug product formulations. The USP method for analyzing dissolution samples of the drug product has shown significant intra- and inter-day variability. The sources of method variability include chromatographic interferences introduced by the dissolution media and the formulation excipients. In the present work, chromatographic separation of levothyroxine was achieved on an Agilent 1100 Series HPLC with a Waters Nova-pak column (250 mm × 3.9 mm) using a 0.01 M phosphate buffer (pH 3.0)-methanol (55:45, v/v) in a gradient elution mobile phase at a flow rate of 1.0 mL/min and detection UV wavelength of 225 nm. The injection volume was 800 μL and the column temperature was maintained at 28°C. The method was validated according to USP Category I requirements. The validation characteristics included accuracy, precision, specificity, linearity, and analytical range. The standard curve was found to have a linear relationship (r(2)>0.99) over the analytical range of 0.08-0.8 μg/mL. Accuracy ranged from 90 to 110% for low quality control (QC) standards and 95 to 105% for medium and high QC standards. Precision was <2% at all QC levels. The method was found to be accurate, precise, selective, and linear for L-T(4) over the analytical range. The HPLC method was successfully applied to the analysis of dissolution samples of marketed levothyroxine sodium tablets. Published by Elsevier B.V.

  14. Protocols for the analytical characterization of therapeutic monoclonal antibodies. II - Enzymatic and chemical sample preparation.

    PubMed

    Bobaly, Balazs; D'Atri, Valentina; Goyon, Alexandre; Colas, Olivier; Beck, Alain; Fekete, Szabolcs; Guillarme, Davy

    2017-08-15

    The analytical characterization of therapeutic monoclonal antibodies and related proteins usually incorporates various sample preparation methodologies. Indeed, quantitative and qualitative information can be enhanced by simplifying the sample, thanks to the removal of sources of heterogeneity (e.g. N-glycans) and/or by decreasing the molecular size of the tested protein by enzymatic or chemical fragmentation. These approaches make the sample more suitable for chromatographic and mass spectrometric analysis. Structural elucidation and quality control (QC) analysis of biopharmaceutics are usually performed at intact, subunit and peptide levels. In this paper, general sample preparation approaches used to attain peptide, subunit and glycan level analysis are overviewed. Protocols are described to perform tryptic proteolysis, IdeS and papain digestion, reduction as well as deglycosylation by PNGase F and EndoS2 enzymes. Both historical and modern sample preparation methods were compared and evaluated using rituximab and trastuzumab, two reference therapeutic mAb products approved by Food and Drug Administration (FDA) and European Medicines Agency (EMA). The described protocols may help analysts to develop sample preparation methods in the field of therapeutic protein analysis. Copyright © 2017 Elsevier B.V. All rights reserved.

  15. Application of clinical assay quality control (QC) to multivariate proteomics data: a workflow exemplified by 2-DE QC.

    PubMed

    Jackson, David; Bramwell, David

    2013-12-16

    Proteomics technologies can be effective for the discovery and assay of protein forms altered with disease. However, few examples of successful biomarker discovery yet exist. Critical to addressing this is the widespread implementation of appropriate QC (quality control) methodology. Such QC should combine the rigour of clinical laboratory assays with a suitable treatment of the complexity of the proteome by targeting separate assignable causes of variation. We demonstrate an approach, metric and example workflow for users to develop such targeted QC rules systematically and objectively, using a publicly available plasma DIGE data set. Hierarchical clustering analysis of standard channels is first used to discover correlated groups of features corresponding to specific assignable sources of technical variation. These effects are then quantified using a statistical distance metric, and followed on control charts. This allows measurement of process drift and the detection of runs that outlie for any given effect. A known technical issue on originally rejected gels was detected validating this approach, and relevant novel effects were also detected and classified effectively. Our approach was effective for 2-DE QC. Whilst we demonstrated this in a retrospective DIGE experiment, the principles would apply to ongoing QC and other proteomic technologies. This work asserts that properly carried out QC is essential to proteomics discovery experiments. Its significance is that it provides one possible novel framework for applying such methods, with a particular consideration of how to handle the complexity of the proteome. It not only focusses on 2DE-based methodology but also demonstrates general principles. A combination of results and discussion based upon a publicly available data set is used to illustrate the approach and allows a structured discussion of factors that experimenters may wish to bear in mind in other situations. The demonstration is on retrospective data

  16. Potential contamination of shipboard air samples by diffusive emissions of PCBs and other organic pollutants: implications and solutions.

    PubMed

    Lohmann, Rainer; Jaward, Foday M; Durham, Louise; Barber, Jonathan L; Ockenden, Wendy; Jones, Kevin C; Bruhn, Regina; Lakaschus, Soenke; Dachs, Jordi; Booij, Kees

    2004-07-15

    Air samples were taken onboard the RRS Bransfield on an Atlantic cruise from the United Kingdom to Halley, Antarctica, from October to December 1998, with the aim of establishing PCB oceanic background air concentrations and assessing their latitudinal distribution. Great care was taken to minimize pre- and post-collection contamination of the samples, which was validated through stringent QA/QC procedures. However, there is evidence that onboard contamination of the air samples occurred,following insidious, diffusive emissions on the ship. Other data (for PCBs and other persistent organic pollutants (POPs)) and examples of shipboard contamination are presented. The implications of these findings for past and future studies of global POPs distribution are discussed. Recommendations are made to help critically appraise and minimize the problems of insidious/diffusive shipboard contamination.

  17. Sampling hazelnuts for aflatoxin: uncertainty associated with sampling, sample preparation, and analysis.

    PubMed

    Ozay, Guner; Seyhan, Ferda; Yilmaz, Aysun; Whitaker, Thomas B; Slate, Andrew B; Giesbrecht, Francis

    2006-01-01

    The variability associated with the aflatoxin test procedure used to estimate aflatoxin levels in bulk shipments of hazelnuts was investigated. Sixteen 10 kg samples of shelled hazelnuts were taken from each of 20 lots that were suspected of aflatoxin contamination. The total variance associated with testing shelled hazelnuts was estimated and partitioned into sampling, sample preparation, and analytical variance components. Each variance component increased as aflatoxin concentration (either B1 or total) increased. With the use of regression analysis, mathematical expressions were developed to model the relationship between aflatoxin concentration and the total, sampling, sample preparation, and analytical variances. The expressions for these relationships were used to estimate the variance for any sample size, subsample size, and number of analyses for a specific aflatoxin concentration. The sampling, sample preparation, and analytical variances associated with estimating aflatoxin in a hazelnut lot at a total aflatoxin level of 10 ng/g and using a 10 kg sample, a 50 g subsample, dry comminution with a Robot Coupe mill, and a high-performance liquid chromatographic analytical method are 174.40, 0.74, and 0.27, respectively. The sampling, sample preparation, and analytical steps of the aflatoxin test procedure accounted for 99.4, 0.4, and 0.2% of the total variability, respectively.

  18. ChronQC: a quality control monitoring system for clinical next generation sequencing.

    PubMed

    Tawari, Nilesh R; Seow, Justine Jia Wen; Perumal, Dharuman; Ow, Jack L; Ang, Shimin; Devasia, Arun George; Ng, Pauline C

    2018-05-15

    ChronQC is a quality control (QC) tracking system for clinical implementation of next-generation sequencing (NGS). ChronQC generates time series plots for various QC metrics to allow comparison of current runs to historical runs. ChronQC has multiple features for tracking QC data including Westgard rules for clinical validity, laboratory-defined thresholds and historical observations within a specified time period. Users can record their notes and corrective actions directly onto the plots for long-term recordkeeping. ChronQC facilitates regular monitoring of clinical NGS to enable adherence to high quality clinical standards. ChronQC is freely available on GitHub (https://github.com/nilesh-tawari/ChronQC), Docker (https://hub.docker.com/r/nileshtawari/chronqc/) and the Python Package Index. ChronQC is implemented in Python and runs on all common operating systems (Windows, Linux and Mac OS X). tawari.nilesh@gmail.com or pauline.c.ng@gmail.com. Supplementary data are available at Bioinformatics online.

  19. A Mars Sample Return Sample Handling System

    NASA Technical Reports Server (NTRS)

    Wilson, David; Stroker, Carol

    2013-01-01

    We present a sample handling system, a subsystem of the proposed Dragon landed Mars Sample Return (MSR) mission [1], that can return to Earth orbit a significant mass of frozen Mars samples potentially consisting of: rock cores, subsurface drilled rock and ice cuttings, pebble sized rocks, and soil scoops. The sample collection, storage, retrieval and packaging assumptions and concepts in this study are applicable for the NASA's MPPG MSR mission architecture options [2]. Our study assumes a predecessor rover mission collects samples for return to Earth to address questions on: past life, climate change, water history, age dating, understanding Mars interior evolution [3], and, human safety and in-situ resource utilization. Hence the rover will have "integrated priorities for rock sampling" [3] that cover collection of subaqueous or hydrothermal sediments, low-temperature fluidaltered rocks, unaltered igneous rocks, regolith and atmosphere samples. Samples could include: drilled rock cores, alluvial and fluvial deposits, subsurface ice and soils, clays, sulfates, salts including perchlorates, aeolian deposits, and concretions. Thus samples will have a broad range of bulk densities, and require for Earth based analysis where practical: in-situ characterization, management of degradation such as perchlorate deliquescence and volatile release, and contamination management. We propose to adopt a sample container with a set of cups each with a sample from a specific location. We considered two sample cups sizes: (1) a small cup sized for samples matching those submitted to in-situ characterization instruments, and, (2) a larger cup for 100 mm rock cores [4] and pebble sized rocks, thus providing diverse samples and optimizing the MSR sample mass payload fraction for a given payload volume. We minimize sample degradation by keeping them frozen in the MSR payload sample canister using Peltier chip cooling. The cups are sealed by interference fitted heat activated memory

  20. WE-AB-206-00: Diagnostic QA/QC Hands-On Workshop

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    The involvement of medical physicists in diagnostic ultrasound imaging service is increasing due to QC and accreditation requirements. The goal of this ultrasound hands-on workshop is to demonstrate quality control (QC) testing in diagnostic ultrasound and to provide updates in ACR ultrasound accreditation requirements. The first half of this workshop will include two presentations reviewing diagnostic ultrasound QA/QC and ACR ultrasound accreditation requirements. The second half of the workshop will include live demonstrations of basic QC tests. An array of ultrasound testing phantoms and ultrasound scanners will be available for attendees to learn diagnostic ultrasound QC in a hands-on environmentmore » with live demonstrations and on-site instructors. The targeted attendees are medical physicists in diagnostic imaging. Learning Objectives: Gain familiarity with common elements of a QA/QC program for diagnostic ultrasound imaging dentify QC tools available for testing diagnostic ultrasound systems and learn how to use these tools Learn ACR ultrasound accreditation requirements Jennifer Walter is an employee of American College of Radiology on Ultrasound Accreditation.« less

  1. [A comparison of convenience sampling and purposive sampling].

    PubMed

    Suen, Lee-Jen Wu; Huang, Hui-Man; Lee, Hao-Hsien

    2014-06-01

    Convenience sampling and purposive sampling are two different sampling methods. This article first explains sampling terms such as target population, accessible population, simple random sampling, intended sample, actual sample, and statistical power analysis. These terms are then used to explain the difference between "convenience sampling" and purposive sampling." Convenience sampling is a non-probabilistic sampling technique applicable to qualitative or quantitative studies, although it is most frequently used in quantitative studies. In convenience samples, subjects more readily accessible to the researcher are more likely to be included. Thus, in quantitative studies, opportunity to participate is not equal for all qualified individuals in the target population and study results are not necessarily generalizable to this population. As in all quantitative studies, increasing the sample size increases the statistical power of the convenience sample. In contrast, purposive sampling is typically used in qualitative studies. Researchers who use this technique carefully select subjects based on study purpose with the expectation that each participant will provide unique and rich information of value to the study. As a result, members of the accessible population are not interchangeable and sample size is determined by data saturation not by statistical power analysis.

  2. Empirical insights and considerations for the OBT inter-laboratory comparison of environmental samples.

    PubMed

    Kim, Sang-Bog; Roche, Jennifer

    2013-08-01

    Organically bound tritium (OBT) is an important tritium species that can be measured in most environmental samples, but has only recently been recognized as a species of tritium in these samples. Currently, OBT is not routinely measured by environmental monitoring laboratories around the world. There are no certified reference materials (CRMs) for environmental samples. Thus, quality assurance (QA), or verification of the accuracy of the OBT measurement, is not possible. Alternatively, quality control (QC), or verification of the precision of the OBT measurement, can be achieved. In the past, there have been differences in OBT analysis results between environmental laboratories. A possible reason for the discrepancies may be differences in analytical methods. Therefore, inter-laboratory OBT comparisons among the environmental laboratories are important and would provide a good opportunity for adopting a reference OBT analytical procedure. Due to the analytical issues, only limited information is available on OBT measurement. Previously conducted OBT inter-laboratory practices are reviewed and the findings are described. Based on our experiences, a few considerations were suggested for the international OBT inter-laboratory comparison exercise to be completed in the near future. Crown Copyright © 2013. Published by Elsevier Ltd. All rights reserved.

  3. Information sampling behavior with explicit sampling costs

    PubMed Central

    Juni, Mordechai Z.; Gureckis, Todd M.; Maloney, Laurence T.

    2015-01-01

    The decision to gather information should take into account both the value of information and its accrual costs in time, energy and money. Here we explore how people balance the monetary costs and benefits of gathering additional information in a perceptual-motor estimation task. Participants were rewarded for touching a hidden circular target on a touch-screen display. The target’s center coincided with the mean of a circular Gaussian distribution from which participants could sample repeatedly. Each “cue” — sampled one at a time — was plotted as a dot on the display. Participants had to repeatedly decide, after sampling each cue, whether to stop sampling and attempt to touch the hidden target or continue sampling. Each additional cue increased the participants’ probability of successfully touching the hidden target but reduced their potential reward. Two experimental conditions differed in the initial reward associated with touching the hidden target and the fixed cost per cue. For each condition we computed the optimal number of cues that participants should sample, before taking action, to maximize expected gain. Contrary to recent claims that people gather less information than they objectively should before taking action, we found that participants over-sampled in one experimental condition, and did not significantly under- or over-sample in the other. Additionally, while the ideal observer model ignores the current sample dispersion, we found that participants used it to decide whether to stop sampling and take action or continue sampling, a possible consequence of imperfect learning of the underlying population dispersion across trials. PMID:27429991

  4. Metabolomic analysis of urine samples by UHPLC-QTOF-MS: Impact of normalization strategies.

    PubMed

    Gagnebin, Yoric; Tonoli, David; Lescuyer, Pierre; Ponte, Belen; de Seigneux, Sophie; Martin, Pierre-Yves; Schappler, Julie; Boccard, Julien; Rudaz, Serge

    2017-02-22

    Among the various biological matrices used in metabolomics, urine is a biofluid of major interest because of its non-invasive collection and its availability in large quantities. However, significant sources of variability in urine metabolomics based on UHPLC-MS are related to the analytical drift and variation of the sample concentration, thus requiring normalization. A sequential normalization strategy was developed to remove these detrimental effects, including: (i) pre-acquisition sample normalization by individual dilution factors to narrow the concentration range and to standardize the analytical conditions, (ii) post-acquisition data normalization by quality control-based robust LOESS signal correction (QC-RLSC) to correct for potential analytical drift, and (iii) post-acquisition data normalization by MS total useful signal (MSTUS) or probabilistic quotient normalization (PQN) to prevent the impact of concentration variability. This generic strategy was performed with urine samples from healthy individuals and was further implemented in the context of a clinical study to detect alterations in urine metabolomic profiles due to kidney failure. In the case of kidney failure, the relation between creatinine/osmolality and the sample concentration is modified, and relying only on these measurements for normalization could be highly detrimental. The sequential normalization strategy was demonstrated to significantly improve patient stratification by decreasing the unwanted variability and thus enhancing data quality. Copyright © 2016 Elsevier B.V. All rights reserved.

  5. Lunar Sample Quarantine & Sample Curation

    NASA Technical Reports Server (NTRS)

    Allton, Judith H.

    2000-01-01

    The main goal of this presentation is to discuss some of the responsibility of the lunar sample quarantine project. The responsibilities are: flying the mission safely, and on schedule, protect the Earth from biohazard, and preserve scientific integrity of samples.

  6. Quality Control (QC) System Development for the Pell Grant Program: A Conceptual Framework.

    ERIC Educational Resources Information Center

    Advanced Technology, Inc., Reston, VA.

    The objectives of the Pell Grant quality control (QC) system and the general definition of QC are considered. Attention is also directed to: the objectives of the Stage II Pell Grant QC system design and testing project, the approach used to develop the QC system, and the interface of the QC system and the Pell Grant delivery system. The…

  7. Soil sampling kit and a method of sampling therewith

    DOEpatents

    Thompson, Cyril V.

    1991-01-01

    A soil sampling device and a sample containment device for containing a soil sample is disclosed. In addition, a method for taking a soil sample using the soil sampling device and soil sample containment device to minimize the loss of any volatile organic compounds contained in the soil sample prior to analysis is disclosed. The soil sampling device comprises two close fitting, longitudinal tubular members of suitable length, the inner tube having the outward end closed. With the inner closed tube withdrawn a selected distance, the outer tube can be inserted into the ground or other similar soft material to withdraw a sample of material for examination. The inner closed end tube controls the volume of the sample taken and also serves to eject the sample. The soil sample containment device has a sealing member which is adapted to attach to an analytical apparatus which analyzes the volatile organic compounds contained in the sample. The soil sampling device in combination with the soil sample containment device allow an operator to obtain a soil sample containing volatile organic compounds and minimizing the loss of the volatile organic compounds prior to analysis of the soil sample for the volatile organic compounds.

  8. Bioanalytical high-throughput selected reaction monitoring-LC/MS determination of selected estrogen receptor modulators in human plasma: 2000 samples/day.

    PubMed

    Zweigenbaum, J; Henion, J

    2000-06-01

    The high-throughput determination of small molecules in biological matrixes has become an important part of drug discovery. This work shows that increased throughput LC/MS/MS techniques can be used for the analysis of selected estrogen receptor modulators in human plasma where more than 2000 samples may be analyzed in a 24-h period. The compounds used to demonstrate the high-throughput methodology include tamoxifen, raloxifene, 4-hydroxytamoxifen, nafoxidine, and idoxifene. Tamoxifen and raloxifene are used in both breast cancer therapy and osteoporosis and have shown prophylactic potential for the reduction of the risk of breast cancer. The described strategy provides LC/MS/MS separation and quantitation for each of the five test articles in control human plasma. The method includes sample preparation employing liquid-liquid extraction in the 96-well format, an LC separation of the five compounds in less than 30 s, and selected reaction monitoring detection from low nano- to microgram per milliter levels. Precision and accuracy are determined where each 96-well plate is considered a typical "tray" having calibration standards and quality control (QC) samples dispersed through each plate. A concept is introduced where 24 96-well plates analyzed in 1 day is considered a "grand tray", and the method is cross-validated with standards placed only at the beginning of the first plate and the end of the last plate. Using idoxifene-d5 as an internal standard, the results obtained for idoxifene and tamoxifen satisfy current bioanalytical method validation criteria on two separate days where 2112 and 2304 samples were run, respectively. Method validation included 24-h autosampler stability and one freeze-thaw cycle stability for the extracts. Idoxifene showed acceptable results with accuracy ranging from 0.3% for the high quality control (QC) to 15.4% for the low QC and precision of 3.6%-13.9% relative standard deviation. Tamoxifen showed accuracy ranging from 1.6% to 13

  9. Soil sampling kit and a method of sampling therewith

    DOEpatents

    Thompson, C.V.

    1991-02-05

    A soil sampling device and a sample containment device for containing a soil sample is disclosed. In addition, a method for taking a soil sample using the soil sampling device and soil sample containment device to minimize the loss of any volatile organic compounds contained in the soil sample prior to analysis is disclosed. The soil sampling device comprises two close fitting, longitudinal tubular members of suitable length, the inner tube having the outward end closed. With the inner closed tube withdrawn a selected distance, the outer tube can be inserted into the ground or other similar soft material to withdraw a sample of material for examination. The inner closed end tube controls the volume of the sample taken and also serves to eject the sample. The soil sample containment device has a sealing member which is adapted to attach to an analytical apparatus which analyzes the volatile organic compounds contained in the sample. The soil sampling device in combination with the soil sample containment device allows an operator to obtain a soil sample containing volatile organic compounds and minimizing the loss of the volatile organic compounds prior to analysis of the soil sample for the volatile organic compounds. 11 figures.

  10. Sample Manipulation System for Sample Analysis at Mars

    NASA Technical Reports Server (NTRS)

    Mumm, Erik; Kennedy, Tom; Carlson, Lee; Roberts, Dustyn

    2008-01-01

    The Sample Analysis at Mars (SAM) instrument will analyze Martian samples collected by the Mars Science Laboratory Rover with a suite of spectrometers. This paper discusses the driving requirements, design, and lessons learned in the development of the Sample Manipulation System (SMS) within SAM. The SMS stores and manipulates 74 sample cups to be used for solid sample pyrolysis experiments. Focus is given to the unique mechanism architecture developed to deliver a high packing density of sample cups in a reliable, fault tolerant manner while minimizing system mass and control complexity. Lessons learned are presented on contamination control, launch restraint mechanisms for fragile sample cups, and mechanism test data.

  11. INCORPORATING PRIOR KNOWLEDGE IN ENVIRONMENTAL SAMPLING: RANKED SET SAMPLING AND OTHER DOUBLE SAMPLING PROCEDURES

    EPA Science Inventory

    Environmental sampling can be difficult and expensive to carry out. Those taking the samples would like to integrate their knowledge of the system of study or their judgment about the system into the sample selection process to decrease the number of necessary samples. However,...

  12. Sampling for area estimation: A comparison of full-frame sampling with the sample segment approach

    NASA Technical Reports Server (NTRS)

    Hixson, M.; Bauer, M. E.; Davis, B. J. (Principal Investigator)

    1979-01-01

    The author has identified the following significant results. Full-frame classifications of wheat and non-wheat for eighty counties in Kansas were repetitively sampled to simulate alternative sampling plans. Evaluation of four sampling schemes involving different numbers of samples and different size sampling units shows that the precision of the wheat estimates increased as the segment size decreased and the number of segments was increased. Although the average bias associated with the various sampling schemes was not significantly different, the maximum absolute bias was directly related to sampling size unit.

  13. Trends in analytical methodologies for the determination of alkylphenols and bisphenol A in water samples.

    PubMed

    Salgueiro-González, N; Muniategui-Lorenzo, S; López-Mahía, P; Prada-Rodríguez, D

    2017-04-15

    In the last decade, the impact of alkylphenols and bisphenol A in the aquatic environment has been widely evaluated because of their high use in industrial and household applications as well as their toxicological effects. These compounds are well-known endocrine disrupting compounds (EDCs) which can affect the hormonal system of humans and wildlife, even at low concentrations. Due to the fact that these pollutants enter into the environment through waters, and it is the most affected compartment, analytical methods which allow the determination of these compounds in aqueous samples at low levels are mandatory. In this review, an overview of the most significant advances in the analytical methodologies for the determination of alkylphenols and bisphenol A in waters is considered (from 2002 to the present). Sample handling and instrumental detection strategies are critically discussed, including analytical parameters related to quality assurance and quality control (QA/QC). Special attention is paid to miniaturized sample preparation methodologies and approaches proposed to reduce time- and reagents consumption according to Green Chemistry principles, which have increased in the last five years. Finally, relevant applications of these methods to the analysis of water samples are examined, being wastewater and surface water the most investigated. Copyright © 2017 Elsevier B.V. All rights reserved.

  14. Sample introducing apparatus and sample modules for mass spectrometer

    DOEpatents

    Thompson, Cyril V.; Wise, Marcus B.

    1993-01-01

    An apparatus for introducing gaseous samples from a wide range of environmental matrices into a mass spectrometer for analysis of the samples is described. Several sample preparing modules including a real-time air monitoring module, a soil/liquid purge module, and a thermal desorption module are individually and rapidly attachable to the sample introducing apparatus for supplying gaseous samples to the mass spectrometer. The sample-introducing apparatus uses a capillary column for conveying the gaseous samples into the mass spectrometer and is provided with an open/split interface in communication with the capillary and a sample archiving port through which at least about 90 percent of the gaseous sample in a mixture with an inert gas that was introduced into the sample introducing apparatus is separated from a minor portion of the mixture entering the capillary discharged from the sample introducing apparatus.

  15. Sampling--how big a sample?

    PubMed

    Aitken, C G

    1999-07-01

    It is thought that, in a consignment of discrete units, a certain proportion of the units contain illegal material. A sample of the consignment is to be inspected. Various methods for the determination of the sample size are compared. The consignment will be considered as a random sample from some super-population of units, a certain proportion of which contain drugs. For large consignments, a probability distribution, known as the beta distribution, for the proportion of the consignment which contains illegal material is obtained. This distribution is based on prior beliefs about the proportion. Under certain specific conditions the beta distribution gives the same numerical results as an approach based on the binomial distribution. The binomial distribution provides a probability for the number of units in a sample which contain illegal material, conditional on knowing the proportion of the consignment which contains illegal material. This is in contrast to the beta distribution which provides probabilities for the proportion of a consignment which contains illegal material, conditional on knowing the number of units in the sample which contain illegal material. The interpretation when the beta distribution is used is much more intuitively satisfactory. It is also much more flexible in its ability to cater for prior beliefs which may vary given the different circumstances of different crimes. For small consignments, a distribution, known as the beta-binomial distribution, for the number of units in the consignment which are found to contain illegal material, is obtained, based on prior beliefs about the number of units in the consignment which are thought to contain illegal material. As with the beta and binomial distributions for large samples, it is shown that, in certain specific conditions, the beta-binomial and hypergeometric distributions give the same numerical results. However, the beta-binomial distribution, as with the beta distribution, has a more

  16. Sample introducing apparatus and sample modules for mass spectrometer

    DOEpatents

    Thompson, C.V.; Wise, M.B.

    1993-12-21

    An apparatus for introducing gaseous samples from a wide range of environmental matrices into a mass spectrometer for analysis of the samples is described. Several sample preparing modules including a real-time air monitoring module, a soil/liquid purge module, and a thermal desorption module are individually and rapidly attachable to the sample introducing apparatus for supplying gaseous samples to the mass spectrometer. The sample-introducing apparatus uses a capillary column for conveying the gaseous samples into the mass spectrometer and is provided with an open/split interface in communication with the capillary and a sample archiving port through which at least about 90 percent of the gaseous sample in a mixture with an inert gas that was introduced into the sample introducing apparatus is separated from a minor portion of the mixture entering the capillary discharged from the sample introducing apparatus. 5 figures.

  17. Sampling and sample processing in pesticide residue analysis.

    PubMed

    Lehotay, Steven J; Cook, Jo Marie

    2015-05-13

    Proper sampling and sample processing in pesticide residue analysis of food and soil have always been essential to obtain accurate results, but the subject is becoming a greater concern as approximately 100 mg test portions are being analyzed with automated high-throughput analytical methods by agrochemical industry and contract laboratories. As global food trade and the importance of monitoring increase, the food industry and regulatory laboratories are also considering miniaturized high-throughput methods. In conjunction with a summary of the symposium "Residues in Food and Feed - Going from Macro to Micro: The Future of Sample Processing in Residue Analytical Methods" held at the 13th IUPAC International Congress of Pesticide Chemistry, this is an opportune time to review sampling theory and sample processing for pesticide residue analysis. If collected samples and test portions do not adequately represent the actual lot from which they came and provide meaningful results, then all costs, time, and efforts involved in implementing programs using sophisticated analytical instruments and techniques are wasted and can actually yield misleading results. This paper is designed to briefly review the often-neglected but crucial topic of sample collection and processing and put the issue into perspective for the future of pesticide residue analysis. It also emphasizes that analysts should demonstrate the validity of their sample processing approaches for the analytes/matrices of interest and encourages further studies on sampling and sample mass reduction to produce a test portion.

  18. Estimating regression coefficients from clustered samples: Sampling errors and optimum sample allocation

    NASA Technical Reports Server (NTRS)

    Kalton, G.

    1983-01-01

    A number of surveys were conducted to study the relationship between the level of aircraft or traffic noise exposure experienced by people living in a particular area and their annoyance with it. These surveys generally employ a clustered sample design which affects the precision of the survey estimates. Regression analysis of annoyance on noise measures and other variables is often an important component of the survey analysis. Formulae are presented for estimating the standard errors of regression coefficients and ratio of regression coefficients that are applicable with a two- or three-stage clustered sample design. Using a simple cost function, they also determine the optimum allocation of the sample across the stages of the sample design for the estimation of a regression coefficient.

  19. Sampling for area estimation: A comparison of full-frame sampling with the sample segment approach. [Kansas

    NASA Technical Reports Server (NTRS)

    Hixson, M. M.; Bauer, M. E.; Davis, B. J.

    1979-01-01

    The effect of sampling on the accuracy (precision and bias) of crop area estimates made from classifications of LANDSAT MSS data was investigated. Full-frame classifications of wheat and non-wheat for eighty counties in Kansas were repetitively sampled to simulate alternative sampling plants. Four sampling schemes involving different numbers of samples and different size sampling units were evaluated. The precision of the wheat area estimates increased as the segment size decreased and the number of segments was increased. Although the average bias associated with the various sampling schemes was not significantly different, the maximum absolute bias was directly related to sampling unit size.

  20. 40 CFR 98.44 - Monitoring and QA/QC requirements.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 20 2010-07-01 2010-07-01 false Monitoring and QA/QC requirements. 98.44 Section 98.44 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Electricity Generation § 98.44 Monitoring and QA/QC...

  1. 40 CFR 98.44 - Monitoring and QA/QC requirements.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 21 2011-07-01 2011-07-01 false Monitoring and QA/QC requirements. 98.44 Section 98.44 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Electricity Generation § 98.44 Monitoring and QA/QC...

  2. 40 CFR 98.44 - Monitoring and QA/QC requirements.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 22 2013-07-01 2013-07-01 false Monitoring and QA/QC requirements. 98.44 Section 98.44 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Electricity Generation § 98.44 Monitoring and QA/QC...

  3. 40 CFR 98.44 - Monitoring and QA/QC requirements.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 21 2014-07-01 2014-07-01 false Monitoring and QA/QC requirements. 98.44 Section 98.44 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Electricity Generation § 98.44 Monitoring and QA/QC...

  4. 40 CFR 98.44 - Monitoring and QA/QC requirements.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 22 2012-07-01 2012-07-01 false Monitoring and QA/QC requirements. 98.44 Section 98.44 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Electricity Generation § 98.44 Monitoring and QA/QC...

  5. 40 CFR 98.64 - Monitoring and QA/QC requirements.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 20 2010-07-01 2010-07-01 false Monitoring and QA/QC requirements. 98.64 Section 98.64 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Aluminum Production § 98.64 Monitoring and QA/QC requirements...

  6. 40 CFR 98.334 - Monitoring and QA/QC requirements.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 22 2012-07-01 2012-07-01 false Monitoring and QA/QC requirements. 98... (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Zinc Production § 98.334 Monitoring and QA/QC requirements. If..., belt weigh feeders, weighed purchased quantities in shipments or containers, combination of bulk...

  7. 40 CFR 98.334 - Monitoring and QA/QC requirements.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 21 2011-07-01 2011-07-01 false Monitoring and QA/QC requirements. 98... (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Zinc Production § 98.334 Monitoring and QA/QC requirements. If..., belt weigh feeders, weighed purchased quantities in shipments or containers, combination of bulk...

  8. 40 CFR 98.334 - Monitoring and QA/QC requirements.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 21 2014-07-01 2014-07-01 false Monitoring and QA/QC requirements. 98... (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Zinc Production § 98.334 Monitoring and QA/QC requirements. If..., belt weigh feeders, weighed purchased quantities in shipments or containers, combination of bulk...

  9. 40 CFR 98.334 - Monitoring and QA/QC requirements.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 22 2013-07-01 2013-07-01 false Monitoring and QA/QC requirements. 98... (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Zinc Production § 98.334 Monitoring and QA/QC requirements. If..., belt weigh feeders, weighed purchased quantities in shipments or containers, combination of bulk...

  10. 40 CFR 98.64 - Monitoring and QA/QC requirements.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 21 2011-07-01 2011-07-01 false Monitoring and QA/QC requirements. 98.64 Section 98.64 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Aluminum Production § 98.64 Monitoring and QA/QC requirements...

  11. 40 CFR 98.84 - Monitoring and QA/QC requirements.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 21 2011-07-01 2011-07-01 false Monitoring and QA/QC requirements. 98.84 Section 98.84 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Cement Production § 98.84 Monitoring and QA/QC requirements...

  12. 40 CFR 98.94 - Monitoring and QA/QC requirements.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 21 2011-07-01 2011-07-01 false Monitoring and QA/QC requirements. 98... (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Electronics Manufacturing § 98.94 Monitoring and QA/QC requirements. (a) For calendar year 2011 monitoring, you may follow the provisions in paragraphs (a)(1) through...

  13. 40 CFR 98.94 - Monitoring and QA/QC requirements.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 22 2013-07-01 2013-07-01 false Monitoring and QA/QC requirements. 98... (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Electronics Manufacturing § 98.94 Monitoring and QA/QC requirements. (a) For calendar year 2011 monitoring, you may follow the provisions in paragraphs (a)(1) through...

  14. 40 CFR 98.94 - Monitoring and QA/QC requirements.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 22 2012-07-01 2012-07-01 false Monitoring and QA/QC requirements. 98... (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Electronics Manufacturing § 98.94 Monitoring and QA/QC requirements. (a) For calendar year 2011 monitoring, you may follow the provisions in paragraphs (a)(1) through...

  15. 40 CFR 98.94 - Monitoring and QA/QC requirements.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 21 2014-07-01 2014-07-01 false Monitoring and QA/QC requirements. 98... (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Electronics Manufacturing § 98.94 Monitoring and QA/QC...-specific heel factors for each container type for each gas used, according to the procedures in paragraphs...

  16. 40 CFR 98.424 - Monitoring and QA/QC requirements.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 22 2012-07-01 2012-07-01 false Monitoring and QA/QC requirements. 98... (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Suppliers of Carbon Dioxide § 98.424 Monitoring and QA/QC... determine quantity in accordance with this paragraph. (i) Reporters that supply CO2 in containers using...

  17. 40 CFR 98.424 - Monitoring and QA/QC requirements.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 22 2013-07-01 2013-07-01 false Monitoring and QA/QC requirements. 98... (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Suppliers of Carbon Dioxide § 98.424 Monitoring and QA/QC... determine quantity in accordance with this paragraph. (i) Reporters that supply CO2 in containers using...

  18. 40 CFR 98.424 - Monitoring and QA/QC requirements.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 21 2011-07-01 2011-07-01 false Monitoring and QA/QC requirements. 98... (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Suppliers of Carbon Dioxide § 98.424 Monitoring and QA/QC... determine quantity in accordance with this paragraph. (i) Reporters that supply CO2 in containers using...

  19. 40 CFR 98.424 - Monitoring and QA/QC requirements.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 21 2014-07-01 2014-07-01 false Monitoring and QA/QC requirements. 98... (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Suppliers of Carbon Dioxide § 98.424 Monitoring and QA/QC... determine quantity in accordance with this paragraph. (i) Reporters that supply CO2 in containers using...

  20. Two-stage sequential sampling: A neighborhood-free adaptive sampling procedure

    USGS Publications Warehouse

    Salehi, M.; Smith, D.R.

    2005-01-01

    Designing an efficient sampling scheme for a rare and clustered population is a challenging area of research. Adaptive cluster sampling, which has been shown to be viable for such a population, is based on sampling a neighborhood of units around a unit that meets a specified condition. However, the edge units produced by sampling neighborhoods have proven to limit the efficiency and applicability of adaptive cluster sampling. We propose a sampling design that is adaptive in the sense that the final sample depends on observed values, but it avoids the use of neighborhoods and the sampling of edge units. Unbiased estimators of population total and its variance are derived using Murthy's estimator. The modified two-stage sampling design is easy to implement and can be applied to a wider range of populations than adaptive cluster sampling. We evaluate the proposed sampling design by simulating sampling of two real biological populations and an artificial population for which the variable of interest took the value either 0 or 1 (e.g., indicating presence and absence of a rare event). We show that the proposed sampling design is more efficient than conventional sampling in nearly all cases. The approach used to derive estimators (Murthy's estimator) opens the door for unbiased estimators to be found for similar sequential sampling designs. ?? 2005 American Statistical Association and the International Biometric Society.

  1. 40 CFR 98.474 - Monitoring and QA/QC requirements.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 22 2013-07-01 2013-07-01 false Monitoring and QA/QC requirements. 98... (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Injection of Carbon Dioxide § 98.474 Monitoring and QA/QC.... (2) You must determine the quarterly mass or volume of contents in all containers if you receive CO2...

  2. 40 CFR 98.474 - Monitoring and QA/QC requirements.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 21 2014-07-01 2014-07-01 false Monitoring and QA/QC requirements. 98... (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Injection of Carbon Dioxide § 98.474 Monitoring and QA/QC.... (2) You must determine the quarterly mass or volume of contents in all containers if you receive CO2...

  3. 40 CFR 98.474 - Monitoring and QA/QC requirements.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 22 2012-07-01 2012-07-01 false Monitoring and QA/QC requirements. 98... (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Injection of Carbon Dioxide § 98.474 Monitoring and QA/QC.... (2) You must determine the quarterly mass or volume of contents in all containers if you receive CO2...

  4. 40 CFR 98.424 - Monitoring and QA/QC requirements.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 20 2010-07-01 2010-07-01 false Monitoring and QA/QC requirements. 98... (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Suppliers of Carbon Dioxide § 98.424 Monitoring and QA/QC... containers shall measure the mass in each CO2 container using weigh bills, scales, or load cells and sum the...

  5. Enhanced conformational sampling using enveloping distribution sampling.

    PubMed

    Lin, Zhixiong; van Gunsteren, Wilfred F

    2013-10-14

    To lessen the problem of insufficient conformational sampling in biomolecular simulations is still a major challenge in computational biochemistry. In this article, an application of the method of enveloping distribution sampling (EDS) is proposed that addresses this challenge and its sampling efficiency is demonstrated in simulations of a hexa-β-peptide whose conformational equilibrium encompasses two different helical folds, i.e., a right-handed 2.7(10∕12)-helix and a left-handed 3(14)-helix, separated by a high energy barrier. Standard MD simulations of this peptide using the GROMOS 53A6 force field did not reach convergence of the free enthalpy difference between the two helices even after 500 ns of simulation time. The use of soft-core non-bonded interactions in the centre of the peptide did enhance the number of transitions between the helices, but at the same time led to neglect of relevant helical configurations. In the simulations of a two-state EDS reference Hamiltonian that envelops both the physical peptide and the soft-core peptide, sampling of the conformational space of the physical peptide ensures that physically relevant conformations can be visited, and sampling of the conformational space of the soft-core peptide helps to enhance the transitions between the two helices. The EDS simulations sampled many more transitions between the two helices and showed much faster convergence of the relative free enthalpy of the two helices compared with the standard MD simulations with only a slightly larger computational effort to determine optimized EDS parameters. Combined with various methods to smoothen the potential energy surface, the proposed EDS application will be a powerful technique to enhance the sampling efficiency in biomolecular simulations.

  6. The Internet of Samples in the Earth Sciences (iSamples)

    NASA Astrophysics Data System (ADS)

    Carter, M. R.; Lehnert, K. A.

    2015-12-01

    Across most Earth Science disciplines, research depends on the availability of samples collected above, at, and beneath Earth's surface, on the moon and in space, or generated in experiments. Many domains in the Earth Sciences have recently expressed the need for better discovery, access, and sharing of scientific samples and collections (EarthCube End-User Domain workshops, 2012 and 2013, http://earthcube.org/info/about/end-user-workshops), as has the US government (OSTP Memo, March 2014). The Internet of Samples in the Earth Sciences (iSamples) is an initiative funded as a Research Coordination Network (RCN) within the EarthCube program to address this need. iSamples aims to advance the use of innovative cyberinfrastructure to connect physical samples and sample collections across the Earth Sciences with digital data infrastructures to revolutionize their utility for science. iSamples strives to build, grow, and foster a new community of practice, in which domain scientists, curators of sample repositories and collections, computer and information scientists, software developers and technology innovators engage in and collaborate on defining, articulating, and addressing the needs and challenges of physical samples as a critical component of digital data infrastructure. A primary goal of iSamples is to deliver a community-endorsed set of best practices and standards for the registration, description, identification, and citation of physical specimens and define an actionable plan for implementation. iSamples conducted a broad community survey about sample sharing and has created 5 different working groups to address the different challenges of developing the internet of samples - from metadata schemas and unique identifiers to an architecture of a shared cyberinfrastructure for collections, to digitization of existing collections, to education, and ultimately to establishing the physical infrastructure that will ensure preservation and access of the physical

  7. Joint design of QC-LDPC codes for coded cooperation system with joint iterative decoding

    NASA Astrophysics Data System (ADS)

    Zhang, Shunwai; Yang, Fengfan; Tang, Lei; Ejaz, Saqib; Luo, Lin; Maharaj, B. T.

    2016-03-01

    In this paper, we investigate joint design of quasi-cyclic low-density-parity-check (QC-LDPC) codes for coded cooperation system with joint iterative decoding in the destination. First, QC-LDPC codes based on the base matrix and exponent matrix are introduced, and then we describe two types of girth-4 cycles in QC-LDPC codes employed by the source and relay. In the equivalent parity-check matrix corresponding to the jointly designed QC-LDPC codes employed by the source and relay, all girth-4 cycles including both type I and type II are cancelled. Theoretical analysis and numerical simulations show that the jointly designed QC-LDPC coded cooperation well combines cooperation gain and channel coding gain, and outperforms the coded non-cooperation under the same conditions. Furthermore, the bit error rate performance of the coded cooperation employing jointly designed QC-LDPC codes is better than those of random LDPC codes and separately designed QC-LDPC codes over AWGN channels.

  8. Future Lunar Sampling Missions: Big Returns on Small Samples

    NASA Astrophysics Data System (ADS)

    Shearer, C. K.; Borg, L.

    2002-01-01

    The next sampling missions to the Moon will result in the return of sample mass (100g to 1 kg) substantially smaller than those returned by the Apollo missions (380 kg). Lunar samples to be returned by these missions are vital for: (1) calibrating the late impact history of the inner solar system that can then be extended to other planetary surfaces; (2) deciphering the effects of catastrophic impacts on a planetary body (i.e. Aitken crater); (3) understanding the very late-stage thermal and magmatic evolution of a cooling planet; (4) exploring the interior of a planet; and (5) examining volatile reservoirs and transport on an airless planetary body. Can small lunar samples be used to answer these and other pressing questions concerning important solar system processes? Two potential problems with small, robotically collected samples are placing them in a geologic context and extracting robust planetary information. Although geologic context will always be a potential problem with any planetary sample, new lunar samples can be placed within the context of the important Apollo - Luna collections and the burgeoning planet-scale data sets for the lunar surface and interior. Here we illustrate the usefulness of applying both new or refined analytical approaches in deciphering information locked in small lunar samples.

  9. Development and validation of a sensitive LC-MS/MS method for the determination of fenoterol in human plasma and urine samples.

    PubMed

    Sanghvi, M; Ramamoorthy, A; Strait, J; Wainer, I W; Moaddel, R

    2013-08-15

    Due to the lack of sensitivity in current methods for the determination of fenoterol (Fen), a rapid LC-MS/MS method was developed for the determination of (R,R')-Fen and (R,R';S,S')-Fen in plasma and urine. The method was fully validated and was linear from 50pg/ml to 2000pg/ml for plasma and from 2.500ng/ml to 160ng/ml for urine with a lower limit of quantitation of 52.8pg/ml in plasma. The coefficient of variation was <15% for the high QC standards and <10% for the low QC standards in plasma and was <15% for the high and low QC standards in urine. The relative concentrations of (R,R')-Fen and (S,S')-Fen were determined using a chirobiotic T chiral stationary phase. The method was used to determine the concentration of (R,R')-Fen in plasma and urine samples obtained in an oral cross-over study of (R,R')-Fen and (R,R';S,S')-Fen formulations. The results demonstrated a potential pre-systemic enantioselective interaction in which the (S,S')-Fen reduces the sulfation of the active (R,R')-Fen. The data suggest that a non-racemic mixture of the Fen enantiomers may provide better bioavailability of the active (R,R')-Fen for use in the treatment of cardiovascular disease. Published by Elsevier B.V.

  10. Development and Validation of a Sensitive LC-MS/MS Method for the Determination of Fenoterol in Human Plasma and Urine Samples

    PubMed Central

    Sanghvi, M.; Ramamoorthy, A.; Strait, J.; Wainer, I. W.; Moaddel, R.

    2013-01-01

    Due to the lack of sensitivity in current methods for the determination of fenoterol (Fen). A rapid, LC-MS/MS method was developed for the determination of (R,R′)-Fen and (R,R′;S,S′)-Fen in plasma and urine. The method was fully validated and was linear from 50 pg/ml to 2000 pg/ml for plasma and from 2.500 ng/ml to 160 ng/ml for urine with a lower limit of quantitation of 52.8 pg/ml in plasma. The coefficient of variation was <15% for the high QC standards and <10% for the low QC standards in plasma and was <15% for the high and low QC standards in urine. The relative concentrations of (R,R′)-Fen and (S,S′)-Fen were determined using a chirobiotic T chiral stationary phase. The method was used to determine the concentration of (R,R′)-Fen in plasma and urine samples obtained in an oral cross-over study of (R,R′)-Fen and (R,R′;S,S′)-Fen formulations. The results demonstrated a potential pre-systemic enantioselective interaction in which the (S,S′)-Fen reduces the sulfation of the active (R,R′)-Fen. The data suggests that a non-racemic mixture of the Fen enantiomers may provide better bioavailability of the active (R,R′)-Fen for use in the treatment of cardiovascular disease PMID:23872161

  11. A Comparison of EPI Sampling, Probability Sampling, and Compact Segment Sampling Methods for Micro and Small Enterprises

    PubMed Central

    Chao, Li-Wei; Szrek, Helena; Peltzer, Karl; Ramlagan, Shandir; Fleming, Peter; Leite, Rui; Magerman, Jesswill; Ngwenya, Godfrey B.; Pereira, Nuno Sousa; Behrman, Jere

    2011-01-01

    Finding an efficient method for sampling micro- and small-enterprises (MSEs) for research and statistical reporting purposes is a challenge in developing countries, where registries of MSEs are often nonexistent or outdated. This lack of a sampling frame creates an obstacle in finding a representative sample of MSEs. This study uses computer simulations to draw samples from a census of businesses and non-businesses in the Tshwane Municipality of South Africa, using three different sampling methods: the traditional probability sampling method, the compact segment sampling method, and the World Health Organization’s Expanded Programme on Immunization (EPI) sampling method. Three mechanisms by which the methods could differ are tested, the proximity selection of respondents, the at-home selection of respondents, and the use of inaccurate probability weights. The results highlight the importance of revisits and accurate probability weights, but the lesser effect of proximity selection on the samples’ statistical properties. PMID:22582004

  12. Soil Gas Sample Handling: Evaluation of Water Removal and Sample Ganging

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fritz, Brad G.; Abrecht, David G.; Hayes, James C.

    2016-10-31

    Soil gas sampling is currently conducted in support of Nuclear Test Ban treaty verification. Soil gas samples are collected and analyzed for isotopes of interest. Some issues that can impact sampling and analysis of these samples are excess moisture and sample processing time. Here we discuss three potential improvements to the current sampling protocol; a desiccant for water removal, use of molecular sieve to remove CO 2 from the sample during collection, and a ganging manifold to allow composite analysis of multiple samples.

  13. Sample processing approach for detection of ricin in surface samples.

    PubMed

    Kane, Staci; Shah, Sanjiv; Erler, Anne Marie; Alfaro, Teneile

    2017-12-01

    With several ricin contamination incidents reported over the past decade, rapid and accurate methods are needed for environmental sample analysis, especially after decontamination. A sample processing method was developed for common surface sampling devices to improve the limit of detection and avoid false negative/positive results for ricin analysis. Potential assay interferents from the sample matrix (bleach residue, sample material, wetting buffer), including reference dust, were tested using a Time-Resolved Fluorescence (TRF) immunoassay. Test results suggested that the sample matrix did not cause the elevated background fluorescence sometimes observed when analyzing post-bleach decontamination samples from ricin incidents. Furthermore, sample particulates (80mg/mL Arizona Test Dust) did not enhance background fluorescence or interfere with ricin detection by TRF. These results suggested that high background fluorescence in this immunoassay could be due to labeled antibody quality and/or quantity issues. Centrifugal ultrafiltration devices were evaluated for ricin concentration as a part of sample processing. Up to 30-fold concentration of ricin was observed by the devices, which serve to remove soluble interferents and could function as the front-end sample processing step to other ricin analytical methods. The procedure has the potential to be used with a broader range of environmental sample types and with other potential interferences and to be followed by other ricin analytical methods, although additional verification studies would be required. Published by Elsevier B.V.

  14. Study of sample drilling techniques for Mars sample return missions

    NASA Technical Reports Server (NTRS)

    Mitchell, D. C.; Harris, P. T.

    1980-01-01

    To demonstrate the feasibility of acquiring various surface samples for a Mars sample return mission the following tasks were performed: (1) design of a Mars rover-mounted drill system capable of acquiring crystalline rock cores; prediction of performance, mass, and power requirements for various size systems, and the generation of engineering drawings; (2) performance of simulated permafrost coring tests using a residual Apollo lunar surface drill, (3) design of a rock breaker system which can be used to produce small samples of rock chips from rocks which are too large to return to Earth, but too small to be cored with the Rover-mounted drill; (4)design of sample containers for the selected regolith cores, rock cores, and small particulate or rock samples; and (5) design of sample handling and transfer techniques which will be required through all phase of sample acquisition, processing, and stowage on-board the Earth return vehicle. A preliminary design of a light-weight Rover-mounted sampling scoop was also developed.

  15. How Sample Size Affects a Sampling Distribution

    ERIC Educational Resources Information Center

    Mulekar, Madhuri S.; Siegel, Murray H.

    2009-01-01

    If students are to understand inferential statistics successfully, they must have a profound understanding of the nature of the sampling distribution. Specifically, they must comprehend the determination of the expected value and standard error of a sampling distribution as well as the meaning of the central limit theorem. Many students in a high…

  16. Are most samples of animals systematically biased? Consistent individual trait differences bias samples despite random sampling.

    PubMed

    Biro, Peter A

    2013-02-01

    Sampling animals from the wild for study is something nearly every biologist has done, but despite our best efforts to obtain random samples of animals, 'hidden' trait biases may still exist. For example, consistent behavioral traits can affect trappability/catchability, independent of obvious factors such as size and gender, and these traits are often correlated with other repeatable physiological and/or life history traits. If so, systematic sampling bias may exist for any of these traits. The extent to which this is a problem, of course, depends on the magnitude of bias, which is presently unknown because the underlying trait distributions in populations are usually unknown, or unknowable. Indeed, our present knowledge about sampling bias comes from samples (not complete population censuses), which can possess bias to begin with. I had the unique opportunity to create naturalized populations of fish by seeding each of four small fishless lakes with equal densities of slow-, intermediate-, and fast-growing fish. Using sampling methods that are not size-selective, I observed that fast-growing fish were up to two-times more likely to be sampled than slower-growing fish. This indicates substantial and systematic bias with respect to an important life history trait (growth rate). If correlations between behavioral, physiological and life-history traits are as widespread as the literature suggests, then many animal samples may be systematically biased with respect to these traits (e.g., when collecting animals for laboratory use), and affect our inferences about population structure and abundance. I conclude with a discussion on ways to minimize sampling bias for particular physiological/behavioral/life-history types within animal populations.

  17. Development of QC Procedures for Ocean Data Obtained by National Research Projects of Korea

    NASA Astrophysics Data System (ADS)

    Kim, S. D.; Park, H. M.

    2017-12-01

    To establish data management system for ocean data obtained by national research projects of Ministry of Oceans and Fisheries of Korea, KIOST conducted standardization and development of QC procedures. After reviewing and analyzing the existing international and domestic ocean-data standards and QC procedures, the draft version of standards and QC procedures were prepared. The proposed standards and QC procedures were reviewed and revised by experts in the field of oceanography and academic societies several times. A technical report on the standards of 25 data items and 12 QC procedures for physical, chemical, biological and geological data items. The QC procedure for temperature and salinity data was set up by referring the manuals published by GTSPP, ARGO and IOOS QARTOD. It consists of 16 QC tests applicable for vertical profile data and time series data obtained in real-time mode and delay mode. Three regional range tests to inspect annual, seasonal and monthly variations were included in the procedure. Three programs were developed to calculate and provide upper limit and lower limit of temperature and salinity at depth from 0 to 1550m. TS data of World Ocean Database, ARGO, GTSPP and in-house data of KIOST were analysed statistically to calculate regional limit of Northwest Pacific area. Based on statistical analysis, the programs calculate regional ranges using mean and standard deviation at 3 kind of grid systems (3° grid, 1° grid and 0.5° grid) and provide recommendation. The QC procedures for 12 data items were set up during 1st phase of national program for data management (2012-2015) and are being applied to national research projects practically at 2nd phase (2016-2019). The QC procedures will be revised by reviewing the result of QC application when the 2nd phase of data management programs is completed.

  18. New prior sampling methods for nested sampling - Development and testing

    NASA Astrophysics Data System (ADS)

    Stokes, Barrie; Tuyl, Frank; Hudson, Irene

    2017-06-01

    Nested Sampling is a powerful algorithm for fitting models to data in the Bayesian setting, introduced by Skilling [1]. The nested sampling algorithm proceeds by carrying out a series of compressive steps, involving successively nested iso-likelihood boundaries, starting with the full prior distribution of the problem parameters. The "central problem" of nested sampling is to draw at each step a sample from the prior distribution whose likelihood is greater than the current likelihood threshold, i.e., a sample falling inside the current likelihood-restricted region. For both flat and informative priors this ultimately requires uniform sampling restricted to the likelihood-restricted region. We present two new methods of carrying out this sampling step, and illustrate their use with the lighthouse problem [2], a bivariate likelihood used by Gregory [3] and a trivariate Gaussian mixture likelihood. All the algorithm development and testing reported here has been done with Mathematica® [4].

  19. AfterQC: automatic filtering, trimming, error removing and quality control for fastq data.

    PubMed

    Chen, Shifu; Huang, Tanxiao; Zhou, Yanqing; Han, Yue; Xu, Mingyan; Gu, Jia

    2017-03-14

    Some applications, especially those clinical applications requiring high accuracy of sequencing data, usually have to face the troubles caused by unavoidable sequencing errors. Several tools have been proposed to profile the sequencing quality, but few of them can quantify or correct the sequencing errors. This unmet requirement motivated us to develop AfterQC, a tool with functions to profile sequencing errors and correct most of them, plus highly automated quality control and data filtering features. Different from most tools, AfterQC analyses the overlapping of paired sequences for pair-end sequencing data. Based on overlapping analysis, AfterQC can detect and cut adapters, and furthermore it gives a novel function to correct wrong bases in the overlapping regions. Another new feature is to detect and visualise sequencing bubbles, which can be commonly found on the flowcell lanes and may raise sequencing errors. Besides normal per cycle quality and base content plotting, AfterQC also provides features like polyX (a long sub-sequence of a same base X) filtering, automatic trimming and K-MER based strand bias profiling. For each single or pair of FastQ files, AfterQC filters out bad reads, detects and eliminates sequencer's bubble effects, trims reads at front and tail, detects the sequencing errors and corrects part of them, and finally outputs clean data and generates HTML reports with interactive figures. AfterQC can run in batch mode with multiprocess support, it can run with a single FastQ file, a single pair of FastQ files (for pair-end sequencing), or a folder for all included FastQ files to be processed automatically. Based on overlapping analysis, AfterQC can estimate the sequencing error rate and profile the error transform distribution. The results of our error profiling tests show that the error distribution is highly platform dependent. Much more than just another new quality control (QC) tool, AfterQC is able to perform quality control, data

  20. Sample Transport for a European Sample Curation Facility

    NASA Astrophysics Data System (ADS)

    Berthoud, L.; Vrublevskis, J. B.; Bennett, A.; Pottage, T.; Bridges, J. C.; Holt, J. M. C.; Dirri, F.; Longobardo, A.; Palomba, E.; Russell, S.; Smith, C.

    2018-04-01

    This work has looked at the recovery of Mars Sample Return capsule once it arrives on Earth. It covers possible landing sites, planetary protection requirements, and transportation from the landing site to a European Sample Curation Facility.

  1. Mars sample return: Site selection and sample acquisition study

    NASA Technical Reports Server (NTRS)

    Nickle, N. (Editor)

    1980-01-01

    Various vehicle and mission options were investigated for the continued exploration of Mars; the cost of a minimum sample return mission was estimated; options and concepts were synthesized into program possibilities; and recommendations for the next Mars mission were made to the Planetary Program office. Specific sites and all relevant spacecraft and ground-based data were studied in order to determine: (1) the adequacy of presently available data for identifying landing sities for a sample return mission that would assure the acquisition of material from the most important geologic provinces of Mars; (2) the degree of surface mobility required to assure sample acquisition for these sites; (3) techniques to be used in the selection and drilling of rock a samples; and (4) the degree of mobility required at the two Viking sites to acquire these samples.

  2. Development of Sample Verification System for Sample Return Missions

    NASA Technical Reports Server (NTRS)

    Toda, Risaku; McKinney, Colin; Jackson, Shannon P.; Mojarradi, Mohammad; Trebi-Ollennu, Ashitey; Manohara, Harish

    2011-01-01

    This paper describes the development of a proof of-concept sample verification system (SVS) for in-situ mass measurement of planetary rock and soil sample in future robotic sample return missions. Our proof-of-concept SVS device contains a 10 cm diameter pressure sensitive elastic membrane placed at the bottom of a sample canister. The membrane deforms under the weight of accumulating planetary sample. The membrane is positioned in proximity to an opposing substrate with a narrow gap. The deformation of the membrane makes the gap to be narrower, resulting in increased capacitance between the two nearly parallel plates. Capacitance readout circuitry on a nearby printed circuit board (PCB) transmits data via a low-voltage differential signaling (LVDS) interface. The fabricated SVS proof-of-concept device has successfully demonstrated approximately 1pF/gram capacitance change

  3. A simple vibrating sample magnetometer for macroscopic samples

    NASA Astrophysics Data System (ADS)

    Lopez-Dominguez, V.; Quesada, A.; Guzmán-Mínguez, J. C.; Moreno, L.; Lere, M.; Spottorno, J.; Giacomone, F.; Fernández, J. F.; Hernando, A.; García, M. A.

    2018-03-01

    We here present a simple model of a vibrating sample magnetometer (VSM). The system allows recording magnetization curves at room temperature with a resolution of the order of 0.01 emu and is appropriated for macroscopic samples. The setup can be mounted with different configurations depending on the requirements of the sample to be measured (mass, saturation magnetization, saturation field, etc.). We also include here examples of curves obtained with our setup and comparison curves measured with a standard commercial VSM that confirms the reliability of our device.

  4. Guidelines and sample protocol for sampling forest gaps.

    Treesearch

    J.R. Runkle

    1992-01-01

    A protocol for sampling forest canopy gaps is presented. Methods used in published gap studies are reviewed. The sample protocol will be useful in developing a broader understanding of forest structure and dynamics through comparative studies across different forest ecosystems.

  5. A Sample Handling System for Mars Sample Return - Design and Status

    NASA Astrophysics Data System (ADS)

    Allouis, E.; Renouf, I.; Deridder, M.; Vrancken, D.; Gelmi, R.; Re, E.

    2009-04-01

    A mission to return atmosphere and soil samples form the Mars is highly desired by planetary scientists from around the world and space agencies are starting preparation for the launch of a sample return mission in the 2020 timeframe. Such a mission would return approximately 500 grams of atmosphere, rock and soil samples to Earth by 2025. Development of a wide range of new technology will be critical to the successful implementation of such a challenging mission. Technical developments required to realise the mission include guided atmospheric entry, soft landing, sample handling robotics, biological sealing, Mars atmospheric ascent sample rendezvous & capture and Earth return. The European Space Agency has been performing system definition studies along with numerous technology development studies under the framework of the Aurora programme. Within the scope of these activities Astrium has been responsible for defining an overall sample handling architecture in collaboration with European partners (sample acquisition and sample capture, Galileo Avionica; sample containment and automated bio-sealing, Verhaert). Our work has focused on the definition and development of the robotic systems required to move the sample through the transfer chain. This paper presents the Astrium team's high level design for the surface transfer system and the orbiter transfer system. The surface transfer system is envisaged to use two robotic arms of different sizes to allow flexible operations and to enable sample transfer over relatively large distances (~2 to 3 metres): The first to deploy/retract the Drill Assembly used for sample collection, the second for the transfer of the Sample Container (the vessel containing all the collected samples) from the Drill Assembly to the Mars Ascent Vehicle (MAV). The sample transfer actuator also features a complex end-effector for handling the Sample Container. The orbiter transfer system will transfer the Sample Container from the capture

  6. Rain sampling device

    DOEpatents

    Nelson, Danny A.; Tomich, Stanley D.; Glover, Donald W.; Allen, Errol V.; Hales, Jeremy M.; Dana, Marshall T.

    1991-01-01

    The present invention constitutes a rain sampling device adapted for independent operation at locations remote from the user which allows rainfall to be sampled in accordance with any schedule desired by the user. The rain sampling device includes a mechanism for directing wet precipitation into a chamber, a chamber for temporarily holding the precipitation during the process of collection, a valve mechanism for controllably releasing samples of said precipitation from said chamber, a means for distributing the samples released from the holding chamber into vessels adapted for permanently retaining these samples, and an electrical mechanism for regulating the operation of the device.

  7. 46. VIEW OF SAMPLING ROOM FROM SOUTHEAST. TO LEFT, SAMPLING ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    46. VIEW OF SAMPLING ROOM FROM SOUTHEAST. TO LEFT, SAMPLING ELEVATOR AND IN CENTER, SAMPLE BINS WITH DISCHARGE CHUTE AND THREE LABELS. - Bald Mountain Gold Mill, Nevada Gulch at head of False Bottom Creek, Lead, Lawrence County, SD

  8. Global Unique Identification of Geoscience Samples: The International Geo Sample Number (IGSN) and the System for Earth Sample Registration (SESAR)

    NASA Astrophysics Data System (ADS)

    Lehnert, K. A.; Goldstein, S. L.; Vinayagamoorthy, S.; Lenhardt, W. C.

    2005-12-01

    Data on samples represent a primary foundation of Geoscience research across disciplines, ranging from the study of climate change, to biogeochemical cycles, to mantle and continental dynamics and are key to our knowledge of the Earth's dynamical systems and evolution. Different data types are generated for individual samples by different research groups, published in different papers, and stored in different databases on a global scale. The utility of these data is critically dependent on their integration. Such integration can be achieved within a Geoscience Cyberinfrastructure, but requires unambiguous identification of samples. Currently, naming of samples is arbitrary and inconsistent and therefore severely limits our ability to share, link, and integrate sample-based data. Major problems include name duplication, and changing of names as a sample is passed along over many years to different investigators. SESAR, the System for Earth Sample Registration (http://www.geosamples.org), addresses this problem by building a registry that generates and administers globally unique identifiers for Geoscience samples: the International Geo Sample Number (IGSN). Implementation of the IGSN in data publication and digital data management will dramatically advance interoperability among information systems for sample-based data, opening an extensive range of new opportunities for discovery and for interdisciplinary approaches in research. The IGSN will also facilitate the ability of investigators to build on previously collected data on samples as new measurements are made or new techniques are developed. With potentially broad application to all types of Geoscience samples, SESAR is global in scope. It is a web-based system that can be easily accessed by individual users through an interactive web interface and by distributed client systems via standard web services. Samples can be registered individually or in batches and at various levels of granularity from entire cores

  9. QA/QC Guidance for Sampling and Analysis of Sediments, Water, and Tissues for Dredged Material Evaluations: Chemical Evaluations

    DTIC Science & Technology

    1995-04-01

    J. Biochem. Physiol. 37:911-917. Bloom, N.S., E.A. Crecelius, and S . Berman . 1983. Determination of mercury in seawater at sub-nanogram per liter...procedure for determination of trace metal in seawater by atomic absorption spectrometry with electrothermal atomization. Anal. Chern. Acta 98:47-55...Nakashima, S ., R.E. Sturgeon, S.N. Willie, and S.S. Berman . 1988. Acid digestion of marine sample for trace element analysis using microwave heating

  10. Sparsely sampling the sky: Regular vs. random sampling

    NASA Astrophysics Data System (ADS)

    Paykari, P.; Pires, S.; Starck, J.-L.; Jaffe, A. H.

    2015-09-01

    Aims: The next generation of galaxy surveys, aiming to observe millions of galaxies, are expensive both in time and money. This raises questions regarding the optimal investment of this time and money for future surveys. In a previous work, we have shown that a sparse sampling strategy could be a powerful substitute for the - usually favoured - contiguous observation of the sky. In our previous paper, regular sparse sampling was investigated, where the sparse observed patches were regularly distributed on the sky. The regularity of the mask introduces a periodic pattern in the window function, which induces periodic correlations at specific scales. Methods: In this paper, we use a Bayesian experimental design to investigate a "random" sparse sampling approach, where the observed patches are randomly distributed over the total sparsely sampled area. Results: We find that in this setting, the induced correlation is evenly distributed amongst all scales as there is no preferred scale in the window function. Conclusions: This is desirable when we are interested in any specific scale in the galaxy power spectrum, such as the matter-radiation equality scale. As the figure of merit shows, however, there is no preference between regular and random sampling to constrain the overall galaxy power spectrum and the cosmological parameters.

  11. Defining And Characterizing Sample Representativeness For DWPF Melter Feed Samples

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shine, E. P.; Poirier, M. R.

    2013-10-29

    Representative sampling is important throughout the Defense Waste Processing Facility (DWPF) process, and the demonstrated success of the DWPF process to achieve glass product quality over the past two decades is a direct result of the quality of information obtained from the process. The objective of this report was to present sampling methods that the Savannah River Site (SRS) used to qualify waste being dispositioned at the DWPF. The goal was to emphasize the methodology, not a list of outcomes from those studies. This methodology includes proven methods for taking representative samples, the use of controlled analytical methods, and datamore » interpretation and reporting that considers the uncertainty of all error sources. Numerous sampling studies were conducted during the development of the DWPF process and still continue to be performed in order to evaluate options for process improvement. Study designs were based on use of statistical tools applicable to the determination of uncertainties associated with the data needs. Successful designs are apt to be repeated, so this report chose only to include prototypic case studies that typify the characteristics of frequently used designs. Case studies have been presented for studying in-tank homogeneity, evaluating the suitability of sampler systems, determining factors that affect mixing and sampling, comparing the final waste glass product chemical composition and durability to that of the glass pour stream sample and other samples from process vessels, and assessing the uniformity of the chemical composition in the waste glass product. Many of these studies efficiently addressed more than one of these areas of concern associated with demonstrating sample representativeness and provide examples of statistical tools in use for DWPF. The time when many of these designs were implemented was in an age when the sampling ideas of Pierre Gy were not as widespread as they are today. Nonetheless, the engineers and

  12. Rain sampling device

    DOEpatents

    Nelson, D.A.; Tomich, S.D.; Glover, D.W.; Allen, E.V.; Hales, J.M.; Dana, M.T.

    1991-05-14

    The present invention constitutes a rain sampling device adapted for independent operation at locations remote from the user which allows rainfall to be sampled in accordance with any schedule desired by the user. The rain sampling device includes a mechanism for directing wet precipitation into a chamber, a chamber for temporarily holding the precipitation during the process of collection, a valve mechanism for controllably releasing samples of the precipitation from the chamber, a means for distributing the samples released from the holding chamber into vessels adapted for permanently retaining these samples, and an electrical mechanism for regulating the operation of the device. 11 figures.

  13. Applied Survey Sampling

    ERIC Educational Resources Information Center

    Blair, Edward; Blair, Johnny

    2015-01-01

    Written for students and researchers who wish to understand the conceptual and practical aspects of sampling, this book is designed to be accessible without requiring advanced statistical training. It covers a wide range of topics, from the basics of sampling to special topics such as sampling rare populations, sampling organizational populations,…

  14. Lunar Sample Compendium

    NASA Technical Reports Server (NTRS)

    Meyer, C.

    2009-01-01

    The Lunar Sample Compendium is a succinct summary of what has been learned from the study of Apollo and Luna samples of the Moon. Basic information is compiled, sample-by-sample, in the form of an advanced catalog in order to provide a basic description of each sample. Information presented is carefully attributed to the original source publication, thus the Compendium also serves as a ready access to the now vast scientific literature pertaining to lunar smples. The Lunar Sample Compendium is a work in progress (and may always be). Future plans include: adding sections on additional samples, adding new thin section photomicrographs, replacing the faded photographs with newly digitized photos from the original negatives, attempting to correct the age data using modern decay constants, adding references to each section, and adding an internal search engine.

  15. Mars Science Laboratory Sample Acquisition, Sample Processing and Handling Subsystem: A Description of the Sampling Functionality

    NASA Astrophysics Data System (ADS)

    Jandura, L.; Burke, K.; Kennedy, B.; Melko, J.; Okon, A.; Sunshine, D.

    2009-12-01

    The Sample Acquisition/Sample Processing and Handling (SA/SPaH) subsystem for the Mars Science Library (MSL) is a rover-based sampling system scheduled to launch in 2011. The SA/SPaH consists of a powdering drill and a scooping, sieving, and portioning device mounted on a turret at the end of a robotic arm. Also on the turret is a dust removal tool for clearing the surface of scientific targets, and two science instruments mounted on vibration isolators. The SA/SPaH can acquire powder from rocks at depths of 20 to 50 mm and can also pick up loose regolith with its scoop. The acquired sample is sieved and portioned and delivered to one of two instruments inside the rover for analysis. The functionality of the system will be described along with the targets the system can acquire and the sample that can be delivered. Top View of the SA/SPaH on the Rover

  16. Comparison of chain sampling plans with single and double sampling plans

    NASA Technical Reports Server (NTRS)

    Stephens, K. S.; Dodge, H. F.

    1976-01-01

    The efficiency of chain sampling is examined through matching of operating characteristics (OC) curves of chain sampling plans (ChSP) with single and double sampling plans. In particular, the operating characteristics of some ChSP-0, 3 and 1, 3 as well as ChSP-0, 4 and 1, 4 are presented, where the number pairs represent the first and the second cumulative acceptance numbers. The fact that the ChSP procedure uses cumulative results from two or more samples and that the parameters can be varied to produce a wide variety of operating characteristics raises the question whether it may be possible for such plans to provide a given protection with less inspection than with single or double sampling plans. The operating ratio values reported illustrate the possibilities of matching single and double sampling plans with ChSP. It is shown that chain sampling plans provide improved efficiency over single and double sampling plans having substantially the same operating characteristics.

  17. Compressive sampling of polynomial chaos expansions: Convergence analysis and sampling strategies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hampton, Jerrad; Doostan, Alireza, E-mail: alireza.doostan@colorado.edu

    2015-01-01

    Sampling orthogonal polynomial bases via Monte Carlo is of interest for uncertainty quantification of models with random inputs, using Polynomial Chaos (PC) expansions. It is known that bounding a probabilistic parameter, referred to as coherence, yields a bound on the number of samples necessary to identify coefficients in a sparse PC expansion via solution to an ℓ{sub 1}-minimization problem. Utilizing results for orthogonal polynomials, we bound the coherence parameter for polynomials of Hermite and Legendre type under their respective natural sampling distribution. In both polynomial bases we identify an importance sampling distribution which yields a bound with weaker dependence onmore » the order of the approximation. For more general orthonormal bases, we propose the coherence-optimal sampling: a Markov Chain Monte Carlo sampling, which directly uses the basis functions under consideration to achieve a statistical optimality among all sampling schemes with identical support. We demonstrate these different sampling strategies numerically in both high-order and high-dimensional, manufactured PC expansions. In addition, the quality of each sampling method is compared in the identification of solutions to two differential equations, one with a high-dimensional random input and the other with a high-order PC expansion. In both cases, the coherence-optimal sampling scheme leads to similar or considerably improved accuracy.« less

  18. Fluid sampling tool

    DOEpatents

    Garcia, Anthony R.; Johnston, Roger G.; Martinez, Ronald K.

    2000-01-01

    A fluid-sampling tool for obtaining a fluid sample from a container. When used in combination with a rotatable drill, the tool bores a hole into a container wall, withdraws a fluid sample from the container, and seals the borehole. The tool collects fluid sample without exposing the operator or the environment to the fluid or to wall shavings from the container.

  19. Sampling Designs in Qualitative Research: Making the Sampling Process More Public

    ERIC Educational Resources Information Center

    Onwuegbuzie, Anthony J.; Leech, Nancy L.

    2007-01-01

    The purpose of this paper is to provide a typology of sampling designs for qualitative researchers. We introduce the following sampling strategies: (a) parallel sampling designs, which represent a body of sampling strategies that facilitate credible comparisons of two or more different subgroups that are extracted from the same levels of study;…

  20. Sample size of the reference sample in a case-augmented study.

    PubMed

    Ghosh, Palash; Dewanji, Anup

    2017-05-01

    The case-augmented study, in which a case sample is augmented with a reference (random) sample from the source population with only covariates information known, is becoming popular in different areas of applied science such as pharmacovigilance, ecology, and econometrics. In general, the case sample is available from some source (for example, hospital database, case registry, etc.); however, the reference sample is required to be drawn from the corresponding source population. The required minimum size of the reference sample is an important issue in this regard. In this work, we address the minimum sample size calculation and discuss related issues. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  1. Drilling, sampling, and sample-handling system for China's asteroid exploration mission

    NASA Astrophysics Data System (ADS)

    Zhang, Tao; Zhang, Wenming; Wang, Kang; Gao, Sheng; Hou, Liang; Ji, Jianghui; Ding, Xilun

    2017-08-01

    Asteroid exploration has a significant importance in promoting our understanding of the solar system and the origin of life on Earth. A unique opportunity to study near-Earth asteroid 99942 Apophis will occur in 2029 because it will be at its perigee. In the current work, a drilling, sampling, and sample-handling system (DSSHS) is proposed to penetrate the asteroid regolith, collect regolith samples at different depths, and distribute the samples to different scientific instruments for in situ analysis. In this system, a rotary-drilling method is employed for the penetration, and an inner sampling tube is utilized to collect and discharge the regolith samples. The sampling tube can deliver samples up to a maximum volume of 84 mm3 at a maximum penetration depth of 300 mm to 17 different ovens. To activate the release of volatile substances, the samples will be heated up to a temperature of 600 °C by the ovens, and these substances will be analyzed by scientific instruments such as a mass spectrometer, an isotopic analyzer, and micro-cameras, among other instruments. The DSSHS is capable of penetrating rocks with a hardness value of six, and it can be used for China's asteroid exploration mission in the foreseeable future.

  2. 40 CFR 1065.245 - Sample flow meter for batch sampling.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... rates or total flow sampled into a batch sampling system over a test interval. You may use the... rates or total raw exhaust flow over a test interval. (b) Component requirements. We recommend that you... averaging Pitot tube, or a hot-wire anemometer. Note that your overall system for measuring sample flow must...

  3. Replicating studies in which samples of participants respond to samples of stimuli.

    PubMed

    Westfall, Jacob; Judd, Charles M; Kenny, David A

    2015-05-01

    In a direct replication, the typical goal is to reproduce a prior experimental result with a new but comparable sample of participants in a high-powered replication study. Often in psychology, the research to be replicated involves a sample of participants responding to a sample of stimuli. In replicating such studies, we argue that the same criteria should be used in sampling stimuli as are used in sampling participants. Namely, a new but comparable sample of stimuli should be used to ensure that the original results are not due to idiosyncrasies of the original stimulus sample, and the stimulus sample must often be enlarged to ensure high statistical power. In support of the latter point, we discuss the fact that in experiments involving samples of stimuli, statistical power typically does not approach 1 as the number of participants goes to infinity. As an example of the importance of sampling new stimuli, we discuss the bygone literature on the risky shift phenomenon, which was almost entirely based on a single stimulus sample that was later discovered to be highly unrepresentative. We discuss the use of both resampled and expanded stimulus sets, that is, stimulus samples that include the original stimuli plus new stimuli. © The Author(s) 2015.

  4. Adaptive web sampling.

    PubMed

    Thompson, Steven K

    2006-12-01

    A flexible class of adaptive sampling designs is introduced for sampling in network and spatial settings. In the designs, selections are made sequentially with a mixture distribution based on an active set that changes as the sampling progresses, using network or spatial relationships as well as sample values. The new designs have certain advantages compared with previously existing adaptive and link-tracing designs, including control over sample sizes and of the proportion of effort allocated to adaptive selections. Efficient inference involves averaging over sample paths consistent with the minimal sufficient statistic. A Markov chain resampling method makes the inference computationally feasible. The designs are evaluated in network and spatial settings using two empirical populations: a hidden human population at high risk for HIV/AIDS and an unevenly distributed bird population.

  5. Point-Sampling and Line-Sampling Probability Theory, Geometric Implications, Synthesis

    Treesearch

    L.R. Grosenbaugh

    1958-01-01

    Foresters concerned with measuring tree populations on definite areas have long employed two well-known methods of representative sampling. In list or enumerative sampling the entire tree population is tallied with a known proportion being randomly selected and measured for volume or other variables. In area sampling all trees on randomly located plots or strips...

  6. Electrophoretic sample insertion. [device for uniformly distributing samples in flow path

    NASA Technical Reports Server (NTRS)

    Mccreight, L. R. (Inventor)

    1974-01-01

    Two conductive screens located in the flow path of an electrophoresis sample separation apparatus are charged electrically. The sample is introduced between the screens, and the charge is sufficient to disperse and hold the samples across the screens. When the charge is terminated, the samples are uniformly distributed in the flow path. Additionally, a first separation by charged properties has been accomplished.

  7. Handbook for Sampling and Sample Preservation of Water and Wastewater

    DTIC Science & Technology

    1992-05-01

    integers from 1 to N. N E Xi = XI1 + X 2 + X 3 . ..... + XN i=l 1 In the above example (from Table 4.1), X1 = 35.8, X2 = 33.0, ... , XN = X52 32.4; N E...41e 0 b10) -4 r- -C) a) 5l %Dt C) ON qc -0 Cl qtr C -4 %D cvj qw a) C )L- L~~~J enC- ..4 m~a CV) .4 c P.. .4 cnm0 q5C) )n me .-4C) - (J 1 () e U m. 0-4...X(l) = X = 35.8, X(2) = X2 = 33.0,..., X(52) = X52 = 32.4. X(t) is the linear trend. X’(t) is the random component. In this case, the trend can be

  8. Validation of Statistical Sampling Algorithms in Visual Sample Plan (VSP): Summary Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nuffer, Lisa L; Sego, Landon H.; Wilson, John E.

    2009-02-18

    The U.S. Department of Homeland Security, Office of Technology Development (OTD) contracted with a set of U.S. Department of Energy national laboratories, including the Pacific Northwest National Laboratory (PNNL), to write a Remediation Guidance for Major Airports After a Chemical Attack. The report identifies key activities and issues that should be considered by a typical major airport following an incident involving release of a toxic chemical agent. Four experimental tasks were identified that would require further research in order to supplement the Remediation Guidance. One of the tasks, Task 4, OTD Chemical Remediation Statistical Sampling Design Validation, dealt with statisticalmore » sampling algorithm validation. This report documents the results of the sampling design validation conducted for Task 4. In 2005, the Government Accountability Office (GAO) performed a review of the past U.S. responses to Anthrax terrorist cases. Part of the motivation for this PNNL report was a major GAO finding that there was a lack of validated sampling strategies in the U.S. response to Anthrax cases. The report (GAO 2005) recommended that probability-based methods be used for sampling design in order to address confidence in the results, particularly when all sample results showed no remaining contamination. The GAO also expressed a desire that the methods be validated, which is the main purpose of this PNNL report. The objective of this study was to validate probability-based statistical sampling designs and the algorithms pertinent to within-building sampling that allow the user to prescribe or evaluate confidence levels of conclusions based on data collected as guided by the statistical sampling designs. Specifically, the designs found in the Visual Sample Plan (VSP) software were evaluated. VSP was used to calculate the number of samples and the sample location for a variety of sampling plans applied to an actual release site. Most of the sampling designs

  9. Sampling Development

    ERIC Educational Resources Information Center

    Adolph, Karen E.; Robinson, Scott R.

    2011-01-01

    Research in developmental psychology requires sampling at different time points. Accurate depictions of developmental change provide a foundation for further empirical studies and theories about developmental mechanisms. However, overreliance on widely spaced sampling intervals in cross-sectional and longitudinal designs threatens the validity of…

  10. Evaluation of the Biological Sampling Kit (BiSKit) for Large-Area Surface Sampling

    PubMed Central

    Buttner, Mark P.; Cruz, Patricia; Stetzenbach, Linda D.; Klima-Comba, Amy K.; Stevens, Vanessa L.; Emanuel, Peter A.

    2004-01-01

    Current surface sampling methods for microbial contaminants are designed to sample small areas and utilize culture analysis. The total number of microbes recovered is low because a small area is sampled, making detection of a potential pathogen more difficult. Furthermore, sampling of small areas requires a greater number of samples to be collected, which delays the reporting of results, taxes laboratory resources and staffing, and increases analysis costs. A new biological surface sampling method, the Biological Sampling Kit (BiSKit), designed to sample large areas and to be compatible with testing with a variety of technologies, including PCR and immunoassay, was evaluated and compared to other surface sampling strategies. In experimental room trials, wood laminate and metal surfaces were contaminated by aerosolization of Bacillus atrophaeus spores, a simulant for Bacillus anthracis, into the room, followed by settling of the spores onto the test surfaces. The surfaces were sampled with the BiSKit, a cotton-based swab, and a foam-based swab. Samples were analyzed by culturing, quantitative PCR, and immunological assays. The results showed that the large surface area (1 m2) sampled with the BiSKit resulted in concentrations of B. atrophaeus in samples that were up to 10-fold higher than the concentrations obtained with the other methods tested. A comparison of wet and dry sampling with the BiSKit indicated that dry sampling was more efficient (efficiency, 18.4%) than wet sampling (efficiency, 11.3%). The sensitivities of detection of B. atrophaeus on metal surfaces were 42 ± 5.8 CFU/m2 for wet sampling and 100.5 ± 10.2 CFU/m2 for dry sampling. These results demonstrate that the use of a sampling device capable of sampling larger areas results in higher sensitivity than that obtained with currently available methods and has the advantage of sampling larger areas, thus requiring collection of fewer samples per site. PMID:15574898

  11. Coring Sample Acquisition Tool

    NASA Technical Reports Server (NTRS)

    Haddad, Nicolas E.; Murray, Saben D.; Walkemeyer, Phillip E.; Badescu, Mircea; Sherrit, Stewart; Bao, Xiaoqi; Kriechbaum, Kristopher L.; Richardson, Megan; Klein, Kerry J.

    2012-01-01

    A sample acquisition tool (SAT) has been developed that can be used autonomously to sample drill and capture rock cores. The tool is designed to accommodate core transfer using a sample tube to the IMSAH (integrated Mars sample acquisition and handling) SHEC (sample handling, encapsulation, and containerization) without ever touching the pristine core sample in the transfer process.

  12. An integrated and accessible sample data library for Mars sample return science

    NASA Astrophysics Data System (ADS)

    Tuite, M. L., Jr.; Williford, K. H.

    2015-12-01

    Over the course of the next decade or more, many thousands of geological samples will be collected and analyzed in a variety of ways by researchers at the Jet Propulsion Laboratory (California Institute of Technology) in order to facilitate discovery and contextualize observations made of Mars rocks both in situ and here on Earth if samples are eventually returned. Integration of data from multiple analyses of samples including petrography, thin section and SEM imaging, isotope and organic geochemistry, XRF, XRD, and Raman spectrometry is a challenge and a potential obstacle to discoveries that require supporting lines of evidence. We report the development of a web-accessible repository, the Sample Data Library (SDL) for the sample-based data that are generated by the laboratories and instruments that comprise JPL's Center for Analysis of Returned Samples (CARS) in order to facilitate collaborative interpretation of potential biosignatures in Mars-analog geological samples. The SDL is constructed using low-cost, open-standards-based Amazon Web Services (AWS), including web-accessible storage, relational data base services, and a virtual web server. The data structure is sample-centered with a shared registry for assigning unique identifiers to all samples including International Geo-Sample Numbers. Both raw and derived data produced by instruments and post-processing workflows are automatically uploaded to online storage and linked via the unique identifiers. Through the web interface, users are able to find all the analyses associated with a single sample or search across features shared by multiple samples, sample localities, and analysis types. Planned features include more sophisticated search and analytical interfaces as well as data discoverability through NSF's EarthCube program.

  13. Sample Return: What Happens to the Samples on Earth?

    NASA Technical Reports Server (NTRS)

    McNamara, Karen

    2010-01-01

    As space agencies throughout the world turn their attention toward human exploration of the Moon, Mars, and the solar system beyond, there has been an increase in the number of robotic sample return missions proposed as precursors to these human endeavors. In reality, however, we, as a global community, have very little experience with robotic sample return missions: 3 of the Russian Luna Missions successfully returned lunar material in the 1970s; 28 years later, in 2004, NASA s Genesis Mission returned material from the solar wind; and in 2006, NASA s Stardust Mission returned material from the Comet Wild2. [Note: The Japanese Hyabusa mission continues in space with the hope of returning material from the asteroid 25143 Itokawa.] We launch many spacecraft to LEO and return them to Earth. We also launch spacecraft beyond LEO to explore the planets, our solar system, and beyond. Some even land on these bodies. But these do not return. So as we begin to contemplate the sample return missions of the future, some common questions arise: "What really happens when the capsule returns?" "Where does it land?" "Who retrieves it and just how do they do that?" "Where does it go after that?" "How do the scientists get the samples?" "Do they keep them?" "Who is in charge?" The questions are nearly endless. The goal of this paper/presentation is to uncover many of the mysteries of the post-return phase of a mission - from the time the return body enters the atmosphere until the mission ends and the samples become part of a long term collection. The discussion will be based largely on the author s own experience with both the Genesis and Stardust missions. Of course, these two missions have a great deal in common, being funded by the same NASA Program (Discovery) and having similar team composition. The intent, however, is to use these missions as examples in order to highlight the general requirements and the challenges in defining and meeting those requirements for the final

  14. Gaussian Boson Sampling.

    PubMed

    Hamilton, Craig S; Kruse, Regina; Sansoni, Linda; Barkhofen, Sonja; Silberhorn, Christine; Jex, Igor

    2017-10-27

    Boson sampling has emerged as a tool to explore the advantages of quantum over classical computers as it does not require universal control over the quantum system, which favors current photonic experimental platforms. Here, we introduce Gaussian Boson sampling, a classically hard-to-solve problem that uses squeezed states as a nonclassical resource. We relate the probability to measure specific photon patterns from a general Gaussian state in the Fock basis to a matrix function called the Hafnian, which answers the last remaining question of sampling from Gaussian states. Based on this result, we design Gaussian Boson sampling, a #P hard problem, using squeezed states. This demonstrates that Boson sampling from Gaussian states is possible, with significant advantages in the photon generation probability, compared to existing protocols.

  15. Novel Sample-handling Approach for XRD Analysis with Minimal Sample Preparation

    NASA Technical Reports Server (NTRS)

    Sarrazin, P.; Chipera, S.; Bish, D.; Blake, D.; Feldman, S.; Vaniman, D.; Bryson, C.

    2004-01-01

    Sample preparation and sample handling are among the most critical operations associated with X-ray diffraction (XRD) analysis. These operations require attention in a laboratory environment, but they become a major constraint in the deployment of XRD instruments for robotic planetary exploration. We are developing a novel sample handling system that dramatically relaxes the constraints on sample preparation by allowing characterization of coarse-grained material that would normally be impossible to analyze with conventional powder-XRD techniques.

  16. Geochemical reanalysis of historical U.S. Geological Survey sediment samples from the Tonsina area, Valdez Quadrangle, Alaska

    USGS Publications Warehouse

    Werdon, Melanie B.; Granitto, Matthew; Azain, Jaime S.

    2015-01-01

    The State of Alaska’s Strategic and Critical Minerals (SCM) Assessment project, a State-funded Capital Improvement Project (CIP), is designed to evaluate Alaska’s statewide potential for SCM resources. The SCM Assessment is being implemented by the Alaska Division of Geological & Geophysical Surveys (DGGS), and involves obtaining new airborne-geophysical, geological, and geochemical data. As part of the SCM Assessment, thousands of historical geochemical samples from DGGS, U.S. Geological Survey (USGS), and U.S. Bureau of Mines archives are being reanalyzed by DGGS using modern, quantitative, geochemical-analytical methods. The objective is to update the statewide geochemical database to more clearly identify areas in Alaska with SCM potential. The USGS is also undertaking SCM-related geologic studies in Alaska through the federally funded Alaska Critical Minerals cooperative project. DGGS and USGS share the goal of evaluating Alaska’s strategic and critical minerals potential and together created a Letter of Agreement (signed December 2012) and a supplementary Technical Assistance Agreement (#14CMTAA143458) to facilitate the two agencies’ cooperative work. Under these agreements, DGGS contracted the USGS in Denver to reanalyze historical USGS sediment samples from Alaska. For this report, DGGS funded reanalysis of 128 historical USGS sediment samples from the statewide Alaska Geochemical Database Version 2.0 (AGDB2; Granitto and others, 2013). Samples were chosen from the Tonsina area in the Chugach Mountains, Valdez quadrangle, Alaska (fig. 1). The USGS was responsible for sample retrieval from the National Geochemical Sample Archive (NGSA) in Denver, Colorado through the final quality assurance/quality control (QA/QC) of the geochemical analyses obtained through the USGS contract lab. The new geochemical data are published in this report as a coauthored DGGS report, and will be incorporated into the statewide geochemical databases of both agencies

  17. Suitability of selected free-gas and dissolved-gas sampling containers for carbon isotopic analysis.

    PubMed

    Eby, P; Gibson, J J; Yi, Y

    2015-07-15

    Storage trials were conducted for 2 to 3 months using a hydrocarbon and carbon dioxide gas mixture with known carbon isotopic composition to simulate typical hold times for gas samples prior to isotopic analysis. A range of containers (both pierced and unpierced) was periodically sampled to test for δ(13)C isotopic fractionation. Seventeen containers were tested for free-gas storage (20°C, 1 atm pressure) and 7 containers were tested for dissolved-gas storage, the latter prepared by bubbling free gas through tap water until saturated (20°C, 1 atm) and then preserved to avoid biological activity by acidifying to pH 2 with phosphoric acid and stored in the dark at 5°C. Samples were extracted using valves or by piercing septa, and then introduced into an isotope ratio mass spectrometer for compound-specific δ(13)C measurements. For free gas, stainless steel canisters and crimp-top glass serum bottles with butyl septa were most effective at preventing isotopic fractionation (pierced and unpierced), whereas silicone and PTFE-butyl septa allowed significant isotopic fractionation. FlexFoil and Tedlar bags were found to be effective only for storage of up to 1 month. For dissolved gas, crimp-top glass serum bottles with butyl septa were again effective, whereas silicone and PTFE-butyl were not. FlexFoil bags were reliable for up to 2 months. Our results suggest a range of preferred containers as well as several that did not perform very well for isotopic analysis. Overall, the results help establish better QA/QC procedures to avoid isotopic fractionation when storing environmental gas samples. Recommended containers for air transportation include steel canisters and glass serum bottles with butyl septa (pierced and unpierced). Copyright © 2015 John Wiley & Sons, Ltd.

  18. Introducing sampling entropy in repository based adaptive umbrella sampling

    NASA Astrophysics Data System (ADS)

    Zheng, Han; Zhang, Yingkai

    2009-12-01

    Determining free energy surfaces along chosen reaction coordinates is a common and important task in simulating complex systems. Due to the complexity of energy landscapes and the existence of high barriers, one widely pursued objective to develop efficient simulation methods is to achieve uniform sampling among thermodynamic states of interest. In this work, we have demonstrated sampling entropy (SE) as an excellent indicator for uniform sampling as well as for the convergence of free energy simulations. By introducing SE and the concentration theorem into the biasing-potential-updating scheme, we have further improved the adaptivity, robustness, and applicability of our recently developed repository based adaptive umbrella sampling (RBAUS) approach [H. Zheng and Y. Zhang, J. Chem. Phys. 128, 204106 (2008)]. Besides simulations of one dimensional free energy profiles for various systems, the generality and efficiency of this new RBAUS-SE approach have been further demonstrated by determining two dimensional free energy surfaces for the alanine dipeptide in gas phase as well as in water.

  19. Variance Estimation, Design Effects, and Sample Size Calculations for Respondent-Driven Sampling

    PubMed Central

    2006-01-01

    Hidden populations, such as injection drug users and sex workers, are central to a number of public health problems. However, because of the nature of these groups, it is difficult to collect accurate information about them, and this difficulty complicates disease prevention efforts. A recently developed statistical approach called respondent-driven sampling improves our ability to study hidden populations by allowing researchers to make unbiased estimates of the prevalence of certain traits in these populations. Yet, not enough is known about the sample-to-sample variability of these prevalence estimates. In this paper, we present a bootstrap method for constructing confidence intervals around respondent-driven sampling estimates and demonstrate in simulations that it outperforms the naive method currently in use. We also use simulations and real data to estimate the design effects for respondent-driven sampling in a number of situations. We conclude with practical advice about the power calculations that are needed to determine the appropriate sample size for a study using respondent-driven sampling. In general, we recommend a sample size twice as large as would be needed under simple random sampling. PMID:16937083

  20. IAEA Sampling Plan

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Geist, William H.

    2017-09-15

    The objectives for this presentation are to describe the method that the IAEA uses to determine a sampling plan for nuclear material measurements; describe the terms detection probability and significant quantity; list the three nuclear materials measurement types; describe the sampling method applied to an item facility; and describe multiple method sampling.

  1. Lunar Samples: Apollo Collection Tools, Curation Handling, Surveyor III and Soviet Luna Samples

    NASA Technical Reports Server (NTRS)

    Allton, J.H.

    2009-01-01

    The 6 Apollo missions that landed on the lunar surface returned 2196 samples comprised of 382 kg. The 58 samples weighing 21.5 kg collected on Apollo 11 expanded to 741 samples weighing 110.5 kg by the time of Apollo 17. The main goal on Apollo 11 was to obtain some material and return it safely to Earth. As we gained experience, the sampling tools and a more specific sampling strategy evolved. A summary of the sample types returned is shown in Table 1. By year 1989, some statistics on allocation by sample type were compiled [2]. The "scientific interest index" is based on the assumption that the more allocations per gram of sample, the higher the scientific interest. It is basically a reflection of the amount of diversity within a given sample type. Samples were also set aside for biohazard testing. The samples set aside and used for biohazard testing were represen-tative, as opposed to diverse. They tended to be larger and be comprised of less scientifically valuable mate-rial, such as dust and debris in the bottom of sample containers.

  2. Effects of within-Class Differences in Sample Responding on Acquired Sample Equivalence

    ERIC Educational Resources Information Center

    Urcuioli, Peter J.; Vasconcelos, Maarco

    2008-01-01

    Two experiments examined whether acquired sample equivalence in many-to-one matching was affected by variation in sample-response requirements. In each experiment, pigeons responded on either identical or different response schedules to the sample stimuli that occasioned the same reinforced comparison choice (i.e., to the within-class samples).…

  3. Sampling bee communities using pan traps: alternative methods increase sample size

    USDA-ARS?s Scientific Manuscript database

    Monitoring of the status of bee populations and inventories of bee faunas require systematic sampling. Efficiency and ease of implementation has encouraged the use of pan traps to sample bees. Efforts to find an optimal standardized sampling method for pan traps have focused on pan trap color. Th...

  4. Fast QC-LDPC code for free space optical communication

    NASA Astrophysics Data System (ADS)

    Wang, Jin; Zhang, Qi; Udeh, Chinonso Paschal; Wu, Rangzhong

    2017-02-01

    Free Space Optical (FSO) Communication systems use the atmosphere as a propagation medium. Hence the atmospheric turbulence effects lead to multiplicative noise related with signal intensity. In order to suppress the signal fading induced by multiplicative noise, we propose a fast Quasi-Cyclic (QC) Low-Density Parity-Check (LDPC) code for FSO Communication systems. As a linear block code based on sparse matrix, the performances of QC-LDPC is extremely near to the Shannon limit. Currently, the studies on LDPC code in FSO Communications is mainly focused on Gauss-channel and Rayleigh-channel, respectively. In this study, the LDPC code design over atmospheric turbulence channel which is nether Gauss-channel nor Rayleigh-channel is closer to the practical situation. Based on the characteristics of atmospheric channel, which is modeled as logarithmic-normal distribution and K-distribution, we designed a special QC-LDPC code, and deduced the log-likelihood ratio (LLR). An irregular QC-LDPC code for fast coding, of which the rates are variable, is proposed in this paper. The proposed code achieves excellent performance of LDPC codes and can present the characteristics of high efficiency in low rate, stable in high rate and less number of iteration. The result of belief propagation (BP) decoding shows that the bit error rate (BER) obviously reduced as the Signal-to-Noise Ratio (SNR) increased. Therefore, the LDPC channel coding technology can effectively improve the performance of FSO. At the same time, the BER, after decoding reduces with the increase of SNR arbitrarily, and not having error limitation platform phenomenon with error rate slowing down.

  5. Decision by Sampling

    ERIC Educational Resources Information Center

    Stewart, Neil; Chater, Nick; Brown, Gordon D. A.

    2006-01-01

    We present a theory of decision by sampling (DbS) in which, in contrast with traditional models, there are no underlying psychoeconomic scales. Instead, we assume that an attribute's subjective value is constructed from a series of binary, ordinal comparisons to a sample of attribute values drawn from memory and is its rank within the sample. We…

  6. Randomized branch sampling

    Treesearch

    Harry T. Valentine

    2002-01-01

    Randomized branch sampling (RBS) is a special application of multistage probability sampling (see Sampling, environmental), which was developed originally by Jessen [3] to estimate fruit counts on individual orchard trees. In general, the method can be used to obtain estimates of many different attributes of trees or other branched plants. The usual objective of RBS is...

  7. Biological sample collector

    DOEpatents

    Murphy, Gloria A [French Camp, CA

    2010-09-07

    A biological sample collector is adapted to a collect several biological samples in a plurality of filter wells. A biological sample collector may comprise a manifold plate for mounting a filter plate thereon, the filter plate having a plurality of filter wells therein; a hollow slider for engaging and positioning a tube that slides therethrough; and a slide case within which the hollow slider travels to allow the tube to be aligned with a selected filter well of the plurality of filter wells, wherein when the tube is aligned with the selected filter well, the tube is pushed through the hollow slider and into the selected filter well to sealingly engage the selected filter well and to allow the tube to deposit a biological sample onto a filter in the bottom of the selected filter well. The biological sample collector may be portable.

  8. Method and apparatus for processing a test sample to concentrate an analyte in the sample from a solvent in the sample

    DOEpatents

    Turner, Terry D.; Beller, Laurence S.; Clark, Michael L.; Klingler, Kerry M.

    1997-01-01

    A method of processing a test sample to concentrate an analyte in the sample from a solvent in the sample includes: a) boiling the test sample containing the analyte and solvent in a boiling chamber to a temperature greater than or equal to the solvent boiling temperature and less than the analyte boiling temperature to form a rising sample vapor mixture; b) passing the sample vapor mixture from the boiling chamber to an elongated primary separation tube, the separation tube having internal sidewalls and a longitudinal axis, the longitudinal axis being angled between vertical and horizontal and thus having an upper region and a lower region; c) collecting the physically transported liquid analyte on the internal sidewalls of the separation tube; and d) flowing the collected analyte along the angled internal sidewalls of the separation tube to and pass the separation tube lower region. The invention also includes passing a turbulence inducing wave through a vapor mixture to separate physically transported liquid second material from vaporized first material. Apparatus are also disclosed for effecting separations. Further disclosed is a fluidically powered liquid test sample withdrawal apparatus for withdrawing a liquid test sample from a test sample container and for cleaning the test sample container.

  9. Method and apparatus for processing a test sample to concentrate an analyte in the sample from a solvent in the sample

    DOEpatents

    Turner, T.D.; Beller, L.S.; Clark, M.L.; Klingler, K.M.

    1997-10-14

    A method of processing a test sample to concentrate an analyte in the sample from a solvent in the sample includes: (a) boiling the test sample containing the analyte and solvent in a boiling chamber to a temperature greater than or equal to the solvent boiling temperature and less than the analyte boiling temperature to form a rising sample vapor mixture; (b) passing the sample vapor mixture from the boiling chamber to an elongated primary separation tube, the separation tube having internal sidewalls and a longitudinal axis, the longitudinal axis being angled between vertical and horizontal and thus having an upper region and a lower region; (c) collecting the physically transported liquid analyte on the internal sidewalls of the separation tube; and (d) flowing the collected analyte along the angled internal sidewalls of the separation tube to and pass the separation tube lower region. The invention also includes passing a turbulence inducing wave through a vapor mixture to separate physically transported liquid second material from vaporized first material. Apparatus is also disclosed for effecting separations. Further disclosed is a fluidically powered liquid test sample withdrawal apparatus for withdrawing a liquid test sample from a test sample container and for cleaning the test sample container. 8 figs.

  10. A Method for Choosing the Best Samples for Mars Sample Return

    PubMed Central

    Gordon, Peter R.

    2018-01-01

    Abstract Success of a future Mars Sample Return mission will depend on the correct choice of samples. Pyrolysis-FTIR can be employed as a triage instrument for Mars Sample Return. The technique can thermally dissociate minerals and organic matter for detection. Identification of certain mineral types can determine the habitability of the depositional environment, past or present, while detection of organic matter may suggest past or present habitation. In Mars' history, the Theiikian era represents an attractive target for life search missions and the acquisition of samples. The acidic and increasingly dry Theiikian may have been habitable and followed a lengthy neutral and wet period in Mars' history during which life could have originated and proliferated to achieve relatively abundant levels of biomass with a wide distribution. Moreover, the sulfate minerals produced in the Theiikian are also known to be good preservers of organic matter. We have used pyrolysis-FTIR and samples from a Mars analog ferrous acid stream with a thriving ecosystem to test the triage concept. Pyrolysis-FTIR identified those samples with the greatest probability of habitability and habitation. A three-tier scoring system was developed based on the detection of (i) organic signals, (ii) carbon dioxide and water, and (iii) sulfur dioxide. The presence of each component was given a score of A, B, or C depending on whether the substance had been detected, tentatively detected, or not detected, respectively. Single-step (for greatest possible sensitivity) or multistep (for more diagnostic data) pyrolysis-FTIR methods informed the assignments. The system allowed the highest-priority samples to be categorized as AAA (or A*AA if the organic signal was complex), while the lowest-priority samples could be categorized as CCC. Our methods provide a mechanism with which to rank samples and identify those that should take the highest priority for return to Earth during a Mars Sample Return mission

  11. A Method for Choosing the Best Samples for Mars Sample Return.

    PubMed

    Gordon, Peter R; Sephton, Mark A

    2018-05-01

    Success of a future Mars Sample Return mission will depend on the correct choice of samples. Pyrolysis-FTIR can be employed as a triage instrument for Mars Sample Return. The technique can thermally dissociate minerals and organic matter for detection. Identification of certain mineral types can determine the habitability of the depositional environment, past or present, while detection of organic matter may suggest past or present habitation. In Mars' history, the Theiikian era represents an attractive target for life search missions and the acquisition of samples. The acidic and increasingly dry Theiikian may have been habitable and followed a lengthy neutral and wet period in Mars' history during which life could have originated and proliferated to achieve relatively abundant levels of biomass with a wide distribution. Moreover, the sulfate minerals produced in the Theiikian are also known to be good preservers of organic matter. We have used pyrolysis-FTIR and samples from a Mars analog ferrous acid stream with a thriving ecosystem to test the triage concept. Pyrolysis-FTIR identified those samples with the greatest probability of habitability and habitation. A three-tier scoring system was developed based on the detection of (i) organic signals, (ii) carbon dioxide and water, and (iii) sulfur dioxide. The presence of each component was given a score of A, B, or C depending on whether the substance had been detected, tentatively detected, or not detected, respectively. Single-step (for greatest possible sensitivity) or multistep (for more diagnostic data) pyrolysis-FTIR methods informed the assignments. The system allowed the highest-priority samples to be categorized as AAA (or A*AA if the organic signal was complex), while the lowest-priority samples could be categorized as CCC. Our methods provide a mechanism with which to rank samples and identify those that should take the highest priority for return to Earth during a Mars Sample Return mission. Key Words

  12. FPGA implementation of high-performance QC-LDPC decoder for optical communications

    NASA Astrophysics Data System (ADS)

    Zou, Ding; Djordjevic, Ivan B.

    2015-01-01

    Forward error correction is as one of the key technologies enabling the next-generation high-speed fiber optical communications. Quasi-cyclic (QC) low-density parity-check (LDPC) codes have been considered as one of the promising candidates due to their large coding gain performance and low implementation complexity. In this paper, we present our designed QC-LDPC code with girth 10 and 25% overhead based on pairwise balanced design. By FPGAbased emulation, we demonstrate that the 5-bit soft-decision LDPC decoder can achieve 11.8dB net coding gain with no error floor at BER of 10-15 avoiding using any outer code or post-processing method. We believe that the proposed single QC-LDPC code is a promising solution for 400Gb/s optical communication systems and beyond.

  13. Network Sampling with Memory: A proposal for more efficient sampling from social networks.

    PubMed

    Mouw, Ted; Verdery, Ashton M

    2012-08-01

    Techniques for sampling from networks have grown into an important area of research across several fields. For sociologists, the possibility of sampling from a network is appealing for two reasons: (1) A network sample can yield substantively interesting data about network structures and social interactions, and (2) it is useful in situations where study populations are difficult or impossible to survey with traditional sampling approaches because of the lack of a sampling frame. Despite its appeal, methodological concerns about the precision and accuracy of network-based sampling methods remain. In particular, recent research has shown that sampling from a network using a random walk based approach such as Respondent Driven Sampling (RDS) can result in high design effects (DE)-the ratio of the sampling variance to the sampling variance of simple random sampling (SRS). A high design effect means that more cases must be collected to achieve the same level of precision as SRS. In this paper we propose an alternative strategy, Network Sampling with Memory (NSM), which collects network data from respondents in order to reduce design effects and, correspondingly, the number of interviews needed to achieve a given level of statistical power. NSM combines a "List" mode, where all individuals on the revealed network list are sampled with the same cumulative probability, with a "Search" mode, which gives priority to bridge nodes connecting the current sample to unexplored parts of the network. We test the relative efficiency of NSM compared to RDS and SRS on 162 school and university networks from Add Health and Facebook that range in size from 110 to 16,278 nodes. The results show that the average design effect for NSM on these 162 networks is 1.16, which is very close to the efficiency of a simple random sample (DE=1), and 98.5% lower than the average DE we observed for RDS.

  14. Stardust Sample: Investigator's Guidebook

    NASA Technical Reports Server (NTRS)

    Allen, Carl

    2006-01-01

    In January 2006, the Stardust spacecraft returned the first in situ collection of samples from a comet, and the first samples of contemporary interstellar dust. Stardust is the first US sample return mission from a planetary body since Apollo, and the first ever from beyond the moon. This handbook is a basic reference source for allocation procedures and policies for Stardust samples. These samples consist of particles and particle residues in aerogel collectors, in aluminum foil, and in spacecraft components. Contamination control samples and unflown collection media are also available for allocation.

  15. SAMPLING OSCILLOSCOPE

    DOEpatents

    Sugarman, R.M.

    1960-08-30

    An oscilloscope is designed for displaying transient signal waveforms having random time and amplitude distributions. The oscilloscopc is a sampling device that selects for display a portion of only those waveforms having a particular range of amplitudes. For this purpose a pulse-height analyzer is provided to screen the pulses. A variable voltage-level shifter and a time-scale rampvoltage generator take the pulse height relative to the start of the waveform. The variable voltage shifter produces a voltage level raised one step for each sequential signal waveform to be sampled and this results in an unsmeared record of input signal waveforms. Appropriate delay devices permit each sample waveform to pass its peak amplitude before the circuit selects it for display.

  16. Improved DESI-MS Performance using Edge Sampling and aRotational Sample Stage

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kertesz, Vilmos; Van Berkel, Gary J

    2008-01-01

    The position of the surface to be analyzed relative to the sampling orifice or capillary into the mass spectrometer has been known to dramatically affect the observed signal levels in desorption electrospray ionization mass spectrometry (DESIMS). In analyses of sample spots on planar surfaces, DESI-MS signal intensities as much as five times greater were routinely observed when the bottom of the sampling capillary was appropriately positioned beneath the surface plane ( edge sampling") compared to when the capillary just touched the surface. To take advantage of the optimum "edge sampling" geometry and to maximize the number of samples that couldmore » be analyzed in this configuration, a rotational sample stage was integrated into a typical DESI-MS setup. The rapid quantitative determination of caffeine in two diet sport drinks (Diet Turbo Tea, Speed Stack Grape) spiked with an isotopically labeled internal standard demonstrated the utility of this approach.« less

  17. Lunar Sample Compendium

    NASA Technical Reports Server (NTRS)

    Meyer, Charles

    2005-01-01

    The purpose of the Lunar Sample Compendium will be to inform scientists, astronauts and the public about the various lunar samples that have been returned from the Moon. This Compendium will be organized rock by rock in the manor of a catalog, but will not be as comprehensive, nor as complete, as the various lunar sample catalogs that are available. Likewise, this Compendium will not duplicate the various excellent books and reviews on the subject of lunar samples (Cadogen 1981, Heiken et al. 1991, Papike et al. 1998, Warren 2003, Eugster 2003). However, it is thought that an online Compendium, such as this, will prove useful to scientists proposing to study individual lunar samples and should help provide backup information for lunar sample displays. This Compendium will allow easy access to the scientific literature by briefly summarizing the significant findings of each rock along with the documentation of where the detailed scientific data are to be found. In general, discussion and interpretation of the results is left to the formal reviews found in the scientific literature. An advantage of this Compendium will be that it can be updated, expanded and corrected as need be.

  18. Quality control in urodynamics and the role of software support in the QC procedure.

    PubMed

    Hogan, S; Jarvis, P; Gammie, A; Abrams, P

    2011-11-01

    This article aims to identify quality control (QC) best practice, to review published QC audits in order to identify how closely good practice is followed, and to carry out a market survey of the software features that support QC offered by urodynamics machines available in the UK. All UK distributors of urodynamic systems were contacted and asked to provide information on the software features relating to data quality of the products they supply. The results of the market survey show that the features offered by manufacturers differ greatly. Automated features, which can be turned off in most cases, include: cough recognition, detrusor contraction detection, and high pressure alerts. There are currently no systems that assess data quality based on published guidelines. A literature review of current QC guidelines for urodynamics was carried out; QC audits were included in the literature review to see how closely guidelines were being followed. This review highlights the fact that basic QC is not being carried out effectively by urodynamicists. Based on the software features currently available and the results of the literature review there is both the need and capacity for a greater degree of automation in relation to urodynamic data quality and accuracy assessment. Some progress has been made in this area and certain manufacturers have already developed automated cough detection. Copyright © 2011 Wiley Periodicals, Inc.

  19. Sampling free energy surfaces as slices by combining umbrella sampling and metadynamics.

    PubMed

    Awasthi, Shalini; Kapil, Venkat; Nair, Nisanth N

    2016-06-15

    Metadynamics (MTD) is a very powerful technique to sample high-dimensional free energy landscapes, and due to its self-guiding property, the method has been successful in studying complex reactions and conformational changes. MTD sampling is based on filling the free energy basins by biasing potentials and thus for cases with flat, broad, and unbound free energy wells, the computational time to sample them becomes very large. To alleviate this problem, we combine the standard Umbrella Sampling (US) technique with MTD to sample orthogonal collective variables (CVs) in a simultaneous way. Within this scheme, we construct the equilibrium distribution of CVs from biased distributions obtained from independent MTD simulations with umbrella potentials. Reweighting is carried out by a procedure that combines US reweighting and Tiwary-Parrinello MTD reweighting within the Weighted Histogram Analysis Method (WHAM). The approach is ideal for a controlled sampling of a CV in a MTD simulation, making it computationally efficient in sampling flat, broad, and unbound free energy surfaces. This technique also allows for a distributed sampling of a high-dimensional free energy surface, further increasing the computational efficiency in sampling. We demonstrate the application of this technique in sampling high-dimensional surface for various chemical reactions using ab initio and QM/MM hybrid molecular dynamics simulations. Further, to carry out MTD bias reweighting for computing forward reaction barriers in ab initio or QM/MM simulations, we propose a computationally affordable approach that does not require recrossing trajectories. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  20. The Effects of Sampling Probe Design and Sampling Techniques on Aerosol Measurements

    DTIC Science & Technology

    1975-05-01

    Schematic of Extraction and Sampling System 39 16. Filter Housing 40 17. Theoretical Isokinetic Flow Requirements of the EPA Sampling...from the flow parameters based on a zero-error assumption at isokinetic sampling conditions. Isokinetic , or equal velocity sampling, was...prior to testing the probes. It was also used to measure the flow field adjacent to the probe inlets to determine the isokinetic condition of the

  1. [Respondent-Driven Sampling: a new sampling method to study visible and hidden populations].

    PubMed

    Mantecón, Alejandro; Juan, Montse; Calafat, Amador; Becoña, Elisardo; Román, Encarna

    2008-01-01

    The paper introduces a variant of chain-referral sampling: respondent-driven sampling (RDS). This sampling method shows that methods based on network analysis can be combined with the statistical validity of standard probability sampling methods. In this sense, RDS appears to be a mathematical improvement of snowball sampling oriented to the study of hidden populations. However, we try to prove its validity with populations that are not within a sampling frame but can nonetheless be contacted without difficulty. The basics of RDS are explained through our research on young people (aged 14 to 25) who go clubbing, consume alcohol and other drugs, and have sex. Fieldwork was carried out between May and July 2007 in three Spanish regions: Baleares, Galicia and Comunidad Valenciana. The presentation of the study shows the utility of this type of sampling when the population is accessible but there is a difficulty deriving from the lack of a sampling frame. However, the sample obtained is not a random representative one in statistical terms of the target population. It must be acknowledged that the final sample is representative of a 'pseudo-population' that approximates to the target population but is not identical to it.

  2. Developing a Hypothetical Learning Trajectory for the Sampling Distribution of the Sample Means

    NASA Astrophysics Data System (ADS)

    Syafriandi

    2018-04-01

    Special types of probability distribution are sampling distributions that are important in hypothesis testing. The concept of a sampling distribution may well be the key concept in understanding how inferential procedures work. In this paper, we will design a hypothetical learning trajectory (HLT) for the sampling distribution of the sample mean, and we will discuss how the sampling distribution is used in hypothesis testing.

  3. Network Sampling with Memory: A proposal for more efficient sampling from social networks

    PubMed Central

    Mouw, Ted; Verdery, Ashton M.

    2013-01-01

    Techniques for sampling from networks have grown into an important area of research across several fields. For sociologists, the possibility of sampling from a network is appealing for two reasons: (1) A network sample can yield substantively interesting data about network structures and social interactions, and (2) it is useful in situations where study populations are difficult or impossible to survey with traditional sampling approaches because of the lack of a sampling frame. Despite its appeal, methodological concerns about the precision and accuracy of network-based sampling methods remain. In particular, recent research has shown that sampling from a network using a random walk based approach such as Respondent Driven Sampling (RDS) can result in high design effects (DE)—the ratio of the sampling variance to the sampling variance of simple random sampling (SRS). A high design effect means that more cases must be collected to achieve the same level of precision as SRS. In this paper we propose an alternative strategy, Network Sampling with Memory (NSM), which collects network data from respondents in order to reduce design effects and, correspondingly, the number of interviews needed to achieve a given level of statistical power. NSM combines a “List” mode, where all individuals on the revealed network list are sampled with the same cumulative probability, with a “Search” mode, which gives priority to bridge nodes connecting the current sample to unexplored parts of the network. We test the relative efficiency of NSM compared to RDS and SRS on 162 school and university networks from Add Health and Facebook that range in size from 110 to 16,278 nodes. The results show that the average design effect for NSM on these 162 networks is 1.16, which is very close to the efficiency of a simple random sample (DE=1), and 98.5% lower than the average DE we observed for RDS. PMID:24159246

  4. Curation of Frozen Samples

    NASA Technical Reports Server (NTRS)

    Fletcher, L. A.; Allen, C. C.; Bastien, R.

    2008-01-01

    NASA's Johnson Space Center (JSC) and the Astromaterials Curator are charged by NPD 7100.10D with the curation of all of NASA s extraterrestrial samples, including those from future missions. This responsibility includes the development of new sample handling and preparation techniques; therefore, the Astromaterials Curator must begin developing procedures to preserve, prepare and ship samples at sub-freezing temperatures in order to enable future sample return missions. Such missions might include the return of future frozen samples from permanently-shadowed lunar craters, the nuclei of comets, the surface of Mars, etc. We are demonstrating the ability to curate samples under cold conditions by designing, installing and testing a cold curation glovebox. This glovebox will allow us to store, document, manipulate and subdivide frozen samples while quantifying and minimizing contamination throughout the curation process.

  5. Apollo 14 rock samples

    NASA Technical Reports Server (NTRS)

    Carlson, I. C.

    1978-01-01

    Petrographic descriptions of all Apollo 14 samples larger than 1 cm in any dimension are presented. The sample description format consists of: (1) an introductory section which includes information on lunar sample location, orientation, and return containers, (2) a section on physical characteristics, which contains the sample mass, dimensions, and a brief description; (3) surface features, including zap pits, cavities, and fractures as seen in binocular view; (4) petrographic description, consisting of a binocular description and, if possible, a thin section description; and (5) a discussion of literature relevant to sample petrology is included for samples which have previously been examined by the scientific community.

  6. Liquid sampling system

    DOEpatents

    Larson, L.L.

    1984-09-17

    A conduit extends from a reservoir through a sampling station and back to the reservoir in a closed loop. A jet ejector in the conduit establishes suction for withdrawing liquid from the reservoir. The conduit has a self-healing septum therein upstream of the jet ejector for receiving one end of a double-ended cannula, the other end of which is received in a serum bottle for sample collection. Gas is introduced into the conduit at a gas bleed between the sample collection bottle and the reservoir. The jet ejector evacuates gas from the conduit and the bottle and aspirates a column of liquid from the reservoir at a high rate. When the withdrawn liquid reaches the jet ejector the rate of flow therethrough reduces substantially and the gas bleed increases the pressure in the conduit for driving liquid into the sample bottle, the gas bleed forming a column of gas behind the withdrawn liquid column and interrupting the withdrawal of liquid from the reservoir. In the case of hazardous and toxic liquids, the sample bottle and the jet ejector may be isolated from the reservoir and may be further isolated from a control station containing remote manipulation means for the sample bottle and control valves for the jet ejector and gas bleed. 5 figs.

  7. Liquid sampling system

    DOEpatents

    Larson, Loren L.

    1987-01-01

    A conduit extends from a reservoir through a sampling station and back to the reservoir in a closed loop. A jet ejector in the conduit establishes suction for withdrawing liquid from the reservoir. The conduit has a self-healing septum therein upstream of the jet ejector for receiving one end of a double-ended cannula, the other end of which is received in a serum bottle for sample collection. Gas is introduced into the conduit at a gas bleed between the sample collection bottle and the reservoir. The jet ejector evacuates gas from the conduit and the bottle and aspirates a column of liquid from the reservoir at a high rate. When the withdrawn liquid reaches the jet ejector the rate of flow therethrough reduces substantially and the gas bleed increases the pressure in the conduit for driving liquid into the sample bottle, the gas bleed forming a column of gas behind the withdrawn liquid column and interrupting the withdrawal of liquid from the reservoir. In the case of hazardous and toxic liquids, the sample bottle and the jet ejector may be isolated from the reservoir and may be further isolated from a control station containing remote manipulation means for the sample bottle and control valves for the jet ejector and gas bleed.

  8. National accident sampling system sample design, phases 2 and 3 : executive summary

    DOT National Transportation Integrated Search

    1979-11-01

    This report describes the Phase 2 and 3 sample design for the : National Accident Sampling System (NASS). It recommends a procedure : for the first-stage selection of Primary Sampling Units (PSU's) and : the second-stage design for the selection of a...

  9. A novel construction scheme of QC-LDPC codes based on the RU algorithm for optical transmission systems

    NASA Astrophysics Data System (ADS)

    Yuan, Jian-guo; Liang, Meng-qi; Wang, Yong; Lin, Jin-zhao; Pang, Yu

    2016-03-01

    A novel lower-complexity construction scheme of quasi-cyclic low-density parity-check (QC-LDPC) codes for optical transmission systems is proposed based on the structure of the parity-check matrix for the Richardson-Urbanke (RU) algorithm. Furthermore, a novel irregular QC-LDPC(4 288, 4 020) code with high code-rate of 0.937 is constructed by this novel construction scheme. The simulation analyses show that the net coding gain ( NCG) of the novel irregular QC-LDPC(4 288,4 020) code is respectively 2.08 dB, 1.25 dB and 0.29 dB more than those of the classic RS(255, 239) code, the LDPC(32 640, 30 592) code and the irregular QC-LDPC(3 843, 3 603) code at the bit error rate ( BER) of 10-6. The irregular QC-LDPC(4 288, 4 020) code has the lower encoding/decoding complexity compared with the LDPC(32 640, 30 592) code and the irregular QC-LDPC(3 843, 3 603) code. The proposed novel QC-LDPC(4 288, 4 020) code can be more suitable for the increasing development requirements of high-speed optical transmission systems.

  10. Phylogenetic effective sample size.

    PubMed

    Bartoszek, Krzysztof

    2016-10-21

    In this paper I address the question-how large is a phylogenetic sample? I propose a definition of a phylogenetic effective sample size for Brownian motion and Ornstein-Uhlenbeck processes-the regression effective sample size. I discuss how mutual information can be used to define an effective sample size in the non-normal process case and compare these two definitions to an already present concept of effective sample size (the mean effective sample size). Through a simulation study I find that the AICc is robust if one corrects for the number of species or effective number of species. Lastly I discuss how the concept of the phylogenetic effective sample size can be useful for biodiversity quantification, identification of interesting clades and deciding on the importance of phylogenetic correlations. Copyright © 2016 Elsevier Ltd. All rights reserved.

  11. Implications of sampling design and sample size for national carbon accounting systems.

    PubMed

    Köhl, Michael; Lister, Andrew; Scott, Charles T; Baldauf, Thomas; Plugge, Daniel

    2011-11-08

    Countries willing to adopt a REDD regime need to establish a national Measurement, Reporting and Verification (MRV) system that provides information on forest carbon stocks and carbon stock changes. Due to the extensive areas covered by forests the information is generally obtained by sample based surveys. Most operational sampling approaches utilize a combination of earth-observation data and in-situ field assessments as data sources. We compared the cost-efficiency of four different sampling design alternatives (simple random sampling, regression estimators, stratified sampling, 2-phase sampling with regression estimators) that have been proposed in the scope of REDD. Three of the design alternatives provide for a combination of in-situ and earth-observation data. Under different settings of remote sensing coverage, cost per field plot, cost of remote sensing imagery, correlation between attributes quantified in remote sensing and field data, as well as population variability and the percent standard error over total survey cost was calculated. The cost-efficiency of forest carbon stock assessments is driven by the sampling design chosen. Our results indicate that the cost of remote sensing imagery is decisive for the cost-efficiency of a sampling design. The variability of the sample population impairs cost-efficiency, but does not reverse the pattern of cost-efficiency of the individual design alternatives. Our results clearly indicate that it is important to consider cost-efficiency in the development of forest carbon stock assessments and the selection of remote sensing techniques. The development of MRV-systems for REDD need to be based on a sound optimization process that compares different data sources and sampling designs with respect to their cost-efficiency. This helps to reduce the uncertainties related with the quantification of carbon stocks and to increase the financial benefits from adopting a REDD regime.

  12. Evaluation of respondent-driven sampling.

    PubMed

    McCreesh, Nicky; Frost, Simon D W; Seeley, Janet; Katongole, Joseph; Tarsh, Matilda N; Ndunguse, Richard; Jichi, Fatima; Lunel, Natasha L; Maher, Dermot; Johnston, Lisa G; Sonnenberg, Pam; Copas, Andrew J; Hayes, Richard J; White, Richard G

    2012-01-01

    Respondent-driven sampling is a novel variant of link-tracing sampling for estimating the characteristics of hard-to-reach groups, such as HIV prevalence in sex workers. Despite its use by leading health organizations, the performance of this method in realistic situations is still largely unknown. We evaluated respondent-driven sampling by comparing estimates from a respondent-driven sampling survey with total population data. Total population data on age, tribe, religion, socioeconomic status, sexual activity, and HIV status were available on a population of 2402 male household heads from an open cohort in rural Uganda. A respondent-driven sampling (RDS) survey was carried out in this population, using current methods of sampling (RDS sample) and statistical inference (RDS estimates). Analyses were carried out for the full RDS sample and then repeated for the first 250 recruits (small sample). We recruited 927 household heads. Full and small RDS samples were largely representative of the total population, but both samples underrepresented men who were younger, of higher socioeconomic status, and with unknown sexual activity and HIV status. Respondent-driven sampling statistical inference methods failed to reduce these biases. Only 31%-37% (depending on method and sample size) of RDS estimates were closer to the true population proportions than the RDS sample proportions. Only 50%-74% of respondent-driven sampling bootstrap 95% confidence intervals included the population proportion. Respondent-driven sampling produced a generally representative sample of this well-connected nonhidden population. However, current respondent-driven sampling inference methods failed to reduce bias when it occurred. Whether the data required to remove bias and measure precision can be collected in a respondent-driven sampling survey is unresolved. Respondent-driven sampling should be regarded as a (potentially superior) form of convenience sampling method, and caution is required

  13. Evaluation of Respondent-Driven Sampling

    PubMed Central

    McCreesh, Nicky; Frost, Simon; Seeley, Janet; Katongole, Joseph; Tarsh, Matilda Ndagire; Ndunguse, Richard; Jichi, Fatima; Lunel, Natasha L; Maher, Dermot; Johnston, Lisa G; Sonnenberg, Pam; Copas, Andrew J; Hayes, Richard J; White, Richard G

    2012-01-01

    Background Respondent-driven sampling is a novel variant of link-tracing sampling for estimating the characteristics of hard-to-reach groups, such as HIV prevalence in sex-workers. Despite its use by leading health organizations, the performance of this method in realistic situations is still largely unknown. We evaluated respondent-driven sampling by comparing estimates from a respondent-driven sampling survey with total-population data. Methods Total-population data on age, tribe, religion, socioeconomic status, sexual activity and HIV status were available on a population of 2402 male household-heads from an open cohort in rural Uganda. A respondent-driven sampling (RDS) survey was carried out in this population, employing current methods of sampling (RDS sample) and statistical inference (RDS estimates). Analyses were carried out for the full RDS sample and then repeated for the first 250 recruits (small sample). Results We recruited 927 household-heads. Full and small RDS samples were largely representative of the total population, but both samples under-represented men who were younger, of higher socioeconomic status, and with unknown sexual activity and HIV status. Respondent-driven-sampling statistical-inference methods failed to reduce these biases. Only 31%-37% (depending on method and sample size) of RDS estimates were closer to the true population proportions than the RDS sample proportions. Only 50%-74% of respondent-driven-sampling bootstrap 95% confidence intervals included the population proportion. Conclusions Respondent-driven sampling produced a generally representative sample of this well-connected non-hidden population. However, current respondent-driven-sampling inference methods failed to reduce bias when it occurred. Whether the data required to remove bias and measure precision can be collected in a respondent-driven sampling survey is unresolved. Respondent-driven sampling should be regarded as a (potentially superior) form of convenience-sampling

  14. Sample Curation in Support of the OSIRIS-REx Asteroid Sample Return Mission

    NASA Technical Reports Server (NTRS)

    Righter, Kevin; Nakamura-Messenger, Keiko

    2017-01-01

    The OSIRIS-REx asteroid sample return mission launched to asteroid Bennu Sept. 8, 2016. The spacecraft will arrive at Bennu in late 2019, orbit and map the asteroid, and perform a touch and go (TAG) sampling maneuver in July 2020. After sample is stowed and confirmed the spacecraft will return to Earth, and the sample return capsule (SRC) will land in Utah in September 2023. Samples will be recovered from Utah [2] and then transported and stored in a new sample cleanroom at NASA Johnson Space Center in Houston [3]. The materials curated for the mission are described here. a) Materials Archive and Witness Plate Collection: The SRC and TAGSAM were built between March 2014 and Summer of 2015, and instruments (OTES,OVIRS, OLA, OCAMS, REXIS) were integrated from Summer 2015 until May 2016. A total of 395 items were received for the materials archive at NASA-JSC, with archiving finishing 30 days after launch (with the final archived items being related to launch operations)[4]. The materials fall into several general categories including metals (stainless steel, aluminum, titanium alloys, brass and BeCu alloy), epoxies, paints, polymers, lubricants, non-volatile-residue samples (NVR), sapphire, and various miscellaneous materials. All through the ATLO process (from March 2015 until late August 2016) contamination knowledge witness plates (Si wafer and Al foil) were deployed in the various cleanrooms in Denver and KSC to provide an additional record of particle counts and volatiles that is archived for current and future scientific studies. These plates were deployed in roughly monthly increments with each unit containing 4 Si wafers and 4 Al foils. We archived 128 individual witness plates (64 Si wafers and 64 Al foils); one of each witness plate (Si and Al) was analyzed immediately by the science team after archiving, while the remaining 3 of each are archived indefinitely. Information about each material archived is stored in an extensive database at NASA-JSC, and key

  15. Sampling functions for geophysics

    NASA Technical Reports Server (NTRS)

    Giacaglia, G. E. O.; Lunquist, C. A.

    1972-01-01

    A set of spherical sampling functions is defined such that they are related to spherical-harmonic functions in the same way that the sampling functions of information theory are related to sine and cosine functions. An orderly distribution of (N + 1) squared sampling points on a sphere is given, for which the (N + 1) squared spherical sampling functions span the same linear manifold as do the spherical-harmonic functions through degree N. The transformations between the spherical sampling functions and the spherical-harmonic functions are given by recurrence relations. The spherical sampling functions of two arguments are extended to three arguments and to nonspherical reference surfaces. Typical applications of this formalism to geophysical topics are sketched.

  16. Sample holder with optical features

    DOEpatents

    Milas, Mirko; Zhu, Yimei; Rameau, Jonathan David

    2013-07-30

    A sample holder for holding a sample to be observed for research purposes, particularly in a transmission electron microscope (TEM), generally includes an external alignment part for directing a light beam in a predetermined beam direction, a sample holder body in optical communication with the external alignment part and a sample support member disposed at a distal end of the sample holder body opposite the external alignment part for holding a sample to be analyzed. The sample holder body defines an internal conduit for the light beam and the sample support member includes a light beam positioner for directing the light beam between the sample holder body and the sample held by the sample support member.

  17. Microgravity Testing of a Surface Sampling System for Sample Return from Small Solar System Bodies

    NASA Technical Reports Server (NTRS)

    Franzen, M. A.; Preble, J.; Schoenoff, M.; Halona, K.; Long, T. E.; Park, T.; Sears, D. W. G.

    2004-01-01

    The return of samples from solar system bodies is becoming an essential element of solar system exploration. The recent National Research Council Solar System Exploration Decadal Survey identified six sample return missions as high priority missions: South-Aitken Basin Sample Return, Comet Surface Sample Return, Comet Surface Sample Return-sample from selected surface sites, Asteroid Lander/Rover/Sample Return, Comet Nucleus Sample Return-cold samples from depth, and Mars Sample Return [1] and the NASA Roadmap also includes sample return missions [2] . Sample collection methods that have been flown on robotic spacecraft to date return subgram quantities, but many scientific issues (like bulk composition, particle size distributions, petrology, chronology) require tens to hundreds of grams of sample. Many complex sample collection devices have been proposed, however, small robotic missions require simplicity. We present here the results of experiments done with a simple but innovative collection system for sample return from small solar system bodies.

  18. Biofouling development on plasma treated samples versus layers coated samples

    NASA Astrophysics Data System (ADS)

    Hnatiuc, B.; Exnar, P.; Sabau, A.; Spatenka, P.; Dumitrache, C. L.; Hnatiuc, M.; Ghita, S.

    2016-12-01

    Biofouling is the most important cause of naval corrosion. In order to reduce the Biofouling development on naval materials as steel or resin, different new methods have been tested. These methods could help to follow the new IMO environment reglementations and they could replace few classic operations before the painting of the small ships. The replacement of these operations means a reduction in maintenance costs. Their action must influence especially the first two steps of the Biofouling development, called Microfouling, that demand about 24 hours. This work presents the comparative results of the Biofouling development on two different classic naval materials, steel and resin, for three treated samples, immersed in sea water. Non-thermal plasma, produced by GlidArc technology, is applied to the first sample, called GD. The plasma treatment was set to 10 minutes. The last two samples, called AE9 and AE10 are covered by hydrophobic layers, prepared from a special organic-inorganic sol synthesized by sol-gel method. Theoretically, because of the hydrophobic properties, the Biofouling formation must be delayed for AE9 and AE10. The Biofouling development on each treated sample was compared with a witness non-treated sample. The microbiological analyses have been done for 24 hours by epifluorescence microscopy, available for one single layer.

  19. Vibronic Boson Sampling: Generalized Gaussian Boson Sampling for Molecular Vibronic Spectra at Finite Temperature.

    PubMed

    Huh, Joonsuk; Yung, Man-Hong

    2017-08-07

    Molecular vibroic spectroscopy, where the transitions involve non-trivial Bosonic correlation due to the Duschinsky Rotation, is strongly believed to be in a similar complexity class as Boson Sampling. At finite temperature, the problem is represented as a Boson Sampling experiment with correlated Gaussian input states. This molecular problem with temperature effect is intimately related to the various versions of Boson Sampling sharing the similar computational complexity. Here we provide a full description to this relation in the context of Gaussian Boson Sampling. We find a hierarchical structure, which illustrates the relationship among various Boson Sampling schemes. Specifically, we show that every instance of Gaussian Boson Sampling with an initial correlation can be simulated by an instance of Gaussian Boson Sampling without initial correlation, with only a polynomial overhead. Since every Gaussian state is associated with a thermal state, our result implies that every sampling problem in molecular vibronic transitions, at any temperature, can be simulated by Gaussian Boson Sampling associated with a product of vacuum modes. We refer such a generalized Gaussian Boson Sampling motivated by the molecular sampling problem as Vibronic Boson Sampling.

  20. The Lunar Sample Compendium

    NASA Technical Reports Server (NTRS)

    Meyer, Charles

    2009-01-01

    The Lunar Sample Compendium is a succinct summary of the data obtained from 40 years of study of Apollo and Luna samples of the Moon. Basic petrographic, chemical and age information is compiled, sample-by-sample, in the form of an advanced catalog in order to provide a basic description of each sample. The LSC can be found online using Google. The initial allocation of lunar samples was done sparingly, because it was realized that scientific techniques would improve over the years and new questions would be formulated. The LSC is important because it enables scientists to select samples within the context of the work that has already been done and facilitates better review of proposed allocations. It also provides back up material for public displays, captures information found only in abstracts, grey literature and curatorial databases and serves as a ready access to the now-vast scientific literature.

  1. A simulative comparison of respondent driven sampling with incentivized snowball sampling--the "strudel effect".

    PubMed

    Gyarmathy, V Anna; Johnston, Lisa G; Caplinskiene, Irma; Caplinskas, Saulius; Latkin, Carl A

    2014-02-01

    Respondent driven sampling (RDS) and incentivized snowball sampling (ISS) are two sampling methods that are commonly used to reach people who inject drugs (PWID). We generated a set of simulated RDS samples on an actual sociometric ISS sample of PWID in Vilnius, Lithuania ("original sample") to assess if the simulated RDS estimates were statistically significantly different from the original ISS sample prevalences for HIV (9.8%), Hepatitis A (43.6%), Hepatitis B (Anti-HBc 43.9% and HBsAg 3.4%), Hepatitis C (87.5%), syphilis (6.8%) and Chlamydia (8.8%) infections and for selected behavioral risk characteristics. The original sample consisted of a large component of 249 people (83% of the sample) and 13 smaller components with 1-12 individuals. Generally, as long as all seeds were recruited from the large component of the original sample, the simulation samples simply recreated the large component. There were no significant differences between the large component and the entire original sample for the characteristics of interest. Altogether 99.2% of 360 simulation sample point estimates were within the confidence interval of the original prevalence values for the characteristics of interest. When population characteristics are reflected in large network components that dominate the population, RDS and ISS may produce samples that have statistically non-different prevalence values, even though some isolated network components may be under-sampled and/or statistically significantly different from the main groups. This so-called "strudel effect" is discussed in the paper. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  2. Serum samples can be substituted by plasma samples for the diagnosis of paratuberculosis.

    PubMed

    Goodridge, Amador; Correa, Ricardo; Castro, Paul; Escobar, Cecilia; de Waard, Jacobus H

    2013-10-01

    Employing plasma samples rather than serum samples for serological paratuberculosis diagnosis is practical, especially when bovine TB is assessed in the same cattle herd with the gamma interferon bovine avian (IFN-γ BA) test. We demonstrate that antibody titers in serum and plasma samples, utilizing the PARACHECK(®) ELISA kit, are highly comparable (Cohen's kappa test, k=0.955). We conclude that serum can be replaced with plasma in this commercially available antibody detection assay resulting in working hour savings for sampling and blood sample work-up and cost reductions for materials and sample storage. Copyright © 2013 Elsevier B.V. All rights reserved.

  3. Fluid sampling system

    DOEpatents

    Houck, Edward D.

    1994-01-01

    An fluid sampling system allows sampling of radioactive liquid without spillage. A feed tank is connected to a liquid transfer jet powered by a pumping chamber pressurized by compressed air. The liquid is pumped upwardly into a sampling jet of a venturi design having a lumen with an inlet, an outlet, a constricted middle portion, and a port located above the constricted middle portion. The liquid is passed under pressure through the constricted portion causing its velocity to increase and its pressure to decreased, thereby preventing liquid from escaping. A septum sealing the port can be pierced by a two pointed hollow needle leading into a sample bottle also sealed by a pierceable septum affixed to one end. The bottle is evacuated by flow through the sample jet, cyclic variation in the sampler jet pressure periodically leaves the evacuated bottle with lower pressure than that of the port, thus causing solution to pass into the bottle. The remaining solution in the system is returned to the feed tank via a holding tank.

  4. Fluid sampling system

    DOEpatents

    Houck, E.D.

    1994-10-11

    An fluid sampling system allows sampling of radioactive liquid without spillage. A feed tank is connected to a liquid transfer jet powered by a pumping chamber pressurized by compressed air. The liquid is pumped upwardly into a sampling jet of a venturi design having a lumen with an inlet, an outlet, a constricted middle portion, and a port located above the constricted middle portion. The liquid is passed under pressure through the constricted portion causing its velocity to increase and its pressure to be decreased, thereby preventing liquid from escaping. A septum sealing the port can be pierced by a two pointed hollow needle leading into a sample bottle also sealed by a pierceable septum affixed to one end. The bottle is evacuated by flow through the sample jet, cyclic variation in the sampler jet pressure periodically leaves the evacuated bottle with lower pressure than that of the port, thus causing solution to pass into the bottle. The remaining solution in the system is returned to the feed tank via a holding tank. 4 figs.

  5. Adaptive sampling in behavioral surveys.

    PubMed

    Thompson, S K

    1997-01-01

    Studies of populations such as drug users encounter difficulties because the members of the populations are rare, hidden, or hard to reach. Conventionally designed large-scale surveys detect relatively few members of the populations so that estimates of population characteristics have high uncertainty. Ethnographic studies, on the other hand, reach suitable numbers of individuals only through the use of link-tracing, chain referral, or snowball sampling procedures that often leave the investigators unable to make inferences from their sample to the hidden population as a whole. In adaptive sampling, the procedure for selecting people or other units to be in the sample depends on variables of interest observed during the survey, so the design adapts to the population as encountered. For example, when self-reported drug use is found among members of the sample, sampling effort may be increased in nearby areas. Types of adaptive sampling designs include ordinary sequential sampling, adaptive allocation in stratified sampling, adaptive cluster sampling, and optimal model-based designs. Graph sampling refers to situations with nodes (for example, people) connected by edges (such as social links or geographic proximity). An initial sample of nodes or edges is selected and edges are subsequently followed to bring other nodes into the sample. Graph sampling designs include network sampling, snowball sampling, link-tracing, chain referral, and adaptive cluster sampling. A graph sampling design is adaptive if the decision to include linked nodes depends on variables of interest observed on nodes already in the sample. Adjustment methods for nonsampling errors such as imperfect detection of drug users in the sample apply to adaptive as well as conventional designs.

  6. Rockballer Sample Acquisition Tool

    NASA Technical Reports Server (NTRS)

    Giersch, Louis R.; Cook, Brant T.

    2013-01-01

    It would be desirable to acquire rock and/or ice samples that extend below the surface of the parent rock or ice in extraterrestrial environments such as the Moon, Mars, comets, and asteroids. Such samples would allow measurements to be made further back into the geologic history of the rock, providing critical insight into the history of the local environment and the solar system. Such samples could also be necessary for sample return mission architectures that would acquire samples from extraterrestrial environments for return to Earth for more detailed scientific investigation.

  7. DIY Tomography sample holder

    NASA Astrophysics Data System (ADS)

    Lari, L.; Wright, I.; Boyes, E. D.

    2015-10-01

    A very simple tomography sample holder at minimal cost was developed in-house. The holder is based on a JEOL single tilt fast exchange sample holder where its exchangeable tip was modified to allow high angle degree tilt. The shape of the tip was designed to retain mechanical stability while minimising the lateral size of the tip. The sample can be mounted on as for a standard 3mm Cu grids as well as semi-circular grids from FIB sample preparation. Applications of the holder on different sample systems are shown.

  8. 21 CFR 203.38 - Sample lot or control numbers; labeling of sample units.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 21 Food and Drugs 4 2010-04-01 2010-04-01 false Sample lot or control numbers; labeling of sample units. 203.38 Section 203.38 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) DRUGS: GENERAL PRESCRIPTION DRUG MARKETING Samples § 203.38 Sample lot or control...

  9. Sampling in Qualitative Research

    PubMed Central

    LUBORSKY, MARK R.; RUBINSTEIN, ROBERT L.

    2011-01-01

    In gerontology the most recognized and elaborate discourse about sampling is generally thought to be in quantitative research associated with survey research and medical research. But sampling has long been a central concern in the social and humanistic inquiry, albeit in a different guise suited to the different goals. There is a need for more explicit discussion of qualitative sampling issues. This article will outline the guiding principles and rationales, features, and practices of sampling in qualitative research. It then describes common questions about sampling in qualitative research. In conclusion it proposes the concept of qualitative clarity as a set of principles (analogous to statistical power) to guide assessments of qualitative sampling in a particular study or proposal. PMID:22058580

  10. Fluid sample collection and distribution system. [qualitative analysis of aqueous samples from several points

    NASA Technical Reports Server (NTRS)

    Brooks, R. L. (Inventor)

    1979-01-01

    A multipoint fluid sample collection and distribution system is provided wherein the sample inputs are made through one or more of a number of sampling valves to a progressive cavity pump which is not susceptible to damage by large unfiltered particles. The pump output is through a filter unit that can provide a filtered multipoint sample. An unfiltered multipoint sample is also provided. An effluent sample can be taken and applied to a second progressive cavity pump for pumping to a filter unit that can provide one or more filtered effluent samples. The second pump can also provide an unfiltered effluent sample. Means are provided to periodically back flush each filter unit without shutting off the whole system.

  11. Fluid sampling device

    NASA Technical Reports Server (NTRS)

    Studenick, D. K. (Inventor)

    1977-01-01

    An inlet leak is described for sampling gases, more specifically, for selectively sampling multiple fluids. This fluid sampling device includes a support frame. A plurality of fluid inlet devices extend through the support frame and each of the fluid inlet devices include a longitudinal aperture. An opening device that is responsive to a control signal selectively opens the aperture to allow fluid passage. A closing device that is responsive to another control signal selectively closes the aperture for terminating further fluid flow.

  12. Sampling effort and estimates of species richness based on prepositioned area electrofisher samples

    USGS Publications Warehouse

    Bowen, Z.H.; Freeman, Mary C.

    1998-01-01

    Estimates of species richness based on electrofishing data are commonly used to describe the structure of fish communities. One electrofishing method for sampling riverine fishes that has become popular in the last decade is the prepositioned area electrofisher (PAE). We investigated the relationship between sampling effort and fish species richness at seven sites in the Tallapoosa River system, USA based on 1,400 PAE samples collected during 1994 and 1995. First, we estimated species richness at each site using the first-order jackknife and compared observed values for species richness and jackknife estimates of species richness to estimates based on historical collection data. Second, we used a permutation procedure and nonlinear regression to examine rates of species accumulation. Third, we used regression to predict the number of PAE samples required to collect the jackknife estimate of species richness at each site during 1994 and 1995. We found that jackknife estimates of species richness generally were less than or equal to estimates based on historical collection data. The relationship between PAE electrofishing effort and species richness in the Tallapoosa River was described by a positive asymptotic curve as found in other studies using different electrofishing gears in wadable streams. Results from nonlinear regression analyses indicted that rates of species accumulation were variable among sites and between years. Across sites and years, predictions of sampling effort required to collect jackknife estimates of species richness suggested that doubling sampling effort (to 200 PAEs) would typically increase observed species richness by not more than six species. However, sampling effort beyond about 60 PAE samples typically increased observed species richness by < 10%. We recommend using historical collection data in conjunction with a preliminary sample size of at least 70 PAE samples to evaluate estimates of species richness in medium-sized rivers

  13. Sampling device for withdrawing a representative sample from single and multi-phase flows

    DOEpatents

    Apley, Walter J.; Cliff, William C.; Creer, James M.

    1984-01-01

    A fluid stream sampling device has been developed for the purpose of obtaining a representative sample from a single or multi-phase fluid flow. This objective is carried out by means of a probe which may be inserted into the fluid stream. Individual samples are withdrawn from the fluid flow by sampling ports with particular spacings, and the sampling parts are coupled to various analytical systems for characterization of the physical, thermal, and chemical properties of the fluid flow as a whole and also individually.

  14. Using Candy Samples to Learn about Sampling Techniques and Statistical Data Evaluation

    ERIC Educational Resources Information Center

    Canaes, Larissa S.; Brancalion, Marcel L.; Rossi, Adriana V.; Rath, Susanne

    2008-01-01

    A classroom exercise for undergraduate and beginning graduate students that takes about one class period is proposed and discussed. It is an easy, interesting exercise that demonstrates important aspects of sampling techniques (sample amount, particle size, and the representativeness of the sample in relation to the bulk material). The exercise…

  15. Rapid Active Sampling Package

    NASA Technical Reports Server (NTRS)

    Peters, Gregory

    2010-01-01

    A field-deployable, battery-powered Rapid Active Sampling Package (RASP), originally designed for sampling strong materials during lunar and planetary missions, shows strong utility for terrestrial geological use. The technology is proving to be simple and effective for sampling and processing materials of strength. Although this originally was intended for planetary and lunar applications, the RASP is very useful as a powered hand tool for geologists and the mining industry to quickly sample and process rocks in the field on Earth. The RASP allows geologists to surgically acquire samples of rock for later laboratory analysis. This tool, roughly the size of a wrench, allows the user to cut away swaths of weathering rinds, revealing pristine rock surfaces for observation and subsequent sampling with the same tool. RASPing deeper (.3.5 cm) exposes single rock strata in-situ. Where a geologist fs hammer can only expose unweathered layers of rock, the RASP can do the same, and then has the added ability to capture and process samples into powder with particle sizes less than 150 microns, making it easier for XRD/XRF (x-ray diffraction/x-ray fluorescence). The tool uses a rotating rasp bit (or two counter-rotating bits) that resides inside or above the catch container. The container has an open slot to allow the bit to extend outside the container and to allow cuttings to enter and be caught. When the slot and rasp bit are in contact with a substrate, the bit is plunged into it in a matter of seconds to reach pristine rock. A user in the field may sample a rock multiple times at multiple depths in minutes, instead of having to cut out huge, heavy rock samples for transport back to a lab for analysis. Because of the speed and accuracy of the RASP, hundreds of samples can be taken in one day. RASP-acquired samples are small and easily carried. A user can characterize more area in less time than by using conventional methods. The field-deployable RASP used a Ni

  16. Two means of sampling sexual minority women: how different are the samples of women?

    PubMed

    Boehmer, Ulrike; Clark, Melissa; Timm, Alison; Ozonoff, Al

    2008-01-01

    We compared 2 sampling approaches of sexual minority women in 1 limited geographic area to better understand the implications of these 2 sampling approaches. Sexual minority women identified through the Census did not differ on average age or the prevalence of raising children from those sampled using nonrandomized methods. Women in the convenience sample were better educated and lived in smaller households. Modeling the likelihood of disability in this population resulted in contradictory parameter estimates by sampling approach. The degree of variation observed both between sampling approaches and between different parameters suggests that the total population of sexual minority women is still unmeasured. Thoroughly constructed convenience samples will continue to be a useful sampling strategy to further research on this population.

  17. Drug sampling in dermatology.

    PubMed

    Reid, Erika E; Alikhan, Ali; Brodell, Robert T

    2012-01-01

    The use of drug samples in a dermatology clinic is controversial. Drug samples are associated with influencing physician prescribing patterns often toward costlier drugs, increasing health care costs, increasing waste, inducing potential conflicts of interest, and decreasing the quality of patient education. On the other hand, they have the potential to help those in financial need, to improve adherence and convenience, and to expose patients to better drugs. Although some academic centers have banned drug samples altogether, many academic and private practices continue to distribute drug samples. Given the controversy of the topic, physicians who wish to distribute drug samples must do so in an ethical manner. We believe, when handled properly, drug sampling can be used in an ethical manner. Copyright © 2012 Elsevier Inc. All rights reserved.

  18. 7 CFR 275.11 - Sampling.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 4 2010-01-01 2010-01-01 false Sampling. 275.11 Section 275.11 Agriculture... § 275.11 Sampling. (a) Sampling plan. Each State agency shall develop a quality control sampling plan which demonstrates the integrity of its sampling procedures. (1) Content. The sampling plan shall...

  19. 7 CFR 275.11 - Sampling.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 7 Agriculture 4 2011-01-01 2011-01-01 false Sampling. 275.11 Section 275.11 Agriculture... § 275.11 Sampling. (a) Sampling plan. Each State agency shall develop a quality control sampling plan which demonstrates the integrity of its sampling procedures. (1) Content. The sampling plan shall...

  20. How iSamples (Internet of Samples in the Earth Sciences) Improves Sample and Data Stewardship in the Next Generation of Geoscientists

    NASA Astrophysics Data System (ADS)

    Hallett, B. W.; Dere, A. L. D.; Lehnert, K.; Carter, M.

    2016-12-01

    Vast numbers of physical samples are routinely collected by geoscientists to probe key scientific questions related to global climate change, biogeochemical cycles, magmatic processes, mantle dynamics, etc. Despite their value as irreplaceable records of nature the majority of these samples remain undiscoverable by the broader scientific community because they lack a digital presence or are not well-documented enough to facilitate their discovery and reuse for future scientific and educational use. The NSF EarthCube iSamples Research Coordination Network seeks to develop a unified approach across all Earth Science disciplines for the registration, description, identification, and citation of physical specimens in order to take advantage of the new opportunities that cyberinfrastructure offers. Even as consensus around best practices begins to emerge, such as the use of the International Geo Sample Number (IGSN), more work is needed to communicate these practices to investigators to encourage widespread adoption. Recognizing the importance of students and early career scientists in particular to transforming data and sample management practices, the iSamples Education and Training Working Group is developing training modules for sample collection, documentation, and management workflows. These training materials are made available to educators/research supervisors online at http://earthcube.org/group/isamples and can be modularized for supervisors to create a customized research workflow. This study details the design and development of several sample management tutorials, created by early career scientists and documented in collaboration with undergraduate research students in field and lab settings. Modules under development focus on rock outcrops, rock cores, soil cores, and coral samples, with an emphasis on sample management throughout the collection, analysis and archiving process. We invite others to share their sample management/registration workflows and to

  1. Catch me if you can: Comparing ballast water sampling skids to traditional net sampling

    NASA Astrophysics Data System (ADS)

    Bradie, Johanna; Gianoli, Claudio; Linley, Robert Dallas; Schillak, Lothar; Schneider, Gerd; Stehouwer, Peter; Bailey, Sarah

    2018-03-01

    With the recent ratification of the International Convention for the Control and Management of Ships' Ballast Water and Sediments, 2004, it will soon be necessary to assess ships for compliance with ballast water discharge standards. Sampling skids that allow the efficient collection of ballast water samples in a compact space have been developed for this purpose. We ran 22 trials on board the RV Meteor from June 4-15, 2015 to evaluate the performance of three ballast water sampling devices (traditional plankton net, Triton sampling skid, SGS sampling skid) for three organism size classes: ≥ 50 μm, ≥ 10 μm to < 50 μm, and < 10 μm. Natural sea water was run through the ballast water system and untreated samples were collected using paired sampling devices. Collected samples were analyzed in parallel by multiple analysts using several different analytic methods to quantify organism concentrations. To determine whether there were differences in the number of viable organisms collected across sampling devices, results were standardized and statistically treated to filter out other sources of variability, resulting in an outcome variable representing the mean difference in measurements that can be attributed to sampling devices. These results were tested for significance using pairwise Tukey contrasts. Differences in organism concentrations were found in 50% of comparisons between sampling skids and the plankton net for ≥ 50 μm, and ≥ 10 μm to < 50 μm size classes, with net samples containing either higher or lower densities. There were no differences for < 10 μm organisms. Future work will be required to explicitly examine the potential effects of flow velocity, sampling duration, sampled volume, and organism concentrations on sampling device performance.

  2. The Effect of Asymmetrical Sample Training on Retention Functions for Hedonic Samples in Rats

    ERIC Educational Resources Information Center

    Simmons, Sabrina; Santi, Angelo

    2012-01-01

    Rats were trained in a symbolic delayed matching-to-sample task to discriminate sample stimuli that consisted of the presence of food or the absence of food. Asymmetrical sample training was provided in which one group was initially trained with only the food sample and the other group was initially trained with only the no-food sample. In…

  3. Recommended protocols for sampling macrofungi

    Treesearch

    Gregory M. Mueller; John Paul Schmit; Sabine M. Hubndorf Leif Ryvarden; Thomas E. O' Dell; D. Jean Lodge; Patrick R. Leacock; Milagro Mata; Loengrin Umania; Qiuxin (Florence) Wu; Daniel L. Czederpiltz

    2004-01-01

    This chapter discusses several issues regarding reommended protocols for sampling macrofungi: Opportunistic sampling of macrofungi, sampling conspicuous macrofungi using fixed-size, sampling small Ascomycetes using microplots, and sampling a fixed number of downed logs.

  4. Toward cost-efficient sampling methods

    NASA Astrophysics Data System (ADS)

    Luo, Peng; Li, Yongli; Wu, Chong; Zhang, Guijie

    2015-09-01

    The sampling method has been paid much attention in the field of complex network in general and statistical physics in particular. This paper proposes two new sampling methods based on the idea that a small part of vertices with high node degree could possess the most structure information of a complex network. The two proposed sampling methods are efficient in sampling high degree nodes so that they would be useful even if the sampling rate is low, which means cost-efficient. The first new sampling method is developed on the basis of the widely used stratified random sampling (SRS) method and the second one improves the famous snowball sampling (SBS) method. In order to demonstrate the validity and accuracy of two new sampling methods, we compare them with the existing sampling methods in three commonly used simulation networks that are scale-free network, random network, small-world network, and also in two real networks. The experimental results illustrate that the two proposed sampling methods perform much better than the existing sampling methods in terms of achieving the true network structure characteristics reflected by clustering coefficient, Bonacich centrality and average path length, especially when the sampling rate is low.

  5. High-Grading Lunar Samples

    NASA Technical Reports Server (NTRS)

    Allen, Carlton; Sellar, Glenn; Nunez, Jorge; Mosie, Andrea; Schwarz, Carol; Parker, Terry; Winterhalter, Daniel; Farmer, Jack

    2009-01-01

    Astronauts on long-duration lunar missions will need the capability to high-grade their samples to select the highest value samples for transport to Earth and to leave others on the Moon. We are supporting studies to define the necessary and sufficient measurements and techniques for high-grading samples at a lunar outpost. A glovebox, dedicated to testing instruments and techniques for high-grading samples, is in operation at the JSC Lunar Experiment Laboratory. A reference suite of lunar rocks and soils, spanning the full compositional range found in the Apollo collection, is available for testing in this laboratory. Thin sections of these samples are available for direct comparison. The Lunar Sample Compendium, on-line at http://www-curator.jsc.nasa.gov/lunar/compendium.cfm, summarizes previous analyses of these samples. The laboratory, sample suite, and Compendium are available to the lunar research and exploration community. In the first test of possible instruments for lunar sample high-grading, we imaged 18 lunar rocks and four soils from the reference suite using the Multispectral Microscopic Imager (MMI) developed by Arizona State University and JPL (see Farmer et. al. abstract). The MMI is a fixed-focus digital imaging system with a resolution of 62.5 microns/pixel, a field size of 40 x 32 mm, and a depth-of-field of approximately 5 mm. Samples are illuminated sequentially by 21 light emitting diodes in discrete wavelengths spanning the visible to shortwave infrared. Measurements of reflectance standards and background allow calibration to absolute reflectance. ENVI-based software is used to produce spectra for specific minerals as well as multi-spectral images of rock textures.

  6. Collecting cometary soil samples? Development of the ROSETTA sample acquisition system

    NASA Technical Reports Server (NTRS)

    Coste, P. A.; Fenzi, M.; Eiden, Michael

    1993-01-01

    In the reference scenario of the ROSETTA CNRS mission, the Sample Acquisition System is mounted on the Comet Lander. Its tasks are to acquire three kinds of cometary samples and to transfer them to the Earth Return Capsule. Operations are to be performed in vacuum and microgravity, on a probably rough and dusty surface, in a largely unknown material, at temperatures in the order of 100 K. The concept and operation of the Sample Acquisition System are presented. The design of the prototype corer and surface sampling tool, and of the equipment for testing them at cryogenic temperatures in ambient conditions and in vacuum in various materials representing cometary soil, are described. Results of recent preliminary tests performed in low temperature thermal vacuum in a cometary analog ice-dust mixture are provided.

  7. Geochemical reanalysis of historical U.S. Geological Survey sediment samples from the Zane Hills, Hughes and Shungnak quadrangles, Alaska

    USGS Publications Warehouse

    Werdon, Melanie B.; Granitto, Matthew; Azain, Jaime S.

    2015-01-01

    The State of Alaska’s Strategic and Critical Minerals (SCM) Assessment project, a State-funded Capital Improvement Project (CIP), is designed to evaluate Alaska’s statewide potential for SCM resources. The SCM Assessment is being implemented by the Alaska Division of Geological & Geophysical Surveys (DGGS), and involves obtaining new airborne-geophysical, geological, and geochemical data. As part of the SCM Assessment, thousands of historical geochemical samples from DGGS, U.S. Geological Survey (USGS), and U.S. Bureau of Mines archives are being reanalyzed by DGGS using modern, quantitative, geochemical-analytical methods. The objective is to update the statewide geochemical database to more clearly identify areas in Alaska with SCM potential.The USGS is also undertaking SCM-related geologic studies in Alaska through the federally funded Alaska Critical Minerals cooperative project. DGGS and USGS share the goal of evaluating Alaska’s strategic and critical minerals potential and together created a Letter of Agreement (signed December 2012) and a supplementary Technical Assistance Agreement (#14CMTAA143458) to facilitate the two agencies’ cooperative work. Under these agreements, DGGS contracted the USGS in Denver to reanalyze historical USGS sediment samples from Alaska.For this report, DGGS funded reanalysis of 105 historical USGS sediment samples from the statewide Alaska Geochemical Database Version 2.0 (AGDB2; Granitto and others, 2013). Samples were chosen from the Zane Hills area in the Hughes and Shungnak quadrangles, Alaska (fig. 1). The USGS was responsible for sample retrieval from the National Geochemical Sample Archive (NGSA) in Denver, Colorado through the final quality assurance/quality control (QA/QC) of the geochemical analyses obtained through the USGS contract lab. The new geochemical data are published in this report as a coauthored DGGS report, and will be incorporated into the statewide geochemical databases of both agencies.

  8. Visualizing the Sample Standard Deviation

    ERIC Educational Resources Information Center

    Sarkar, Jyotirmoy; Rashid, Mamunur

    2017-01-01

    The standard deviation (SD) of a random sample is defined as the square-root of the sample variance, which is the "mean" squared deviation of the sample observations from the sample mean. Here, we interpret the sample SD as the square-root of twice the mean square of all pairwise half deviations between any two sample observations. This…

  9. Deterministic multidimensional nonuniform gap sampling.

    PubMed

    Worley, Bradley; Powers, Robert

    2015-12-01

    Born from empirical observations in nonuniformly sampled multidimensional NMR data relating to gaps between sampled points, the Poisson-gap sampling method has enjoyed widespread use in biomolecular NMR. While the majority of nonuniform sampling schemes are fully randomly drawn from probability densities that vary over a Nyquist grid, the Poisson-gap scheme employs constrained random deviates to minimize the gaps between sampled grid points. We describe a deterministic gap sampling method, based on the average behavior of Poisson-gap sampling, which performs comparably to its random counterpart with the additional benefit of completely deterministic behavior. We also introduce a general algorithm for multidimensional nonuniform sampling based on a gap equation, and apply it to yield a deterministic sampling scheme that combines burst-mode sampling features with those of Poisson-gap schemes. Finally, we derive a relationship between stochastic gap equations and the expectation value of their sampling probability densities. Copyright © 2015 Elsevier Inc. All rights reserved.

  10. Urine sample (image)

    MedlinePlus

    A "clean-catch" urine sample is performed by collecting the sample of urine in midstream. Men or boys should wipe clean the head ... water and rinse well. A small amount of urine should initially fall into the toilet bowl before ...

  11. Enhancing sampling design in mist-net bat surveys by accounting for sample size optimization.

    PubMed

    Trevelin, Leonardo Carreira; Novaes, Roberto Leonan Morim; Colas-Rosas, Paul François; Benathar, Thayse Cristhina Melo; Peres, Carlos A

    2017-01-01

    The advantages of mist-netting, the main technique used in Neotropical bat community studies to date, include logistical implementation, standardization and sampling representativeness. Nonetheless, study designs still have to deal with issues of detectability related to how different species behave and use the environment. Yet there is considerable sampling heterogeneity across available studies in the literature. Here, we approach the problem of sample size optimization. We evaluated the common sense hypothesis that the first six hours comprise the period of peak night activity for several species, thereby resulting in a representative sample for the whole night. To this end, we combined re-sampling techniques, species accumulation curves, threshold analysis, and community concordance of species compositional data, and applied them to datasets of three different Neotropical biomes (Amazonia, Atlantic Forest and Cerrado). We show that the strategy of restricting sampling to only six hours of the night frequently results in incomplete sampling representation of the entire bat community investigated. From a quantitative standpoint, results corroborated the existence of a major Sample Area effect in all datasets, although for the Amazonia dataset the six-hour strategy was significantly less species-rich after extrapolation, and for the Cerrado dataset it was more efficient. From the qualitative standpoint, however, results demonstrated that, for all three datasets, the identity of species that are effectively sampled will be inherently impacted by choices of sub-sampling schedule. We also propose an alternative six-hour sampling strategy (at the beginning and the end of a sample night) which performed better when resampling Amazonian and Atlantic Forest datasets on bat assemblages. Given the observed magnitude of our results, we propose that sample representativeness has to be carefully weighed against study objectives, and recommend that the trade-off between

  12. Enhancing sampling design in mist-net bat surveys by accounting for sample size optimization

    PubMed Central

    Trevelin, Leonardo Carreira; Novaes, Roberto Leonan Morim; Colas-Rosas, Paul François; Benathar, Thayse Cristhina Melo; Peres, Carlos A.

    2017-01-01

    The advantages of mist-netting, the main technique used in Neotropical bat community studies to date, include logistical implementation, standardization and sampling representativeness. Nonetheless, study designs still have to deal with issues of detectability related to how different species behave and use the environment. Yet there is considerable sampling heterogeneity across available studies in the literature. Here, we approach the problem of sample size optimization. We evaluated the common sense hypothesis that the first six hours comprise the period of peak night activity for several species, thereby resulting in a representative sample for the whole night. To this end, we combined re-sampling techniques, species accumulation curves, threshold analysis, and community concordance of species compositional data, and applied them to datasets of three different Neotropical biomes (Amazonia, Atlantic Forest and Cerrado). We show that the strategy of restricting sampling to only six hours of the night frequently results in incomplete sampling representation of the entire bat community investigated. From a quantitative standpoint, results corroborated the existence of a major Sample Area effect in all datasets, although for the Amazonia dataset the six-hour strategy was significantly less species-rich after extrapolation, and for the Cerrado dataset it was more efficient. From the qualitative standpoint, however, results demonstrated that, for all three datasets, the identity of species that are effectively sampled will be inherently impacted by choices of sub-sampling schedule. We also propose an alternative six-hour sampling strategy (at the beginning and the end of a sample night) which performed better when resampling Amazonian and Atlantic Forest datasets on bat assemblages. Given the observed magnitude of our results, we propose that sample representativeness has to be carefully weighed against study objectives, and recommend that the trade-off between

  13. Statistical validation of reagent lot change in the clinical chemistry laboratory can confer insights on good clinical laboratory practice.

    PubMed

    Cho, Min-Chul; Kim, So Young; Jeong, Tae-Dong; Lee, Woochang; Chun, Sail; Min, Won-Ki

    2014-11-01

    Verification of new lot reagent's suitability is necessary to ensure that results for patients' samples are consistent before and after reagent lot changes. A typical procedure is to measure results of some patients' samples along with quality control (QC) materials. In this study, the results of patients' samples and QC materials in reagent lot changes were analysed. In addition, the opinion regarding QC target range adjustment along with reagent lot changes was proposed. Patients' sample and QC material results of 360 reagent lot change events involving 61 analytes and eight instrument platforms were analysed. The between-lot differences for the patients' samples (ΔP) and the QC materials (ΔQC) were tested by Mann-Whitney U tests. The size of the between-lot differences in the QC data was calculated as multiples of standard deviation (SD). The ΔP and ΔQC values only differed significantly in 7.8% of the reagent lot change events. This frequency was not affected by the assay principle or the QC material source. One SD was proposed for the cutoff for maintaining pre-existing target range after reagent lot change. While non-commutable QC material results were infrequent in the present study, our data confirmed that QC materials have limited usefulness when assessing new reagent lots. Also a 1 SD standard for establishing a new QC target range after reagent lot change event was proposed. © The Author(s) 2014 Reprints and permissions: sagepub.co.uk/journalsPermissions.nav.

  14. Orbiting Sample Capture and Orientation Technologies for Potential Mars Sample Return

    NASA Astrophysics Data System (ADS)

    Younse, P.; Adajian, R.; Dolci, M.; Ohta, P.; Olds, E.; Lalla, K.; Strahle, J. W.

    2018-04-01

    Technologies applicable to a potential Mars Sample Return Orbiter for orbiting sample container capture and orientation are presented, as well as an integrated MArs CApture and ReOrientation for a potential NExt Mars Orbiter (MACARONE) concept.

  15. A novel construction method of QC-LDPC codes based on CRT for optical communications

    NASA Astrophysics Data System (ADS)

    Yuan, Jian-guo; Liang, Meng-qi; Wang, Yong; Lin, Jin-zhao; Pang, Yu

    2016-05-01

    A novel construction method of quasi-cyclic low-density parity-check (QC-LDPC) codes is proposed based on Chinese remainder theory (CRT). The method can not only increase the code length without reducing the girth, but also greatly enhance the code rate, so it is easy to construct a high-rate code. The simulation results show that at the bit error rate ( BER) of 10-7, the net coding gain ( NCG) of the regular QC-LDPC(4 851, 4 546) code is respectively 2.06 dB, 1.36 dB, 0.53 dB and 0.31 dB more than those of the classic RS(255, 239) code in ITU-T G.975, the LDPC(32 640, 30 592) code in ITU-T G.975.1, the QC-LDPC(3 664, 3 436) code constructed by the improved combining construction method based on CRT and the irregular QC-LDPC(3 843, 3 603) code constructed by the construction method based on the Galois field ( GF( q)) multiplicative group. Furthermore, all these five codes have the same code rate of 0.937. Therefore, the regular QC-LDPC(4 851, 4 546) code constructed by the proposed construction method has excellent error-correction performance, and can be more suitable for optical transmission systems.

  16. Comparison of initial stream urine samples and cervical samples for detection of human papillomavirus.

    PubMed

    Hagihara, Mao; Yamagishi, Yuka; Izumi, Koji; Miyazaki, Narimi; Suzuki, Takayoshi; Kato, Hideo; Nishiyama, Naoya; Koizumi, Yusuke; Suematsu, Hiroyuki; Mikamo, Hiroshige

    2016-08-01

    Uterine cervical cancer is a treatable and preventable cancer. Medical efforts to reduce rates of cervical cancer focus on the promotion of human papillomavirus (HPV) vaccination and the promotion of routine cervical cancer screening done by cervical cytology and cervical HPV testing. Urine-based HPV testing would be simple and noninvasive approach to screen for cervical cancer. Two biospecimens (clinician-taken sample from cervix and initial stream urine sample) were provided from a total of 240 healthy women attending for cancer screening provided for HPV testing. We have assessed the HPV detection rates among cervical samples and pellet fraction of urine samples using HPV test (Anyplex™ II HPV28 Detection kit, Seegene, Korea). Among 240 samples screened, HPV prevalence was 42.9% in pellet fractions of urine samples. The agreement between the two kinds of samples was 98.4%, k = 0.792. Discordant results were observed in 27 cases; 5 were positive only by urine samples and 22 were positive only by smear samples. Sensitivity and specificity for all HPV DNA in pellet fractions of urine using cervical samples as reference was 68.4% and 99.9%. Comparing methodologies of collection of samples for HPV detection, they showed the higher agreements for almost genotypes between cervical samples and pellet fractions of urine samples. These results suggest that urine could be a good noninvasive tool to monitor HPV infection in women. Additional research in a larger and general screening population would be needed. Copyright © 2016 Japanese Society of Chemotherapy and The Japanese Association for Infectious Diseases. Published by Elsevier Ltd. All rights reserved.

  17. Perpendicular distance sampling: an alternative method for sampling downed coarse woody debris

    Treesearch

    Michael S. Williams; Jeffrey H. Gove

    2003-01-01

    Coarse woody debris (CWD) plays an important role in many forest ecosystem processes. In recent years, a number of new methods have been proposed to sample CWD. These methods select individual logs into the sample using some form of unequal probability sampling. One concern with most of these methods is the difficulty in estimating the volume of each log. A new method...

  18. RnaSeqSampleSize: real data based sample size estimation for RNA sequencing.

    PubMed

    Zhao, Shilin; Li, Chung-I; Guo, Yan; Sheng, Quanhu; Shyr, Yu

    2018-05-30

    One of the most important and often neglected components of a successful RNA sequencing (RNA-Seq) experiment is sample size estimation. A few negative binomial model-based methods have been developed to estimate sample size based on the parameters of a single gene. However, thousands of genes are quantified and tested for differential expression simultaneously in RNA-Seq experiments. Thus, additional issues should be carefully addressed, including the false discovery rate for multiple statistic tests, widely distributed read counts and dispersions for different genes. To solve these issues, we developed a sample size and power estimation method named RnaSeqSampleSize, based on the distributions of gene average read counts and dispersions estimated from real RNA-seq data. Datasets from previous, similar experiments such as the Cancer Genome Atlas (TCGA) can be used as a point of reference. Read counts and their dispersions were estimated from the reference's distribution; using that information, we estimated and summarized the power and sample size. RnaSeqSampleSize is implemented in R language and can be installed from Bioconductor website. A user friendly web graphic interface is provided at http://cqs.mc.vanderbilt.edu/shiny/RnaSeqSampleSize/ . RnaSeqSampleSize provides a convenient and powerful way for power and sample size estimation for an RNAseq experiment. It is also equipped with several unique features, including estimation for interested genes or pathway, power curve visualization, and parameter optimization.

  19. Mars Sample Handling Functionality

    NASA Astrophysics Data System (ADS)

    Meyer, M. A.; Mattingly, R. L.

    2018-04-01

    The final leg of a Mars Sample Return campaign would be an entity that we have referred to as Mars Returned Sample Handling (MRSH.) This talk will address our current view of the functional requirements on MRSH, focused on the Sample Receiving Facility (SRF).

  20. SAMPLING OF CONTAMINATED SITES

    EPA Science Inventory

    A critical aspect of characterization of the amount and species of contamination of a hazardous waste site is the sampling plan developed for that site. f the sampling plan is not thoroughly conceptualized before sampling takes place, then certain critical aspects of the limits o...

  1. Sampling Strategies and Processing of Biobank Tissue Samples from Porcine Biomedical Models.

    PubMed

    Blutke, Andreas; Wanke, Rüdiger

    2018-03-06

    In translational medical research, porcine models have steadily become more popular. Considering the high value of individual animals, particularly of genetically modified pig models, and the often-limited number of available animals of these models, establishment of (biobank) collections of adequately processed tissue samples suited for a broad spectrum of subsequent analyses methods, including analyses not specified at the time point of sampling, represent meaningful approaches to take full advantage of the translational value of the model. With respect to the peculiarities of porcine anatomy, comprehensive guidelines have recently been established for standardized generation of representative, high-quality samples from different porcine organs and tissues. These guidelines are essential prerequisites for the reproducibility of results and their comparability between different studies and investigators. The recording of basic data, such as organ weights and volumes, the determination of the sampling locations and of the numbers of tissue samples to be generated, as well as their orientation, size, processing and trimming directions, are relevant factors determining the generalizability and usability of the specimen for molecular, qualitative, and quantitative morphological analyses. Here, an illustrative, practical, step-by-step demonstration of the most important techniques for generation of representative, multi-purpose biobank specimen from porcine tissues is presented. The methods described here include determination of organ/tissue volumes and densities, the application of a volume-weighted systematic random sampling procedure for parenchymal organs by point-counting, determination of the extent of tissue shrinkage related to histological embedding of samples, and generation of randomly oriented samples for quantitative stereological analyses, such as isotropic uniform random (IUR) sections generated by the "Orientator" and "Isector" methods, and vertical

  2. Mars Science Laboratory Sample Acquisition, Sample Processing and Handling: Subsystem Design and Test Challenges

    NASA Technical Reports Server (NTRS)

    Jandura, Louise

    2010-01-01

    The Sample Acquisition/Sample Processing and Handling subsystem for the Mars Science Laboratory is a highly-mechanized, Rover-based sampling system that acquires powdered rock and regolith samples from the Martian surface, sorts the samples into fine particles through sieving, and delivers small portions of the powder into two science instruments inside the Rover. SA/SPaH utilizes 17 actuated degrees-of-freedom to perform the functions needed to produce 5 sample pathways in support of the scientific investigation on Mars. Both hardware redundancy and functional redundancy are employed in configuring this sampling system so some functionality is retained even with the loss of a degree-of-freedom. Intentional dynamic environments are created to move sample while vibration isolators attenuate this environment at the sensitive instruments located near the dynamic sources. In addition to the typical flight hardware qualification test program, two additional types of testing are essential for this kind of sampling system: characterization of the intentionally-created dynamic environment and testing of the sample acquisition and processing hardware functions using Mars analog materials in a low pressure environment. The overall subsystem design and configuration are discussed along with some of the challenges, tradeoffs, and lessons learned in the areas of fault tolerance, intentional dynamic environments, and special testing

  3. Some connections between importance sampling and enhanced sampling methods in molecular dynamics

    NASA Astrophysics Data System (ADS)

    Lie, H. C.; Quer, J.

    2017-11-01

    In molecular dynamics, enhanced sampling methods enable the collection of better statistics of rare events from a reference or target distribution. We show that a large class of these methods is based on the idea of importance sampling from mathematical statistics. We illustrate this connection by comparing the Hartmann-Schütte method for rare event simulation (J. Stat. Mech. Theor. Exp. 2012, P11004) and the Valsson-Parrinello method of variationally enhanced sampling [Phys. Rev. Lett. 113, 090601 (2014)]. We use this connection in order to discuss how recent results from the Monte Carlo methods literature can guide the development of enhanced sampling methods.

  4. Some connections between importance sampling and enhanced sampling methods in molecular dynamics.

    PubMed

    Lie, H C; Quer, J

    2017-11-21

    In molecular dynamics, enhanced sampling methods enable the collection of better statistics of rare events from a reference or target distribution. We show that a large class of these methods is based on the idea of importance sampling from mathematical statistics. We illustrate this connection by comparing the Hartmann-Schütte method for rare event simulation (J. Stat. Mech. Theor. Exp. 2012, P11004) and the Valsson-Parrinello method of variationally enhanced sampling [Phys. Rev. Lett. 113, 090601 (2014)]. We use this connection in order to discuss how recent results from the Monte Carlo methods literature can guide the development of enhanced sampling methods.

  5. GTKDynamo: a PyMOL plug-in for QC/MM hybrid potential simulations

    PubMed Central

    Bachega, José Fernando R.; Timmers, Luís Fernando S.M.; Assirati, Lucas; Bachega, Leonardo R.; Field, Martin J.; Wymore, Troy

    2014-01-01

    Hybrid quantum chemical (QC)/molecular mechanical (MM) potentials are very powerful tools for molecular simulation. They are especially useful for studying processes in condensed phase systems, such as chemical reactions, that involve a relatively localized change in electronic structure and where the surrounding environment contributes to these changes but can be represented with more computationally efficient functional forms. Despite their utility, however, these potentials are not always straightforward to apply since the extent of significant electronic structure changes occurring in the condensed phase process may not be intuitively obvious. To facilitate their use we have developed an open-source graphical plug-in, GTKDynamo, that links the PyMOL visualization program and the pDynamo QC/MM simulation library. This article describes the implementation of GTKDynamo and its capabilities and illustrates its application to QC/MM simulations. PMID:24137667

  6. Sampling properties of directed networks

    NASA Astrophysics Data System (ADS)

    Son, S.-W.; Christensen, C.; Bizhani, G.; Foster, D. V.; Grassberger, P.; Paczuski, M.

    2012-10-01

    For many real-world networks only a small “sampled” version of the original network may be investigated; those results are then used to draw conclusions about the actual system. Variants of breadth-first search (BFS) sampling, which are based on epidemic processes, are widely used. Although it is well established that BFS sampling fails, in most cases, to capture the IN component(s) of directed networks, a description of the effects of BFS sampling on other topological properties is all but absent from the literature. To systematically study the effects of sampling biases on directed networks, we compare BFS sampling to random sampling on complete large-scale directed networks. We present new results and a thorough analysis of the topological properties of seven complete directed networks (prior to sampling), including three versions of Wikipedia, three different sources of sampled World Wide Web data, and an Internet-based social network. We detail the differences that sampling method and coverage can make to the structural properties of sampled versions of these seven networks. Most notably, we find that sampling method and coverage affect both the bow-tie structure and the number and structure of strongly connected components in sampled networks. In addition, at a low sampling coverage (i.e., less than 40%), the values of average degree, variance of out-degree, degree autocorrelation, and link reciprocity are overestimated by 30% or more in BFS-sampled networks and only attain values within 10% of the corresponding values in the complete networks when sampling coverage is in excess of 65%. These results may cause us to rethink what we know about the structure, function, and evolution of real-world directed networks.

  7. Metadata, Identifiers, and Physical Samples

    NASA Astrophysics Data System (ADS)

    Arctur, D. K.; Lenhardt, W. C.; Hills, D. J.; Jenkyns, R.; Stroker, K. J.; Todd, N. S.; Dassie, E. P.; Bowring, J. F.

    2016-12-01

    Physical samples are integral to much of the research conducted by geoscientists. The samples used in this research are often obtained at significant cost and represent an important investment for future research. However, making information about samples - whether considered data or metadata - available for researchers to enable discovery is difficult: a number of key elements related to samples are difficult to characterize in common ways, such as classification, location, sample type, sampling method, repository information, subsample distribution, and instrumentation, because these differ from one domain to the next. Unifying these elements or developing metadata crosswalks is needed. The iSamples (Internet of Samples) NSF-funded Research Coordination Network (RCN) is investigating ways to develop these types of interoperability and crosswalks. Within the iSamples RCN, one of its working groups, WG1, has focused on the metadata related to physical samples. This includes identifying existing metadata standards and systems, and how they might interoperate with the International Geo Sample Number (IGSN) schema (schema.igsn.org) in order to help inform leading practices for metadata. For example, we are examining lifecycle metadata beyond the IGSN `birth certificate.' As a first step, this working group is developing a list of relevant standards and comparing their various attributes. In addition, the working group is looking toward technical solutions to facilitate developing a linked set of registries to build the web of samples. Finally, the group is also developing a comparison of sample identifiers and locators. This paper will provide an overview and comparison of the standards identified thus far, as well as an update on the technical solutions examined for integration. We will discuss how various sample identifiers might work in complementary fashion with the IGSN to more completely describe samples, facilitate retrieval of contextual information, and

  8. Sample push-out fixture

    DOEpatents

    Biernat, John L.

    2002-11-05

    This invention generally relates to the remote removal of pelletized samples from cylindrical containment capsules. V-blocks are used to receive the samples and provide guidance to push out rods. Stainless steel liners fit into the v-channels on the v-blocks which permits them to be remotely removed and replaced or cleaned to prevent cross contamination between capsules and samples. A capsule holder securely holds the capsule while allowing manual up/down and in/out movement to align each sample hole with the v-blocks. Both end sections contain identical v-blocks; one that guides the drive out screw and rods or manual push out rods and the other to receive the samples as they are driven out of the capsule.

  9. Automated sampling assessment for molecular simulations using the effective sample size

    PubMed Central

    Zhang, Xin; Bhatt, Divesh; Zuckerman, Daniel M.

    2010-01-01

    To quantify the progress in the development of algorithms and forcefields used in molecular simulations, a general method for the assessment of the sampling quality is needed. Statistical mechanics principles suggest the populations of physical states characterize equilibrium sampling in a fundamental way. We therefore develop an approach for analyzing the variances in state populations, which quantifies the degree of sampling in terms of the effective sample size (ESS). The ESS estimates the number of statistically independent configurations contained in a simulated ensemble. The method is applicable to both traditional dynamics simulations as well as more modern (e.g., multi–canonical) approaches. Our procedure is tested in a variety of systems from toy models to atomistic protein simulations. We also introduce a simple automated procedure to obtain approximate physical states from dynamic trajectories: this allows sample–size estimation in systems for which physical states are not known in advance. PMID:21221418

  10. An evaluation of soil sampling for 137Cs using various field-sampling volumes.

    PubMed

    Nyhan, J W; White, G C; Schofield, T G; Trujillo, G

    1983-05-01

    The sediments from a liquid effluent receiving area at the Los Alamos National Laboratory and soils from an intensive study area in the fallout pathway of Trinity were sampled for 137Cs using 25-, 500-, 2500- and 12,500-cm3 field sampling volumes. A highly replicated sampling program was used to determine mean concentrations and inventories of 137Cs at each site, as well as estimates of spatial, aliquoting, and counting variance components of the radionuclide data. The sampling methods were also analyzed as a function of soil size fractions collected in each field sampling volume and of the total cost of the program for a given variation in the radionuclide survey results. Coefficients of variation (CV) of 137Cs inventory estimates ranged from 0.063 to 0.14 for Mortandad Canyon sediments, whereas CV values for Trinity soils were observed from 0.38 to 0.57. Spatial variance components of 137Cs concentration data were usually found to be larger than either the aliquoting or counting variance estimates and were inversely related to field sampling volume at the Trinity intensive site. Subsequent optimization studies of the sampling schemes demonstrated that each aliquot should be counted once, and that only 2-4 aliquots out of as many as 30 collected need be assayed for 137Cs. The optimization studies showed that as sample costs increased to 45 man-hours of labor per sample, the variance of the mean 137Cs concentration decreased dramatically, but decreased very little with additional labor.

  11. Uniform Sampling Table Method and its Applications II--Evaluating the Uniform Sampling by Experiment.

    PubMed

    Chen, Yibin; Chen, Jiaxi; Chen, Xuan; Wang, Min; Wang, Wei

    2015-01-01

    A new method of uniform sampling is evaluated in this paper. The items and indexes were adopted to evaluate the rationality of the uniform sampling. The evaluation items included convenience of operation, uniformity of sampling site distribution, and accuracy and precision of measured results. The evaluation indexes included operational complexity, occupation rate of sampling site in a row and column, relative accuracy of pill weight, and relative deviation of pill weight. They were obtained from three kinds of drugs with different shape and size by four kinds of sampling methods. Gray correlation analysis was adopted to make the comprehensive evaluation by comparing it with the standard method. The experimental results showed that the convenience of uniform sampling method was 1 (100%), odds ratio of occupation rate in a row and column was infinity, relative accuracy was 99.50-99.89%, reproducibility RSD was 0.45-0.89%, and weighted incidence degree exceeded the standard method. Hence, the uniform sampling method was easy to operate, and the selected samples were distributed uniformly. The experimental results demonstrated that the uniform sampling method has good accuracy and reproducibility, which can be put into use in drugs analysis.

  12. Developing Water Sampling Standards

    ERIC Educational Resources Information Center

    Environmental Science and Technology, 1974

    1974-01-01

    Participants in the D-19 symposium on aquatic sampling and measurement for water pollution assessment were informed that determining the extent of waste water stream pollution is not a cut and dry procedure. Topics discussed include field sampling, representative sampling from storm sewers, suggested sampler features and application of improved…

  13. Hayabusa Recovery, Curation and Preliminary Sample Analysis: Lessons Learned from Recent Sample Return Mission

    NASA Technical Reports Server (NTRS)

    Zolensky, Michael E.

    2011-01-01

    I describe lessons learned from my participation on the Hayabusa Mission, which returned regolith grains from asteroid Itokawa in 2010 [1], comparing this with the recently returned Stardust Spacecraft, which sampled the Jupiter Family comet Wild 2. Spacecraft Recovery Operations: The mission Science and Curation teams must actively participate in planning, testing and implementing spacecraft recovery operations. The crash of the Genesis spacecraft underscored the importance of thinking through multiple contingency scenarios and practicing field recovery for these potential circumstances. Having the contingency supplies on-hand was critical, and at least one full year of planning for Stardust and Hayabusa recovery operations was necessary. Care must be taken to coordinate recovery operations with local organizations and inform relevant government bodies well in advance. Recovery plans for both Stardust and Hayabusa had to be adjusted for unexpectedly wet landing site conditions. Documentation of every step of spacecraft recovery and deintegration was necessary, and collection and analysis of launch and landing site soils was critical. We found the operation of the Woomera Text Range (South Australia) to be excellent in the case of Hayabusa, and in many respects this site is superior to the Utah Test and Training Range (used for Stardust) in the USA. Recovery operations for all recovered spacecraft suffered from the lack of a hermetic seal for the samples. Mission engineers should be pushed to provide hermetic seals for returned samples. Sample Curation Issues: More than two full years were required to prepare curation facilities for Stardust and Hayabusa. Despite this seemingly adequate lead time, major changes to curation procedures were required once the actual state of the returned samples became apparent. Sample databases must be fully implemented before sample return for Stardust we did not adequately think through all of the possible sub sampling and

  14. Sample analysis at Mars

    NASA Astrophysics Data System (ADS)

    Coll, P.; Cabane, M.; Mahaffy, P. R.; Brinckerhoff, W. B.; Sam Team

    The next landed missions to Mars, such as the planned Mars Science Laboratory and ExoMars, will require sample analysis capabilities refined well beyond what has been flown to date. A key science objective driving this requirement is the determination of the carbon inventory of Mars, and particularly the detection of organic compounds. The Sample Analysis at Mars (SAM) suite consists of a group of tightly-integrated experiments that would analyze samples delivered directly from a coring drill or by a facility sample processing and delivery (SPAD) mechanism. SAM consists of an advanced GC/MS system and a laser desorption mass spectrometer (LDMS). The combined capabilities of these techniques can address Mars science objectives with much improved sensitivity, resolution, and analytical breadth over what has been previously possible in situ. The GC/MS system analyzes the bulk composition (both molecular and isotopic) of solid-phase and atmospheric samples. Solid samples are introduced with a highly flexible chemical derivatization/pyrolysis subsystem (Pyr/GC/MS) that is significantly more capable than the mass spectrometers on Viking. The LDMS analyzes local elemental and molecular composition in solid samples vaporized and ionized with a pulsed laser. We will describe how each of these capabilities has particular strengths that can achieve key measurement objectives at Mars. In addition, the close codevelopment of the GC/MS and LDMS along with a sample manipulation system enables the the sharing of resources, the correlation of results, and the utilization of certain approaches that would not be possible with separate instruments. For instance, the same samples could be analyzed with more than one technique, increasing efficiency and providing cross-checks for quantification. There is also the possibility of combining methods, such as by permitting TOF-MS analyses of evolved gas (Pyr/EI-TOF-MS) or GC/MS analyses of laser evaporated gas (LD-GC/MS).

  15. Protocol for Microplastics Sampling on the Sea Surface and Sample Analysis

    PubMed Central

    Kovač Viršek, Manca; Palatinus, Andreja; Koren, Špela; Peterlin, Monika; Horvat, Petra; Kržan, Andrej

    2016-01-01

    Microplastic pollution in the marine environment is a scientific topic that has received increasing attention over the last decade. The majority of scientific publications address microplastic pollution of the sea surface. The protocol below describes the methodology for sampling, sample preparation, separation and chemical identification of microplastic particles. A manta net fixed on an »A frame« attached to the side of the vessel was used for sampling. Microplastic particles caught in the cod end of the net were separated from samples by visual identification and use of stereomicroscopes. Particles were analyzed for their size using an image analysis program and for their chemical structure using ATR-FTIR and micro FTIR spectroscopy. The described protocol is in line with recommendations for microplastics monitoring published by the Marine Strategy Framework Directive (MSFD) Technical Subgroup on Marine Litter. This written protocol with video guide will support the work of researchers that deal with microplastics monitoring all over the world. PMID:28060297

  16. Protocol for Microplastics Sampling on the Sea Surface and Sample Analysis.

    PubMed

    Kovač Viršek, Manca; Palatinus, Andreja; Koren, Špela; Peterlin, Monika; Horvat, Petra; Kržan, Andrej

    2016-12-16

    Microplastic pollution in the marine environment is a scientific topic that has received increasing attention over the last decade. The majority of scientific publications address microplastic pollution of the sea surface. The protocol below describes the methodology for sampling, sample preparation, separation and chemical identification of microplastic particles. A manta net fixed on an »A frame« attached to the side of the vessel was used for sampling. Microplastic particles caught in the cod end of the net were separated from samples by visual identification and use of stereomicroscopes. Particles were analyzed for their size using an image analysis program and for their chemical structure using ATR-FTIR and micro FTIR spectroscopy. The described protocol is in line with recommendations for microplastics monitoring published by the Marine Strategy Framework Directive (MSFD) Technical Subgroup on Marine Litter. This written protocol with video guide will support the work of researchers that deal with microplastics monitoring all over the world.

  17. Comparing fixed sampling with minimizer sampling when using k-mer indexes to find maximal exact matches.

    PubMed

    Almutairy, Meznah; Torng, Eric

    2018-01-01

    Bioinformatics applications and pipelines increasingly use k-mer indexes to search for similar sequences. The major problem with k-mer indexes is that they require lots of memory. Sampling is often used to reduce index size and query time. Most applications use one of two major types of sampling: fixed sampling and minimizer sampling. It is well known that fixed sampling will produce a smaller index, typically by roughly a factor of two, whereas it is generally assumed that minimizer sampling will produce faster query times since query k-mers can also be sampled. However, no direct comparison of fixed and minimizer sampling has been performed to verify these assumptions. We systematically compare fixed and minimizer sampling using the human genome as our database. We use the resulting k-mer indexes for fixed sampling and minimizer sampling to find all maximal exact matches between our database, the human genome, and three separate query sets, the mouse genome, the chimp genome, and an NGS data set. We reach the following conclusions. First, using larger k-mers reduces query time for both fixed sampling and minimizer sampling at a cost of requiring more space. If we use the same k-mer size for both methods, fixed sampling requires typically half as much space whereas minimizer sampling processes queries only slightly faster. If we are allowed to use any k-mer size for each method, then we can choose a k-mer size such that fixed sampling both uses less space and processes queries faster than minimizer sampling. The reason is that although minimizer sampling is able to sample query k-mers, the number of shared k-mer occurrences that must be processed is much larger for minimizer sampling than fixed sampling. In conclusion, we argue that for any application where each shared k-mer occurrence must be processed, fixed sampling is the right sampling method.

  18. Comparing fixed sampling with minimizer sampling when using k-mer indexes to find maximal exact matches

    PubMed Central

    Torng, Eric

    2018-01-01

    Bioinformatics applications and pipelines increasingly use k-mer indexes to search for similar sequences. The major problem with k-mer indexes is that they require lots of memory. Sampling is often used to reduce index size and query time. Most applications use one of two major types of sampling: fixed sampling and minimizer sampling. It is well known that fixed sampling will produce a smaller index, typically by roughly a factor of two, whereas it is generally assumed that minimizer sampling will produce faster query times since query k-mers can also be sampled. However, no direct comparison of fixed and minimizer sampling has been performed to verify these assumptions. We systematically compare fixed and minimizer sampling using the human genome as our database. We use the resulting k-mer indexes for fixed sampling and minimizer sampling to find all maximal exact matches between our database, the human genome, and three separate query sets, the mouse genome, the chimp genome, and an NGS data set. We reach the following conclusions. First, using larger k-mers reduces query time for both fixed sampling and minimizer sampling at a cost of requiring more space. If we use the same k-mer size for both methods, fixed sampling requires typically half as much space whereas minimizer sampling processes queries only slightly faster. If we are allowed to use any k-mer size for each method, then we can choose a k-mer size such that fixed sampling both uses less space and processes queries faster than minimizer sampling. The reason is that although minimizer sampling is able to sample query k-mers, the number of shared k-mer occurrences that must be processed is much larger for minimizer sampling than fixed sampling. In conclusion, we argue that for any application where each shared k-mer occurrence must be processed, fixed sampling is the right sampling method. PMID:29389989

  19. Influence of population versus convenience sampling on sample characteristics in studies of cognitive aging.

    PubMed

    Brodaty, Henry; Mothakunnel, Annu; de Vel-Palumbo, Melissa; Ames, David; Ellis, Kathryn A; Reppermund, Simone; Kochan, Nicole A; Savage, Greg; Trollor, Julian N; Crawford, John; Sachdev, Perminder S

    2014-01-01

    We examined whether differences in findings of studies examining mild cognitive impairment (MCI) were associated with recruitment methods by comparing sample characteristics in two contemporaneous Australian studies, using population-based and convenience sampling. The Sydney Memory and Aging Study invited participants randomly from the electoral roll in defined geographic areas in Sydney. The Australian Imaging, Biomarkers and Lifestyle Study of Ageing recruited cognitively normal (CN) individuals via media appeals and MCI participants via referrals from clinicians in Melbourne and Perth. Demographic and cognitive variables were harmonized, and similar diagnostic criteria were applied to both samples retrospectively. CN participants recruited via convenience sampling were younger, better educated, more likely to be married and have a family history of dementia, and performed better cognitively than those recruited via population-based sampling. MCI participants recruited via population-based sampling had better memory performance and were less likely to carry the apolipoprotein E ε4 allele than clinically referred participants but did not differ on other demographic variables. A convenience sample of normal controls is likely to be younger and better functioning and that of an MCI group likely to perform worse than a purportedly random sample. Sampling bias should be considered when interpreting findings. Copyright © 2014 Elsevier Inc. All rights reserved.

  20. Sample-Based Surface Coloring

    PubMed Central

    Bürger, Kai; Krüger, Jens; Westermann, Rüdiger

    2011-01-01

    In this paper, we present a sample-based approach for surface coloring, which is independent of the original surface resolution and representation. To achieve this, we introduce the Orthogonal Fragment Buffer (OFB)—an extension of the Layered Depth Cube—as a high-resolution view-independent surface representation. The OFB is a data structure that stores surface samples at a nearly uniform distribution over the surface, and it is specifically designed to support efficient random read/write access to these samples. The data access operations have a complexity that is logarithmic in the depth complexity of the surface. Thus, compared to data access operations in tree data structures like octrees, data-dependent memory access patterns are greatly reduced. Due to the particular sampling strategy that is employed to generate an OFB, it also maintains sample coherence, and thus, exhibits very good spatial access locality. Therefore, OFB-based surface coloring performs significantly faster than sample-based approaches using tree structures. In addition, since in an OFB, the surface samples are internally stored in uniform 2D grids, OFB-based surface coloring can efficiently be realized on the GPU to enable interactive coloring of high-resolution surfaces. On the OFB, we introduce novel algorithms for color painting using volumetric and surface-aligned brushes, and we present new approaches for particle-based color advection along surfaces in real time. Due to the intermediate surface representation we choose, our method can be used to color polygonal surfaces as well as any other type of surface that can be sampled. PMID:20616392

  1. Why sampling scheme matters: the effect of sampling scheme on landscape genetic results

    Treesearch

    Michael K. Schwartz; Kevin S. McKelvey

    2008-01-01

    There has been a recent trend in genetic studies of wild populations where researchers have changed their sampling schemes from sampling pre-defined populations to sampling individuals uniformly across landscapes. This reflects the fact that many species under study are continuously distributed rather than clumped into obvious "populations". Once individual...

  2. Sample Size Determination for One- and Two-Sample Trimmed Mean Tests

    ERIC Educational Resources Information Center

    Luh, Wei-Ming; Olejnik, Stephen; Guo, Jiin-Huarng

    2008-01-01

    Formulas to determine the necessary sample sizes for parametric tests of group comparisons are available from several sources and appropriate when population distributions are normal. However, in the context of nonnormal population distributions, researchers recommend Yuen's trimmed mean test, but formulas to determine sample sizes have not been…

  3. System for Earth Sample Registration SESAR: Services for IGSN Registration and Sample Metadata Management

    NASA Astrophysics Data System (ADS)

    Chan, S.; Lehnert, K. A.; Coleman, R. J.

    2011-12-01

    SESAR, the System for Earth Sample Registration, is an online registry for physical samples collected for Earth and environmental studies. SESAR generates and administers the International Geo Sample Number IGSN, a unique identifier for samples that is dramatically advancing interoperability amongst information systems for sample-based data. SESAR was developed to provide the complete range of registry services, including definition of IGSN syntax and metadata profiles, registration and validation of name spaces requested by users, tools for users to submit and manage sample metadata, validation of submitted metadata, generation and validation of the unique identifiers, archiving of sample metadata, and public or private access to the sample metadata catalog. With the development of SESAR v3, we placed particular emphasis on creating enhanced tools that make metadata submission easier and more efficient for users, and that provide superior functionality for users to manage metadata of their samples in their private workspace MySESAR. For example, SESAR v3 includes a module where users can generate custom spreadsheet templates to enter metadata for their samples, then upload these templates online for sample registration. Once the content of the template is uploaded, it is displayed online in an editable grid format. Validation rules are executed in real-time on the grid data to ensure data integrity. Other new features of SESAR v3 include the capability to transfer ownership of samples to other SESAR users, the ability to upload and store images and other files in a sample metadata profile, and the tracking of changes to sample metadata profiles. In the next version of SESAR (v3.5), we will further improve the discovery, sharing, registration of samples. For example, we are developing a more comprehensive suite of web services that will allow discovery and registration access to SESAR from external systems. Both batch and individual registrations will be possible

  4. Methodology Series Module 5: Sampling Strategies.

    PubMed

    Setia, Maninder Singh

    2016-01-01

    Once the research question and the research design have been finalised, it is important to select the appropriate sample for the study. The method by which the researcher selects the sample is the ' Sampling Method'. There are essentially two types of sampling methods: 1) probability sampling - based on chance events (such as random numbers, flipping a coin etc.); and 2) non-probability sampling - based on researcher's choice, population that accessible & available. Some of the non-probability sampling methods are: purposive sampling, convenience sampling, or quota sampling. Random sampling method (such as simple random sample or stratified random sample) is a form of probability sampling. It is important to understand the different sampling methods used in clinical studies and mention this method clearly in the manuscript. The researcher should not misrepresent the sampling method in the manuscript (such as using the term ' random sample' when the researcher has used convenience sample). The sampling method will depend on the research question. For instance, the researcher may want to understand an issue in greater detail for one particular population rather than worry about the ' generalizability' of these results. In such a scenario, the researcher may want to use ' purposive sampling' for the study.

  5. Interferometrically stable, enclosed, spinning sample cell for spectroscopic experiments on air-sensitive samples

    NASA Astrophysics Data System (ADS)

    Baranov, Dmitry; Hill, Robert J.; Ryu, Jisu; Park, Samuel D.; Huerta-Viga, Adriana; Carollo, Alexa R.; Jonas, David M.

    2017-01-01

    In experiments with high photon flux, it is necessary to rapidly remove the sample from the beam and to delay re-excitation until the sample has returned to equilibrium. Rapid and complete sample exchange has been a challenge for air-sensitive samples and for vibration-sensitive experiments. Here, a compact spinning sample cell for air and moisture sensitive liquid and thin film samples is described. The principal parts of the cell are a copper gasket sealed enclosure, a 2.5 in. hard disk drive motor, and a reusable, chemically inert glass sandwich cell. The enclosure provides an oxygen and water free environment at the 1 ppm level, as demonstrated by multi-day tests with sodium benzophenone ketyl radical. Inside the enclosure, the glass sandwich cell spins at ≈70 Hz to generate tangential speeds of 7-12 m/s that enable complete sample exchange at 100 kHz repetition rates. The spinning cell is acoustically silent and compatible with a ±1 nm rms displacement stability interferometer. In order to enable the use of the spinning cell, we discuss centrifugation and how to prevent it, introduce the cycle-averaged resampling rate to characterize repetitive excitation, and develop a figure of merit for a long-lived photoproduct buildup.

  6. Interferometrically stable, enclosed, spinning sample cell for spectroscopic experiments on air-sensitive samples.

    PubMed

    Baranov, Dmitry; Hill, Robert J; Ryu, Jisu; Park, Samuel D; Huerta-Viga, Adriana; Carollo, Alexa R; Jonas, David M

    2017-01-01

    In experiments with high photon flux, it is necessary to rapidly remove the sample from the beam and to delay re-excitation until the sample has returned to equilibrium. Rapid and complete sample exchange has been a challenge for air-sensitive samples and for vibration-sensitive experiments. Here, a compact spinning sample cell for air and moisture sensitive liquid and thin film samples is described. The principal parts of the cell are a copper gasket sealed enclosure, a 2.5 in. hard disk drive motor, and a reusable, chemically inert glass sandwich cell. The enclosure provides an oxygen and water free environment at the 1 ppm level, as demonstrated by multi-day tests with sodium benzophenone ketyl radical. Inside the enclosure, the glass sandwich cell spins at ≈70 Hz to generate tangential speeds of 7-12 m/s that enable complete sample exchange at 100 kHz repetition rates. The spinning cell is acoustically silent and compatible with a ±1 nm rms displacement stability interferometer. In order to enable the use of the spinning cell, we discuss centrifugation and how to prevent it, introduce the cycle-averaged resampling rate to characterize repetitive excitation, and develop a figure of merit for a long-lived photoproduct buildup.

  7. Driven Boson Sampling.

    PubMed

    Barkhofen, Sonja; Bartley, Tim J; Sansoni, Linda; Kruse, Regina; Hamilton, Craig S; Jex, Igor; Silberhorn, Christine

    2017-01-13

    Sampling the distribution of bosons that have undergone a random unitary evolution is strongly believed to be a computationally hard problem. Key to outperforming classical simulations of this task is to increase both the number of input photons and the size of the network. We propose driven boson sampling, in which photons are input within the network itself, as a means to approach this goal. We show that the mean number of photons entering a boson sampling experiment can exceed one photon per input mode, while maintaining the required complexity, potentially leading to less stringent requirements on the input states for such experiments. When using heralded single-photon sources based on parametric down-conversion, this approach offers an ∼e-fold enhancement in the input state generation rate over scattershot boson sampling, reaching the scaling limit for such sources. This approach also offers a dramatic increase in the signal-to-noise ratio with respect to higher-order photon generation from such probabilistic sources, which removes the need for photon number resolution during the heralding process as the size of the system increases.

  8. Evaluation of common methods for sampling invertebrate pollinator assemblages: net sampling out-perform pan traps.

    PubMed

    Popic, Tony J; Davila, Yvonne C; Wardle, Glenda M

    2013-01-01

    Methods for sampling ecological assemblages strive to be efficient, repeatable, and representative. Unknowingly, common methods may be limited in terms of revealing species function and so of less value for comparative studies. The global decline in pollination services has stimulated surveys of flower-visiting invertebrates, using pan traps and net sampling. We explore the relative merits of these two methods in terms of species discovery, quantifying abundance, function, and composition, and responses of species to changing floral resources. Using a spatially-nested design we sampled across a 5000 km(2) area of arid grasslands, including 432 hours of net sampling and 1296 pan trap-days, between June 2010 and July 2011. Net sampling yielded 22% more species and 30% higher abundance than pan traps, and better reflected the spatio-temporal variation of floral resources. Species composition differed significantly between methods; from 436 total species, 25% were sampled by both methods, 50% only by nets, and the remaining 25% only by pans. Apart from being less comprehensive, if pan traps do not sample flower-visitors, the link to pollination is questionable. By contrast, net sampling functionally linked species to pollination through behavioural observations of flower-visitation interaction frequency. Netted specimens are also necessary for evidence of pollen transport. Benefits of net-based sampling outweighed minor differences in overall sampling effort. As pan traps and net sampling methods are not equivalent for sampling invertebrate-flower interactions, we recommend net sampling of invertebrate pollinator assemblages, especially if datasets are intended to document declines in pollination and guide measures to retain this important ecosystem service.

  9. Evaluation of Common Methods for Sampling Invertebrate Pollinator Assemblages: Net Sampling Out-Perform Pan Traps

    PubMed Central

    Popic, Tony J.; Davila, Yvonne C.; Wardle, Glenda M.

    2013-01-01

    Methods for sampling ecological assemblages strive to be efficient, repeatable, and representative. Unknowingly, common methods may be limited in terms of revealing species function and so of less value for comparative studies. The global decline in pollination services has stimulated surveys of flower-visiting invertebrates, using pan traps and net sampling. We explore the relative merits of these two methods in terms of species discovery, quantifying abundance, function, and composition, and responses of species to changing floral resources. Using a spatially-nested design we sampled across a 5000 km2 area of arid grasslands, including 432 hours of net sampling and 1296 pan trap-days, between June 2010 and July 2011. Net sampling yielded 22% more species and 30% higher abundance than pan traps, and better reflected the spatio-temporal variation of floral resources. Species composition differed significantly between methods; from 436 total species, 25% were sampled by both methods, 50% only by nets, and the remaining 25% only by pans. Apart from being less comprehensive, if pan traps do not sample flower-visitors, the link to pollination is questionable. By contrast, net sampling functionally linked species to pollination through behavioural observations of flower-visitation interaction frequency. Netted specimens are also necessary for evidence of pollen transport. Benefits of net-based sampling outweighed minor differences in overall sampling effort. As pan traps and net sampling methods are not equivalent for sampling invertebrate-flower interactions, we recommend net sampling of invertebrate pollinator assemblages, especially if datasets are intended to document declines in pollination and guide measures to retain this important ecosystem service. PMID:23799127

  10. Efficiently sampling conformations and pathways using the concurrent adaptive sampling (CAS) algorithm

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ahn, Surl-Hee; Grate, Jay W.; Darve, Eric F.

    Molecular dynamics (MD) simulations are useful in obtaining thermodynamic and kinetic properties of bio-molecules but are limited by the timescale barrier, i.e., we may be unable to efficiently obtain properties because we need to run microseconds or longer simulations using femtoseconds time steps. While there are several existing methods to overcome this timescale barrier and efficiently sample thermodynamic and/or kinetic properties, problems remain in regard to being able to sample un- known systems, deal with high-dimensional space of collective variables, and focus the computational effort on slow timescales. Hence, a new sampling method, called the “Concurrent Adaptive Sampling (CAS) algorithm,”more » has been developed to tackle these three issues and efficiently obtain conformations and pathways. The method is not constrained to use only one or two collective variables, unlike most reaction coordinate-dependent methods. Instead, it can use a large number of collective vari- ables and uses macrostates (a partition of the collective variable space) to enhance the sampling. The exploration is done by running a large number of short simula- tions, and a clustering technique is used to accelerate the sampling. In this paper, we introduce the new methodology and show results from two-dimensional models and bio-molecules, such as penta-alanine and triazine polymer« less

  11. Sample Size for Measuring Grammaticality in Preschool Children from Picture-Elicited Language Samples

    ERIC Educational Resources Information Center

    Eisenberg, Sarita L.; Guo, Ling-Yu

    2015-01-01

    Purpose: The purpose of this study was to investigate whether a shorter language sample elicited with fewer pictures (i.e., 7) would yield a percent grammatical utterances (PGU) score similar to that computed from a longer language sample elicited with 15 pictures for 3-year-old children. Method: Language samples were elicited by asking forty…

  12. Clean and Cold Sample Curation

    NASA Technical Reports Server (NTRS)

    Allen, C. C.; Agee, C. B.; Beer, R.; Cooper, B. L.

    2000-01-01

    Curation of Mars samples includes both samples that are returned to Earth, and samples that are collected, examined, and archived on Mars. Both kinds of curation operations will require careful planning to ensure that the samples are not contaminated by the instruments that are used to collect and contain them. In both cases, sample examination and subdivision must take place in an environment that is organically, inorganically, and biologically clean. Some samples will need to be prepared for analysis under ultra-clean or cryogenic conditions. Inorganic and biological cleanliness are achievable separately by cleanroom and biosafety lab techniques. Organic cleanliness to the <50 ng/sq cm level requires material control and sorbent removal - techniques being applied in our Class 10 cleanrooms and sample processing gloveboxes.

  13. Documentation of Apollo 15 samples

    NASA Technical Reports Server (NTRS)

    Sutton, R. L.; Hait, M. H.; Larson, K. B.; Swann, G. A.; Reed, V. S.; Schaber, G. G.

    1972-01-01

    A catalog is presented of the documentation of Apollo 15 samples using photographs and verbal descriptions returned from the lunar surface. Almost all of the Apollo 15 samples were correlated with lunar surface photographs, descriptions, and traverse locations. Where possible, the lunar orientations of rock samples were reconstructed in the lunar receiving laboratory, using a collimated light source to reproduce illumination and shadow characteristics of the same samples shown in lunar photographs. In several cases, samples were not recognized in lunar surface photographs, and their approximate locations are known only by association with numbered sample bags used during their collection. Tables, photographs, and maps included in this report are designed to aid in the understanding of the lunar setting of the Apollo 15 samples.

  14. Sampling pig farms at the abattoir in a cross-sectional study - Evaluation of a sampling method.

    PubMed

    Birkegård, Anna Camilla; Halasa, Tariq; Toft, Nils

    2017-09-15

    A cross-sectional study design is relatively inexpensive, fast and easy to conduct when compared to other study designs. Careful planning is essential to obtaining a representative sample of the population, and the recommended approach is to use simple random sampling from an exhaustive list of units in the target population. This approach is rarely feasible in practice, and other sampling procedures must often be adopted. For example, when slaughter pigs are the target population, sampling the pigs on the slaughter line may be an alternative to on-site sampling at a list of farms. However, it is difficult to sample a large number of farms from an exact predefined list, due to the logistics and workflow of an abattoir. Therefore, it is necessary to have a systematic sampling procedure and to evaluate the obtained sample with respect to the study objective. We propose a method for 1) planning, 2) conducting, and 3) evaluating the representativeness and reproducibility of a cross-sectional study when simple random sampling is not possible. We used an example of a cross-sectional study with the aim of quantifying the association of antimicrobial resistance and antimicrobial consumption in Danish slaughter pigs. It was not possible to visit farms within the designated timeframe. Therefore, it was decided to use convenience sampling at the abattoir. Our approach was carried out in three steps: 1) planning: using data from meat inspection to plan at which abattoirs and how many farms to sample; 2) conducting: sampling was carried out at five abattoirs; 3) evaluation: representativeness was evaluated by comparing sampled and non-sampled farms, and the reproducibility of the study was assessed through simulated sampling based on meat inspection data from the period where the actual data collection was carried out. In the cross-sectional study samples were taken from 681 Danish pig farms, during five weeks from February to March 2015. The evaluation showed that the sampling

  15. An empirical comparison of respondent-driven sampling, time location sampling, and snowball sampling for behavioral surveillance in men who have sex with men, Fortaleza, Brazil.

    PubMed

    Kendall, Carl; Kerr, Ligia R F S; Gondim, Rogerio C; Werneck, Guilherme L; Macena, Raimunda Hermelinda Maia; Pontes, Marta Kerr; Johnston, Lisa G; Sabin, Keith; McFarland, Willi

    2008-07-01

    Obtaining samples of populations at risk for HIV challenges surveillance, prevention planning, and evaluation. Methods used include snowball sampling, time location sampling (TLS), and respondent-driven sampling (RDS). Few studies have made side-by-side comparisons to assess their relative advantages. We compared snowball, TLS, and RDS surveys of men who have sex with men (MSM) in Forteleza, Brazil, with a focus on the socio-economic status (SES) and risk behaviors of the samples to each other, to known AIDS cases and to the general population. RDS produced a sample with wider inclusion of lower SES than snowball sampling or TLS-a finding of health significance given the majority of AIDS cases reported among MSM in the state were low SES. RDS also achieved the sample size faster and at lower cost. For reasons of inclusion and cost-efficiency, RDS is the sampling methodology of choice for HIV surveillance of MSM in Fortaleza.

  16. Sample processor for the automatic extraction of families of compounds from liquid samples and/or homogenized solid samples suspended in a liquid

    NASA Technical Reports Server (NTRS)

    Jahnsen, Vilhelm J. (Inventor); Campen, Jr., Charles F. (Inventor)

    1980-01-01

    A sample processor and method for the automatic extraction of families of compounds, known as extracts, from liquid and/or homogenized solid samples are disclosed. The sample processor includes a tube support structure which supports a plurality of extraction tubes, each containing a sample from which families of compounds are to be extracted. The support structure is moveable automatically with respect to one or more extraction stations, so that as each tube is at each station a solvent system, consisting of a solvent and reagents, is introduced therein. As a result an extract is automatically extracted from the tube. The sample processor includes an arrangement for directing the different extracts from each tube to different containers, or to direct similar extracts from different tubes to the same utilization device.

  17. A sampling plan for conduit-flow karst springs: Minimizing sampling cost and maximizing statistical utility

    USGS Publications Warehouse

    Currens, J.C.

    1999-01-01

    Analytical data for nitrate and triazines from 566 samples collected over a 3-year period at Pleasant Grove Spring, Logan County, KY, were statistically analyzed to determine the minimum data set needed to calculate meaningful yearly averages for a conduit-flow karst spring. Results indicate that a biweekly sampling schedule augmented with bihourly samples from high-flow events will provide meaningful suspended-constituent and dissolved-constituent statistics. Unless collected over an extensive period of time, daily samples may not be representative and may also be autocorrelated. All high-flow events resulting in a significant deflection of a constituent from base-line concentrations should be sampled. Either the geometric mean or the flow-weighted average of the suspended constituents should be used. If automatic samplers are used, then they may be programmed to collect storm samples as frequently as every few minutes to provide details on the arrival time of constituents of interest. However, only samples collected bihourly should be used to calculate averages. By adopting a biweekly sampling schedule augmented with high-flow samples, the need to continuously monitor discharge, or to search for and analyze existing data to develop a statistically valid monitoring plan, is lessened.Analytical data for nitrate and triazines from 566 samples collected over a 3-year period at Pleasant Grove Spring, Logan County, KY, were statistically analyzed to determine the minimum data set needed to calculate meaningful yearly averages for a conduit-flow karst spring. Results indicate that a biweekly sampling schedule augmented with bihourly samples from high-flow events will provide meaningful suspended-constituent and dissolved-constituent statistics. Unless collected over an extensive period of time, daily samples may not be representative and may also be autocorrelated. All high-flow events resulting in a significant deflection of a constituent from base-line concentrations

  18. 40 CFR 1065.1107 - Sample media and sample system preparation; sample system assembly.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ...) For capturing PM, we recommend using pure quartz filters with no binder. Select the filter diameter to minimize filter change intervals, accounting for the expected PM emission rate, sample flow rate, and... filter without replacing the sorbent or otherwise disassembling the batch sampler. In those cases...

  19. 40 CFR 98.74 - Monitoring and QA/QC requirements.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Ammonia Manufacturing § 98.74 Monitoring and QA/QC... (c)(8) of this section. (f) [Reserved] (g) If CO2 from ammonia production is used to produce urea at...

  20. 40 CFR 98.74 - Monitoring and QA/QC requirements.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Ammonia Manufacturing § 98.74 Monitoring and QA/QC... (c)(8) of this section. (f)[Reserved] (g) If CO2 from ammonia production is used to produce urea at...

  1. 40 CFR 98.74 - Monitoring and QA/QC requirements.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Ammonia Manufacturing § 98.74 Monitoring and QA/QC... (c)(8) of this section. (f)[Reserved] (g) If CO2 from ammonia production is used to produce urea at...

  2. 40 CFR 98.74 - Monitoring and QA/QC requirements.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Ammonia Manufacturing § 98.74 Monitoring and QA/QC... (c)(8) of this section. (f) [Reserved] (g) If CO2 from ammonia production is used to produce urea at...

  3. Preservation of Liquid Biological Samples

    NASA Technical Reports Server (NTRS)

    Putcha, Lakshmi (Inventor); Nimmagudda, Ramalingeshwara (Inventor)

    2004-01-01

    The present invention related to the preservation of a liquid biological sample. The biological sample is exposed to a preservative containing at least about 0.15 g of sodium benzoate and at least about 0.025 g of citric acid per 100 ml of sample. The biological sample may be collected in a vessel or an absorbent mass. The biological sample may also be exposed to a substrate and/or a vehicle.

  4. Sampling Key Populations for HIV Surveillance: Results From Eight Cross-Sectional Studies Using Respondent-Driven Sampling and Venue-Based Snowball Sampling.

    PubMed

    Rao, Amrita; Stahlman, Shauna; Hargreaves, James; Weir, Sharon; Edwards, Jessie; Rice, Brian; Kochelani, Duncan; Mavimbela, Mpumelelo; Baral, Stefan

    2017-10-20

    In using regularly collected or existing surveillance data to characterize engagement in human immunodeficiency virus (HIV) services among marginalized populations, differences in sampling methods may produce different pictures of the target population and may therefore result in different priorities for response. The objective of this study was to use existing data to evaluate the sample distribution of eight studies of female sex workers (FSW) and men who have sex with men (MSM), who were recruited using different sampling approaches in two locations within Sub-Saharan Africa: Manzini, Swaziland and Yaoundé, Cameroon. MSM and FSW participants were recruited using either respondent-driven sampling (RDS) or venue-based snowball sampling. Recruitment took place between 2011 and 2016. Participants at each study site were administered a face-to-face survey to assess sociodemographics, along with the prevalence of self-reported HIV status, frequency of HIV testing, stigma, and other HIV-related characteristics. Crude and RDS-adjusted prevalence estimates were calculated. Crude prevalence estimates from the venue-based snowball samples were compared with the overlap of the RDS-adjusted prevalence estimates, between both FSW and MSM in Cameroon and Swaziland. RDS samples tended to be younger (MSM aged 18-21 years in Swaziland: 47.6% [139/310] in RDS vs 24.3% [42/173] in Snowball, in Cameroon: 47.9% [99/306] in RDS vs 20.1% [52/259] in Snowball; FSW aged 18-21 years in Swaziland 42.5% [82/325] in RDS vs 8.0% [20/249] in Snowball; in Cameroon 15.6% [75/576] in RDS vs 8.1% [25/306] in Snowball). They were less educated (MSM: primary school completed or less in Swaziland 42.6% [109/310] in RDS vs 4.0% [7/173] in Snowball, in Cameroon 46.2% [138/306] in RDS vs 14.3% [37/259] in Snowball; FSW: primary school completed or less in Swaziland 86.6% [281/325] in RDS vs 23.9% [59/247] in Snowball, in Cameroon 87.4% [520/576] in RDS vs 77.5% [238/307] in Snowball) than the snowball

  5. Sampling Key Populations for HIV Surveillance: Results From Eight Cross-Sectional Studies Using Respondent-Driven Sampling and Venue-Based Snowball Sampling

    PubMed Central

    Stahlman, Shauna; Hargreaves, James; Weir, Sharon; Edwards, Jessie; Rice, Brian; Kochelani, Duncan; Mavimbela, Mpumelelo; Baral, Stefan

    2017-01-01

    Background In using regularly collected or existing surveillance data to characterize engagement in human immunodeficiency virus (HIV) services among marginalized populations, differences in sampling methods may produce different pictures of the target population and may therefore result in different priorities for response. Objective The objective of this study was to use existing data to evaluate the sample distribution of eight studies of female sex workers (FSW) and men who have sex with men (MSM), who were recruited using different sampling approaches in two locations within Sub-Saharan Africa: Manzini, Swaziland and Yaoundé, Cameroon. Methods MSM and FSW participants were recruited using either respondent-driven sampling (RDS) or venue-based snowball sampling. Recruitment took place between 2011 and 2016. Participants at each study site were administered a face-to-face survey to assess sociodemographics, along with the prevalence of self-reported HIV status, frequency of HIV testing, stigma, and other HIV-related characteristics. Crude and RDS-adjusted prevalence estimates were calculated. Crude prevalence estimates from the venue-based snowball samples were compared with the overlap of the RDS-adjusted prevalence estimates, between both FSW and MSM in Cameroon and Swaziland. Results RDS samples tended to be younger (MSM aged 18-21 years in Swaziland: 47.6% [139/310] in RDS vs 24.3% [42/173] in Snowball, in Cameroon: 47.9% [99/306] in RDS vs 20.1% [52/259] in Snowball; FSW aged 18-21 years in Swaziland 42.5% [82/325] in RDS vs 8.0% [20/249] in Snowball; in Cameroon 15.6% [75/576] in RDS vs 8.1% [25/306] in Snowball). They were less educated (MSM: primary school completed or less in Swaziland 42.6% [109/310] in RDS vs 4.0% [7/173] in Snowball, in Cameroon 46.2% [138/306] in RDS vs 14.3% [37/259] in Snowball; FSW: primary school completed or less in Swaziland 86.6% [281/325] in RDS vs 23.9% [59/247] in Snowball, in Cameroon 87.4% [520/576] in RDS vs 77.5% [238

  6. Offline solid phase microextraction sampling system

    DOEpatents

    Harvey, Chris A.

    2008-12-16

    An offline solid phase microextraction (SPME) sampling apparatus for enabling SPME samples to be taken a number of times from a previously collected fluid sample (e.g. sample atmosphere) stored in a fused silica lined bottle which keeps volatile organics in the fluid sample stable for weeks at a time. The offline SPME sampling apparatus has a hollow body surrounding a sampling chamber, with multiple ports through which a portion of a previously collected fluid sample may be (a) released into the sampling chamber, (b) SPME sampled to collect analytes for subsequent GC analysis, and (c) flushed/purged using a fluidically connected vacuum source and purging fluid source to prepare the sampling chamber for additional SPME samplings of the same original fluid sample, such as may have been collected in situ from a headspace.

  7. Methodology Series Module 5: Sampling Strategies

    PubMed Central

    Setia, Maninder Singh

    2016-01-01

    Once the research question and the research design have been finalised, it is important to select the appropriate sample for the study. The method by which the researcher selects the sample is the ‘ Sampling Method’. There are essentially two types of sampling methods: 1) probability sampling – based on chance events (such as random numbers, flipping a coin etc.); and 2) non-probability sampling – based on researcher's choice, population that accessible & available. Some of the non-probability sampling methods are: purposive sampling, convenience sampling, or quota sampling. Random sampling method (such as simple random sample or stratified random sample) is a form of probability sampling. It is important to understand the different sampling methods used in clinical studies and mention this method clearly in the manuscript. The researcher should not misrepresent the sampling method in the manuscript (such as using the term ‘ random sample’ when the researcher has used convenience sample). The sampling method will depend on the research question. For instance, the researcher may want to understand an issue in greater detail for one particular population rather than worry about the ‘ generalizability’ of these results. In such a scenario, the researcher may want to use ‘ purposive sampling’ for the study. PMID:27688438

  8. LUNAR SAMPLES - APOLLO 11

    NASA Image and Video Library

    1969-08-03

    S69-40749 (July 1969) --- Dr. Grant Heikan, MSC and a Lunar Sample Preliminary Examination Team member, examines lunar material in a sieve from the bulk sample container which was opened in the Biopreparation Laboratory of the Lunar Receiving Laboratory. The samples were collected by astronauts Neil A. Armstrong and Edwin E. Aldrin Jr. during their lunar surface extravehicular activity on July 20, 1969.

  9. Systematic sampling for suspended sediment

    Treesearch

    Robert B. Thomas

    1991-01-01

    Abstract - Because of high costs or complex logistics, scientific populations cannot be measured entirely and must be sampled. Accepted scientific practice holds that sample selection be based on statistical principles to assure objectivity when estimating totals and variances. Probability sampling--obtaining samples with known probabilities--is the only method that...

  10. Distribution of the two-sample t-test statistic following blinded sample size re-estimation.

    PubMed

    Lu, Kaifeng

    2016-05-01

    We consider the blinded sample size re-estimation based on the simple one-sample variance estimator at an interim analysis. We characterize the exact distribution of the standard two-sample t-test statistic at the final analysis. We describe a simulation algorithm for the evaluation of the probability of rejecting the null hypothesis at given treatment effect. We compare the blinded sample size re-estimation method with two unblinded methods with respect to the empirical type I error, the empirical power, and the empirical distribution of the standard deviation estimator and final sample size. We characterize the type I error inflation across the range of standardized non-inferiority margin for non-inferiority trials, and derive the adjusted significance level to ensure type I error control for given sample size of the internal pilot study. We show that the adjusted significance level increases as the sample size of the internal pilot study increases. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  11. Efficiently sampling conformations and pathways using the concurrent adaptive sampling (CAS) algorithm

    NASA Astrophysics Data System (ADS)

    Ahn, Surl-Hee; Grate, Jay W.; Darve, Eric F.

    2017-08-01

    Molecular dynamics simulations are useful in obtaining thermodynamic and kinetic properties of bio-molecules, but they are limited by the time scale barrier. That is, we may not obtain properties' efficiently because we need to run microseconds or longer simulations using femtosecond time steps. To overcome this time scale barrier, we can use the weighted ensemble (WE) method, a powerful enhanced sampling method that efficiently samples thermodynamic and kinetic properties. However, the WE method requires an appropriate partitioning of phase space into discrete macrostates, which can be problematic when we have a high-dimensional collective space or when little is known a priori about the molecular system. Hence, we developed a new WE-based method, called the "Concurrent Adaptive Sampling (CAS) algorithm," to tackle these issues. The CAS algorithm is not constrained to use only one or two collective variables, unlike most reaction coordinate-dependent methods. Instead, it can use a large number of collective variables and adaptive macrostates to enhance the sampling in the high-dimensional space. This is especially useful for systems in which we do not know what the right reaction coordinates are, in which case we can use many collective variables to sample conformations and pathways. In addition, a clustering technique based on the committor function is used to accelerate sampling the slowest process in the molecular system. In this paper, we introduce the new method and show results from two-dimensional models and bio-molecules, specifically penta-alanine and a triazine trimer.

  12. Sample Design, Sample Augmentation, and Estimation for Wave 2 of the NSHAP

    PubMed Central

    English, Ned; Pedlow, Steven; Kwok, Peter K.

    2014-01-01

    Objectives. The sample for the second wave (2010) of National Social Life, Health, and Aging Project (NSHAP) was designed to increase the scientific value of the Wave 1 (2005) data set by revisiting sample members 5 years after their initial interviews and augmenting this sample where possible. Method. There were 2 important innovations. First, the scope of the study was expanded by collecting data from coresident spouses or romantic partners. Second, to maximize the representativeness of the Wave 2 data, nonrespondents from Wave 1 were again approached for interview in the Wave 2 sample. Results. The overall unconditional response rate for the Wave 2 panel was 74%; the conditional response rate of Wave 1 respondents was 89%; the conditional response rate of partners was 84%; and the conversion rate for Wave 1 nonrespondents was 26%. Discussion. The inclusion of coresident partners enhanced the study by allowing the examination of how intimate, household relationships are related to health trajectories and by augmenting the size of the NSHAP sample size for this and future waves. The uncommon strategy of returning to Wave 1 nonrespondents reduced potential bias by ensuring that to the extent possible the whole of the original sample forms the basis for the field effort. NSHAP Wave 2 achieved its field objectives of consolidating the panel, recruiting their resident spouses or romantic partners, and converting a significant proportion of Wave 1 nonrespondents. PMID:25360016

  13. Ensemble Sampling vs. Time Sampling in Molecular Dynamics Simulations of Thermal Conductivity

    DOE PAGES

    Gordiz, Kiarash; Singh, David J.; Henry, Asegun

    2015-01-29

    In this report we compare time sampling and ensemble averaging as two different methods available for phase space sampling. For the comparison, we calculate thermal conductivities of solid argon and silicon structures, using equilibrium molecular dynamics. We introduce two different schemes for the ensemble averaging approach, and show that both can reduce the total simulation time as compared to time averaging. It is also found that velocity rescaling is an efficient mechanism for phase space exploration. Although our methodology is tested using classical molecular dynamics, the ensemble generation approaches may find their greatest utility in computationally expensive simulations such asmore » first principles molecular dynamics. For such simulations, where each time step is costly, time sampling can require long simulation times because each time step must be evaluated sequentially and therefore phase space averaging is achieved through sequential operations. On the other hand, with ensemble averaging, phase space sampling can be achieved through parallel operations, since each ensemble is independent. For this reason, particularly when using massively parallel architectures, ensemble sampling can result in much shorter simulation times and exhibits similar overall computational effort.« less

  14. Immune Blood Sample Draw

    NASA Image and Video Library

    2012-04-26

    ISS030-E-257690 (26 April 2012) --- European Space Agency astronaut Andre Kuipers, Expedition 30 flight engineer, prepares for IMMUNE venous blood sample draws in the Columbus laboratory of the International Space Station. Following the blood draws, the samples were temporarily stowed in the Minus Eighty Laboratory Freezer for ISS 1 (MELFI-1) and later packed together with saliva samples on the Soyuz TMA-22 for return to Earth for analysis.

  15. 45 CFR 1356.84 - Sampling.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 45 Public Welfare 4 2011-10-01 2011-10-01 false Sampling. 1356.84 Section 1356.84 Public Welfare....84 Sampling. (a) The State agency may collect and report the information required in section 1356.83(e) of this part on a sample of the baseline population consistent with the sampling requirements...

  16. 45 CFR 1356.84 - Sampling.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 45 Public Welfare 4 2010-10-01 2010-10-01 false Sampling. 1356.84 Section 1356.84 Public Welfare....84 Sampling. (a) The State agency may collect and report the information required in section 1356.83(e) of this part on a sample of the baseline population consistent with the sampling requirements...

  17. 27 CFR 19.749 - Samples.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... Samples. The following rules apply to the testing and analysis of samples of spirits and fuel alcohol for purposes of this subpart: (a) A proprietor may take samples of spirits and fuel alcohol for on-site testing and analysis at the proprietor's alcohol fuel plant; (b) A proprietor may not remove samples of...

  18. Estimating the Expected Value of Sample Information Using the Probabilistic Sensitivity Analysis Sample

    PubMed Central

    Oakley, Jeremy E.; Brennan, Alan; Breeze, Penny

    2015-01-01

    Health economic decision-analytic models are used to estimate the expected net benefits of competing decision options. The true values of the input parameters of such models are rarely known with certainty, and it is often useful to quantify the value to the decision maker of reducing uncertainty through collecting new data. In the context of a particular decision problem, the value of a proposed research design can be quantified by its expected value of sample information (EVSI). EVSI is commonly estimated via a 2-level Monte Carlo procedure in which plausible data sets are generated in an outer loop, and then, conditional on these, the parameters of the decision model are updated via Bayes rule and sampled in an inner loop. At each iteration of the inner loop, the decision model is evaluated. This is computationally demanding and may be difficult if the posterior distribution of the model parameters conditional on sampled data is hard to sample from. We describe a fast nonparametric regression-based method for estimating per-patient EVSI that requires only the probabilistic sensitivity analysis sample (i.e., the set of samples drawn from the joint distribution of the parameters and the corresponding net benefits). The method avoids the need to sample from the posterior distributions of the parameters and avoids the need to rerun the model. The only requirement is that sample data sets can be generated. The method is applicable with a model of any complexity and with any specification of model parameter distribution. We demonstrate in a case study the superior efficiency of the regression method over the 2-level Monte Carlo method. PMID:25810269

  19. Surface sampling concentration and reaction probe with controller to adjust sampling position

    DOEpatents

    Van Berkel, Gary J.; ElNaggar, Mariam S.

    2016-07-19

    A method of analyzing a chemical composition of a specimen is described. The method can include providing a probe comprising an outer capillary tube and an inner capillary tube disposed co-axially within the outer capillary tube, where the inner and outer capillary tubes define a solvent capillary and a sampling capillary in fluid communication with one another at a distal end of the probe; contacting a target site on a surface of a specimen with a solvent in fluid communication with the probe; maintaining a plug volume proximate a solvent-specimen interface, wherein the plug volume is in fluid communication with the probe; draining plug sampling fluid from the plug volume through the sampling capillary; and analyzing a chemical composition of the plug sampling fluid with an analytical instrument. A system for performing the method is also described.

  20. Construction method of QC-LDPC codes based on multiplicative group of finite field in optical communication

    NASA Astrophysics Data System (ADS)

    Huang, Sheng; Ao, Xiang; Li, Yuan-yuan; Zhang, Rui

    2016-09-01

    In order to meet the needs of high-speed development of optical communication system, a construction method of quasi-cyclic low-density parity-check (QC-LDPC) codes based on multiplicative group of finite field is proposed. The Tanner graph of parity check matrix of the code constructed by this method has no cycle of length 4, and it can make sure that the obtained code can get a good distance property. Simulation results show that when the bit error rate ( BER) is 10-6, in the same simulation environment, the net coding gain ( NCG) of the proposed QC-LDPC(3 780, 3 540) code with the code rate of 93.7% in this paper is improved by 2.18 dB and 1.6 dB respectively compared with those of the RS(255, 239) code in ITU-T G.975 and the LDPC(3 2640, 3 0592) code in ITU-T G.975.1. In addition, the NCG of the proposed QC-LDPC(3 780, 3 540) code is respectively 0.2 dB and 0.4 dB higher compared with those of the SG-QC-LDPC(3 780, 3 540) code based on the two different subgroups in finite field and the AS-QC-LDPC(3 780, 3 540) code based on the two arbitrary sets of a finite field. Thus, the proposed QC-LDPC(3 780, 3 540) code in this paper can be well applied in optical communication systems.

  1. Sample Length Affects the Reliability of Language Sample Measures in 3-Year-Olds: Evidence from Parent-Elicited Conversational Samples

    ERIC Educational Resources Information Center

    Guo, Ling-Yu; Eisenberg, Sarita

    2015-01-01

    Purpose: The goal of this study was to investigate the extent to which sample length affected the reliability of total number of words (TNW), number of different words (NDW), and mean length of C-units in morphemes (MLCUm) in parent-elicited conversational samples for 3-year-olds. Method: Participants were sixty 3-year-olds. A 22-min language…

  2. Reweighting of the primary sampling units in the National Automotive Sampling System

    DOT National Transportation Integrated Search

    1997-09-01

    The original design of hte National Automotive Sampling System - formerly the National Accident Sampling System - called for 75 PSUs randomly selected from PSUs which were grouped into various strata across the U.S. The implementation of the PSU samp...

  3. Probability Sampling Method for a Hidden Population Using Respondent-Driven Sampling: Simulation for Cancer Survivors.

    PubMed

    Jung, Minsoo

    2015-01-01

    When there is no sampling frame within a certain group or the group is concerned that making its population public would bring social stigma, we say the population is hidden. It is difficult to approach this kind of population survey-methodologically because the response rate is low and its members are not quite honest with their responses when probability sampling is used. The only alternative known to address the problems caused by previous methods such as snowball sampling is respondent-driven sampling (RDS), which was developed by Heckathorn and his colleagues. RDS is based on a Markov chain, and uses the social network information of the respondent. This characteristic allows for probability sampling when we survey a hidden population. We verified through computer simulation whether RDS can be used on a hidden population of cancer survivors. According to the simulation results of this thesis, the chain-referral sampling of RDS tends to minimize as the sample gets bigger, and it becomes stabilized as the wave progresses. Therefore, it shows that the final sample information can be completely independent from the initial seeds if a certain level of sample size is secured even if the initial seeds were selected through convenient sampling. Thus, RDS can be considered as an alternative which can improve upon both key informant sampling and ethnographic surveys, and it needs to be utilized for various cases domestically as well.

  4. Improved Sampling Method Reduces Isokinetic Sampling Errors.

    ERIC Educational Resources Information Center

    Karels, Gale G.

    The particulate sampling system currently in use by the Bay Area Air Pollution Control District, San Francisco, California is described in this presentation for the 12th Conference on Methods in Air Pollution and Industrial Hygiene Studies, University of Southern California, April, 1971. The method represents a practical, inexpensive tool that can…

  5. jqcML: an open-source java API for mass spectrometry quality control data in the qcML format.

    PubMed

    Bittremieux, Wout; Kelchtermans, Pieter; Valkenborg, Dirk; Martens, Lennart; Laukens, Kris

    2014-07-03

    The awareness that systematic quality control is an essential factor to enable the growth of proteomics into a mature analytical discipline has increased over the past few years. To this aim, a controlled vocabulary and document structure have recently been proposed by Walzer et al. to store and disseminate quality-control metrics for mass-spectrometry-based proteomics experiments, called qcML. To facilitate the adoption of this standardized quality control routine, we introduce jqcML, a Java application programming interface (API) for the qcML data format. First, jqcML provides a complete object model to represent qcML data. Second, jqcML provides the ability to read, write, and work in a uniform manner with qcML data from different sources, including the XML-based qcML file format and the relational database qcDB. Interaction with the XML-based file format is obtained through the Java Architecture for XML Binding (JAXB), while generic database functionality is obtained by the Java Persistence API (JPA). jqcML is released as open-source software under the permissive Apache 2.0 license and can be downloaded from https://bitbucket.org/proteinspector/jqcml .

  6. Self-sampling with HPV mRNA analyses from vagina and urine compared with cervical samples.

    PubMed

    Asciutto, Katrin Christine; Ernstson, Avalon; Forslund, Ola; Borgfeldt, Christer

    2018-04-01

    In order to increase coverage in the organized cervical screening program, self-sampling with HPV analyses has been suggested. The aim was to compare human papillomavirus (HPV) mRNA detection in vaginal and urine self-collected samples with clinician-taken cervical samples and the corresponding clinician-taken histological specimens. Self-collected vaginal, urine and clinician-taken cervical samples were analyzed from 209 women with the Aptima mRNA assay (Hologic Inc, MA, USA). Cervical cytology, colposcopy, biopsy and/or the loop electrosurgical excision procedure (LEEP) were performed in every examination. The sensitivity of the HPV mRNA test in detecting high-grade squamous intraepithelial lesions (HSIL)/adenocarcinoma in situ (AIS)/cancer cases was as follows: for the vaginal self-samples 85.5% (95% CI; 75.0-92.8), the urinary samples 44.8% (95% CI; 32.6-57.4), and for routine cytology 81.7% (95% CI; 70.7-89.9). For the clinician-taken cervical HPV samples the sensitivity of the HPV mRNA test in detecting HSIL/AIS/cancer was 100.0% (95% CI; 94.9-100.0). The specificity of the HPV mRNA was similar for the clinician-taken cervical HPV samples and the self-samples: 49.0% vs. 48.1%. The urinary HPV samples had a specificity of 61.9% and cytology had a specificity of 93.3%. The sensitivity of the Aptima HPV mRNA test in detecting HSIL/AIS/cancer from vaginal self-samples was similar to that of routine cytology. The Aptima HPV mRNA vaginal self-sampling analysis may serve as a complement in screening programs. Copyright © 2018 Elsevier B.V. All rights reserved.

  7. Dimensionality and the sample unit

    Treesearch

    Francis A. Roesch

    2009-01-01

    The sample unit and its implications for the Forest Service, U.S. Department of Agriculture's Forest Inventory and Analysis program are discussed in light of a generalized three-dimensional concept of continuous forest inventories. The concept views the sampled population as a spatial-temporal cube and the sample as a finite partitioning of the cube. The sample...

  8. Preservation of Liquid Biological Samples

    NASA Technical Reports Server (NTRS)

    Putcha, Lakshmi (Inventor); Nimmagudda, Ramalingeshwara R. (Inventor)

    2000-01-01

    The present invention provides a method of preserving a liquid biological sample, comprising the step of: contacting said liquid biological sample with a preservative comprising, sodium benzoate in an amount of at least about 0.15% of the sample (weight/volume) and citric acid in an amount of at least about 0.025% of the sample (weight/volume).

  9. Sampling Operations on Big Data

    DTIC Science & Technology

    2015-11-29

    gories. These include edge sampling methods where edges are selected by a predetermined criteria; snowball sampling methods where algorithms start... Sampling Operations on Big Data Vijay Gadepally, Taylor Herr, Luke Johnson, Lauren Milechin, Maja Milosavljevic, Benjamin A. Miller Lincoln...process and disseminate information for discovery and exploration under real-time constraints. Common signal processing operations such as sampling and

  10. 40 CFR 98.434 - Monitoring and QA/QC requirements.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... Contained in Pre-Charged Equipment or Closed-Cell Foams § 98.434 Monitoring and QA/QC requirements. (a) For... equipment or closed-cell foam in the correct quantities (metric tons) and units (kg per piece of equipment...

  11. 40 CFR 98.434 - Monitoring and QA/QC requirements.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... Contained in Pre-Charged Equipment or Closed-Cell Foams § 98.434 Monitoring and QA/QC requirements. (a) For... equipment or closed-cell foam in the correct quantities (metric tons) and units (kg per piece of equipment...

  12. Estimation of plant sampling uncertainty: an example based on chemical analysis of moss samples.

    PubMed

    Dołęgowska, Sabina

    2016-11-01

    In order to estimate the level of uncertainty arising from sampling, 54 samples (primary and duplicate) of the moss species Pleurozium schreberi (Brid.) Mitt. were collected within three forested areas (Wierna Rzeka, Piaski, Posłowice Range) in the Holy Cross Mountains (south-central Poland). During the fieldwork, each primary sample composed of 8 to 10 increments (subsamples) was taken over an area of 10 m 2 whereas duplicate samples were collected in the same way at a distance of 1-2 m. Subsequently, all samples were triple rinsed with deionized water, dried, milled, and digested (8 mL HNO 3 (1:1) + 1 mL 30 % H 2 O 2 ) in a closed microwave system Multiwave 3000. The prepared solutions were analyzed twice for Cu, Fe, Mn, and Zn using FAAS and GFAAS techniques. All datasets were checked for normality and for normally distributed elements (Cu from Piaski, Zn from Posłowice, Fe, Zn from Wierna Rzeka). The sampling uncertainty was computed with (i) classical ANOVA, (ii) classical RANOVA, (iii) modified RANOVA, and (iv) range statistics. For the remaining elements, the sampling uncertainty was calculated with traditional and/or modified RANOVA (if the amount of outliers did not exceed 10 %) or classical ANOVA after Box-Cox transformation (if the amount of outliers exceeded 10 %). The highest concentrations of all elements were found in moss samples from Piaski, whereas the sampling uncertainty calculated with different statistical methods ranged from 4.1 to 22 %.

  13. Approximate sample size formulas for the two-sample trimmed mean test with unequal variances.

    PubMed

    Luh, Wei-Ming; Guo, Jiin-Huarng

    2007-05-01

    Yuen's two-sample trimmed mean test statistic is one of the most robust methods to apply when variances are heterogeneous. The present study develops formulas for the sample size required for the test. The formulas are applicable for the cases of unequal variances, non-normality and unequal sample sizes. Given the specified alpha and the power (1-beta), the minimum sample size needed by the proposed formulas under various conditions is less than is given by the conventional formulas. Moreover, given a specified size of sample calculated by the proposed formulas, simulation results show that Yuen's test can achieve statistical power which is generally superior to that of the approximate t test. A numerical example is provided.

  14. Geochemical reanalysis of historical U.S. Geological Survey sediment samples from the Kougarok area, Bendeleben and Teller quadrangles, Seward Peninsula, Alaska

    USGS Publications Warehouse

    Werdon, Melanie B.; Granitto, Matthew; Azain, Jaime S.

    2015-01-01

    The State of Alaska’s Strategic and Critical Minerals (SCM) Assessment project, a State-funded Capital Improvement Project (CIP), is designed to evaluate Alaska’s statewide potential for SCM resources. The SCM Assessment is being implemented by the Alaska Division of Geological & Geophysical Surveys (DGGS), and involves obtaining new airborne-geophysical, geological, and geochemical data. As part of the SCM Assessment, thousands of historical geochemical samples from DGGS, U.S. Geological Survey (USGS), and U.S. Bureau of Mines archives are being reanalyzed by DGGS using modern, quantitative, geochemical-analytical methods. The objective is to update the statewide geochemical database to more clearly identify areas in Alaska with SCM potential. The USGS is also undertaking SCM-related geologic studies in Alaska through the federally funded Alaska Critical Minerals cooperative project. DGGS and USGS share the goal of evaluating Alaska’s strategic and critical minerals potential and together created a Letter of Agreement (signed December 2012) and a supplementary Technical Assistance Agreement (#14CMTAA143458) to facilitate the two agencies’ cooperative work. Under these agreements, DGGS contracted the USGS in Denver to reanalyze historical USGS sediment samples from Alaska. For this report, DGGS funded reanalysis of 302 historical USGS sediment samples from the statewide Alaska Geochemical Database Version 2.0 (AGDB2; Granitto and others, 2013). Samples were chosen from the Kougarok River drainage as well as smaller adjacent drainages in the Bendeleben and Teller quadrangles, Seward Peninsula, Alaska (fig. 1). The USGS was responsible for sample retrieval from the National Geochemical Sample Archive (NGSA) in Denver, Colorado through the final quality assurance/quality control (QA/QC) of the geochemical analyses obtained through the USGS contract lab. The new geochemical data are published in this report as a coauthored DGGS report, and will be incorporated

  15. On incomplete sampling under birth-death models and connections to the sampling-based coalescent.

    PubMed

    Stadler, Tanja

    2009-11-07

    The constant rate birth-death process is used as a stochastic model for many biological systems, for example phylogenies or disease transmission. As the biological data are usually not fully available, it is crucial to understand the effect of incomplete sampling. In this paper, we analyze the constant rate birth-death process with incomplete sampling. We derive the density of the bifurcation events for trees on n leaves which evolved under this birth-death-sampling process. This density is used for calculating prior distributions in Bayesian inference programs and for efficiently simulating trees. We show that the birth-death-sampling process can be interpreted as a birth-death process with reduced rates and complete sampling. This shows that joint inference of birth rate, death rate and sampling probability is not possible. The birth-death-sampling process is compared to the sampling-based population genetics model, the coalescent. It is shown that despite many similarities between these two models, the distribution of bifurcation times remains different even in the case of very large population sizes. We illustrate these findings on an Hepatitis C virus dataset from Egypt. We show that the transmission times estimates are significantly different-the widely used Gamma statistic even changes its sign from negative to positive when switching from the coalescent to the birth-death process.

  16. Sample-Clock Phase-Control Feedback

    NASA Technical Reports Server (NTRS)

    Quirk, Kevin J.; Gin, Jonathan W.; Nguyen, Danh H.; Nguyen, Huy

    2012-01-01

    To demodulate a communication signal, a receiver must recover and synchronize to the symbol timing of a received waveform. In a system that utilizes digital sampling, the fidelity of synchronization is limited by the time between the symbol boundary and closest sample time location. To reduce this error, one typically uses a sample clock in excess of the symbol rate in order to provide multiple samples per symbol, thereby lowering the error limit to a fraction of a symbol time. For systems with a large modulation bandwidth, the required sample clock rate is prohibitive due to current technological barriers and processing complexity. With precise control of the phase of the sample clock, one can sample the received signal at times arbitrarily close to the symbol boundary, thus obviating the need, from a synchronization perspective, for multiple samples per symbol. Sample-clock phase-control feedback was developed for use in the demodulation of an optical communication signal, where multi-GHz modulation bandwidths would require prohibitively large sample clock frequencies for rates in excess of the symbol rate. A custom mixedsignal (RF/digital) offset phase-locked loop circuit was developed to control the phase of the 6.4-GHz clock that samples the photon-counting detector output. The offset phase-locked loop is driven by a feedback mechanism that continuously corrects for variation in the symbol time due to motion between the transmitter and receiver as well as oscillator instability. This innovation will allow significant improvements in receiver throughput; for example, the throughput of a pulse-position modulation (PPM) with 16 slots can increase from 188 Mb/s to 1.5 Gb/s.

  17. 40 CFR 761.283 - Determination of the number of samples to collect and sample collection locations.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... sampling points after the recleaning, but select three new pairs of sampling coordinates. (i) Beginning in the southwest corner (lower left when facing magnetic north) of the area to be sampled, measure in... new pair of sampling coordinates. Continue to select pairs of sampling coordinates until three are...

  18. Fluid sampling apparatus and method

    DOEpatents

    Yeamans, David R.

    1998-01-01

    Incorporation of a bellows in a sampling syringe eliminates ingress of contaminants, permits replication of amounts and compression of multiple sample injections, and enables remote sampling for off-site analysis.

  19. A simulative comparison of respondent driven sampling with incentivized snowball sampling – the “strudel effect”

    PubMed Central

    Gyarmathy, V. Anna; Johnston, Lisa G.; Caplinskiene, Irma; Caplinskas, Saulius; Latkin, Carl A.

    2014-01-01

    Background Respondent driven sampling (RDS) and Incentivized Snowball Sampling (ISS) are two sampling methods that are commonly used to reach people who inject drugs (PWID). Methods We generated a set of simulated RDS samples on an actual sociometric ISS sample of PWID in Vilnius, Lithuania (“original sample”) to assess if the simulated RDS estimates were statistically significantly different from the original ISS sample prevalences for HIV (9.8%), Hepatitis A (43.6%), Hepatitis B (Anti-HBc 43.9% and HBsAg 3.4%), Hepatitis C (87.5%), syphilis (6.8%) and Chlamydia (8.8%) infections and for selected behavioral risk characteristics. Results The original sample consisted of a large component of 249 people (83% of the sample) and 13 smaller components with 1 to 12 individuals. Generally, as long as all seeds were recruited from the large component of the original sample, the simulation samples simply recreated the large component. There were no significant differences between the large component and the entire original sample for the characteristics of interest. Altogether 99.2% of 360 simulation sample point estimates were within the confidence interval of the original prevalence values for the characteristics of interest. Conclusions When population characteristics are reflected in large network components that dominate the population, RDS and ISS may produce samples that have statistically non-different prevalence values, even though some isolated network components may be under-sampled and/or statistically significantly different from the main groups. This so-called “strudel effect” is discussed in the paper. PMID:24360650

  20. 40 CFR 98.434 - Monitoring and QA/QC requirements.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... Contained in Pre-Charged Equipment or Closed-Cell Foams § 98.434 Monitoring and QA/QC requirements. (a) For... equipment or closed-cell foam in the correct quantities and units. [74 FR 56374, Oct. 30, 2009, as amended...

  1. Geochemical reanalysis of historical U.S. Geological Survey sediment samples from the Haines area, Juneau and Skagway quadrangles, southeast Alaska

    USGS Publications Warehouse

    Werdon, Melanie B.; Granitto, Matthew; Azain, Jaime S.

    2015-01-01

    The State of Alaska’s Strategic and Critical Minerals (SCM) Assessment project, a State-funded Capital Improvement Project (CIP), is designed to evaluate Alaska’s statewide potential for SCM resources. The SCM Assessment is being implemented by the Alaska Division of Geological & Geophysical Surveys (DGGS), and involves obtaining new airborne-geophysical, geological, and geochemical data. As part of the SCM Assessment, thousands of historical geochemical samples from DGGS, U.S. Geological Survey (USGS), and U.S. Bureau of Mines archives are being reanalyzed by DGGS using modern, quantitative, geochemical-analytical methods. The objective is to update the statewide geochemical database to more clearly identify areas in Alaska with SCM potential. The USGS is also undertaking SCM-related geologic studies in Alaska through the federally funded Alaska Critical Minerals cooperative project. DGGS and USGS share the goal of evaluating Alaska’s strategic and critical minerals potential and together created a Letter of Agreement (signed December 2012) and a supplementary Technical Assistance Agreement (#14CMTAA143458) to facilitate the two agencies’ cooperative work. Under these agreements, DGGS contracted the USGS in Denver to reanalyze historical USGS sediment samples from Alaska. For this report, DGGS funded reanalysis of 212 historical USGS sediment samples from the statewide Alaska Geochemical Database Version 2.0 (AGDB2; Granitto and others, 2013). Samples were chosen from the Chilkat, Klehini, Tsirku, and Takhin river drainages, as well as smaller drainages flowing into Chilkat and Chilkoot Inlets near Haines, Skagway Quadrangle, Southeast Alaska. Additionally some samples were also chosen from the Juneau gold belt, Juneau Quadrangle, Southeast Alaska (fig. 1). The USGS was responsible for sample retrieval from the National Geochemical Sample Archive (NGSA) in Denver, Colorado through the final quality assurance/quality control (QA/QC) of the geochemical

  2. BACTERIOLOGICAL ANALYSIS WITH SAMPLING AND SAMPLE PRESERVATION SPECIFICS

    EPA Science Inventory

    Current federal regulations (40CFR 503) specify that under certain conditions treated municipal biosolids must be analyzed for fecal coliform or salmonellae. The regulations state that representative samples of biosolids must be collected and analyzed using standard methods. Th...

  3. Reconstructing genealogies of serial samples under the assumption of a molecular clock using serial-sample UPGMA.

    PubMed

    Drummond, A; Rodrigo, A G

    2000-12-01

    Reconstruction of evolutionary relationships from noncontemporaneous molecular samples provides a new challenge for phylogenetic reconstruction methods. With recent biotechnological advances there has been an increase in molecular sequencing throughput, and the potential to obtain serial samples of sequences from populations, including rapidly evolving pathogens, is fast being realized. A new method called the serial-sample unweighted pair grouping method with arithmetic means (sUPGMA) is presented that reconstructs a genealogy or phylogeny of sequences sampled serially in time using a matrix of pairwise distances. The resulting tree depicts the terminal lineages of each sample ending at a different level consistent with the sample's temporal order. Since sUPGMA is a variant of UPGMA, it will perform best when sequences have evolved at a constant rate (i.e., according to a molecular clock). On simulated data, this new method performs better than standard cluster analysis under a variety of longitudinal sampling strategies. Serial-sample UPGMA is particularly useful for analysis of longitudinal samples of viruses and bacteria, as well as ancient DNA samples, with the minimal requirement that samples of sequences be ordered in time.

  4. Fluid sampling apparatus and method

    DOEpatents

    Yeamans, D.R.

    1998-02-03

    Incorporation of a bellows in a sampling syringe eliminates ingress of contaminants, permits replication of amounts and compression of multiple sample injections, and enables remote sampling for off-site analysis. 3 figs.

  5. Sample Handling Considerations for a Europa Sample Return Mission: An Overview

    NASA Technical Reports Server (NTRS)

    Fries, M. D.; Calaway, M. L.; Evans, C. A.; McCubbin, F. M.

    2015-01-01

    The intent of this abstract is to provide a basic overview of mission requirements for a generic Europan plume sample return mission, based on NASA Curation experience in NASA sample return missions ranging from Apollo to OSIRIS-REx. This should be useful for mission conception and early stage planning. We will break the mission down into Outbound and Return legs and discuss them separately.

  6. MEPAG Recommendations for a 2018 Mars Sample Return Caching Lander - Sample Types, Number, and Sizes

    NASA Technical Reports Server (NTRS)

    Allen, Carlton C.

    2011-01-01

    The return to Earth of geological and atmospheric samples from the surface of Mars is among the highest priority objectives of planetary science. The MEPAG Mars Sample Return (MSR) End-to-End International Science Analysis Group (MEPAG E2E-iSAG) was chartered to propose scientific objectives and priorities for returned sample science, and to map out the implications of these priorities, including for the proposed joint ESA-NASA 2018 mission that would be tasked with the crucial job of collecting and caching the samples. The E2E-iSAG identified four overarching scientific aims that relate to understanding: (A) the potential for life and its pre-biotic context, (B) the geologic processes that have affected the martian surface, (C) planetary evolution of Mars and its atmosphere, (D) potential for future human exploration. The types of samples deemed most likely to achieve the science objectives are, in priority order: (1A). Subaqueous or hydrothermal sediments (1B). Hydrothermally altered rocks or low temperature fluid-altered rocks (equal priority) (2). Unaltered igneous rocks (3). Regolith, including airfall dust (4). Present-day atmosphere and samples of sedimentary-igneous rocks containing ancient trapped atmosphere Collection of geologically well-characterized sample suites would add considerable value to interpretations of all collected rocks. To achieve this, the total number of rock samples should be about 30-40. In order to evaluate the size of individual samples required to meet the science objectives, the E2E-iSAG reviewed the analytical methods that would likely be applied to the returned samples by preliminary examination teams, for planetary protection (i.e., life detection, biohazard assessment) and, after distribution, by individual investigators. It was concluded that sample size should be sufficient to perform all high-priority analyses in triplicate. In keeping with long-established curatorial practice of extraterrestrial material, at least 40% by

  7. Optimal approaches for inline sampling of organisms in ballast water: L-shaped vs. Straight sample probes

    NASA Astrophysics Data System (ADS)

    Wier, Timothy P.; Moser, Cameron S.; Grant, Jonathan F.; Riley, Scott C.; Robbins-Wamsley, Stephanie H.; First, Matthew R.; Drake, Lisa A.

    2017-10-01

    Both L-shaped ("L") and straight ("Straight") sample probes have been used to collect water samples from a main ballast line in land-based or shipboard verification testing of ballast water management systems (BWMS). A series of experiments was conducted to quantify and compare the sampling efficiencies of L and Straight sample probes. The findings from this research-that both L and Straight probes sample organisms with similar efficiencies-permit increased flexibility for positioning sample probes aboard ships.

  8. Chiral liquid chromatography-mass spectrometry (LC-MS/MS) method development for the detection of salbutamol in urine samples.

    PubMed

    Chan, Sue Hay; Lee, Warren; Asmawi, Mohd Zaini; Tan, Soo Choon

    2016-07-01

    A sequential solid-phase extraction (SPE) method was developed and validated using liquid chromatography-electrospray ionization tandem mass spectrometry (LC-ESI-MS/MS) for the detection and quantification of salbutamol enantiomers in porcine urine. Porcine urine samples were hydrolysed with β-glucuronidase/arylsulfatase from Helix pomatia and then subjected to a double solid-phase extraction (SPE) first using the Abs-Elut Nexus SPE and then followed by the Bond Elut Phenylboronic Acid (PBA) SPE. The salbutamol enantiomers were separated using the Astec CHIROBIOTIC™ T HPLC column (3.0mm×100mm; 5μm) maintained at 15°C with a 15min isocratic run at a flow rate of 0.4mL/min. The mobile phase constituted of 5mM ammonium formate in methanol. Salbutamol and salbutamol-tert-butyl-d9 (internal standard, IS) was monitored and quantified with the multiple reaction monitoring (MRM) mode. The method showed good linearity for the range of 0.1-10ng/mL with limit of quantification at 0.3ng/mL. Analysis of the QC samples showed intra- and inter-assay precisions to be less than 5.04%, and recovery ranging from 83.82 to 102.33%. Copyright © 2016 Elsevier B.V. All rights reserved.

  9. Comet coma sample return instrument

    NASA Technical Reports Server (NTRS)

    Albee, A. L.; Brownlee, Don E.; Burnett, Donald S.; Tsou, Peter; Uesugi, K. T.

    1994-01-01

    The sample collection technology and instrument concept for the Sample of Comet Coma Earth Return Mission (SOCCER) are described. The scientific goals of this Flyby Sample Return are to return to coma dust and volatile samples from a known comet source, which will permit accurate elemental and isotopic measurements for thousands of individual solid particles and volatiles, detailed analysis of the dust structure, morphology, and mineralogy of the intact samples, and identification of the biogenic elements or compounds in the solid and volatile samples. Having these intact samples, morphologic, petrographic, and phase structural features can be determined. Information on dust particle size, shape, and density can be ascertained by analyzing penetration holes and tracks in the capture medium. Time and spatial data of dust capture will provide understanding of the flux dynamics of the coma and the jets. Additional information will include the identification of cosmic ray tracks in the cometary grains, which can provide a particle's process history and perhaps even the age of the comet. The measurements will be made with the same equipment used for studying micrometeorites for decades past; hence, the results can be directly compared without extrapolation or modification. The data will provide a powerful and direct technique for comparing the cometary samples with all known types of meteorites and interplanetary dust. This sample collection system will provide the first sample return from a specifically identified primitive body and will allow, for the first time, a direct method of matching meteoritic materials captured on Earth with known parent bodies.

  10. Curation of Samples from Mars

    NASA Astrophysics Data System (ADS)

    Lindstrom, D.; Allen, C.

    One of the strong scientific reasons for returning samples from Mars is to search for evidence of current or past life in the samples. Because of the remote possibility that the samples may contain life forms that are hazardous to the terrestrial biosphere, the National Research Council has recommended that all samples returned from Mars be kept under strict biological containment until tests show that they can safely be released to other laboratories. It is possible that Mars samples may contain only scarce or subtle traces of life or prebiotic chemistry that could readily be overwhelmed by terrestrial contamination. Thus, the facilities used to contain, process, and analyze samples from Mars must have a combination of high-level biocontainment and organic / inorganic chemical cleanliness that is unprecedented. We have been conducting feasibility studies and developing designs for a facility that would be at least as capable as current maximum containment BSL-4 (BioSafety Level 4) laboratories, while simultaneously maintaining cleanliness levels exceeding those of the cleanest electronics manufacturing labs. Unique requirements for the processing of Mars samples have inspired a program to develop handling techniques that are much more precise and reliable than the approach (currently used for lunar samples) of employing gloved human hands in nitrogen-filled gloveboxes. Individual samples from Mars are expected to be much smaller than lunar samples, the total mass of samples returned by each mission being 0.5- 1 kg, compared with many tens of kg of lunar samples returned by each of the six Apollo missions. Smaller samp les require much more of the processing to be done under microscopic observation. In addition, the requirements for cleanliness and high-level containment would be difficult to satisfy while using traditional gloveboxes. JSC has constructed a laboratory to test concepts and technologies important to future sample curation. The Advanced Curation

  11. Nonuniform sampling by quantiles

    NASA Astrophysics Data System (ADS)

    Craft, D. Levi; Sonstrom, Reilly E.; Rovnyak, Virginia G.; Rovnyak, David

    2018-03-01

    A flexible strategy for choosing samples nonuniformly from a Nyquist grid using the concept of statistical quantiles is presented for broad classes of NMR experimentation. Quantile-directed scheduling is intuitive and flexible for any weighting function, promotes reproducibility and seed independence, and is generalizable to multiple dimensions. In brief, weighting functions are divided into regions of equal probability, which define the samples to be acquired. Quantile scheduling therefore achieves close adherence to a probability distribution function, thereby minimizing gaps for any given degree of subsampling of the Nyquist grid. A characteristic of quantile scheduling is that one-dimensional, weighted NUS schedules are deterministic, however higher dimensional schedules are similar within a user-specified jittering parameter. To develop unweighted sampling, we investigated the minimum jitter needed to disrupt subharmonic tracts, and show that this criterion can be met in many cases by jittering within 25-50% of the subharmonic gap. For nD-NUS, three supplemental components to choosing samples by quantiles are proposed in this work: (i) forcing the corner samples to ensure sampling to specified maximum values in indirect evolution times, (ii) providing an option to triangular backfill sampling schedules to promote dense/uniform tracts at the beginning of signal evolution periods, and (iii) providing an option to force the edges of nD-NUS schedules to be identical to the 1D quantiles. Quantile-directed scheduling meets the diverse needs of current NUS experimentation, but can also be used for future NUS implementations such as off-grid NUS and more. A computer program implementing these principles (a.k.a. QSched) in 1D- and 2D-NUS is available under the general public license.

  12. Apparatus and method for maintaining multi-component sample gas constituents in vapor phase during sample extraction and cooling

    DOEpatents

    Felix, Larry Gordon; Farthing, William Earl; Irvin, James Hodges; Snyder, Todd Robert

    2010-05-11

    A dilution apparatus for diluting a gas sample. The apparatus includes a sample gas conduit having a sample gas inlet end and a diluted sample gas outlet end, and a sample gas flow restricting orifice disposed proximate the sample gas inlet end connected with the sample gas conduit and providing fluid communication between the exterior and the interior of the sample gas conduit. A diluted sample gas conduit is provided within the sample gas conduit having a mixing end with a mixing space inlet opening disposed proximate the sample gas inlet end, thereby forming an annular space between the sample gas conduit and the diluted sample gas conduit. The mixing end of the diluted sample gas conduit is disposed at a distance from the sample gas flow restricting orifice. A dilution gas source connected with the sample gas inlet end of the sample gas conduit is provided for introducing a dilution gas into the annular space, and a filter is provided for filtering the sample gas. The apparatus is particularly suited for diluting heated sample gases containing one or more condensable components.

  13. Sample selection and preservation techniques for the Mars sample return mission

    NASA Technical Reports Server (NTRS)

    Tsay, Fun-Dow

    1988-01-01

    It is proposed that a miniaturized electron spin resonance (ESR) spectrometer be developed as an effective, nondestructivew sample selection and characterization instrument for the Mars Rover Sample Return mission. The ESR instrument can meet rover science payload requirements and yet has the capability and versatility to perform the following in situ Martian sample analyses: (1) detection of active oxygen species, and characterization of Martian surface chemistry and photocatalytic oxidation processes; (2) determination of paramagnetic Fe(3+) in clay silicate minerals, Mn(2+) in carbonates, and ferromagnetic centers of magnetite, maghemite and hematite; (3) search for organic compounds in the form of free radicals in subsoil, and detection of Martian fossil organic matter likely to be associated with carbonate and other sedimentary deposits. The proposed instrument is further detailed.

  14. 40 CFR 761.292 - Chemical extraction and analysis of individual samples and composite samples.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... individual samples and composite samples. 761.292 Section 761.292 Protection of Environment ENVIRONMENTAL... Cleanup and On-Site Disposal of Bulk PCB Remediation Waste and Porous Surfaces in Accordance With § 761... individual and composite samples of PCB remediation waste. Use Method 8082 from SW-846, or a method validated...

  15. 40 CFR 761.292 - Chemical extraction and analysis of individual samples and composite samples.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... individual samples and composite samples. 761.292 Section 761.292 Protection of Environment ENVIRONMENTAL... Cleanup and On-Site Disposal of Bulk PCB Remediation Waste and Porous Surfaces in Accordance With § 761... individual and composite samples of PCB remediation waste. Use Method 8082 from SW-846, or a method validated...

  16. 40 CFR 761.292 - Chemical extraction and analysis of individual samples and composite samples.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... individual samples and composite samples. 761.292 Section 761.292 Protection of Environment ENVIRONMENTAL... Cleanup and On-Site Disposal of Bulk PCB Remediation Waste and Porous Surfaces in Accordance With § 761... individual and composite samples of PCB remediation waste. Use Method 8082 from SW-846, or a method validated...

  17. 40 CFR 761.292 - Chemical extraction and analysis of individual samples and composite samples.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... individual samples and composite samples. 761.292 Section 761.292 Protection of Environment ENVIRONMENTAL... Cleanup and On-Site Disposal of Bulk PCB Remediation Waste and Porous Surfaces in Accordance With § 761... individual and composite samples of PCB remediation waste. Use Method 8082 from SW-846, or a method validated...

  18. Sample positioning in microgravity

    NASA Technical Reports Server (NTRS)

    Sridharan, Govind (Inventor)

    1991-01-01

    Repulsion forces arising from laser beams are provided to produce mild positioning forces on a sample in microgravity vacuum environments. The system of the preferred embodiment positions samples using a plurality of pulsed lasers providing opposing repulsion forces. The lasers are positioned around the periphery of a confinement area and expanded to create a confinement zone. The grouped laser configuration, in coordination with position sensing devices, creates a feedback servo whereby stable position control of a sample within microgravity environment can be achieved.

  19. Sample positioning in microgravity

    NASA Technical Reports Server (NTRS)

    Sridharan, Govind (Inventor)

    1993-01-01

    Repulsion forces arising from laser beams are provided to produce mild positioning forces on a sample in microgravity vacuum environments. The system of the preferred embodiment positions samples using a plurality of pulsed lasers providing opposing repulsion forces. The lasers are positioned around the periphery of a confinement area and expanded to create a confinement zone. The grouped laser configuration, in coordination with position sensing devices, creates a feedback servo whereby stable position control of a sample within microgravity environment can be achieved.

  20. Planetary Sample Caching System Design Options

    NASA Technical Reports Server (NTRS)

    Collins, Curtis; Younse, Paulo; Backes, Paul

    2009-01-01

    Potential Mars Sample Return missions would aspire to collect small core and regolith samples using a rover with a sample acquisition tool and sample caching system. Samples would need to be stored in individual sealed tubes in a canister that could be transfered to a Mars ascent vehicle and returned to Earth. A sample handling, encapsulation and containerization system (SHEC) has been developed as part of an integrated system for acquiring and storing core samples for application to future potential MSR and other potential sample return missions. Requirements and design options for the SHEC system were studied and a recommended design concept developed. Two families of solutions were explored: 1)transfer of a raw sample from the tool to the SHEC subsystem and 2)transfer of a tube containing the sample to the SHEC subsystem. The recommended design utilizes sample tool bit change out as the mechanism for transferring tubes to and samples in tubes from the tool. The SHEC subsystem design, called the Bit Changeout Caching(BiCC) design, is intended for operations on a MER class rover.

  1. Sampling Strategy

    NASA Technical Reports Server (NTRS)

    2008-01-01

    Three locations to the right of the test dig area are identified for the first samples to be delivered to the Thermal and Evolved Gas Analyzer (TEGA), the Wet Chemistry Lab (WCL), and the Optical Microscope (OM) on NASA's Phoenix Mars Lander. These sampling areas are informally labeled 'Baby Bear', 'Mama Bear', and 'Papa Bear' respectively. This image was taken on the seventh day of the Mars mission, or Sol 7 (June 1, 2008) by the Surface Stereo Imager aboard NASA's Phoenix Mars Lander.

    The Phoenix Mission is led by the University of Arizona, Tucson, on behalf of NASA. Project management of the mission is by NASA's Jet Propulsion Laboratory, Pasadena, Calif. Spacecraft development is by Lockheed Martin Space Systems, Denver.

  2. Method and apparatus for data sampling

    DOEpatents

    Odell, Daniel M. C.

    1994-01-01

    A method and apparatus for sampling radiation detector outputs and determining event data from the collected samples. The method uses high speed sampling of the detector output, the conversion of the samples to digital values, and the discrimination of the digital values so that digital values representing detected events are determined. The high speed sampling and digital conversion is performed by an A/D sampler that samples the detector output at a rate high enough to produce numerous digital samples for each detected event. The digital discrimination identifies those digital samples that are not representative of detected events. The sampling and discrimination also provides for temporary or permanent storage, either serially or in parallel, to a digital storage medium.

  3. Apparatus for sectioning demountable semiconductor samples

    DOEpatents

    Sopori, Bhushan L.; Wolf, Abraham

    1984-01-01

    Apparatus for use during polishing and sectioning operations of a ribbon sample is described. The sample holder includes a cylinder having an axially extending sample cavity terminated in a first funnel-shaped opening and a second slot-like opening. A spring-loaded pressure plunger is located adjacent the second opening of the sample cavity for frictional engagement of the sample prior to introduction of a molding medium in the sample cavity. A heat softenable molding medium is inserted in the funnel-shaped opening, to surround the sample. After polishing, the heater is energized to allow draining of the molding medium from the sample cavity. During manual polishing, the second end of the sample holder is inserted in a support ring which provides mechanical support as well as alignment of the sample holder during polishing. A gauge block for measuring the protrusion of a sample beyond the second wall of the holder is also disclosed.

  4. Improving Lab Sample Management - POS/MCEARD

    EPA Science Inventory

    "Scientists face increasing challenges in managing their laboratory samples, including long-term storage of legacy samples, tracking multiple aliquots of samples for many experiments, and linking metadata to these samples. Other factors complicating sample management include the...

  5. Analysis of the Touch-And-Go Surface Sampling Concept for Comet Sample Return Missions

    NASA Technical Reports Server (NTRS)

    Mandic, Milan; Acikmese, Behcet; Bayard, David S.; Blackmore, Lars

    2012-01-01

    This paper studies the Touch-and-Go (TAG) concept for enabling a spacecraft to take a sample from the surface of a small primitive body, such as an asteroid or comet. The idea behind the TAG concept is to let the spacecraft descend to the surface, make contact with the surface for several seconds, and then ascend to a safe location. Sampling would be accomplished by an end-effector that is active during the few seconds of surface contact. The TAG event is one of the most critical events in a primitive body sample-return mission. The purpose of this study is to evaluate the dynamic behavior of a representative spacecraft during the TAG event, i.e., immediately prior, during, and after surface contact of the sampler. The study evaluates the sample-collection performance of the proposed sampling end-effector, in this case a brushwheel sampler, while acquiring material from the surface during the contact. A main result of the study is a guidance and control (G&C) validation of the overall TAG concept, in addition to specific contributions to demonstrating the effectiveness of using nonlinear clutch mechanisms in the sampling arm joints, and increasing the length of the sampling arms to improve robustness.

  6. Recommendations for representative ballast water sampling

    NASA Astrophysics Data System (ADS)

    Gollasch, Stephan; David, Matej

    2017-05-01

    Until now, the purpose of ballast water sampling studies was predominantly limited to general scientific interest to determine the variety of species arriving in ballast water in a recipient port. Knowing the variety of species arriving in ballast water also contributes to the assessment of relative species introduction vector importance. Further, some sampling campaigns addressed awareness raising or the determination of organism numbers per water volume to evaluate the species introduction risk by analysing the propagule pressure of species. A new aspect of ballast water sampling, which this contribution addresses, is compliance monitoring and enforcement of ballast water management standards as set by, e.g., the IMO Ballast Water Management Convention. To achieve this, sampling methods which result in representative ballast water samples are essential. We recommend such methods based on practical tests conducted on two commercial vessels also considering results from our previous studies. The results show that different sampling approaches influence the results regarding viable organism concentrations in ballast water samples. It was observed that the sampling duration (i.e., length of the sampling process), timing (i.e., in which point in time of the discharge the sample is taken), the number of samples and the sampled water quantity are the main factors influencing the concentrations of viable organisms in a ballast water sample. Based on our findings we provide recommendations for representative ballast water sampling.

  7. GEOTHERMAL EFFLUENT SAMPLING WORKSHOP

    EPA Science Inventory

    This report outlines the major recommendations resulting from a workshop to identify gaps in existing geothermal effluent sampling methodologies, define needed research to fill those gaps, and recommend strategies to lead to a standardized sampling methodology.

  8. Proteomic Challenges: Sample Preparation Techniques for Microgram-Quantity Protein Analysis from Biological Samples

    PubMed Central

    Feist, Peter; Hummon, Amanda B.

    2015-01-01

    Proteins regulate many cellular functions and analyzing the presence and abundance of proteins in biological samples are central focuses in proteomics. The discovery and validation of biomarkers, pathways, and drug targets for various diseases can be accomplished using mass spectrometry-based proteomics. However, with mass-limited samples like tumor biopsies, it can be challenging to obtain sufficient amounts of proteins to generate high-quality mass spectrometric data. Techniques developed for macroscale quantities recover sufficient amounts of protein from milligram quantities of starting material, but sample losses become crippling with these techniques when only microgram amounts of material are available. To combat this challenge, proteomicists have developed micro-scale techniques that are compatible with decreased sample size (100 μg or lower) and still enable excellent proteome coverage. Extraction, contaminant removal, protein quantitation, and sample handling techniques for the microgram protein range are reviewed here, with an emphasis on liquid chromatography and bottom-up mass spectrometry-compatible techniques. Also, a range of biological specimens, including mammalian tissues and model cell culture systems, are discussed. PMID:25664860

  9. Apparatus and method for handheld sampling

    DOEpatents

    Staab, Torsten A.

    2005-09-20

    The present invention includes an apparatus, and corresponding method, for taking a sample. The apparatus is built around a frame designed to be held in at least one hand. A sample media is used to secure the sample. A sample media adapter for securing the sample media is operated by a trigger mechanism connectively attached within the frame to the sample media adapter.

  10. Synchronizing data from irregularly sampled sensors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Uluyol, Onder

    A system and method include receiving a set of sampled measurements for each of multiple sensors, wherein the sampled measurements are at irregular intervals or different rates, re-sampling the sampled measurements of each of the multiple sensors at a higher rate than one of the sensor's set of sampled measurements, and synchronizing the sampled measurements of each of the multiple sensors.

  11. Adaptive Peer Sampling with Newscast

    NASA Astrophysics Data System (ADS)

    Tölgyesi, Norbert; Jelasity, Márk

    The peer sampling service is a middleware service that provides random samples from a large decentralized network to support gossip-based applications such as multicast, data aggregation and overlay topology management. Lightweight gossip-based implementations of the peer sampling service have been shown to provide good quality random sampling while also being extremely robust to many failure scenarios, including node churn and catastrophic failure. We identify two problems with these approaches. The first problem is related to message drop failures: if a node experiences a higher-than-average message drop rate then the probability of sampling this node in the network will decrease. The second problem is that the application layer at different nodes might request random samples at very different rates which can result in very poor random sampling especially at nodes with high request rates. We propose solutions for both problems. We focus on Newscast, a robust implementation of the peer sampling service. Our solution is based on simple extensions of the protocol and an adaptive self-control mechanism for its parameters, namely—without involving failure detectors—nodes passively monitor local protocol events using them as feedback for a local control loop for self-tuning the protocol parameters. The proposed solution is evaluated by simulation experiments.

  12. Method and apparatus for data sampling

    DOEpatents

    Odell, D.M.C.

    1994-04-19

    A method and apparatus for sampling radiation detector outputs and determining event data from the collected samples is described. The method uses high speed sampling of the detector output, the conversion of the samples to digital values, and the discrimination of the digital values so that digital values representing detected events are determined. The high speed sampling and digital conversion is performed by an A/D sampler that samples the detector output at a rate high enough to produce numerous digital samples for each detected event. The digital discrimination identifies those digital samples that are not representative of detected events. The sampling and discrimination also provides for temporary or permanent storage, either serially or in parallel, to a digital storage medium. 6 figures.

  13. Solvent Hold Tank Sample Results for MCU-16-991-992-993: July 2016 Monthly sample and MCU-16-1033-1034-1035: July 2016 Superwashed Sample

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fondeur, F.; Jones, D.

    SRNL received one set of SHT samples (MCU-16-991, MCU-16-992 and MCU-16-993), pulled on 07/13/2016 and another set of SHT samples (MCU-16-1033, MCU-16-1034, and MCU-16-1035) that were pulled on 07/24/2016 after the solvent was superwashed with 300 mM sodium hydroxide for analysis. Samples MCU-16-991, MCU-16-992, and MCU-16-993 were combined into one sample (MCU-16-991-992-993) and samples MCU-16-1033, MCU-16-1034, and MCU-16-1035 were combined into one sample (MCU-16-1033-1034-1035). Of the two composite samples MCU-16-1033-1034-1035 represents the current chemical state of the solvent at MCU. All analytical conclusions are based on the chemical analysis of MCU-16-1033-1034-1035. There were no chemical differences between MCU-16- 991-992-993 andmore » superwashed MCU-16-1033-1034-1035.« less

  14. A comprehensive quality control workflow for paired tumor-normal NGS experiments.

    PubMed

    Schroeder, Christopher M; Hilke, Franz J; Löffler, Markus W; Bitzer, Michael; Lenz, Florian; Sturm, Marc

    2017-06-01

    Quality control (QC) is an important part of all NGS data analysis stages. Many available tools calculate QC metrics from different analysis steps of single sample experiments (raw reads, mapped reads and variant lists). Multi-sample experiments, as sequencing of tumor-normal pairs, require additional QC metrics to ensure validity of results. These multi-sample QC metrics still lack standardization. We therefore suggest a new workflow for QC of DNA sequencing of tumor-normal pairs. With this workflow well-known single-sample QC metrics and additional metrics specific for tumor-normal pairs can be calculated. The segmentation into different tools offers a high flexibility and allows reuse for other purposes. All tools produce qcML, a generic XML format for QC of -omics experiments. qcML uses quality metrics defined in an ontology, which was adapted for NGS. All QC tools are implemented in C ++ and run both under Linux and Windows. Plotting requires python 2.7 and matplotlib. The software is available under the 'GNU General Public License version 2' as part of the ngs-bits project: https://github.com/imgag/ngs-bits. christopher.schroeder@med.uni-tuebingen.de. Supplementary data are available at Bioinformatics online. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com

  15. Mars Sample Quarantine Protocol Workshop

    NASA Technical Reports Server (NTRS)

    DeVincenzi, Donald L. (Editor); Bagby, John (Editor); Race, Margaret (Editor); Rummel, John (Editor)

    1999-01-01

    The Mars Sample Quarantine Protocol (QP) Workshop was convened to deal with three specific aspects of the initial handling of a returned Mars sample: 1) biocontainment, to prevent uncontrolled release of sample material into the terrestrial environment; 2) life detection, to examine the sample for evidence of live organisms; and 3) biohazard testing, to determine if the sample poses any threat to terrestrial life forms and the Earth's biosphere. During the first part of the Workshop, several tutorials were presented on topics related to the workshop in order to give all participants a common basis in the technical areas necessary to achieve the objectives of the Workshop.

  16. A novel QC-LDPC code based on the finite field multiplicative group for optical communications

    NASA Astrophysics Data System (ADS)

    Yuan, Jian-guo; Xu, Liang; Tong, Qing-zhen

    2013-09-01

    A novel construction method of quasi-cyclic low-density parity-check (QC-LDPC) code is proposed based on the finite field multiplicative group, which has easier construction, more flexible code-length code-rate adjustment and lower encoding/decoding complexity. Moreover, a regular QC-LDPC(5334,4962) code is constructed. The simulation results show that the constructed QC-LDPC(5334,4962) code can gain better error correction performance under the condition of the additive white Gaussian noise (AWGN) channel with iterative decoding sum-product algorithm (SPA). At the bit error rate (BER) of 10-6, the net coding gain (NCG) of the constructed QC-LDPC(5334,4962) code is 1.8 dB, 0.9 dB and 0.2 dB more than that of the classic RS(255,239) code in ITU-T G.975, the LDPC(32640,30592) code in ITU-T G.975.1 and the SCG-LDPC(3969,3720) code constructed by the random method, respectively. So it is more suitable for optical communication systems.

  17. Where Will All Your Samples Go?

    NASA Astrophysics Data System (ADS)

    Lehnert, K.

    2017-12-01

    Even in the digital age, physical samples remain an essential component of Earth and space science research. Geoscientists collect samples, sometimes locally, often in remote locations during expensive field expeditions, or at sample repositories and museums. They take these samples to their labs to describe and analyze them. When the analyses are completed and the results are published, the samples get stored away in sheds, basements, or desk drawers, where they remain unknown and inaccessible to the broad science community. In some cases, they will get re-analyzed or shared with other researchers, who know of their existence through personal connections. The sad end comes when the researcher retires: There are many stories of samples and entire collections being discarded to free up space for new samples or other purposes, even though these samples may be unique and irreplaceable. Institutions do not feel obligated and do not have the resources to store samples in perpetuity. Only samples collected in large sampling campaigns such as the Ocean Discovery Program or cores taken on ships find a home in repositories that curate and preserve them for reuse in future science endeavors. In the era of open, transparent, and reproducible science, preservation and persistent access to samples must be considered a mandate. Policies need to be developed that guide investigators, institutions, and funding agencies to plan and implement solutions for reliably and persistently curating and providing access to samples. Registration of samples in online catalogs and use of persistent identifiers such as the International Geo Sample Number are first steps to ensure discovery and access of samples. But digital discovery and access loses its value if the physical objects are not preserved and accessible. It is unreasonable to expect that every sample ever collected can be archived. Selection of those samples that are worth preserving requires guidelines and policies. We also need to

  18. GROUND WATER SAMPLING ISSUES

    EPA Science Inventory

    Obtaining representative ground water samples is important for site assessment and
    remedial performance monitoring objectives. Issues which must be considered prior to initiating a ground-water monitoring program include defining monitoring goals and objectives, sampling point...

  19. Molecular-beam gas-sampling system

    NASA Technical Reports Server (NTRS)

    Young, W. S.; Knuth, E. L.

    1972-01-01

    A molecular beam mass spectrometer system for rocket motor combustion chamber sampling is described. The history of the sampling system is reviewed. The problems associated with rocket motor combustion chamber sampling are reported. Several design equations are presented. The results of the experiments include the effects of cooling water flow rates, the optimum separation gap between the end plate and sampling nozzle, and preliminary data on compositions in a rocket motor combustion chamber.

  20. Apparatus for sectioning demountable semiconductor samples

    DOEpatents

    Sopori, B.L.; Wolf, A.

    1984-01-01

    Apparatus for use during polishing and sectioning operations of a ribbon sample is described. The sample holder includes a cylinder having an axially extending sample cavity terminated in a first funnel-shaped opening and a second slot-like opening. A spring-loaded pressure plunger is located adjacent the second opening of the sample cavity for frictional engagement of the sample cavity. A heat softenable molding medium is inserted in the funnel-shaped opening, to surround the sample. After polishing, the heater is energized to allow draining of the molding medium from the sample cavity. During manual polishing, the second end of the sample holder is inserted in a support ring which provides mechanical support as well as alignment of the sample holder during polishing. A gauge block for measuring the protrusion of a sample beyond the second wall of the holder is also disclosed.

  1. Systematic sampling of discrete and continuous populations: sample selection and the choice of estimator

    Treesearch

    Harry T. Valentine; David L. R. Affleck; Timothy G. Gregoire

    2009-01-01

    Systematic sampling is easy, efficient, and widely used, though it is not generally recognized that a systematic sample may be drawn from the population of interest with or without restrictions on randomization. The restrictions or the lack of them determine which estimators are unbiased, when using the sampling design as the basis for inference. We describe the...

  2. Nonuniform sampling by quantiles.

    PubMed

    Craft, D Levi; Sonstrom, Reilly E; Rovnyak, Virginia G; Rovnyak, David

    2018-03-01

    A flexible strategy for choosing samples nonuniformly from a Nyquist grid using the concept of statistical quantiles is presented for broad classes of NMR experimentation. Quantile-directed scheduling is intuitive and flexible for any weighting function, promotes reproducibility and seed independence, and is generalizable to multiple dimensions. In brief, weighting functions are divided into regions of equal probability, which define the samples to be acquired. Quantile scheduling therefore achieves close adherence to a probability distribution function, thereby minimizing gaps for any given degree of subsampling of the Nyquist grid. A characteristic of quantile scheduling is that one-dimensional, weighted NUS schedules are deterministic, however higher dimensional schedules are similar within a user-specified jittering parameter. To develop unweighted sampling, we investigated the minimum jitter needed to disrupt subharmonic tracts, and show that this criterion can be met in many cases by jittering within 25-50% of the subharmonic gap. For nD-NUS, three supplemental components to choosing samples by quantiles are proposed in this work: (i) forcing the corner samples to ensure sampling to specified maximum values in indirect evolution times, (ii) providing an option to triangular backfill sampling schedules to promote dense/uniform tracts at the beginning of signal evolution periods, and (iii) providing an option to force the edges of nD-NUS schedules to be identical to the 1D quantiles. Quantile-directed scheduling meets the diverse needs of current NUS experimentation, but can also be used for future NUS implementations such as off-grid NUS and more. A computer program implementing these principles (a.k.a. QSched) in 1D- and 2D-NUS is available under the general public license. Copyright © 2018 Elsevier Inc. All rights reserved.

  3. Core sample extractor

    NASA Technical Reports Server (NTRS)

    Akins, James; Cobb, Billy; Hart, Steve; Leaptrotte, Jeff; Milhollin, James; Pernik, Mark

    1989-01-01

    The problem of retrieving and storing core samples from a hole drilled on the lunar surface is addressed. The total depth of the hole in question is 50 meters with a maximum diameter of 100 millimeters. The core sample itself has a diameter of 60 millimeters and will be two meters in length. It is therefore necessary to retrieve and store 25 core samples per hole. The design utilizes a control system that will stop the mechanism at a certain depth, a cam-linkage system that will fracture the core, and a storage system that will save and catalogue the cores to be extracted. The Rod Changer and Storage Design Group will provide the necessary tooling to get into the hole as well as to the core. The mechanical design for the cam-linkage system as well as the conceptual design of the storage device are described.

  4. Lunar Samples - Apollo 17

    NASA Image and Video Library

    1972-12-27

    S72-56362 (27 Dec. 1972) --- Scientist-astronaut Harrison H. "Jack" Schmitt (facing camera), Apollo 17 lunar module pilot, was one of the first to look at the sample of "orange" soil which was brought back from the Taurus-Littrow landing site by the Apollo 17 crewmen. Schmitt discovered the material at Shorty Crater during the second Apollo 17 extravehicular activity (EVA). The "orange" sample, which was opened Wednesday, Dec. 27, 1972, is in the bag on a weighing platform in the sealed nitrogen cabinet in the upstairs processing line in the Lunar Receiving Laboratory at the Manned Spacecraft Center. Just before, the sample was removed from one of the bolt-top cans visible to the left in the cabinet. The first reaction of Schmitt was "It doesn't look the same." Most of the geologists and staff viewing the sample agreed that it was more tan and brown than orange. Closer comparison with color charts showed that the sample had a definite orange cast, according the MSC geology branch Chief William Phinney. After closer investigation and sieving, it was discovered that the orange color was caused by very fine spheres and fragments of orange glass in the midst of darker colored, larger grain material. Earlier in the day the "orange" soil was taken from the Apollo Lunar Sample Return Container No. 2 and placed in the bolt-top can (as was all the material in the ALSRC "rock box").

  5. 40 CFR 1065.150 - Continuous sampling.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 32 2010-07-01 2010-07-01 false Continuous sampling. 1065.150 Section... ENGINE-TESTING PROCEDURES Equipment Specifications § 1065.150 Continuous sampling. You may use continuous sampling techniques for measurements that involve raw or dilute sampling. Make sure continuous sampling...

  6. 40 CFR 1065.150 - Continuous sampling.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 33 2011-07-01 2011-07-01 false Continuous sampling. 1065.150 Section... ENGINE-TESTING PROCEDURES Equipment Specifications § 1065.150 Continuous sampling. You may use continuous sampling techniques for measurements that involve raw or dilute sampling. Make sure continuous sampling...

  7. SAMPLING SYSTEM

    DOEpatents

    Hannaford, B.A.; Rosenberg, R.; Segaser, C.L.; Terry, C.L.

    1961-01-17

    An apparatus is given for the batch sampling of radioactive liquids such as slurries from a system by remote control, while providing shielding for protection of operating personnel from the harmful effects of radiation.

  8. Mars Sample Return Architecture Overview

    NASA Astrophysics Data System (ADS)

    Edwards, C. D.; Vijendran, S.

    2018-04-01

    NASA and ESA are exploring potential concepts for a Sample Retrieval Lander and Earth Return Orbiter that could return samples planned to be collected and cached by the Mars 2020 rover mission. We provide an overview of the Mars Sample Return architecture.

  9. 40 CFR 98.434 - Monitoring and QA/QC requirements.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 22 2012-07-01 2012-07-01 false Monitoring and QA/QC requirements. 98.434 Section 98.434 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Importers and Exporters of Fluorinated Greenhouse Gases...

  10. Monitoring of occupational exposure to methylene chloride: sampling protocol and stability of urine samples.

    PubMed

    Hoffer, Erica; Tabak, Arek; Shcherb, Inna; Wiener, Avi; Bentur, Yedidia

    2005-01-01

    A sampling protocol for biomonitoring of the volatile solvent methylene chloride (MeCl(2)) by analysis of urine from exposed workers was established. Storage temperature, sample volume in headspace vial (HSV), and time to sealing HSV on determination of MeCl(2) in urine were evaluated. MeCl(2) was analyzed by a solid-phase microextraction technique combined with gas chromatography. Volume of urine in HSV has no effect on MeCl(2) analysis. Delays of 30 and 60 min from collection of urine until sealing the HSV caused 14.47 +/- 6.98% and 26.17 +/- 9.57% decreases from baseline concentration, respectively. MeCl(2) concentration in spiked urine samples stored in sealed HSVs decreased on day 2 and then remained stable for 2 weeks. Refrigeration did not improve recovery although it seems to be associated with less variability. MeCl(2) in urine samples of seven exposed workers was in the range of 0.02-0.06 mg/L. Sampling of MeCl(2)-containing urine should include collection of urine in closed plastic bottles, transfer to HSV within 15 min, sealing and clamping of HSV within 15 s, and storage of HSV in refrigeration until analysis, but no longer than 2 weeks. Standard samples should be prepared on the day of test sample collection and handled under the same conditions.

  11. Mold Testing or Sampling

    EPA Pesticide Factsheets

    In most cases, if visible mold growth is present, sampling is unnecessary. Since no EPA or other federal limits have been set for mold or mold spores, sampling cannot be used to check a building's compliance with federal mold standards.

  12. Micro-organism distribution sampling for bioassays

    NASA Technical Reports Server (NTRS)

    Nelson, B. A.

    1975-01-01

    Purpose of sampling distribution is to characterize sample-to-sample variation so statistical tests may be applied, to estimate error due to sampling (confidence limits) and to evaluate observed differences between samples. Distribution could be used for bioassays taken in hospitals, breweries, food-processing plants, and pharmaceutical plants.

  13. Field Exploration and Life Detection Sampling Through Planetary Analogue Sampling (FELDSPAR).

    NASA Technical Reports Server (NTRS)

    Stockton, A.; Amador, E. S.; Cable, M. L.; Cantrell, T.; Chaudry, N.; Cullen, T.; Duca, Z.; Gentry, D. M.; Kirby, J.; Jacobsen, M.; hide

    2017-01-01

    Exploration missions to Mars rely on rovers to perform analyses over small sampling areas; however, landing sites for these missions are selected based on large-scale, low-resolution remote data. The use of Earth analogue environments to estimate the multi-scale spatial distributions of key signatures of habitability can help ensure mission science goals are met. A main goal of FELDSPAR is to conduct field operations analogous to Mars sample return in its science, operations, and technology from landing site selection, to in-field sampling location selection, remote or stand-off analysis, in situ analysis, and home laboratory analysis. Lava fields and volcanic regions are relevant analogues to Martian landscapes due to desiccation, low nutrient availability, and temperature extremes. Operationally, many Icelandic lava fields are remote enough to require that field expeditions address several sampling constraints that are experienced in robotic exploration, including in situ and sample return missions. The Fimmvruhls lava field was formed by a basaltic effusive eruption associated with the 2010 Eyjafjallajkull eruption. Mlifellssandur is a recently deglaciated plain to the north of the Myrdalsjkull glacier. Holuhraun was formed by a 2014 fissure eruptions just north of the large Vatnajkull glacier. Dyngjusandur is an alluvial plain apparently kept barren by repeated mechanical weathering. Informed by our 2013 expedition, we collected samples in nested triangular grids every decade from the 10 cm scale to the 1 km scale (as permitted by the size of the site). Satellite imagery is available for older sites, and for Mlifellssandur, Holuhraun, and Dyngjusandur we obtained overhead imagery at 1 m to 200 m elevation. PanCam-style photographs were taken in the field by sampling personnel. In-field reflectance spectroscopy was also obtained with an ASD spectrometer in Dyngjusandur. All sites chosen were 'homogeneous' in apparent color, morphology, moisture, grain size, and

  14. Implications of sampling design and sample size for national carbon accounting systems

    Treesearch

    Michael Köhl; Andrew Lister; Charles T. Scott; Thomas Baldauf; Daniel Plugge

    2011-01-01

    Countries willing to adopt a REDD regime need to establish a national Measurement, Reporting and Verification (MRV) system that provides information on forest carbon stocks and carbon stock changes. Due to the extensive areas covered by forests the information is generally obtained by sample based surveys. Most operational sampling approaches utilize a combination of...

  15. Ball assisted device for analytical surface sampling

    DOEpatents

    ElNaggar, Mariam S; Van Berkel, Gary J; Covey, Thomas R

    2015-11-03

    A system for sampling a surface includes a sampling probe having a housing and a socket, and a rolling sampling sphere within the socket. The housing has a sampling fluid supply conduit and a sampling fluid exhaust conduit. The sampling fluid supply conduit supplies sampling fluid to the sampling sphere. The sampling fluid exhaust conduit has an inlet opening for receiving sampling fluid carried from the surface by the sampling sphere. A surface sampling probe and a method for sampling a surface are also disclosed.

  16. 7 CFR 75.18 - Sampling.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 7 Agriculture 3 2011-01-01 2011-01-01 false Sampling. 75.18 Section 75.18 Agriculture Regulations of the Department of Agriculture (Continued) AGRICULTURAL MARKETING SERVICE (Standards, Inspections... CERTIFICATION OF QUALITY OF AGRICULTURAL AND VEGETABLE SEEDS Inspection § 75.18 Sampling. Sampling, when...

  17. 7 CFR 75.18 - Sampling.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 3 2010-01-01 2010-01-01 false Sampling. 75.18 Section 75.18 Agriculture Regulations of the Department of Agriculture (Continued) AGRICULTURAL MARKETING SERVICE (Standards, Inspections... CERTIFICATION OF QUALITY OF AGRICULTURAL AND VEGETABLE SEEDS Inspection § 75.18 Sampling. Sampling, when...

  18. 7 CFR 28.906 - Sampling arrangements.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 7 Agriculture 2 2011-01-01 2011-01-01 false Sampling arrangements. 28.906 Section 28.906... Producers Sampling § 28.906 Sampling arrangements. (a) Cotton must be sampled by a gin or warehouse that... an authorized representative may direct that sampling be performed by employees of the Department of...

  19. 7 CFR 28.906 - Sampling arrangements.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 2 2010-01-01 2010-01-01 false Sampling arrangements. 28.906 Section 28.906... Producers Sampling § 28.906 Sampling arrangements. (a) Cotton must be sampled by a gin or warehouse that... an authorized representative may direct that sampling be performed by employees of the Department of...

  20. A Survey of Current Literature on Sampling, Sample Handling, and Long Term Storage for Environmental Materials.

    ERIC Educational Resources Information Center

    Maienthal, E. J.; Becker, D. A.

    This report presents the results of an extensive literature survey undertaken to establish optimum sampling, sample handling and long-term storage techniques for a wide variety of environmental samples to retain sample integrity. The components of interest are trace elements, organics, pesticides, radionuclides and microbiologicals. A bibliography…

  1. CHEMICAL TIME-SERIES SAMPLING

    EPA Science Inventory

    The rationale for chemical time-series sampling has its roots in the same fundamental relationships as govern well hydraulics. Samples of ground water are collected as a function of increasing time of pumpage. The most efficient pattern of collection consists of logarithmically s...

  2. 7 CFR 28.908 - Samples.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ..., TESTING, AND STANDARDS Cotton Classification and Market News Service for Producers Sampling § 28.908... submitted for classification under this subpart. This does not prohibit the submission of an additional sample from a bale for review classification if the producer so desires. (b) Drawing of samples manual...

  3. 7 CFR 28.908 - Samples.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ..., TESTING, AND STANDARDS Cotton Classification and Market News Service for Producers Sampling § 28.908... submitted for classification under this subpart. This does not prohibit the submission of an additional sample from a bale for review classification if the producer so desires. (b) Drawing of samples manual...

  4. 7 CFR 28.908 - Samples.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ..., TESTING, AND STANDARDS Cotton Classification and Market News Service for Producers Sampling § 28.908... submitted for classification under this subpart. This does not prohibit the submission of an additional sample from a bale for review classification if the producer so desires. (b) Drawing of samples manual...

  5. 7 CFR 28.908 - Samples.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ..., TESTING, AND STANDARDS Cotton Classification and Market News Service for Producers Sampling § 28.908... submitted for classification under this subpart. This does not prohibit the submission of an additional sample from a bale for review classification if the producer so desires. (b) Drawing of samples manual...

  6. Do Research Findings Apply to My Students? Examining Study Samples and Sampling

    ERIC Educational Resources Information Center

    Cook, Bryan G.; Cook, Lysandra

    2017-01-01

    Special educators are urged to use research findings to inform their instruction in order to improve student outcomes. However, it can be difficult to tell whether and how research findings apply to one's own students. In this article, we discuss how special educators can consider the samples and the sampling methods in studies to examine the…

  7. High-efficiency multiphoton boson sampling

    NASA Astrophysics Data System (ADS)

    Wang, Hui; He, Yu; Li, Yu-Huai; Su, Zu-En; Li, Bo; Huang, He-Liang; Ding, Xing; Chen, Ming-Cheng; Liu, Chang; Qin, Jian; Li, Jin-Peng; He, Yu-Ming; Schneider, Christian; Kamp, Martin; Peng, Cheng-Zhi; Höfling, Sven; Lu, Chao-Yang; Pan, Jian-Wei

    2017-06-01

    Boson sampling is considered as a strong candidate to demonstrate 'quantum computational supremacy' over classical computers. However, previous proof-of-principle experiments suffered from small photon number and low sampling rates owing to the inefficiencies of the single-photon sources and multiport optical interferometers. Here, we develop two central components for high-performance boson sampling: robust multiphoton interferometers with 99% transmission rate and actively demultiplexed single-photon sources based on a quantum dot-micropillar with simultaneously high efficiency, purity and indistinguishability. We implement and validate three-, four- and five-photon boson sampling, and achieve sampling rates of 4.96 kHz, 151 Hz and 4 Hz, respectively, which are over 24,000 times faster than previous experiments. Our architecture can be scaled up for a larger number of photons and with higher sampling rates to compete with classical computers, and might provide experimental evidence against the extended Church-Turing thesis.

  8. Chemical analyses of provided samples

    NASA Technical Reports Server (NTRS)

    Becker, Christopher H.

    1993-01-01

    Two batches of samples were received and chemical analysis was performed of the surface and near surface regions of the samples by the surface analysis by laser ionization (SALI) method. The samples included four one-inch optics and several paint samples. The analyses emphasized surface contamination or modification. In these studies, pulsed sputtering by 7 keV Ar+ and primarily single-photon ionization (SPI) by coherent 118 nm radiation (at approximately 5 x 10(exp 5) W/cm(sup 2) were used. For two of the samples, also multiphoton ionization (MPI) at 266 nm (approximately 5 x 10(exp 11) W/cm(sup 2) was used. Most notable among the results was the silicone contamination on Mg2 mirror 28-92, and that the Long Duration Exposure Facility (LDEF) paint sample had been enriched in K and Na and depleted in Zn, Si, B, and organic compounds relative to the control paint.

  9. Returning Samples from Enceladus

    NASA Astrophysics Data System (ADS)

    Tsou, P.; Kanik, I.; Brownlee, D.; McKay, C.; Anbar, A.; Glavin, D.; Yano, H.

    2012-12-01

    From the first half century of space exploration, we have obtained samples only from the Moon, comet Wild 2, the Solar Wind and the asteroid Itokawa. The in-depth analyses of these samples in terrestrial laboratories have yielded profound knowledge that could not have been obtained without the returned samples. While obtaining samples from Solar System bodies is crucial science, it is rarely done due to cost and complexity. Cassini's discovery of geysers on Enceladus and organic materials, indicate that there is an exceptional opportunity and science rational to do a low-cost flyby sample return mission, similar to what was done by the Stardust. The earliest low cost possible flight opportunity is the next Discovery Mission [Tsou et al 2012]. Enceladus Plume Discovery - While Voyager provided evidence for young surfaces on Enceladus, the existence of Enceladus plumes was discovered by Cassini. Enceladus and comets are the only known solar system bodies that have jets enabling sample collection without landing or surface contact. Cassini in situ Findings -Cassini's made many discoveries at Saturn, including the break up of large organics in the plumes of Enceladus. Four prime criteria for habitability are liquid water, a heat source, organics and nitrogen [McKay et al. 2008, Waite et al. 2009, Postberg et al. 2011]. Out of all the NASA designated habitability targets, Enceladus is the single body that presents evidence for all four criteria. Significant advancement in the exploration of the biological potential of Enceladus can be made on returned samples in terrestrial laboratories where the full power of state-of-the-art laboratory instrumentation and procedures can be used. Without serious limits on power, mass or even cost, terrestrial laboratories provide the ultimate in analytical capability, adaptability, reproducibility and reliability. What Questions can Samples Address? - Samples collected from the Enceladus plume will enable a thorough and replicated

  10. Development of Sample Handling and Analytical Expertise For the Stardust Comet Sample Return

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bradley, J; Bajt, S; Brennan, S

    NASA's Stardust mission returned to Earth in January 2006 with ''fresh'' cometary particles from a young Jupiter family comet. The cometary particles were sampled during the spacecraft flyby of comet 81P/Wild-2 in January 2004, when they impacted low-density silica aerogel tiles and aluminum foils on the sample tray assembly at approximately 6.1 km/s. This LDRD project has developed extraction and sample recovery methodologies to maximize the scientific information that can be obtained from the analysis of natural and man-made nano-materials of relevance to the LLNL programs.

  11. 10 CFR 431.328 - Sampling.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 10 Energy 3 2011-01-01 2011-01-01 false Sampling. 431.328 Section 431.328 Energy DEPARTMENT OF... Metal Halide Lamp Ballasts and Fixtures Energy Conservation Standards § 431.328 Sampling. For purposes... energy conservation standard shall be based upon the testing and sampling procedures, and other...

  12. 10 CFR 431.372 - Sampling.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 10 Energy 3 2011-01-01 2011-01-01 false Sampling. 431.372 Section 431.372 Energy DEPARTMENT OF... Certification and Enforcement § 431.372 Sampling. For purposes of a certification of compliance, the... standard shall be based upon the testing and sampling procedures, and other applicable rating procedures...

  13. Patient identification in blood sampling.

    PubMed

    Davidson, Anne; Bolton-Maggs, Paula

    The majority of adverse reports relating to blood transfusions result from human error, including misidentification of patients and incorrect labelling of samples. This article outlines best practice in blood sampling for transfusion (but is recommended for all pathology samples) and the role of patient empowerment in improving safety.

  14. 40 CFR 98.364 - Monitoring and QA/QC requirements.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... by Gas Chromatography (incorporated by reference see § 98.7). All gas composition monitors shall be...-90 (Reapproved 2006) Standard Practice for Analysis of Reformed Gas by Gas Chromatography... (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Manure Management § 98.364 Monitoring and QA/QC requirements...

  15. 40 CFR 98.364 - Monitoring and QA/QC requirements.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... by Gas Chromatography (incorporated by reference see § 98.7). All gas composition monitors shall be...-90 (Reapproved 2006) Standard Practice for Analysis of Reformed Gas by Gas Chromatography... (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Manure Management § 98.364 Monitoring and QA/QC requirements...

  16. 40 CFR 98.164 - Monitoring and QA/QC requirements.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Hydrogen Production § 98.164 Monitoring and QA/QC requirements. The GHG emissions data for hydrogen production process units must be quality-assured as specified in... Instrumental Determination of Carbon, Hydrogen, and Nitrogen in Petroleum Products and Lubricants (incorporated...

  17. Direct Electrospray Ionization Mass Spectrometric Profiling of Real-World Samples via a Solid Sampling Probe

    NASA Astrophysics Data System (ADS)

    Yu, Zhan; Chen, Lee Chuin; Mandal, Mridul Kanti; Yoshimura, Kentaro; Takeda, Sen; Hiraoka, Kenzo

    2013-10-01

    This study presents a novel direct analysis strategy for rapid mass spectrometric profiling of biochemicals in real-world samples via a direct sampling probe (DSP) without sample pretreatments. Chemical modification is applied to a disposable stainless steel acupuncture needle to enhance its surface area and hydrophilicity. After insertion into real-world samples, biofluid can be attached on the DSP surface. With the presence of a high DC voltage and solvent vapor condensing on the tip of the DSP, analyte can be dissolved and electrosprayed. The simplicity in design, versatility in application aspects, and other advantages such as low cost and disposability make this new method a competitive tool for direct analysis of real-world samples.

  18. Integrated Hamiltonian sampling: a simple and versatile method for free energy simulations and conformational sampling.

    PubMed

    Mori, Toshifumi; Hamers, Robert J; Pedersen, Joel A; Cui, Qiang

    2014-07-17

    Motivated by specific applications and the recent work of Gao and co-workers on integrated tempering sampling (ITS), we have developed a novel sampling approach referred to as integrated Hamiltonian sampling (IHS). IHS is straightforward to implement and complementary to existing methods for free energy simulation and enhanced configurational sampling. The method carries out sampling using an effective Hamiltonian constructed by integrating the Boltzmann distributions of a series of Hamiltonians. By judiciously selecting the weights of the different Hamiltonians, one achieves rapid transitions among the energy landscapes that underlie different Hamiltonians and therefore an efficient sampling of important regions of the conformational space. Along this line, IHS shares similar motivations as the enveloping distribution sampling (EDS) approach of van Gunsteren and co-workers, although the ways that distributions of different Hamiltonians are integrated are rather different in IHS and EDS. Specifically, we report efficient ways for determining the weights using a combination of histogram flattening and weighted histogram analysis approaches, which make it straightforward to include many end-state and intermediate Hamiltonians in IHS so as to enhance its flexibility. Using several relatively simple condensed phase examples, we illustrate the implementation and application of IHS as well as potential developments for the near future. The relation of IHS to several related sampling methods such as Hamiltonian replica exchange molecular dynamics and λ-dynamics is also briefly discussed.

  19. Sampling benthic macroinvertebrates in a large flood-plain river: Considerations of study design, sample size, and cost

    USGS Publications Warehouse

    Bartsch, L.A.; Richardson, W.B.; Naimo, T.J.

    1998-01-01

    Estimation of benthic macroinvertebrate populations over large spatial scales is difficult due to the high variability in abundance and the cost of sample processing and taxonomic analysis. To determine a cost-effective, statistically powerful sample design, we conducted an exploratory study of the spatial variation of benthic macroinvertebrates in a 37 km reach of the Upper Mississippi River. We sampled benthos at 36 sites within each of two strata, contiguous backwater and channel border. Three standard ponar (525 cm(2)) grab samples were obtained at each site ('Original Design'). Analysis of variance and sampling cost of strata-wide estimates for abundance of Oligochaeta, Chironomidae, and total invertebrates showed that only one ponar sample per site ('Reduced Design') yielded essentially the same abundance estimates as the Original Design, while reducing the overall cost by 63%. A posteriori statistical power analysis (alpha = 0.05, beta = 0.20) on the Reduced Design estimated that at least 18 sites per stratum were needed to detect differences in mean abundance between contiguous backwater and channel border areas for Oligochaeta, Chironomidae, and total invertebrates. Statistical power was nearly identical for the three taxonomic groups. The abundances of several taxa of concern (e.g., Hexagenia mayflies and Musculium fingernail clams) were too spatially variable to estimate power with our method. Resampling simulations indicated that to achieve adequate sampling precision for Oligochaeta, at least 36 sample sites per stratum would be required, whereas a sampling precision of 0.2 would not be attained with any sample size for Hexagenia in channel border areas, or Chironomidae and Musculium in both strata given the variance structure of the original samples. Community-wide diversity indices (Brillouin and 1-Simpsons) increased as sample area per site increased. The backwater area had higher diversity than the channel border area. The number of sampling sites

  20. 10 CFR 430.63 - Sampling.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 10 Energy 3 2011-01-01 2011-01-01 false Sampling. 430.63 Section 430.63 Energy DEPARTMENT OF... Enforcement § 430.63 Sampling. (a) For purposes of a certification of compliance, the determination that a... the case of faucets, showerheads, water closets, and urinals) shall be based upon the sampling...

  1. 10 CFR 430.63 - Sampling.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 10 Energy 3 2010-01-01 2010-01-01 false Sampling. 430.63 Section 430.63 Energy DEPARTMENT OF... Enforcement § 430.63 Sampling. (a) For purposes of a certification of compliance, the determination that a... the case of faucets, showerheads, water closets, and urinals) shall be based upon the sampling...

  2. 19 CFR 151.10 - Sampling.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 19 Customs Duties 2 2011-04-01 2011-04-01 false Sampling. 151.10 Section 151.10 Customs Duties U.S. CUSTOMS AND BORDER PROTECTION, DEPARTMENT OF HOMELAND SECURITY; DEPARTMENT OF THE TREASURY (CONTINUED) EXAMINATION, SAMPLING, AND TESTING OF MERCHANDISE General § 151.10 Sampling. When necessary, the port director...

  3. 19 CFR 151.10 - Sampling.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 19 Customs Duties 2 2010-04-01 2010-04-01 false Sampling. 151.10 Section 151.10 Customs Duties U.S. CUSTOMS AND BORDER PROTECTION, DEPARTMENT OF HOMELAND SECURITY; DEPARTMENT OF THE TREASURY (CONTINUED) EXAMINATION, SAMPLING, AND TESTING OF MERCHANDISE General § 151.10 Sampling. When necessary, the port director...

  4. Does sampling using random digit dialling really cost more than sampling from telephone directories: Debunking the myths

    PubMed Central

    Yang, Baohui; Eyeson-Annan, Margo

    2006-01-01

    Background Computer assisted telephone interviewing (CATI) is widely used for health surveys. The advantages of CATI over face-to-face interviewing are timeliness and cost reduction to achieve the same sample size and geographical coverage. Two major CATI sampling procedures are used: sampling directly from the electronic white pages (EWP) telephone directory and list assisted random digit dialling (LA-RDD) sampling. EWP sampling covers telephone numbers of households listed in the printed white pages. LA-RDD sampling has a better coverage of households than EWP sampling but is considered to be more expensive due to interviewers dialling more out-of-scope numbers. Methods This study compared an EWP sample and a LA-RDD sample from the New South Wales Population Health Survey in 2003 on demographic profiles, health estimates, coefficients of variation in weights, design effects on estimates, and cost effectiveness, on the basis of achieving the same level of precision of estimates. Results The LA-RDD sample better represented the population than the EWP sample, with a coefficient of variation of weights of 1.03 for LA-RDD compared with 1.21 for EWP, and average design effects of 2.00 for LA-RDD compared with 2.38 for EWP. Also, a LA-RDD sample can save up to 14.2% in cost compared to an EWP sample to achieve the same precision for health estimates. Conclusion A LA-RDD sample better represents the population, which potentially leads to reduced bias in health estimates, and rather than costing more than EWP actually costs less. PMID:16504117

  5. National Sample Assessment Protocols

    ERIC Educational Resources Information Center

    Ministerial Council on Education, Employment, Training and Youth Affairs (NJ1), 2012

    2012-01-01

    These protocols represent a working guide for planning and implementing national sample assessments in connection with the national Key Performance Measures (KPMs). The protocols are intended for agencies involved in planning or conducting national sample assessments and personnel responsible for administering associated tenders or contracts,…

  6. The Viking X ray fluorescence experiment - Sampling strategies and laboratory simulations. [Mars soil sampling

    NASA Technical Reports Server (NTRS)

    Baird, A. K.; Castro, A. J.; Clark, B. C.; Toulmin, P., III; Rose, H., Jr.; Keil, K.; Gooding, J. L.

    1977-01-01

    Ten samples of Mars regolith material (six on Viking Lander 1 and four on Viking Lander 2) have been delivered to the X ray fluorescence spectrometers as of March 31, 1977. An additional six samples at least are planned for acquisition in the remaining Extended Mission (to January 1979) for each lander. All samples acquired are Martian fines from the near surface (less than 6-cm depth) of the landing sites except the latest on Viking Lander 1, which is fine material from the bottom of a trench dug to a depth of 25 cm. Several attempts on each lander to acquire fresh rock material (in pebble sizes) for analysis have yielded only cemented surface crustal material (duricrust). Laboratory simulation and experimentation are required both for mission planning of sampling and for interpretation of data returned from Mars. This paper is concerned with the rationale for sample site selections, surface sampler operations, and the supportive laboratory studies needed to interpret X ray results from Mars.

  7. Mars sample collection and preservation

    NASA Technical Reports Server (NTRS)

    Blanchard, Douglas P.

    1988-01-01

    The intensive exploration of Mars is a major step in the systematic exploration of the solar system. Mars, earth, and Venus provide valuable contrasts in planetary evolution. Mars exploration has progressed through the stages of exploration and is now ready for a sample-return mission. About 5 kg of intelligently selected samples will be returned from Mars. A variety of samples are wanted. This requires accurate landing in areas of high interest, surface mobility and analytical capability, a variety of sampling tools, and stringent preservation and isolation measures.

  8. Open port sampling interface

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Van Berkel, Gary J

    A system for sampling a sample material includes a probe which can have an outer probe housing with an open end. A liquid supply conduit within the housing has an outlet positioned to deliver liquid to the open end of the housing. The liquid supply conduit can be connectable to a liquid supply for delivering liquid at a first volumetric flow rate to the open end of the housing. A liquid exhaust conduit within the housing is provided for removing liquid from the open end of the housing. A liquid exhaust system can be provided for removing liquid from themore » liquid exhaust conduit at a second volumetric flow rate, the first volumetric flow rate exceeding the second volumetric flow rate, wherein liquid at the open end will receive sample, liquid containing sample material will be drawn into and through the liquid exhaust conduit, and liquid will overflow from the open end.« less

  9. Duplex sampling apparatus and method

    DOEpatents

    Brown, Paul E.; Lloyd, Robert

    1992-01-01

    An improved apparatus is provided for sampling a gaseous mixture and for measuring mixture components. The apparatus includes two sampling containers connected in series serving as a duplex sampling apparatus. The apparatus is adapted to independently determine the amounts of condensable and noncondensable gases in admixture from a single sample. More specifically, a first container includes a first port capable of selectively connecting to and disconnecting from a sample source and a second port capable of selectively connecting to and disconnecting from a second container. A second container also includes a first port capable of selectively connecting to and disconnecting from the second port of the first container and a second port capable of either selectively connecting to and disconnecting from a differential pressure source. By cooling a mixture sample in the first container, the condensable vapors form a liquid, leaving noncondensable gases either as free gases or dissolved in the liquid. The condensed liquid is heated to drive out dissolved noncondensable gases, and all the noncondensable gases are transferred to the second container. Then the first and second containers are separated from one another in order to separately determine the amount of noncondensable gases and the amount of condensable gases in the sample.

  10. 27 CFR 19.1007 - Samples.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... fuel alcohol for testing and analysis. Samples of spirits may not be removed from the premises of the alcohol fuel plant. Samples of fuel alcohol may be removed from the premises of the alcohol fuel plant to... that the spirits or fuel alcohol contained therein is a sample. The proprietor shall account for...

  11. Sample normalization methods in quantitative metabolomics.

    PubMed

    Wu, Yiman; Li, Liang

    2016-01-22

    To reveal metabolomic changes caused by a biological event in quantitative metabolomics, it is critical to use an analytical tool that can perform accurate and precise quantification to examine the true concentration differences of individual metabolites found in different samples. A number of steps are involved in metabolomic analysis including pre-analytical work (e.g., sample collection and storage), analytical work (e.g., sample analysis) and data analysis (e.g., feature extraction and quantification). Each one of them can influence the quantitative results significantly and thus should be performed with great care. Among them, the total sample amount or concentration of metabolites can be significantly different from one sample to another. Thus, it is critical to reduce or eliminate the effect of total sample amount variation on quantification of individual metabolites. In this review, we describe the importance of sample normalization in the analytical workflow with a focus on mass spectrometry (MS)-based platforms, discuss a number of methods recently reported in the literature and comment on their applicability in real world metabolomics applications. Sample normalization has been sometimes ignored in metabolomics, partially due to the lack of a convenient means of performing sample normalization. We show that several methods are now available and sample normalization should be performed in quantitative metabolomics where the analyzed samples have significant variations in total sample amounts. Copyright © 2015 Elsevier B.V. All rights reserved.

  12. QC operator’s nonneutral posture against musculoskeletal disorder’s (MSDs) risks

    NASA Astrophysics Data System (ADS)

    Kautsar, F.; Gustopo, D.; Achmadi, F.

    2018-04-01

    Musculoskeletal disorders refer to a gamut of inflammatory and degenerative disorders aggravated largely by the performance of work. It is the major cause of pain, disability, absenteeism and reduced productivity among workers worldwide. Although it is not fatal, MSDs have the potential to develop into serious injuries in the musculoskeletal system if ignored. QC operators work in nonneutral body posture. This cross-sectional study was condusted in order to investigate correlation between risk assessment results of QEC and body posture calculation of mannequin pro. Statistical analysis was condusted using SPSS version 16.0. Validity test, Reliability test and Regression analysis were conducted to compare the risk assessment output of applied method and nonneutral body posture simulation. All of QEC’s indicator classified as valid and reliable. The result of simple regression anlysis are back (0.326<4.32), shoulder/arm (8.489>4.32), wrist/hand (4.86 >4.32) and neck (1.298 <4.32). Result of this study shows that there is an influence between nonneutral body posture of the QC operator during work with risk of musculoskeletal disorders. The potential risk of musculoskeletal disorders is in the shoulder/arm and wrist/hand of the QC operator, whereas the back and neck are not affected.

  13. Sampling system and method

    DOEpatents

    Decker, David L.; Lyles, Brad F.; Purcell, Richard G.; Hershey, Ronald Lee

    2013-04-16

    The present disclosure provides an apparatus and method for coupling conduit segments together. A first pump obtains a sample and transmits it through a first conduit to a reservoir accessible by a second pump. The second pump further conducts the sample from the reservoir through a second conduit.

  14. Extraterrestrial Samples at JSC

    NASA Technical Reports Server (NTRS)

    Allen, Carlton C.

    2007-01-01

    A viewgraph presentation on the curation of extraterrestrial samples at NASA Johnson Space Center is shown. The topics include: 1) Apollo lunar samples; 2) Meteorites from Antarctica; 3) Cosmic dust from the stratosphere; 4) Genesis solar wind ions; 5) Stardust comet and interstellar grains; and 5) Space-Exposed Hardware.

  15. Correction of Anisokinetic Sampling Errors.

    ERIC Educational Resources Information Center

    Nelson, William G.

    Gas flow patterns at a sampling nozzle are described in this presentation for the 12th Conference on Methods in Air Pollution and Industrial Hygiene Studies, University of Southern California, April, 1971. Three situations for sampling velocity are illustrated and analyzed, where the flow upstream of a sampling probe is: (1) equal to free stream…

  16. 40 CFR 98.164 - Monitoring and QA/QC requirements.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Hydrogen Production § 98.164 Monitoring and QA/QC requirements. The GHG emissions data for hydrogen production process units must be quality-assured as specified in..., Hydrogen, and Nitrogen in Petroleum Products and Lubricants (incorporated by reference, see § 98.7). (xi...

  17. Sampling characteristics of satellite orbits

    NASA Technical Reports Server (NTRS)

    Wunsch, Carl

    1989-01-01

    The irregular space-time sampling of any finite region by an orbiting satellite raises difficult questions as to which frequencies and wavenumbers can be determined and which will alias into others. Conventional sampling theorems must be extended to account for both irregular data distributions and observational noise - the sampling irregularity making the system much more susceptible to noise than in regularly sampled cases. The problem is formulated here in terms of least-squares and applied to spacecraft in 10-day and 17-day repeating orbits. The 'diamond-pattern' laid down spatially in such repeating orbits means that either repeat period adequately samples the spatial variables, but the slow overall temporal coverage in the 17-day pattern leads to much greater uncertainty than in the shorter repeat cycle. The result is not definitive and it is not concluded that a 10-day orbit repeat is the most appropriate one. A major conclusion, however, is that different orbital choices have potentially quite different sampling characteristics which need to be analyzed in terms of the spectral characteristics of the moving sea surface.

  18. LUNAR SAMPLES - APOLLO XI

    NASA Image and Video Library

    1969-07-27

    S69-45002 (26 July 1969) --- A close-up view of the lunar rocks contained in the first Apollo 11 sample return container. The rock box was opened for the first time in the Vacuum Laboratory of the Manned Spacecraft Center’s Lunar Receiving Laboratory, Building 37, at 3:55 p.m. (CDT), Saturday, July 26, 1969. The gloved hand gives an indication of size. This box also contained the Solar Wind Composition experiment (not shown) and two core tubes for subsurface samples (not shown). These lunar samples were collected by astronauts Neil A. Armstrong and Edwin E. Aldrin Jr. during their lunar surface extravehicular activity on July 20, 1969.

  19. An antithetic variate to facilitate upper-stem height measurements for critical height sampling with importance sampling

    Treesearch

    Thomas B. Lynch; Jeffrey H. Gove

    2013-01-01

    Critical height sampling (CHS) estimates cubic volume per unit area by multiplying the sum of critical heights measured on trees tallied in a horizontal point sample (HPS) by the HPS basal area factor. One of the barriers to practical application of CHS is the fact that trees near the field location of the point-sampling sample point have critical heights that occur...

  20. 16 CFR 305.6 - Sampling.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 16 Commercial Practices 1 2010-01-01 2010-01-01 false Sampling. 305.6 Section 305.6 Commercial... ENERGY POLICY AND CONSERVATION ACT (âAPPLIANCE LABELING RULEâ) Testing § 305.6 Sampling. (a) For any... based upon the sampling procedures set forth in § 430.24 of 10 CFR part 430, subpart B. (b) For any...

  1. 16 CFR 305.6 - Sampling.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 16 Commercial Practices 1 2011-01-01 2011-01-01 false Sampling. 305.6 Section 305.6 Commercial... ENERGY POLICY AND CONSERVATION ACT (âAPPLIANCE LABELING RULEâ) Testing § 305.6 Sampling. Link to an... consumption incorporated into § 305.5 shall be based upon the sampling procedures set forth in § 430.24 of 10...

  2. Downselection for Sample Return — Defining Sampling Strategies Using Lessons from Terrestrial Field Analogues

    NASA Astrophysics Data System (ADS)

    Stevens, A. H.; Gentry, D.; Amador, E.; Cable, M. L.; Cantrell, T.; Chaudry, N.; Cullen, T.; Duca, Z.; Jacobsen, M.; Kirby, J.; McCaig, H.; Murukesan, G.; Rader, E.; Rennie, V.; Schwieterman, E.; Sutton, S.; Tan, G.; Yin, C.; Cullen, D.; Geppert, W.; Stockton, A.

    2018-04-01

    We detail multi-year field investigations in Icelandic Mars analogue environments that have yielded results that can help inform strategies for sample selection and downselection for Mars Sample Return.

  3. Nanopipettes: probes for local sample analysis.

    PubMed

    Saha-Shah, Anumita; Weber, Anna E; Karty, Jonathan A; Ray, Steven J; Hieftje, Gary M; Baker, Lane A

    2015-06-01

    Nanopipettes (pipettes with diameters <1 μm) were explored as pressure-driven fluid manipulation tools for sampling nanoliter volumes of fluids. The fundamental behavior of fluids confined in the narrow channels of the nanopipette shank was studied to optimize sampling volume and probe geometry. This method was utilized to collect nanoliter volumes (<10 nL) of sample from single Allium cepa cells and live Drosophila melanogaster first instar larvae. Matrix assisted laser desorption/ionization-mass spectrometry (MALDI-MS) was utilized to characterize the collected sample. The use of nanopipettes for surface sampling of mouse brain tissue sections was also explored. Lipid analyses were performed on mouse brain tissues with spatial resolution of sampling as small as 50 μm. Nanopipettes were shown to be a versatile tool that will find further application in studies of sample heterogeneity and population analysis for a wide range of samples.

  4. Sampling Large Graphs for Anticipatory Analytics

    DTIC Science & Technology

    2015-05-15

    low. C. Random Area Sampling Random area sampling [8] is a “ snowball ” sampling method in which a set of random seed vertices are selected and areas... Sampling Large Graphs for Anticipatory Analytics Lauren Edwards, Luke Johnson, Maja Milosavljevic, Vijay Gadepally, Benjamin A. Miller Lincoln...systems, greater human-in-the-loop involvement, or through complex algorithms. We are investigating the use of sampling to mitigate these challenges

  5. Sampling and data handling methods for inhalable particulate sampling. Final report nov 78-dec 80

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, W.B.; Cushing, K.M.; Johnson, J.W.

    1982-05-01

    The report reviews the objectives of a research program on sampling and measuring particles in the inhalable particulate (IP) size range in emissions from stationary sources, and describes methods and equipment required. A computer technique was developed to analyze data on particle-size distributions of samples taken with cascade impactors from industrial process streams. Research in sampling systems for IP matter included concepts for maintaining isokinetic sampling conditions, necessary for representative sampling of the larger particles, while flowrates in the particle-sizing device were constant. Laboratory studies were conducted to develop suitable IP sampling systems with overall cut diameters of 15 micrometersmore » and conforming to a specified collection efficiency curve. Collection efficiencies were similarly measured for a horizontal elutriator. Design parameters were calculated for horizontal elutriators to be used with impactors, the EPA SASS train, and the EPA FAS train. Two cyclone systems were designed and evaluated. Tests on an Andersen Size Selective Inlet, a 15-micrometer precollector for high-volume samplers, showed its performance to be with the proposed limits for IP samplers. A stack sampling system was designed in which the aerosol is diluted in flow patterns and with mixing times simulating those in stack plumes.« less

  6. Sampling in the Wild: How Attention to Variation Supports Middle School Students' Sampling Practice

    ERIC Educational Resources Information Center

    Forsythe, Michelle E.

    2018-01-01

    Sampling is a fundamental practice of many scientific disciplines. However, K-12 students are rarely asked to think critically about sampling decisions. Because of this, open questions remain about how best to support students in this practice. This study explores the emergent sampling practice of two classes of sixth-grade students as they…

  7. Effects of coupling between sample and electrode on the electrical resistivity measurements of conductive samples

    NASA Astrophysics Data System (ADS)

    Lee, T. J.; Lee, S. K.

    2015-12-01

    A resistivity measurement system for conductive core samples has been setup using a high resolution nano-voltmeter. Using the system, in this study, various coupling effects between electrodes and the samples are discussed including contact resistance, lead resistance, temperature dependence, and heat produced within the samples by applied current. The lead resistance was over 10 times higher than the resistance of the conductive samples such as graphite or nichrome, even though the electrodes and lead lines were made of silver. Furthermore, lead resistance itself showed very strong temperature dependence, so that it is essential to subtract the lead resistance from the measured values at corresponding temperature. Minimization of contact resistance is very important, so that the axial loads are needed as big as possible unless the deformation of sample occurs.

  8. Optimizing cord blood sample cryopreservation.

    PubMed

    Harris, David T

    2012-03-01

    Cord blood (CB) banking is becoming more and more commonplace throughout the medical community, both in the USA and elsewhere. It is now generally recognized that storage of CB samples in multiple aliquots is the preferred approach to banking because it allows the greatest number of uses of the sample. However, it is unclear which are the best methodologies for cryopreservation and storage of the sample aliquots. In the current study we analyzed variables that could affect these processes. CB were processed into mononuclear cells (MNC) and frozen in commercially available human serum albumin (HSA) or autologous CB plasma using cryovials of various sizes and cryobags. The bacteriophage phiX174 was used as a model virus to test for cross-contamination. We observed that cryopreservation of CB in HSA, undiluted autologous human plasma and 50% diluted plasma was equivalent in terms of cell recovery and cell viability. We also found that cryopreservation of CB samples in either cryovials or cryobags displayed equivalent thermal characteristics. Finally, we demonstrated that overwrapping the CB storage container in an impermeable plastic sheathing was sufficient to prevent cross-sample viral contamination during prolonged storage in the liquid phase of liquid nitrogen dewar storage. CB may be cryopreserved in either vials or bags without concern for temperature stability. Sample overwrapping is sufficient to prevent microbiologic contamination of the samples while in liquid-phase liquid nitrogen storage.

  9. Geochemical reanalysis of historical U.S. Geological Survey sediment samples from the northeastern Alaska Range, Healy, Mount Hayes, Nabesna, and Tanacross quadrangles, Alaska

    USGS Publications Warehouse

    Werdon, Melanie B.; Granitto, Matthew; Azain, Jaime S.

    2015-01-01

    The State of Alaska’s Strategic and Critical Minerals (SCM) Assessment project, a State-funded Capital Improvement Project (CIP), is designed to evaluate Alaska’s statewide potential for SCM resources. The SCM Assessment is being implemented by the Alaska Division of Geological & Geophysical Surveys (DGGS), and involves obtaining new airborne-geophysical, geological, and geochemical data. As part of the SCM Assessment, thousands of historical geochemical samples from DGGS, U.S. Geological Survey (USGS), and U.S. Bureau of Mines archives are being reanalyzed by DGGS using modern, quantitative, geochemical-analytical methods. The objective is to update the statewide geochemical database to more clearly identify areas in Alaska with SCM potential. The USGS is also undertaking SCM-related geologic studies in Alaska through the federally funded Alaska Critical Minerals cooperative project. DGGS and USGS share the goal of evaluating Alaska’s strategic and critical minerals potential and together created a Letter of Agreement (signed December 2012) and a supplementary Technical Assistance Agreement (#14CMTAA143458) to facilitate the two agencies’ cooperative work. Under these agreements, DGGS contracted the USGS in Denver to reanalyze historical USGS sediment samples from Alaska. For this report, DGGS funded reanalysis of 670 historical USGS sediment samples from the statewide Alaska Geochemical Database Version 2.0 (AGDB2; Granitto and others, 2013). Samples were chosen from the northeastern Alaska Range, in the Healy, Mount Hayes, Nabesna, and Tanacross quadrangles, Alaska (fig. 1). The USGS was responsible for sample retrieval from the National Geochemical Sample Archive (NGSA) in Denver, Colorado through the final quality assurance/quality control (QA/QC) of the geochemical analyses obtained through the USGS contract lab. The new geochemical data are published in this report as a coauthored DGGS report, and will be incorporated into the statewide geochemical

  10. Automated storm water sampling on small watersheds

    USGS Publications Warehouse

    Harmel, R.D.; King, K.W.; Slade, R.M.

    2003-01-01

    Few guidelines are currently available to assist in designing appropriate automated storm water sampling strategies for small watersheds. Therefore, guidance is needed to develop strategies that achieve an appropriate balance between accurate characterization of storm water quality and loads and limitations of budget, equipment, and personnel. In this article, we explore the important sampling strategy components (minimum flow threshold, sampling interval, and discrete versus composite sampling) and project-specific considerations (sampling goal, sampling and analysis resources, and watershed characteristics) based on personal experiences and pertinent field and analytical studies. These components and considerations are important in achieving the balance between sampling goals and limitations because they determine how and when samples are taken and the potential sampling error. Several general recommendations are made, including: setting low minimum flow thresholds, using flow-interval or variable time-interval sampling, and using composite sampling to limit the number of samples collected. Guidelines are presented to aid in selection of an appropriate sampling strategy based on user's project-specific considerations. Our experiences suggest these recommendations should allow implementation of a successful sampling strategy for most small watershed sampling projects with common sampling goals.

  11. 19 CFR 151.52 - Sampling procedures.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 19 Customs Duties 2 2011-04-01 2011-04-01 false Sampling procedures. 151.52 Section 151.52 Customs... (CONTINUED) EXAMINATION, SAMPLING, AND TESTING OF MERCHANDISE Metal-Bearing Ores and Other Metal-Bearing Materials § 151.52 Sampling procedures. (a) Commercial samples taken under Customs supervision...

  12. 19 CFR 151.52 - Sampling procedures.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 19 Customs Duties 2 2010-04-01 2010-04-01 false Sampling procedures. 151.52 Section 151.52 Customs... (CONTINUED) EXAMINATION, SAMPLING, AND TESTING OF MERCHANDISE Metal-Bearing Ores and Other Metal-Bearing Materials § 151.52 Sampling procedures. (a) Commercial samples taken under Customs supervision...

  13. 40 CFR 761.323 - Sample preparation.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 761.323 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) TOXIC SUBSTANCES CONTROL... Remediation Waste Samples § 761.323 Sample preparation. (a) The comparison study requires analysis of a minimum of 10 samples weighing at least 300 grams each. Samples of PCB remediation waste used in the...

  14. 40 CFR 761.323 - Sample preparation.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 761.323 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) TOXIC SUBSTANCES CONTROL... Remediation Waste Samples § 761.323 Sample preparation. (a) The comparison study requires analysis of a minimum of 10 samples weighing at least 300 grams each. Samples of PCB remediation waste used in the...

  15. 40 CFR 761.323 - Sample preparation.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 761.323 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) TOXIC SUBSTANCES CONTROL... Remediation Waste Samples § 761.323 Sample preparation. (a) The comparison study requires analysis of a minimum of 10 samples weighing at least 300 grams each. Samples of PCB remediation waste used in the...

  16. 19 CFR 151.52 - Sampling procedures.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    .... Representative commercial moisture and assay samples shall be taken under Customs supervision for testing by the Customs laboratory. The samples used for the moisture test shall be representative of the shipment at the... verified commercial moisture sample and prepared assay sample certified to be representative of the...

  17. Computer Graphics Simulations of Sampling Distributions.

    ERIC Educational Resources Information Center

    Gordon, Florence S.; Gordon, Sheldon P.

    1989-01-01

    Describes the use of computer graphics simulations to enhance student understanding of sampling distributions that arise in introductory statistics. Highlights include the distribution of sample proportions, the distribution of the difference of sample means, the distribution of the difference of sample proportions, and the distribution of sample…

  18. Design-based Sample and Probability Law-Assumed Sample: Their Role in Scientific Investigation.

    ERIC Educational Resources Information Center

    Ojeda, Mario Miguel; Sahai, Hardeo

    2002-01-01

    Discusses some key statistical concepts in probabilistic and non-probabilistic sampling to provide an overview for understanding the inference process. Suggests a statistical model constituting the basis of statistical inference and provides a brief review of the finite population descriptive inference and a quota sampling inferential theory.…

  19. Sampling in research on interpersonal aggression.

    PubMed

    Nielsen, Morten Birkeland; Einarsen, Ståle

    2008-01-01

    The aim of this study was to investigate the usefulness of convenience samples in research on interpersonal aggression among adults. It was hypothesised that convenience sampled targets of aggression differs from targets in general with regards to both demographic characteristics and degree of aggression exposed to. A convenience sample comprising support-seeking targets of workplace bullying was compared with a representative sample of Norwegian targets of bullying. The results showed that the two samples differed significantly on all demographic variables investigated, except gender. A far higher percentage of the convenience sample had blown the whistle on illegal, immoral or illegitimate practice at their workplace, whereas they also reported significantly more frequent and more intense exposure to aggression. The findings confirm that convenience samples have low external validity when generalising to the general population. Such samples should therefore mainly be used to investigate tendencies in, and the phenomenology of, interpersonal aggression, in studies where generalisability is not the principal objective. Copyright 2007 Wiley-Liss, Inc.

  20. Subrandom methods for multidimensional nonuniform sampling.

    PubMed

    Worley, Bradley

    2016-08-01

    Methods of nonuniform sampling that utilize pseudorandom number sequences to select points from a weighted Nyquist grid are commonplace in biomolecular NMR studies, due to the beneficial incoherence introduced by pseudorandom sampling. However, these methods require the specification of a non-arbitrary seed number in order to initialize a pseudorandom number generator. Because the performance of pseudorandom sampling schedules can substantially vary based on seed number, this can complicate the task of routine data collection. Approaches such as jittered sampling and stochastic gap sampling are effective at reducing random seed dependence of nonuniform sampling schedules, but still require the specification of a seed number. This work formalizes the use of subrandom number sequences in nonuniform sampling as a means of seed-independent sampling, and compares the performance of three subrandom methods to their pseudorandom counterparts using commonly applied schedule performance metrics. Reconstruction results using experimental datasets are also provided to validate claims made using these performance metrics. Copyright © 2016 Elsevier Inc. All rights reserved.

  1. Integrated Sampling Strategy (ISS) Guide

    Treesearch

    Robert E. Keane; Duncan C. Lutes

    2006-01-01

    What is an Integrated Sampling Strategy? Simply put, it is the strategy that guides how plots are put on the landscape. FIREMON’s Integrated Sampling Strategy assists fire managers as they design their fire monitoring project by answering questions such as: What statistical approach is appropriate for my sample design? How many plots can I afford? How many plots do I...

  2. 27 CFR 6.91 - Samples.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 27 Alcohol, Tobacco Products and Firearms 1 2011-04-01 2011-04-01 false Samples. 6.91 Section 6.91 Alcohol, Tobacco Products and Firearms ALCOHOL AND TOBACCO TAX AND TRADE BUREAU, DEPARTMENT OF THE TREASURY LIQUORS âTIED-HOUSEâ Exceptions § 6.91 Samples. The act by an industry member of furnishing or giving a sample of distilled spirits, wine, o...

  3. 40 CFR 98.84 - Monitoring and QA/QC requirements.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Cement Production § 98.84 Monitoring and QA/QC requirements..., shale, iron oxide, and alumina). Facilities that opt to use the default total organic carbon factor... quantity of each category of raw materials consumed by the facility (e.g., limestone, sand, shale, iron...

  4. 40 CFR 98.84 - Monitoring and QA/QC requirements.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Cement Production § 98.84 Monitoring and QA/QC requirements..., shale, iron oxide, and alumina). Facilities that opt to use the default total organic carbon factor... quantity of each category of raw materials consumed by the facility (e.g., limestone, sand, shale, iron...

  5. 40 CFR 98.84 - Monitoring and QA/QC requirements.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Cement Production § 98.84 Monitoring and QA/QC requirements..., shale, iron oxide, and alumina). Facilities that opt to use the default total organic carbon factor... quantity of each category of raw materials consumed by the facility (e.g., limestone, sand, shale, iron...

  6. 40 CFR 98.144 - Monitoring and QA/QC requirements.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Glass Production § 98.144 Monitoring and QA/QC requirements. (a) You must measure annual amounts of carbonate-based raw materials charged to each continuous glass... calibrated scales or weigh hoppers. Total annual mass charged to glass melting furnaces at the facility shall...

  7. 40 CFR 98.74 - Monitoring and QA/QC requirements.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Relative Molecular Mass of Petroleum Oils from Viscosity Measurements (incorporated by reference, see § 98... Weight) of Hydrocarbons by Thermoelectric Measurement of Vapor Pressure (incorporated by reference, see... measurements according to the monitoring and QA/QC requirements for the Tier 3 methodology in § 98.34(b). (e...

  8. 40 CFR 98.364 - Monitoring and QA/QC requirements.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... or operator shall document the procedures used to ensure the accuracy of gas flow rate, gas... (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Manure Management § 98.364 Monitoring and QA/QC requirements... fraction of total manure managed in each system component. (c) The CH4 concentration of gas from digesters...

  9. 40 CFR 98.364 - Monitoring and QA/QC requirements.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... or operator shall document the procedures used to ensure the accuracy of gas flow rate, gas... (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Manure Management § 98.364 Monitoring and QA/QC requirements... fraction of total manure managed in each system component. (c) The CH4 concentration of gas from digesters...

  10. 40 CFR 98.364 - Monitoring and QA/QC requirements.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... or operator shall document the procedures used to ensure the accuracy of gas flow rate, gas... (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Manure Management § 98.364 Monitoring and QA/QC requirements... fraction of total manure managed in each system component. (c) The CH4 concentration of gas from digesters...

  11. 40 CFR 98.414 - Monitoring and QA/QC requirements.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 20 2010-07-01 2010-07-01 false Monitoring and QA/QC requirements. 98... (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Suppliers of Industrial Greenhouse Gases § 98.414 Monitoring... or better. If the mass in paragraph (a) of this section is measured by weighing containers that...

  12. 40 CFR 98.394 - Monitoring and QA/QC requirements.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Suppliers of Petroleum Products § 98.394 Monitoring and QA/QC requirements. (a) Determination of quantity. (1) The quantity of petroleum products, natural gas liquids... product or natural gas liquid on any day of each calendar month of the reporting year in which the...

  13. Spot sputum samples are at least as good as early morning samples for identifying Mycobacterium tuberculosis.

    PubMed

    Murphy, Michael E; Phillips, Patrick P J; Mendel, Carl M; Bongard, Emily; Bateson, Anna L C; Hunt, Robert; Murthy, Saraswathi; Singh, Kasha P; Brown, Michael; Crook, Angela M; Nunn, Andrew J; Meredith, Sarah K; Lipman, Marc; McHugh, Timothy D; Gillespie, Stephen H

    2017-10-27

    The use of early morning sputum samples (EMS) to diagnose tuberculosis (TB) can result in treatment delay given the need for the patient to return to the clinic with the EMS, increasing the chance of patients being lost during their diagnostic workup. However, there is little evidence to support the superiority of EMS over spot sputum samples. In this new analysis of the REMoxTB study, we compare the diagnostic accuracy of EMS with spot samples for identifying Mycobacterium tuberculosis pre- and post-treatment. Patients who were smear positive at screening were enrolled into the study. Paired sputum samples (one EMS and one spot) were collected at each trial visit pre- and post-treatment. Microscopy and culture on solid LJ and liquid MGIT media were performed on all samples; those missing corresponding paired results were excluded from the analyses. Data from 1115 pre- and 2995 post-treatment paired samples from 1931 patients enrolled in the REMoxTB study were analysed. Patients were recruited from South Africa (47%), East Africa (21%), India (20%), Asia (11%), and North America (1%); 70% were male, median age 31 years (IQR 24-41), 139 (7%) co-infected with HIV with a median CD4 cell count of 399 cells/μL (IQR 318-535). Pre-treatment spot samples had a higher yield of positive Ziehl-Neelsen smears (98% vs. 97%, P = 0.02) and LJ cultures (87% vs. 82%, P = 0.006) than EMS, but there was no difference for positivity by MGIT (93% vs. 95%, P = 0.18). Contaminated and false-positive MGIT were found more often with EMS rather than spot samples. Surprisingly, pre-treatment EMS had a higher smear grading and shorter time-to-positivity, by 1 day, than spot samples in MGIT culture (4.5 vs. 5.5 days, P < 0.001). There were no differences in time to positivity in pre-treatment LJ culture, or in post-treatment MGIT or LJ cultures. Comparing EMS and spot samples in those with unfavourable outcomes, there were no differences in smear or culture results, and

  14. Sampling Methodologies for Epidemiologic Surveillance of Men Who Have Sex with Men and Transgender Women in Latin America: An Empiric Comparison of Convenience Sampling, Time Space Sampling, and Respondent Driven Sampling

    PubMed Central

    Clark, J. L.; Konda, K. A.; Silva-Santisteban, A.; Peinado, J.; Lama, J. R.; Kusunoki, L.; Perez-Brumer, A.; Pun, M.; Cabello, R.; Sebastian, J. L.; Suarez-Ognio, L.; Sanchez, J.

    2014-01-01

    Alternatives to convenience sampling (CS) are needed for HIV/STI surveillance of most-at-risk populations in Latin America. We compared CS, time space sampling (TSS), and respondent driven sampling (RDS) for recruitment of men who have sex with men (MSM) and transgender women (TW) in Lima, Peru. During concurrent 60-day periods from June–August, 2011, we recruited MSM/TW for epidemiologic surveillance using CS, TSS, and RDS. A total of 748 participants were recruited through CS, 233 through TSS, and 127 through RDS. The TSS sample included the largest proportion of TW (30.7 %) and the lowest percentage of subjects who had previously participated in HIV/STI research (14.9 %). The prevalence of newly diagnosed HIV infection, according to participants’ self-reported previous HIV diagnosis, was highest among TSS recruits (17.9 %) compared with RDS (12.6 %) and CS (10.2 %). TSS identified diverse populations of MSM/TW with higher prevalences of HIV/STIs not accessed by other methods. PMID:24362754

  15. Sampling methodologies for epidemiologic surveillance of men who have sex with men and transgender women in Latin America: an empiric comparison of convenience sampling, time space sampling, and respondent driven sampling.

    PubMed

    Clark, J L; Konda, K A; Silva-Santisteban, A; Peinado, J; Lama, J R; Kusunoki, L; Perez-Brumer, A; Pun, M; Cabello, R; Sebastian, J L; Suarez-Ognio, L; Sanchez, J

    2014-12-01

    Alternatives to convenience sampling (CS) are needed for HIV/STI surveillance of most-at-risk populations in Latin America. We compared CS, time space sampling (TSS), and respondent driven sampling (RDS) for recruitment of men who have sex with men (MSM) and transgender women (TW) in Lima, Peru. During concurrent 60-day periods from June-August, 2011, we recruited MSM/TW for epidemiologic surveillance using CS, TSS, and RDS. A total of 748 participants were recruited through CS, 233 through TSS, and 127 through RDS. The TSS sample included the largest proportion of TW (30.7 %) and the lowest percentage of subjects who had previously participated in HIV/STI research (14.9 %). The prevalence of newly diagnosed HIV infection, according to participants' self-reported previous HIV diagnosis, was highest among TSS recruits (17.9 %) compared with RDS (12.6 %) and CS (10.2 %). TSS identified diverse populations of MSM/TW with higher prevalences of HIV/STIs not accessed by other methods.

  16. 40 CFR 1065.805 - Sampling system.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 33 2011-07-01 2011-07-01 false Sampling system. 1065.805 Section 1065... ENGINE-TESTING PROCEDURES Testing With Oxygenated Fuels § 1065.805 Sampling system. (a) Dilute engine exhaust, and use batch sampling to collect proportional flow-weighted dilute samples of the applicable...

  17. 30 CFR 90.207 - Compliance sampling.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 30 Mineral Resources 1 2011-07-01 2011-07-01 false Compliance sampling. 90.207 Section 90.207... MANDATORY HEALTH STANDARDS-COAL MINERS WHO HAVE EVIDENCE OF THE DEVELOPMENT OF PNEUMOCONIOSIS Sampling Procedures § 90.207 Compliance sampling. (a) The operator shall take five valid respirable dust samples for...

  18. 30 CFR 90.208 - Bimonthly sampling.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 30 Mineral Resources 1 2011-07-01 2011-07-01 false Bimonthly sampling. 90.208 Section 90.208... MANDATORY HEALTH STANDARDS-COAL MINERS WHO HAVE EVIDENCE OF THE DEVELOPMENT OF PNEUMOCONIOSIS Sampling Procedures § 90.208 Bimonthly sampling. (a) Each operator shall take one valid respirable dust sample for...

  19. 40 CFR 1065.805 - Sampling system.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 32 2010-07-01 2010-07-01 false Sampling system. 1065.805 Section 1065... ENGINE-TESTING PROCEDURES Testing With Oxygenated Fuels § 1065.805 Sampling system. (a) Dilute engine exhaust, and use batch sampling to collect proportional flow-weighted dilute samples of the applicable...

  20. 7 CFR 58.227 - Sampling device.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 3 2010-01-01 2010-01-01 false Sampling device. 58.227 Section 58.227 Agriculture....227 Sampling device. If automatic sampling devices are used, they shall be constructed in such a.... The type of sampler and the sampling procedure shall be as approved by the Administrator. ...

  1. 7 CFR 58.227 - Sampling device.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 7 Agriculture 3 2011-01-01 2011-01-01 false Sampling device. 58.227 Section 58.227 Agriculture....227 Sampling device. If automatic sampling devices are used, they shall be constructed in such a.... The type of sampler and the sampling procedure shall be as approved by the Administrator. ...

  2. 30 CFR 90.207 - Compliance sampling.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 30 Mineral Resources 1 2010-07-01 2010-07-01 false Compliance sampling. 90.207 Section 90.207... MANDATORY HEALTH STANDARDS-COAL MINERS WHO HAVE EVIDENCE OF THE DEVELOPMENT OF PNEUMOCONIOSIS Sampling Procedures § 90.207 Compliance sampling. (a) The operator shall take five valid respirable dust samples for...

  3. 30 CFR 90.208 - Bimonthly sampling.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 30 Mineral Resources 1 2010-07-01 2010-07-01 false Bimonthly sampling. 90.208 Section 90.208... MANDATORY HEALTH STANDARDS-COAL MINERS WHO HAVE EVIDENCE OF THE DEVELOPMENT OF PNEUMOCONIOSIS Sampling Procedures § 90.208 Bimonthly sampling. (a) Each operator shall take one valid respirable dust sample for...

  4. Simple street tree sampling

    Treesearch

    David J. Nowak; Jeffrey T. Walton; James Baldwin; Jerry Bond

    2015-01-01

    Information on street trees is critical for management of this important resource. Sampling of street tree populations provides an efficient means to obtain street tree population information. Long-term repeat measures of street tree samples supply additional information on street tree changes and can be used to report damages from catastrophic events. Analyses of...

  5. Rational Learning and Information Sampling: On the "Naivety" Assumption in Sampling Explanations of Judgment Biases

    ERIC Educational Resources Information Center

    Le Mens, Gael; Denrell, Jerker

    2011-01-01

    Recent research has argued that several well-known judgment biases may be due to biases in the available information sample rather than to biased information processing. Most of these sample-based explanations assume that decision makers are "naive": They are not aware of the biases in the available information sample and do not correct for them.…

  6. Effects of sample handling methods on substance P concentrations and immunoreactivity in bovine blood samples.

    PubMed

    Mosher, Ruby A; Coetzee, Johann F; Allen, Portia S; Havel, James A; Griffith, Gary R; Wang, Chong

    2014-02-01

    To determine the effects of protease inhibitors and holding times and temperatures before processing on the stability of substance P in bovine blood samples. Blood samples obtained from a healthy 6-month-old calf. Blood samples were dispensed into tubes containing exogenous substance P and 1 of 6 degradative enzyme inhibitor treatments: heparin, EDTA, EDTA with 1 of 2 concentrations of aprotinin, or EDTA with 1 of 2 concentrations of a commercially available protease inhibitor cocktail. Plasma was harvested immediately following collection or after 1, 3, 6, 12, or 24 hours of holding at ambient (20.3° to 25.4°C) or ice bath temperatures. Total substance P immunoreactivity was determined with an ELISA; concentrations of the substance P parent molecule, a metabolite composed of the 9 terminal amino acids, and a metabolite composed of the 5 terminal amino acids were determined with liquid chromatography-tandem mass spectrometry. Regarding blood samples processed immediately, no significant differences in substance P concentrations or immunoreactivity were detected among enzyme inhibitor treatments. In blood samples processed at 1 hour of holding, substance P parent molecule concentration was significantly lower for ambient temperature versus ice bath temperature holding conditions; aprotinin was the most effective inhibitor of substance P degradation at the ice bath temperature. The ELISA substance P immunoreactivity was typically lower for blood samples with heparin versus samples with other inhibitors processed at 1 hour of holding in either temperature condition. Results suggested that blood samples should be chilled and plasma harvested within 1 hour after collection to prevent substance P degradation.

  7. Sample Preparation Report of the Fourth OPCW Confidence Building Exercise on Biomedical Sample Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Udey, R. N.; Corzett, T. H.; Alcaraz, A.

    Following the successful completion of the 3rd biomedical confidence building exercise (February 2013 – March 2013), which included the analysis of plasma and urine samples spiked at low ppb levels as part of the exercise scenario, another confidence building exercise was targeted to be conducted in 2014. In this 4th exercise, it was desired to focus specifically on the analysis of plasma samples. The scenario was designed as an investigation of an alleged use of chemical weapons where plasma samples were collected, as plasma has been reported to contain CWA adducts which remain present in the human body for severalmore » weeks (Solano et al. 2008). In the 3rd exercise most participants used the fluoride regeneration method to analyze for the presence of nerve agents in plasma samples. For the 4th biomedical exercise it was decided to evaluate the analysis of human plasma samples for the presence/absence of the VX adducts and aged adducts to blood proteins (e.g., VX-butyrylcholinesterase (BuChE) and aged BuChE adducts using a pepsin digest technique to yield nonapeptides; or equivalent). As the aging of VX-BuChE adducts is relatively slow (t1/2 = 77 hr at 37 °C [Aurbek et al. 2009]), soman (GD), which ages much more quickly (t1/2 = 9 min at 37 °C [Masson et al. 2010]), was used to simulate an aged VX sample. Additional objectives of this exercise included having laboratories assess novel OP-adducted plasma sample preparation techniques and analytical instrumentation methodologies, as well as refining/designating the reporting formats for these new techniques.« less

  8. Decreasing Errors in Reading-Related Matching to Sample Using a Delayed-Sample Procedure

    ERIC Educational Resources Information Center

    Doughty, Adam H.; Saunders, Kathryn J.

    2009-01-01

    Two men with intellectual disabilities initially demonstrated intermediate accuracy in two-choice matching-to-sample (MTS) procedures. A printed-letter identity MTS procedure was used with 1 participant, and a spoken-to-printed-word MTS procedure was used with the other participant. Errors decreased substantially under a delayed-sample procedure,…

  9. A comparison of four-sample slope-intercept and single-sample 51Cr-EDTA glomerular filtration rate measurements.

    PubMed

    Porter, Charlotte A; Bradley, Kevin M; McGowan, Daniel R

    2018-05-01

    The aim of this study was to verify, with a large dataset of 1394 Cr-EDTA glomerular filtration rate (GFR) studies, the equivalence of slope-intercept and single-sample GFR. Raw data from 1394 patient studies were used to calculate four-sample slope-intercept GFR in addition to four individual single-sample GFR values (blood samples taken at 90, 150, 210 and 270 min after injection). The percentage differences between the four-sample slope-intercept and each of the single-sample GFR values were calculated, to identify the optimum single-sample time point. Having identified the optimum time point, the percentage difference between the slope-intercept and optimal single-sample GFR was calculated across a range of GFR values to investigate whether there was a GFR value below which the two methodologies cannot be considered equivalent. It was found that the lowest percentage difference between slope-intercept and single-sample GFR was for the third blood sample, taken at 210 min after injection. The median percentage difference was 2.5% and only 6.9% of patient studies had a percentage difference greater than 10%. Above a GFR value of 30 ml/min/1.73 m, the median percentage difference between the slope-intercept and optimal single-sample GFR values was below 10%, and so it was concluded that, above this value, the two techniques are sufficiently equivalent. This study supports the recommendation of performing single-sample GFR measurements for GFRs greater than 30 ml/min/1.73 m.

  10. Ground Water Sampling at ISCO Sites - Residual Oxidant Impact on Sample Quality and Sample Preservation Guideline

    EPA Science Inventory

    In-situ chemical oxidation (ISCO) involves the delivery of a chemical oxidant into the subsurface where oxidative reactions transform ground water contaminants into less toxic or harmless byproducts. Due to oxidant persistence, ground water samples collected at hazardous waste si...

  11. 42 CFR 402.109 - Statistical sampling.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 42 Public Health 2 2011-10-01 2011-10-01 false Statistical sampling. 402.109 Section 402.109... Statistical sampling. (a) Purpose. CMS or OIG may introduce the results of a statistical sampling study to... or caused to be presented. (b) Prima facie evidence. The results of the statistical sampling study...

  12. 42 CFR 402.109 - Statistical sampling.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 42 Public Health 2 2010-10-01 2010-10-01 false Statistical sampling. 402.109 Section 402.109... Statistical sampling. (a) Purpose. CMS or OIG may introduce the results of a statistical sampling study to... or caused to be presented. (b) Prima facie evidence. The results of the statistical sampling study...

  13. Optimum sample size allocation to minimize cost or maximize power for the two-sample trimmed mean test.

    PubMed

    Guo, Jiin-Huarng; Luh, Wei-Ming

    2009-05-01

    When planning a study, sample size determination is one of the most important tasks facing the researcher. The size will depend on the purpose of the study, the cost limitations, and the nature of the data. By specifying the standard deviation ratio and/or the sample size ratio, the present study considers the problem of heterogeneous variances and non-normality for Yuen's two-group test and develops sample size formulas to minimize the total cost or maximize the power of the test. For a given power, the sample size allocation ratio can be manipulated so that the proposed formulas can minimize the total cost, the total sample size, or the sum of total sample size and total cost. On the other hand, for a given total cost, the optimum sample size allocation ratio can maximize the statistical power of the test. After the sample size is determined, the present simulation applies Yuen's test to the sample generated, and then the procedure is validated in terms of Type I errors and power. Simulation results show that the proposed formulas can control Type I errors and achieve the desired power under the various conditions specified. Finally, the implications for determining sample sizes in experimental studies and future research are discussed.

  14. 40 CFR 61.34 - Air sampling.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 8 2011-07-01 2011-07-01 false Air sampling. 61.34 Section 61.34... sampling. (a) Stationary sources subject to § 61.32(b) shall locate air sampling sites in accordance with a... concentrations calculated within 30 days after filters are collected. Records of concentrations at all sampling...

  15. 7 CFR 51.17 - Official sampling.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 2 2010-01-01 2010-01-01 false Official sampling. 51.17 Section 51.17 Agriculture... Inspection Service § 51.17 Official sampling. Samples may be officially drawn by any duly authorized... time and place of the sampling and the brands or other identifying marks of the containers from which...

  16. 7 CFR 51.17 - Official sampling.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 7 Agriculture 2 2011-01-01 2011-01-01 false Official sampling. 51.17 Section 51.17 Agriculture... Inspection Service § 51.17 Official sampling. Samples may be officially drawn by any duly authorized... time and place of the sampling and the brands or other identifying marks of the containers from which...

  17. 40 CFR 761.323 - Sample preparation.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Remediation Waste Samples § 761.323 Sample preparation. (a) The comparison study requires analysis of a minimum of 10 samples weighing at least 300 grams each. Samples of PCB remediation waste used in the... PCB remediation waste at the cleanup site, or must be the same kind of material as that waste. For...

  18. Non-destructive sampling of a comet

    NASA Astrophysics Data System (ADS)

    Jessberger, H. L.; Kotthaus, M.

    1991-04-01

    Various conditions which must be met for the development of a nondestructive sampling and acquisition system are outlined and the development of a new robotic sampling system suited for use on a cometary surface is briefly discussed. The Rosetta mission of ESA will take samples of a comet nucleus and return both core and volatile samples to earth. Various considerations which must be taken into account for such a project are examined including the identification of design parameters for sample quality; the identification of the most probable site conditions; the development of a sample acquisition system with respect to these conditions; the production of model materials and model conditions; and the investigation of the relevant material properties. An adequate sampling system should also be designed and built, including various tools, and the system should be tested under simulated cometary conditions.

  19. Using random telephone sampling to recruit generalizable samples for family violence studies.

    PubMed

    Slep, Amy M Smith; Heyman, Richard E; Williams, Mathew C; Van Dyke, Cheryl E; O'Leary, Susan G

    2006-12-01

    Convenience sampling methods predominate in recruiting for laboratory-based studies within clinical and family psychology. The authors used random digit dialing (RDD) to determine whether they could feasibly recruit generalizable samples for 2 studies (a parenting study and an intimate partner violence study). RDD screen response rate was 42-45%; demographics matched those in the 2000 U.S. Census, with small- to medium-sized differences on race, age, and income variables. RDD respondents who qualified for, but did not participate in, the laboratory study of parents showed small differences on income, couple conflicts, and corporal punishment. Time and cost are detailed, suggesting that RDD may be a feasible, effective method by which to recruit more generalizable samples for in-laboratory studies of family violence when those studies have sufficient resources. (c) 2006 APA, all rights reserved.

  20. Sample design effects in landscape genetics

    USGS Publications Warehouse

    Oyler-McCance, Sara J.; Fedy, Bradley C.; Landguth, Erin L.

    2012-01-01

    An important research gap in landscape genetics is the impact of different field sampling designs on the ability to detect the effects of landscape pattern on gene flow. We evaluated how five different sampling regimes (random, linear, systematic, cluster, and single study site) affected the probability of correctly identifying the generating landscape process of population structure. Sampling regimes were chosen to represent a suite of designs common in field studies. We used genetic data generated from a spatially-explicit, individual-based program and simulated gene flow in a continuous population across a landscape with gradual spatial changes in resistance to movement. Additionally, we evaluated the sampling regimes using realistic and obtainable number of loci (10 and 20), number of alleles per locus (5 and 10), number of individuals sampled (10-300), and generational time after the landscape was introduced (20 and 400). For a simulated continuously distributed species, we found that random, linear, and systematic sampling regimes performed well with high sample sizes (>200), levels of polymorphism (10 alleles per locus), and number of molecular markers (20). The cluster and single study site sampling regimes were not able to correctly identify the generating process under any conditions and thus, are not advisable strategies for scenarios similar to our simulations. Our research emphasizes the importance of sampling data at ecologically appropriate spatial and temporal scales and suggests careful consideration for sampling near landscape components that are likely to most influence the genetic structure of the species. In addition, simulating sampling designs a priori could help guide filed data collection efforts.

  1. Double-cross hydrostatic pressure sample injection for chip CE: variable sample plug volume and minimum number of electrodes.

    PubMed

    Luo, Yong; Wu, Dapeng; Zeng, Shaojiang; Gai, Hongwei; Long, Zhicheng; Shen, Zheng; Dai, Zhongpeng; Qin, Jianhua; Lin, Bingcheng

    2006-09-01

    A novel sample injection method for chip CE was presented. This injection method uses hydrostatic pressure, generated by emptying the sample waste reservoir, for sample loading and electrokinetic force for dispensing. The injection was performed on a double-cross microchip. One cross, created by the sample and separation channels, is used for formation of a sample plug. Another cross, formed by the sample and controlling channels, is used for plug control. By varying the electric field in the controlling channel, the sample plug volume can be linearly adjusted. Hydrostatic pressure takes advantage of its ease of generation on a microfluidic chip, without any electrode or external pressure pump, thus allowing a sample injection with a minimum number of electrodes. The potential of this injection method was demonstrated by a four-separation-channel chip CE system. In this system, parallel sample separation can be achieved with only two electrodes, which is otherwise impossible with conventional injection methods. Hydrostatic pressure maintains the sample composition during the sample loading, allowing the injection to be free of injection bias.

  2. Sample rotating turntable kit for infrared spectrometers

    DOEpatents

    Eckels, Joel Del [Livermore, CA; Klunder, Gregory L [Oakland, CA

    2008-03-04

    An infrared spectrometer sample rotating turntable kit has a rotatable sample cup containing the sample. The infrared spectrometer has an infrared spectrometer probe for analyzing the sample and the rotatable sample cup is adapted to receive the infrared spectrometer probe. A reflectance standard is located in the rotatable sample cup. A sleeve is positioned proximate the sample cup and adapted to receive the probe. A rotator rotates the rotatable sample cup. A battery is connected to the rotator.

  3. 40 CFR 98.444 - Monitoring and QA/QC requirements.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 22 2012-07-01 2012-07-01 false Monitoring and QA/QC requirements. 98... (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Geologic Sequestration of Carbon Dioxide § 98.444 Monitoring... volume of contents in all containers if you receive CO2 in containers by following the most appropriate...

  4. 40 CFR 98.444 - Monitoring and QA/QC requirements.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 21 2014-07-01 2014-07-01 false Monitoring and QA/QC requirements. 98... (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Geologic Sequestration of Carbon Dioxide § 98.444 Monitoring... volume of contents in all containers if you receive CO2 in containers by following the most appropriate...

  5. 40 CFR 98.444 - Monitoring and QA/QC requirements.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 22 2013-07-01 2013-07-01 false Monitoring and QA/QC requirements. 98... (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Geologic Sequestration of Carbon Dioxide § 98.444 Monitoring... volume of contents in all containers if you receive CO2 in containers by following the most appropriate...

  6. 40 CFR 98.394 - Monitoring and QA/QC requirements.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Suppliers of Petroleum Products § 98.394 Monitoring and QA/QC requirements. (a) Determination of quantity. (1) The quantity of petroleum products, natural gas liquids, and... each petroleum product or natural gas liquid on any day of each calendar month of the reporting year in...

  7. 40 CFR 98.394 - Monitoring and QA/QC requirements.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Suppliers of Petroleum Products § 98.394 Monitoring and QA/QC requirements. (a) Determination of quantity. (1) The quantity of petroleum products, natural gas liquids, and... each petroleum product or natural gas liquid on any day of each calendar month of the reporting year in...

  8. 40 CFR 98.394 - Monitoring and QA/QC requirements.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Suppliers of Petroleum Products § 98.394 Monitoring and QA/QC requirements. (a) Determination of quantity. (1) The quantity of petroleum products, natural gas liquids, and... or natural gas liquid on any day of each calendar month of the reporting year in which the quantity...

  9. 40 CFR 98.394 - Monitoring and QA/QC requirements.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... (CONTINUED) MANDATORY GREENHOUSE GAS REPORTING Suppliers of Petroleum Products § 98.394 Monitoring and QA/QC requirements. (a) Determination of quantity. (1) The quantity of petroleum products, natural gas liquids, and... each petroleum product or natural gas liquid on any day of each calendar month of the reporting year in...

  10. 40 CFR 761.130 - Sampling requirements.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... sampling scheme and the guidance document are available on EPA's PCB Web site at http://www.epa.gov/pcb, or... § 761.125(c) (2) through (4). Using its best engineering judgment, EPA may sample a statistically valid random or grid sampling technique, or both. When using engineering judgment or random “grab” samples, EPA...

  11. 40 CFR 761.130 - Sampling requirements.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... sampling scheme and the guidance document are available on EPA's PCB Web site at http://www.epa.gov/pcb, or... § 761.125(c) (2) through (4). Using its best engineering judgment, EPA may sample a statistically valid random or grid sampling technique, or both. When using engineering judgment or random “grab” samples, EPA...

  12. Sampling design optimization for spatial functions

    USGS Publications Warehouse

    Olea, R.A.

    1984-01-01

    A new procedure is presented for minimizing the sampling requirements necessary to estimate a mappable spatial function at a specified level of accuracy. The technique is based on universal kriging, an estimation method within the theory of regionalized variables. Neither actual implementation of the sampling nor universal kriging estimations are necessary to make an optimal design. The average standard error and maximum standard error of estimation over the sampling domain are used as global indices of sampling efficiency. The procedure optimally selects those parameters controlling the magnitude of the indices, including the density and spatial pattern of the sample elements and the number of nearest sample elements used in the estimation. As an illustration, the network of observation wells used to monitor the water table in the Equus Beds of Kansas is analyzed and an improved sampling pattern suggested. This example demonstrates the practical utility of the procedure, which can be applied equally well to other spatial sampling problems, as the procedure is not limited by the nature of the spatial function. ?? 1984 Plenum Publishing Corporation.

  13. qcML: An Exchange Format for Quality Control Metrics from Mass Spectrometry Experiments*

    PubMed Central

    Walzer, Mathias; Pernas, Lucia Espona; Nasso, Sara; Bittremieux, Wout; Nahnsen, Sven; Kelchtermans, Pieter; Pichler, Peter; van den Toorn, Henk W. P.; Staes, An; Vandenbussche, Jonathan; Mazanek, Michael; Taus, Thomas; Scheltema, Richard A.; Kelstrup, Christian D.; Gatto, Laurent; van Breukelen, Bas; Aiche, Stephan; Valkenborg, Dirk; Laukens, Kris; Lilley, Kathryn S.; Olsen, Jesper V.; Heck, Albert J. R.; Mechtler, Karl; Aebersold, Ruedi; Gevaert, Kris; Vizcaíno, Juan Antonio; Hermjakob, Henning; Kohlbacher, Oliver; Martens, Lennart

    2014-01-01

    Quality control is increasingly recognized as a crucial aspect of mass spectrometry based proteomics. Several recent papers discuss relevant parameters for quality control and present applications to extract these from the instrumental raw data. What has been missing, however, is a standard data exchange format for reporting these performance metrics. We therefore developed the qcML format, an XML-based standard that follows the design principles of the related mzML, mzIdentML, mzQuantML, and TraML standards from the HUPO-PSI (Proteomics Standards Initiative). In addition to the XML format, we also provide tools for the calculation of a wide range of quality metrics as well as a database format and interconversion tools, so that existing LIMS systems can easily add relational storage of the quality control data to their existing schema. We here describe the qcML specification, along with possible use cases and an illustrative example of the subsequent analysis possibilities. All information about qcML is available at http://code.google.com/p/qcml. PMID:24760958

  14. qcML: an exchange format for quality control metrics from mass spectrometry experiments.

    PubMed

    Walzer, Mathias; Pernas, Lucia Espona; Nasso, Sara; Bittremieux, Wout; Nahnsen, Sven; Kelchtermans, Pieter; Pichler, Peter; van den Toorn, Henk W P; Staes, An; Vandenbussche, Jonathan; Mazanek, Michael; Taus, Thomas; Scheltema, Richard A; Kelstrup, Christian D; Gatto, Laurent; van Breukelen, Bas; Aiche, Stephan; Valkenborg, Dirk; Laukens, Kris; Lilley, Kathryn S; Olsen, Jesper V; Heck, Albert J R; Mechtler, Karl; Aebersold, Ruedi; Gevaert, Kris; Vizcaíno, Juan Antonio; Hermjakob, Henning; Kohlbacher, Oliver; Martens, Lennart

    2014-08-01

    Quality control is increasingly recognized as a crucial aspect of mass spectrometry based proteomics. Several recent papers discuss relevant parameters for quality control and present applications to extract these from the instrumental raw data. What has been missing, however, is a standard data exchange format for reporting these performance metrics. We therefore developed the qcML format, an XML-based standard that follows the design principles of the related mzML, mzIdentML, mzQuantML, and TraML standards from the HUPO-PSI (Proteomics Standards Initiative). In addition to the XML format, we also provide tools for the calculation of a wide range of quality metrics as well as a database format and interconversion tools, so that existing LIMS systems can easily add relational storage of the quality control data to their existing schema. We here describe the qcML specification, along with possible use cases and an illustrative example of the subsequent analysis possibilities. All information about qcML is available at http://code.google.com/p/qcml. © 2014 by The American Society for Biochemistry and Molecular Biology, Inc.

  15. Sample Curation at a Lunar Outpost

    NASA Technical Reports Server (NTRS)

    Allen, Carlton C.; Lofgren, Gary E.; Treiman, A. H.; Lindstrom, Marilyn L.

    2007-01-01

    The six Apollo surface missions returned 2,196 individual rock and soil samples, with a total mass of 381.6 kg. Samples were collected based on visual examination by the astronauts and consultation with geologists in the science back room in Houston. The samples were photographed during collection, packaged in uniquely-identified containers, and transported to the Lunar Module. All samples collected on the Moon were returned to Earth. NASA's upcoming return to the Moon will be different. Astronauts will have extended stays at an out-post and will collect more samples than they will return. They will need curation and analysis facilities on the Moon in order to carefully select samples for return to Earth.

  16. Development of an automated data processing method for sample to sample comparison of seized methamphetamines.

    PubMed

    Choe, Sanggil; Lee, Jaesin; Choi, Hyeyoung; Park, Yujin; Lee, Heesang; Pyo, Jaesung; Jo, Jiyeong; Park, Yonghoon; Choi, Hwakyung; Kim, Suncheun

    2012-11-30

    The information about the sources of supply, trafficking routes, distribution patterns and conspiracy links can be obtained from methamphetamine profiling. The precursor and synthetic method for the clandestine manufacture can be estimated from the analysis of minor impurities contained in methamphetamine. Also, the similarity between samples can be evaluated using the peaks that appear in chromatograms. In South Korea, methamphetamine was the most popular drug but the total seized amount of methamphetamine whole through the country was very small. Therefore, it would be more important to find the links between samples than the other uses of methamphetamine profiling. Many Asian countries including Japan and South Korea have been using the method developed by National Research Institute of Police Science of Japan. The method used gas chromatography-flame ionization detector (GC-FID), DB-5 column and four internal standards. It was developed to increase the amount of impurities and minimize the amount of methamphetamine. After GC-FID analysis, the raw data have to be processed. The data processing steps are very complex and require a lot of time and effort. In this study, Microsoft Visual Basic Application (VBA) modules were developed to handle these data processing steps. This module collected the results from the data into an Excel file and then corrected the retention time shift and response deviation generated from the sample preparation and instruments analysis. The developed modules were tested for their performance using 10 samples from 5 different cases. The processed results were analyzed with Pearson correlation coefficient for similarity assessment and the correlation coefficient of the two samples from the same case was more than 0.99. When the modules were applied to 131 seized methamphetamine samples, four samples from two different cases were found to have the common origin and the chromatograms of the four samples were appeared visually identical

  17. NASA Sample Return Missions: Recovery Operations

    NASA Technical Reports Server (NTRS)

    Pace, L. F.; Cannon, R. E.

    2017-01-01

    The Utah Test and Training Range (UTTR), southwest of Salt Lake City, Utah, is the site of all NASA unmanned sample return missions. To date these missions include the Genesis solar wind samples (2004) and Stardust cometary and interstellar dust samples (2006). NASA’s OSIRIS-REx Mission will return its first asteroid sample at UTTR in 2023.

  18. Equilibrium Molecular Thermodynamics from Kirkwood Sampling

    PubMed Central

    2015-01-01

    We present two methods for barrierless equilibrium sampling of molecular systems based on the recently proposed Kirkwood method (J. Chem. Phys.2009, 130, 134102). Kirkwood sampling employs low-order correlations among internal coordinates of a molecule for random (or non-Markovian) sampling of the high dimensional conformational space. This is a geometrical sampling method independent of the potential energy surface. The first method is a variant of biased Monte Carlo, where Kirkwood sampling is used for generating trial Monte Carlo moves. Using this method, equilibrium distributions corresponding to different temperatures and potential energy functions can be generated from a given set of low-order correlations. Since Kirkwood samples are generated independently, this method is ideally suited for massively parallel distributed computing. The second approach is a variant of reservoir replica exchange, where Kirkwood sampling is used to construct a reservoir of conformations, which exchanges conformations with the replicas performing equilibrium sampling corresponding to different thermodynamic states. Coupling with the Kirkwood reservoir enhances sampling by facilitating global jumps in the conformational space. The efficiency of both methods depends on the overlap of the Kirkwood distribution with the target equilibrium distribution. We present proof-of-concept results for a model nine-atom linear molecule and alanine dipeptide. PMID:25915525

  19. Rational learning and information sampling: on the "naivety" assumption in sampling explanations of judgment biases.

    PubMed

    Le Mens, Gaël; Denrell, Jerker

    2011-04-01

    Recent research has argued that several well-known judgment biases may be due to biases in the available information sample rather than to biased information processing. Most of these sample-based explanations assume that decision makers are "naive": They are not aware of the biases in the available information sample and do not correct for them. Here, we show that this "naivety" assumption is not necessary. Systematically biased judgments can emerge even when decision makers process available information perfectly and are also aware of how the information sample has been generated. Specifically, we develop a rational analysis of Denrell's (2005) experience sampling model, and we prove that when information search is interested rather than disinterested, even rational information sampling and processing can give rise to systematic patterns of errors in judgments. Our results illustrate that a tendency to favor alternatives for which outcome information is more accessible can be consistent with rational behavior. The model offers a rational explanation for behaviors that had previously been attributed to cognitive and motivational biases, such as the in-group bias or the tendency to prefer popular alternatives. 2011 APA, all rights reserved

  20. Accounting for Diversity in Suicide Research: Sampling and Sample Reporting Practices in the United States.

    PubMed

    Cha, Christine B; Tezanos, Katherine M; Peros, Olivia M; Ng, Mei Yi; Ribeiro, Jessica D; Nock, Matthew K; Franklin, Joseph C

    2018-04-01

    Research on suicidal thoughts and behaviors (STB) has identified many risk factors, but whether these findings generalize to diverse populations remains unclear. We review longitudinal studies on STB risk factors over the past 50 years in the United States and evaluate the methodological practices of sampling and reporting sample characteristics. We found that articles frequently reported participant age and sex, less frequently reported participant race and ethnicity, and rarely reported participant veteran status or lesbian, gay, bisexual, and transgender status. Sample reporting practices modestly and inconsistently improved over time. Finally, articles predominantly featured White, non-Hispanic, young adult samples. © 2017 The American Association of Suicidology.

  1. LUNAR SAMPLES - APOLLO XVI - JSC

    NASA Image and Video Library

    1975-03-18

    S75-23543 (April 1972) --- This Apollo 16 lunar sample (moon rock) was collected by astronaut John W. Young, commander of the mission, about 15 meters southwest of the landing site. This rock weighs 128 grams when returned to Earth. The sample is a polymict breccia. This rock, like all lunar highland breccias, is very old, about 3,900,000,000 years older than 99.99% of all Earth surface rocks, according to scientists. Scientific research is being conducted on the balance of this sample at NASA's Johnson Space Center and at other research centers in the United States and certain foreign nations under a continuing program of investigation involving lunar samples collected during the Apollo program.

  2. Sample Size Calculations for Population Size Estimation Studies Using Multiplier Methods With Respondent-Driven Sampling Surveys.

    PubMed

    Fearon, Elizabeth; Chabata, Sungai T; Thompson, Jennifer A; Cowan, Frances M; Hargreaves, James R

    2017-09-14

    While guidance exists for obtaining population size estimates using multiplier methods with respondent-driven sampling surveys, we lack specific guidance for making sample size decisions. To guide the design of multiplier method population size estimation studies using respondent-driven sampling surveys to reduce the random error around the estimate obtained. The population size estimate is obtained by dividing the number of individuals receiving a service or the number of unique objects distributed (M) by the proportion of individuals in a representative survey who report receipt of the service or object (P). We have developed an approach to sample size calculation, interpreting methods to estimate the variance around estimates obtained using multiplier methods in conjunction with research into design effects and respondent-driven sampling. We describe an application to estimate the number of female sex workers in Harare, Zimbabwe. There is high variance in estimates. Random error around the size estimate reflects uncertainty from M and P, particularly when the estimate of P in the respondent-driven sampling survey is low. As expected, sample size requirements are higher when the design effect of the survey is assumed to be greater. We suggest a method for investigating the effects of sample size on the precision of a population size estimate obtained using multipler methods and respondent-driven sampling. Uncertainty in the size estimate is high, particularly when P is small, so balancing against other potential sources of bias, we advise researchers to consider longer service attendance reference periods and to distribute more unique objects, which is likely to result in a higher estimate of P in the respondent-driven sampling survey. ©Elizabeth Fearon, Sungai T Chabata, Jennifer A Thompson, Frances M Cowan, James R Hargreaves. Originally published in JMIR Public Health and Surveillance (http://publichealth.jmir.org), 14.09.2017.

  3. 40 CFR 91.327 - Sampling system requirements.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 20 2011-07-01 2011-07-01 false Sampling system requirements. 91.327....327 Sampling system requirements. (a) Sample component surface temperature. For sampling systems which..., sample line section, filters, and so forth) in the heated portion of the sampling system that has a...

  4. 40 CFR 91.327 - Sampling system requirements.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 20 2010-07-01 2010-07-01 false Sampling system requirements. 91.327....327 Sampling system requirements. (a) Sample component surface temperature. For sampling systems which..., sample line section, filters, and so forth) in the heated portion of the sampling system that has a...

  5. Viscous-sludge sample collector

    DOEpatents

    Not Available

    1979-01-01

    A vertical core sample collection system for viscous sludge is disclosed. A sample tube's upper end has a flange and is attached to a piston. The tube and piston are located in the upper end of a bore in a housing. The bore's lower end leads outside the housing and has an inwardly extending rim. Compressed gas, from a storage cylinder, is quickly introduced into the bore's upper end to rapidly accelerate the piston and tube down the bore. The lower end of the tube has a high sludge entering velocity to obtain a full-length sludge sample without disturbing strata detail. The tube's downward motion is stopped when its upper end flange impacts against the bore's lower end inwardly extending rim.

  6. Viscous sludge sample collector

    DOEpatents

    Beitel, George A [Richland, WA

    1983-01-01

    A vertical core sample collection system for viscous sludge. A sample tube's upper end has a flange and is attached to a piston. The tube and piston are located in the upper end of a bore in a housing. The bore's lower end leads outside the housing and has an inwardly extending rim. Compressed gas, from a storage cylinder, is quickly introduced into the bore's upper end to rapidly accelerate the piston and tube down the bore. The lower end of the tube has a high sludge entering velocity to obtain a full-length sludge sample without disturbing strata detail. The tube's downward motion is stopped when its upper end flange impacts against the bore's lower end inwardly extending rim.

  7. Solvent Hold Tank Sample Results for MCU-16-701-702-703: May 2016 Monthly Sample and MCU-16-710-711-712: May 2016 Superwashed Sample

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fondeur, F. F.; Jones, D. H.

    The Savannah River National Laboratory (SRNL) received one set of Solvent Hold Tank (SHT) samples (MCU-16-701, MCU-16-702 and MCU-16-703), pulled on 05/23/2016, and another set of SHT samples (MCU-16-710, MCU-16-711, and MCU-16-712) were pulled on 05/28/2016 after the solvent was superwashed with 300 mM sodium hydroxide for analysis. Samples MCU-16-701, MCU-16-702, and MCU-16-703 were combined into one sample (MCU-16-701-702-703) and samples MCU-16-710, MCU- 16-711, and MCU-16-712 were combined into one sample (MCU-16-710-711-712). Of the two composite samples MCU-16-710-711-712 represents the current chemical state of the solvent at MCU. All analytical conclusions are based on the chemical analysis of MCU-16-710-711-712. Theremore » were no chemical differences between MCU-16-701-702-703 and superwashed MCU-16-710-711-712. Analysis of the composited sample MCU-16-710-712-713 indicated the Isopar™L concentration is above its nominal level (102%). The modifier (CS-7SB) is 16% below its nominal concentration, while the TiDG and MaxCalix concentrations are at and above their nominal concentrations, respectively. The TiDG level has begun to decrease, and it is 7% below its nominal level as of May 28, 2016. Based on this current analysis, the levels of TiDG, Isopar™L, MaxCalix, and modifier are sufficient for continuing operation but are expected to decrease with time. Periodic characterization and trimming additions to the solvent are recommended.« less

  8. Confidence intervals for the population mean tailored to small sample sizes, with applications to survey sampling.

    PubMed

    Rosenblum, Michael A; Laan, Mark J van der

    2009-01-07

    The validity of standard confidence intervals constructed in survey sampling is based on the central limit theorem. For small sample sizes, the central limit theorem may give a poor approximation, resulting in confidence intervals that are misleading. We discuss this issue and propose methods for constructing confidence intervals for the population mean tailored to small sample sizes. We present a simple approach for constructing confidence intervals for the population mean based on tail bounds for the sample mean that are correct for all sample sizes. Bernstein's inequality provides one such tail bound. The resulting confidence intervals have guaranteed coverage probability under much weaker assumptions than are required for standard methods. A drawback of this approach, as we show, is that these confidence intervals are often quite wide. In response to this, we present a method for constructing much narrower confidence intervals, which are better suited for practical applications, and that are still more robust than confidence intervals based on standard methods, when dealing with small sample sizes. We show how to extend our approaches to much more general estimation problems than estimating the sample mean. We describe how these methods can be used to obtain more reliable confidence intervals in survey sampling. As a concrete example, we construct confidence intervals using our methods for the number of violent deaths between March 2003 and July 2006 in Iraq, based on data from the study "Mortality after the 2003 invasion of Iraq: A cross sectional cluster sample survey," by Burnham et al. (2006).

  9. Comparing respondent-driven sampling and targeted sampling methods of recruiting injection drug users in San Francisco.

    PubMed

    Kral, Alex H; Malekinejad, Mohsen; Vaudrey, Jason; Martinez, Alexis N; Lorvick, Jennifer; McFarland, Willi; Raymond, H Fisher

    2010-09-01

    The objective of this article is to compare demographic characteristics, risk behaviors, and service utilization among injection drug users (IDUs) recruited from two separate studies in San Francisco in 2005, one which used targeted sampling (TS) and the other which used respondent-driven sampling (RDS). IDUs were recruited using TS (n = 651) and RDS (n = 534) and participated in quantitative interviews that included demographic characteristics, risk behaviors, and service utilization. Prevalence estimates and 95% confidence intervals (CIs) were calculated to assess whether there were differences in these variables by sampling method. There was overlap in 95% CIs for all demographic variables except African American race (TS: 45%, 53%; RDS: 29%, 44%). Maps showed that the proportion of IDUs distributed across zip codes were similar for the TS and RDS sample, with the exception of a single zip code that was more represented in the TS sample. This zip code includes an isolated, predominantly African American neighborhood where only the TS study had a field site. Risk behavior estimates were similar for both TS and RDS samples, although self-reported hepatitis C infection was lower in the RDS sample. In terms of service utilization, more IDUs in the RDS sample reported no recent use of drug treatment and syringe exchange program services. Our study suggests that perhaps a hybrid sampling plan is best suited for recruiting IDUs in San Francisco, whereby the more intensive ethnographic and secondary analysis components of TS would aid in the planning of seed placement and field locations for RDS.

  10. 40 CFR 1065.245 - Sample flow meter for batch sampling.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... POLLUTION CONTROLS ENGINE-TESTING PROCEDURES Measurement Instruments Flow-Related Measurements § 1065.245... difference between a diluted exhaust sample flow meter and a dilution air meter to calculate raw exhaust flow...

  11. 40 CFR 1065.245 - Sample flow meter for batch sampling.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... POLLUTION CONTROLS ENGINE-TESTING PROCEDURES Measurement Instruments Flow-Related Measurements § 1065.245... difference between a diluted exhaust sample flow meter and a dilution air meter to calculate raw exhaust flow...

  12. 40 CFR 1065.245 - Sample flow meter for batch sampling.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... POLLUTION CONTROLS ENGINE-TESTING PROCEDURES Measurement Instruments Flow-Related Measurements § 1065.245... difference between a diluted exhaust sample flow meter and a dilution air meter to calculate raw exhaust flow...

  13. 40 CFR 1065.245 - Sample flow meter for batch sampling.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... POLLUTION CONTROLS ENGINE-TESTING PROCEDURES Measurement Instruments Flow-Related Measurements § 1065.245... difference between a diluted exhaust sample flow meter and a dilution air meter to calculate raw exhaust flow...

  14. Procedures for sampling radium-contaminated soils

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fleischhauer, H.L.

    Two procedures for sampling the surface layer (0 to 15 centimeters) of radium-contaminated soil are recommended for use in remedial action projects. Both procedures adhere to the philosophy that soil samples should have constant geometry and constant volume in order to ensure uniformity. In the first procedure, a ''cookie cutter'' fashioned from pipe or steel plate, is driven to the desired depth by means of a slide hammer, and the sample extracted as a core or plug. The second procedure requires use of a template to outline the sampling area, from which the sample is obtained using a trowel ormore » spoon. Sampling to the desired depth must then be performed incrementally. Selection of one procedure over the other is governed primarily by soil conditions, the cookie cutter being effective in nongravelly soils, and the template procedure appropriate for use in both gravelly and nongravelly soils. In any event, a minimum sample volume of 1000 cubic centimeters is recommended. The step-by-step procedures are accompanied by a description of the minimum requirements for sample documentation. Transport of the soil samples from the field is then addressed in a discussion of the federal regulations for shipping radioactive materials. Interpretation of those regulations, particularly in light of their application to remedial action soil-sampling programs, is provided in the form of guidance and suggested procedures. Due to the complex nature of the regulations, however, there is no guarantee that our interpretations of them are complete or entirely accurate. Preparation of soil samples for radium-226 analysis by means of gamma-ray spectroscopy is described.« less

  15. Knowledge-based nonuniform sampling in multidimensional NMR.

    PubMed

    Schuyler, Adam D; Maciejewski, Mark W; Arthanari, Haribabu; Hoch, Jeffrey C

    2011-07-01

    The full resolution afforded by high-field magnets is rarely realized in the indirect dimensions of multidimensional NMR experiments because of the time cost of uniformly sampling to long evolution times. Emerging methods utilizing nonuniform sampling (NUS) enable high resolution along indirect dimensions by sampling long evolution times without sampling at every multiple of the Nyquist sampling interval. While the earliest NUS approaches matched the decay of sampling density to the decay of the signal envelope, recent approaches based on coupled evolution times attempt to optimize sampling by choosing projection angles that increase the likelihood of resolving closely-spaced resonances. These approaches employ knowledge about chemical shifts to predict optimal projection angles, whereas prior applications of tailored sampling employed only knowledge of the decay rate. In this work we adapt the matched filter approach as a general strategy for knowledge-based nonuniform sampling that can exploit prior knowledge about chemical shifts and is not restricted to sampling projections. Based on several measures of performance, we find that exponentially weighted random sampling (envelope matched sampling) performs better than shift-based sampling (beat matched sampling). While shift-based sampling can yield small advantages in sensitivity, the gains are generally outweighed by diminished robustness. Our observation that more robust sampling schemes are only slightly less sensitive than schemes highly optimized using prior knowledge about chemical shifts has broad implications for any multidimensional NMR study employing NUS. The results derived from simulated data are demonstrated with a sample application to PfPMT, the phosphoethanolamine methyltransferase of the human malaria parasite Plasmodium falciparum.

  16. Harpoon-based sample Acquisition System

    NASA Astrophysics Data System (ADS)

    Bernal, Javier; Nuth, Joseph; Wegel, Donald

    2012-02-01

    Acquiring information about the composition of comets, asteroids, and other near Earth objects is very important because they may contain the primordial ooze of the solar system and the origins of life on Earth. Sending a spacecraft is the obvious answer, but once it gets there it needs to collect and analyze samples. Conceptually, a drill or a shovel would work, but both require something extra to anchor it to the comet, adding to the cost and complexity of the spacecraft. Since comets and asteroids are very low gravity objects, drilling becomes a problem. If you do not provide a grappling mechanism, the drill would push the spacecraft off the surface. Harpoons have been proposed as grappling mechanisms in the past and are currently flying on missions such as ROSETTA. We propose to use a hollow, core sampling harpoon, to act as the anchoring mechanism as well as the sample collecting device. By combining these two functions, mass is reduced, more samples can be collected and the spacecraft can carry more propellant. Although challenging, returning the collected samples to Earth allows them to be analyzed in laboratories with much greater detail than possible on a spacecraft. Also, bringing the samples back to Earth allows future generations to study them.

  17. 7 CFR 27.89 - Expenses; inspection; sampling.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 7 Agriculture 2 2011-01-01 2011-01-01 false Expenses; inspection; sampling. 27.89 Section 27.89... Micronaire § 27.89 Expenses; inspection; sampling. Expense of inspection and sampling, the preparation of the... Office, the expense of inspection, sampling, preparation of samples, and delivery of the samples to the...

  18. 7 CFR 27.89 - Expenses; inspection; sampling.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 2 2010-01-01 2010-01-01 false Expenses; inspection; sampling. 27.89 Section 27.89... Micronaire § 27.89 Expenses; inspection; sampling. Expense of inspection and sampling, the preparation of the... Office, the expense of inspection, sampling, preparation of samples, and delivery of the samples to the...

  19. Solvent Hold Tank Sample Results for MCU-16-934-935-936: June 2016 Monthly Sample

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fondeur, F. F.; Jones, D. H.

    2016-08-30

    Savannah River National Laboratory (SRNL) received one set of Solvent Hold Tank (SHT) samples (MCU-16-934-935-936), pulled on 07/01/2016 for analysis. The samples were combined and analyzed for composition. Analysis of the composite sample MCU-16-934-935-936 indicated the Isopar™L concentration is above its nominal level (101%). The modifier (CS-7SB) and the TiDG concentrations are 8% and 29 % below their nominal concentrations. This analysis confirms the solvent may require the addition of TiDG, and possibly of modifier. Based on the current monthly sample, the levels of TiDG, Isopar™L, MaxCalix, and modifier are sufficient for continuing operation but are expected to decrease withmore » time. Periodic characterization and trimming additions to the solvent are recommended. No impurities above the 1000 ppm level were found in this solvent by the Semi-Volatile Organic Analysis (SVOA). No impurities were observed in the Hydrogen Nuclear Magnetic Resonance (HNMR). However, up to 21.1 ± 4 micrograms of mercury per gram of solvent (or 17.5 μg/mL) was detected in this sample (as determined by the XRF method of undigested sample). The current gamma level (1.41E5 dpm/mL) confirmed that the gamma concentration has returned to previous levels (as observed in the late 2015 samples) where the process operated normally and as expected.« less

  20. [Assessment comparison between area sampling and personal sampling noise measurement in new thermal power plant].

    PubMed

    Zhang, Hua; Chen, Qing-song; Li, Nan; Hua, Yan; Zeng, Lin; Xu, Guo-yang; Tao, Li-yuan; Zhao, Yi-ming

    2013-05-01

    To compare the results of noise hazard evaluations based on area sampling and personal sampling in a new thermal power plant and to analyze the similarities and differences between the two measurement methods. According to Measurement of Physical agents in Workplace Part 8: Noise(GBZff 189.8-2007), area sampling was performed at various operating points for noise measurement, and meanwhile the workers under different types of work wore noise dosimeters for personal noise exposure measurement. The two measurement methods were used to evaluate the level of noise hazards in the enterprise according to the corresponding occupational health standards, and the evaluation results were compared. Area sampling was performed at 99 operating points, the mean noise level was 88.9 ± 11.1 dB (A)(range, 51.3-107.0 dB (A)), with an over-standard rate of 75.8%. Personal sampling was performed (73 person times),and the mean noise level was 79.3 ± 6.3 dB (A), with an over-standard rate of 6.6% ( 16/241 ). There was a statistically significant difference in the over-standard rate between the evaluation results of the two measurement methods ( x2=53.869, ?<0.001 ). Because of the characteristics of the work in new thermal power plants, the noise hazard evaluation based on area sampling cannot be used instead of personal noise exposure measurement among workers. Personal sampling should be used in the noise measurement in new thermal power plant.