Science.gov

Sample records for importance sampling method

  1. A flexible importance sampling method for integrating subgrid processes

    NASA Astrophysics Data System (ADS)

    Raut, E. K.; Larson, V. E.

    2016-01-01

    Numerical models of weather and climate need to compute grid-box-averaged rates of physical processes such as microphysics. These averages are computed by integrating subgrid variability over a grid box. For this reason, an important aspect of atmospheric modeling is spatial integration over subgrid scales. The needed integrals can be estimated by Monte Carlo integration. Monte Carlo integration is simple and general but requires many evaluations of the physical process rate. To reduce the number of function evaluations, this paper describes a new, flexible method of importance sampling. It divides the domain of integration into eight categories, such as the portion that contains both precipitation and cloud, or the portion that contains precipitation but no cloud. It then allows the modeler to prescribe the density of sample points within each of the eight categories. The new method is incorporated into the Subgrid Importance Latin Hypercube Sampler (SILHS). The resulting method is tested on drizzling cumulus and stratocumulus cases. In the cumulus case, the sampling error can be considerably reduced by drawing more sample points from the region of rain evaporation.

  2. A flexible importance sampling method for integrating subgrid processes

    DOE PAGES

    Raut, E. K.; Larson, V. E.

    2016-01-29

    Numerical models of weather and climate need to compute grid-box-averaged rates of physical processes such as microphysics. These averages are computed by integrating subgrid variability over a grid box. For this reason, an important aspect of atmospheric modeling is spatial integration over subgrid scales. The needed integrals can be estimated by Monte Carlo integration. Monte Carlo integration is simple and general but requires many evaluations of the physical process rate. To reduce the number of function evaluations, this paper describes a new, flexible method of importance sampling. It divides the domain of integration into eight categories, such as the portion that containsmore » both precipitation and cloud, or the portion that contains precipitation but no cloud. It then allows the modeler to prescribe the density of sample points within each of the eight categories. The new method is incorporated into the Subgrid Importance Latin Hypercube Sampler (SILHS). The resulting method is tested on drizzling cumulus and stratocumulus cases. In the cumulus case, the sampling error can be considerably reduced by drawing more sample points from the region of rain evaporation.« less

  3. Importance Sampling Approach for the Nonstationary Approximation Error Method

    NASA Astrophysics Data System (ADS)

    Huttunen, J. M. J.; Lehikoinen, A.; Hämäläinen, J.; Kaipio, J. P.

    2010-09-01

    The approximation error approach has been earlier proposed to handle modelling, numerical and computational errors in inverse problems. The idea of the approach is to include the errors to the forward model and compute the approximate statistics of the errors using Monte Carlo sampling. This can be a computationally tedious task but the key property of the approach is that the approximate statistics can be calculated off-line before measurement process takes place. In nonstationary problems, however, information is accumulated over time, and the initial uncertainties may turn out to have been exaggerated. In this paper, we propose an importance weighing algorithm with which the approximation error statistics can be updated during the accumulation of measurement information. As a computational example, we study an estimation problem that is related to a convection-diffusion problem in which the velocity field is not accurately specified.

  4. Importance sampling variance reduction for the Fokker-Planck rarefied gas particle method

    NASA Astrophysics Data System (ADS)

    Collyer, B. S.; Connaughton, C.; Lockerby, D. A.

    2016-11-01

    The Fokker-Planck approximation to the Boltzmann equation, solved numerically by stochastic particle schemes, is used to provide estimates for rarefied gas flows. This paper presents a variance reduction technique for a stochastic particle method that is able to greatly reduce the uncertainty of the estimated flow fields when the characteristic speed of the flow is small in comparison to the thermal velocity of the gas. The method relies on importance sampling, requiring minimal changes to the basic stochastic particle scheme. We test the importance sampling scheme on a homogeneous relaxation, planar Couette flow and a lid-driven-cavity flow, and find that our method is able to greatly reduce the noise of estimated quantities. Significantly, we find that as the characteristic speed of the flow decreases, the variance of the noisy estimators becomes independent of the characteristic speed.

  5. Importance sampling : promises and limitations.

    SciTech Connect

    West, Nicholas J.; Swiler, Laura Painton

    2010-04-01

    Importance sampling is an unbiased sampling method used to sample random variables from different densities than originally defined. These importance sampling densities are constructed to pick 'important' values of input random variables to improve the estimation of a statistical response of interest, such as a mean or probability of failure. Conceptually, importance sampling is very attractive: for example one wants to generate more samples in a failure region when estimating failure probabilities. In practice, however, importance sampling can be challenging to implement efficiently, especially in a general framework that will allow solutions for many classes of problems. We are interested in the promises and limitations of importance sampling as applied to computationally expensive finite element simulations which are treated as 'black-box' codes. In this paper, we present a customized importance sampler that is meant to be used after an initial set of Latin Hypercube samples has been taken, to help refine a failure probability estimate. The importance sampling densities are constructed based on kernel density estimators. We examine importance sampling with respect to two main questions: is importance sampling efficient and accurate for situations where we can only afford small numbers of samples? And does importance sampling require the use of surrogate methods to generate a sufficient number of samples so that the importance sampling process does increase the accuracy of the failure probability estimate? We present various case studies to address these questions.

  6. Adaptive importance sampling for network growth models

    PubMed Central

    Holmes, Susan P.

    2016-01-01

    Network Growth Models such as Preferential Attachment and Duplication/Divergence are popular generative models with which to study complex networks in biology, sociology, and computer science. However, analyzing them within the framework of model selection and statistical inference is often complicated and computationally difficult, particularly when comparing models that are not directly related or nested. In practice, ad hoc methods are often used with uncertain results. If possible, the use of standard likelihood-based statistical model selection techniques is desirable. With this in mind, we develop an Adaptive Importance Sampling algorithm for estimating likelihoods of Network Growth Models. We introduce the use of the classic Plackett-Luce model of rankings as a family of importance distributions. Updates to importance distributions are performed iteratively via the Cross-Entropy Method with an additional correction for degeneracy/over-fitting inspired by the Minimum Description Length principle. This correction can be applied to other estimation problems using the Cross-Entropy method for integration/approximate counting, and it provides an interpretation of Adaptive Importance Sampling as iterative model selection. Empirical results for the Preferential Attachment model are given, along with a comparison to an alternative established technique, Annealed Importance Sampling. PMID:27182098

  7. Adaptive Importance Sampling for Control and Inference

    NASA Astrophysics Data System (ADS)

    Kappen, H. J.; Ruiz, H. C.

    2016-03-01

    Path integral (PI) control problems are a restricted class of non-linear control problems that can be solved formally as a Feynman-Kac PI and can be estimated using Monte Carlo sampling. In this contribution we review PI control theory in the finite horizon case. We subsequently focus on the problem how to compute and represent control solutions. We review the most commonly used methods in robotics and control. Within the PI theory, the question of how to compute becomes the question of importance sampling. Efficient importance samplers are state feedback controllers and the use of these requires an efficient representation. Learning and representing effective state-feedback controllers for non-linear stochastic control problems is a very challenging, and largely unsolved, problem. We show how to learn and represent such controllers using ideas from the cross entropy method. We derive a gradient descent method that allows to learn feed-back controllers using an arbitrary parametrisation. We refer to this method as the path integral cross entropy method or PICE. We illustrate this method for some simple examples. The PI control methods can be used to estimate the posterior distribution in latent state models. In neuroscience these problems arise when estimating connectivity from neural recording data using EM. We demonstrate the PI control method as an accurate alternative to particle filtering.

  8. The Importance of Microhabitat for Biodiversity Sampling

    PubMed Central

    Mehrabi, Zia; Slade, Eleanor M.; Solis, Angel; Mann, Darren J.

    2014-01-01

    Responses to microhabitat are often neglected when ecologists sample animal indicator groups. Microhabitats may be particularly influential in non-passive biodiversity sampling methods, such as baited traps or light traps, and for certain taxonomic groups which respond to fine scale environmental variation, such as insects. Here we test the effects of microhabitat on measures of species diversity, guild structure and biomass of dung beetles, a widely used ecological indicator taxon. We demonstrate that choice of trap placement influences dung beetle functional guild structure and species diversity. We found that locally measured environmental variables were unable to fully explain trap-based differences in species diversity metrics or microhabitat specialism of functional guilds. To compare the effects of habitat degradation on biodiversity across multiple sites, sampling protocols must be standardized and scale-relevant. Our work highlights the importance of considering microhabitat scale responses of indicator taxa and designing robust sampling protocols which account for variation in microhabitats during trap placement. We suggest that this can be achieved either through standardization of microhabitat or through better efforts to record relevant environmental variables that can be incorporated into analyses to account for microhabitat effects. This is especially important when rapidly assessing the consequences of human activity on biodiversity loss and associated ecosystem function and services. PMID:25469770

  9. Adaptively biased sequential importance sampling for rare events in reaction networks with comparison to exact solutions from finite buffer dCME method

    PubMed Central

    Cao, Youfang; Liang, Jie

    2013-01-01

    Critical events that occur rarely in biological processes are of great importance, but are challenging to study using Monte Carlo simulation. By introducing biases to reaction selection and reaction rates, weighted stochastic simulation algorithms based on importance sampling allow rare events to be sampled more effectively. However, existing methods do not address the important issue of barrier crossing, which often arises from multistable networks and systems with complex probability landscape. In addition, the proliferation of parameters and the associated computing cost pose significant problems. Here we introduce a general theoretical framework for obtaining optimized biases in sampling individual reactions for estimating probabilities of rare events. We further describe a practical algorithm called adaptively biased sequential importance sampling (ABSIS) method for efficient probability estimation. By adopting a look-ahead strategy and by enumerating short paths from the current state, we estimate the reaction-specific and state-specific forward and backward moving probabilities of the system, which are then used to bias reaction selections. The ABSIS algorithm can automatically detect barrier-crossing regions, and can adjust bias adaptively at different steps of the sampling process, with bias determined by the outcome of exhaustively generated short paths. In addition, there are only two bias parameters to be determined, regardless of the number of the reactions and the complexity of the network. We have applied the ABSIS method to four biochemical networks: the birth-death process, the reversible isomerization, the bistable Schlögl model, and the enzymatic futile cycle model. For comparison, we have also applied the finite buffer discrete chemical master equation (dCME) method recently developed to obtain exact numerical solutions of the underlying discrete chemical master equations of these problems. This allows us to assess sampling results objectively

  10. Annealed Importance Sampling Reversible Jump MCMC algorithms

    SciTech Connect

    Karagiannis, Georgios; Andrieu, Christophe

    2013-03-20

    It will soon be 20 years since reversible jump Markov chain Monte Carlo (RJ-MCMC) algorithms have been proposed. They have significantly extended the scope of Markov chain Monte Carlo simulation methods, offering the promise to be able to routinely tackle transdimensional sampling problems, as encountered in Bayesian model selection problems for example, in a principled and flexible fashion. Their practical efficient implementation, however, still remains a challenge. A particular difficulty encountered in practice is in the choice of the dimension matching variables (both their nature and their distribution) and the reversible transformations which allow one to define the one-to-one mappings underpinning the design of these algorithms. Indeed, even seemingly sensible choices can lead to algorithms with very poor performance. The focus of this paper is the development and performance evaluation of a method, annealed importance sampling RJ-MCMC (aisRJ), which addresses this problem by mitigating the sensitivity of RJ-MCMC algorithms to the aforementioned poor design. As we shall see the algorithm can be understood as being an “exact approximation” of an idealized MCMC algorithm that would sample from the model probabilities directly in a model selection set-up. Such an idealized algorithm may have good theoretical convergence properties, but typically cannot be implemented, and our algorithms can approximate the performance of such idealized algorithms to an arbitrary degree while not introducing any bias for any degree of approximation. Our approach combines the dimension matching ideas of RJ-MCMC with annealed importance sampling and its Markov chain Monte Carlo implementation. We illustrate the performance of the algorithm with numerical simulations which indicate that, although the approach may at first appear computationally involved, it is in fact competitive.

  11. Sampling system and method

    DOEpatents

    Decker, David L.; Lyles, Brad F.; Purcell, Richard G.; Hershey, Ronald Lee

    2013-04-16

    The present disclosure provides an apparatus and method for coupling conduit segments together. A first pump obtains a sample and transmits it through a first conduit to a reservoir accessible by a second pump. The second pump further conducts the sample from the reservoir through a second conduit.

  12. Annealed Importance Sampling for Neural Mass Models.

    PubMed

    Penny, Will; Sengupta, Biswa

    2016-03-01

    Neural Mass Models provide a compact description of the dynamical activity of cell populations in neocortical regions. Moreover, models of regional activity can be connected together into networks, and inferences made about the strength of connections, using M/EEG data and Bayesian inference. To date, however, Bayesian methods have been largely restricted to the Variational Laplace (VL) algorithm which assumes that the posterior distribution is Gaussian and finds model parameters that are only locally optimal. This paper explores the use of Annealed Importance Sampling (AIS) to address these restrictions. We implement AIS using proposals derived from Langevin Monte Carlo (LMC) which uses local gradient and curvature information for efficient exploration of parameter space. In terms of the estimation of Bayes factors, VL and AIS agree about which model is best but report different degrees of belief. Additionally, AIS finds better model parameters and we find evidence of non-Gaussianity in their posterior distribution. PMID:26942606

  13. Annealed Importance Sampling for Neural Mass Models.

    PubMed

    Penny, Will; Sengupta, Biswa

    2016-03-01

    Neural Mass Models provide a compact description of the dynamical activity of cell populations in neocortical regions. Moreover, models of regional activity can be connected together into networks, and inferences made about the strength of connections, using M/EEG data and Bayesian inference. To date, however, Bayesian methods have been largely restricted to the Variational Laplace (VL) algorithm which assumes that the posterior distribution is Gaussian and finds model parameters that are only locally optimal. This paper explores the use of Annealed Importance Sampling (AIS) to address these restrictions. We implement AIS using proposals derived from Langevin Monte Carlo (LMC) which uses local gradient and curvature information for efficient exploration of parameter space. In terms of the estimation of Bayes factors, VL and AIS agree about which model is best but report different degrees of belief. Additionally, AIS finds better model parameters and we find evidence of non-Gaussianity in their posterior distribution.

  14. Annealed Importance Sampling for Neural Mass Models

    PubMed Central

    Penny, Will; Sengupta, Biswa

    2016-01-01

    Neural Mass Models provide a compact description of the dynamical activity of cell populations in neocortical regions. Moreover, models of regional activity can be connected together into networks, and inferences made about the strength of connections, using M/EEG data and Bayesian inference. To date, however, Bayesian methods have been largely restricted to the Variational Laplace (VL) algorithm which assumes that the posterior distribution is Gaussian and finds model parameters that are only locally optimal. This paper explores the use of Annealed Importance Sampling (AIS) to address these restrictions. We implement AIS using proposals derived from Langevin Monte Carlo (LMC) which uses local gradient and curvature information for efficient exploration of parameter space. In terms of the estimation of Bayes factors, VL and AIS agree about which model is best but report different degrees of belief. Additionally, AIS finds better model parameters and we find evidence of non-Gaussianity in their posterior distribution. PMID:26942606

  15. Improved Sampling Method Reduces Isokinetic Sampling Errors.

    ERIC Educational Resources Information Center

    Karels, Gale G.

    The particulate sampling system currently in use by the Bay Area Air Pollution Control District, San Francisco, California is described in this presentation for the 12th Conference on Methods in Air Pollution and Industrial Hygiene Studies, University of Southern California, April, 1971. The method represents a practical, inexpensive tool that can…

  16. Sampling system and method

    DOEpatents

    Decker, David L; Lyles, Brad F; Purcell, Richard G; Hershey, Ronald Lee

    2014-05-20

    An apparatus and method for supporting a tubing bundle during installation or removal. The apparatus includes a clamp for securing the tubing bundle to an external wireline. The method includes deploying the tubing bundle and wireline together, The tubing bundle is periodically secured to the wireline using a clamp.

  17. Stochastic seismic inversion using greedy annealed importance sampling

    NASA Astrophysics Data System (ADS)

    Xue, Yang; Sen, Mrinal K.

    2016-10-01

    A global optimization method called very fast simulated annealing (VFSA) inversion has been applied to seismic inversion. Here we address some of the limitations of VFSA by developing a new stochastic inference method, named greedy annealed importance sampling (GAIS). GAIS combines VFSA and greedy importance sampling (GIS), which uses a greedy search in the important regions located by VFSA, in order to attain fast convergence and provide unbiased estimation. We demonstrate the performance of GAIS with application to seismic inversion of field post- and pre-stack datasets. The results indicate that GAIS can improve lateral continuity of the inverted impedance profiles and provide better estimation of uncertainties than using VFSA alone. Thus this new hybrid method combining global and local optimization methods can be applied in seismic reservoir characterization and reservoir monitoring for accurate estimation of reservoir models and their uncertainties.

  18. Improved metropolis light transport algorithm based on multiple importance sampling

    NASA Astrophysics Data System (ADS)

    He, Huaiqing; Yang, Jiaqian; Liu, Haohan

    2015-12-01

    Metropolis light transport was an unbiased and robust Monte Carlo method, which could efficiently reduce noise during rendering the realistic graphics to resolve the global illumination problem. The basic Metropolis light transport was improved by combining with multiple importance sampling, which better solved the large correlation and high variance between samples caused by the basic Metropolis light transport. The experiences manifested that the quality of images generated by improved algorithm was better compared with the basic Metropolis light transport in the same scenes settings.

  19. 9 CFR 327.11 - Receipts to importers for import product samples.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 9 Animals and Animal Products 2 2010-01-01 2010-01-01 false Receipts to importers for import... AND VOLUNTARY INSPECTION AND CERTIFICATION IMPORTED PRODUCTS § 327.11 Receipts to importers for import product samples. In order that importers may be assured that samples of foreign products collected...

  20. Consistent Adjoint Driven Importance Sampling using Space, Energy and Angle

    SciTech Connect

    Peplow, Douglas E.; Mosher, Scott W; Evans, Thomas M

    2012-08-01

    For challenging radiation transport problems, hybrid methods combine the accuracy of Monte Carlo methods with the global information present in deterministic methods. One of the most successful hybrid methods is CADIS Consistent Adjoint Driven Importance Sampling. This method uses a deterministic adjoint solution to construct a biased source distribution and consistent weight windows to optimize a specific tally in a Monte Carlo calculation. The method has been implemented into transport codes using just the spatial and energy information from the deterministic adjoint and has been used in many applications to compute tallies with much higher figures-of-merit than analog calculations. CADIS also outperforms user-supplied importance values, which usually take long periods of user time to develop. This work extends CADIS to develop weight windows that are a function of the position, energy, and direction of the Monte Carlo particle. Two types of consistent source biasing are presented: one method that biases the source in space and energy while preserving the original directional distribution and one method that biases the source in space, energy, and direction. Seven simple example problems are presented which compare the use of the standard space/energy CADIS with the new space/energy/angle treatments.

  1. Sampling methods for phlebotomine sandflies.

    PubMed

    Alexander, B

    2000-06-01

    A review is presented of methods for sampling phlebotomine sandflies (Diptera: Psychodidae). Among approximately 500 species of Phlebotominae so far described, mostly in the New World genus Lutzomyia and the Old World genus Phlebotomus, about 10% are known vectors of Leishmania parasites or other pathogens. Despite being small and fragile, sandflies have a wide geographical range with species occupying a considerable diversity of ecotopes and habitats, from deserts to humid forests, so that suitable methods for collecting them are influenced by environmental conditions where they are sought. Because immature phlebotomines occupy obscure terrestrial habitats, it is difficult to find their breeding sites. Therefore, most trapping methods and sampling procedures focus on sandfly adults, whether resting or active. The diurnal resting sites of adult sandflies include tree holes, buttress roots, rock crevices, houses, animal shelters and burrows, from which they may be aspirated directly or trapped after being disturbed. Sandflies can be collected during their periods of activity by interception traps, or by using attractants such as bait animals, CO2 or light. The method of trapping used should: (a) be suited to the habitat and area to be surveyed, (b) take into account the segment of the sandfly population to be sampled (species, sex and reproduction condition) and (c) yield specimens of appropriate condition for the study objectives (e.g. identification of species present, population genetics or vector implication). Methods for preservation and transportation of sandflies to the laboratory also depend on the objectives of a particular study and are described accordingly. PMID:10872855

  2. Elaborating transition interface sampling methods

    SciTech Connect

    Erp, Titus S. van . E-mail: bolhuis@science.uva.nl

    2005-05-01

    We review two recently developed efficient methods for calculating rate constants of processes dominated by rare events in high-dimensional complex systems. The first is transition interface sampling (TIS), based on the measurement of effective fluxes through hypersurfaces in phase space. TIS improves efficiency with respect to standard transition path sampling (TPS) rate constant techniques, because it allows a variable path length and is less sensitive to recrossings. The second method is the partial path version of TIS. Developed for diffusive processes, it exploits the loss of long time correlation. We discuss the relation between the new techniques and the standard reactive flux methods in detail. Path sampling algorithms can suffer from ergodicity problems, and we introduce several new techniques to alleviate these problems, notably path swapping, stochastic configurational bias Monte Carlo shooting moves and order-parameter free path sampling. In addition, we give algorithms to calculate other interesting properties from path ensembles besides rate constants, such as activation energies and reaction mechanisms.

  3. Sample normalization methods in quantitative metabolomics.

    PubMed

    Wu, Yiman; Li, Liang

    2016-01-22

    To reveal metabolomic changes caused by a biological event in quantitative metabolomics, it is critical to use an analytical tool that can perform accurate and precise quantification to examine the true concentration differences of individual metabolites found in different samples. A number of steps are involved in metabolomic analysis including pre-analytical work (e.g., sample collection and storage), analytical work (e.g., sample analysis) and data analysis (e.g., feature extraction and quantification). Each one of them can influence the quantitative results significantly and thus should be performed with great care. Among them, the total sample amount or concentration of metabolites can be significantly different from one sample to another. Thus, it is critical to reduce or eliminate the effect of total sample amount variation on quantification of individual metabolites. In this review, we describe the importance of sample normalization in the analytical workflow with a focus on mass spectrometry (MS)-based platforms, discuss a number of methods recently reported in the literature and comment on their applicability in real world metabolomics applications. Sample normalization has been sometimes ignored in metabolomics, partially due to the lack of a convenient means of performing sample normalization. We show that several methods are now available and sample normalization should be performed in quantitative metabolomics where the analyzed samples have significant variations in total sample amounts.

  4. Sequential Importance Sampling for Rare Event Estimation with Computer Experiments

    SciTech Connect

    Williams, Brian J.; Picard, Richard R.

    2012-06-25

    Importance sampling often drastically improves the variance of percentile and quantile estimators of rare events. We propose a sequential strategy for iterative refinement of importance distributions for sampling uncertain inputs to a computer model to estimate quantiles of model output or the probability that the model output exceeds a fixed or random threshold. A framework is introduced for updating a model surrogate to maximize its predictive capability for rare event estimation with sequential importance sampling. Examples of the proposed methodology involving materials strength and nuclear reactor applications will be presented. The conclusions are: (1) Importance sampling improves UQ of percentile and quantile estimates relative to brute force approach; (2) Benefits of importance sampling increase as percentiles become more extreme; (3) Iterative refinement improves importance distributions in relatively few iterations; (4) Surrogates are necessary for slow running codes; (5) Sequential design improves surrogate quality in region of parameter space indicated by importance distributions; and (6) Importance distributions and VRFs stabilize quickly, while quantile estimates may converge slowly.

  5. Fluid sampling apparatus and method

    DOEpatents

    Yeamans, David R.

    1998-01-01

    Incorporation of a bellows in a sampling syringe eliminates ingress of contaminants, permits replication of amounts and compression of multiple sample injections, and enables remote sampling for off-site analysis.

  6. Fluid sampling apparatus and method

    DOEpatents

    Yeamans, D.R.

    1998-02-03

    Incorporation of a bellows in a sampling syringe eliminates ingress of contaminants, permits replication of amounts and compression of multiple sample injections, and enables remote sampling for off-site analysis. 3 figs.

  7. Adaptive importance sampling of random walks on continuous state spaces

    SciTech Connect

    Baggerly, K.; Cox, D.; Picard, R.

    1998-11-01

    The authors consider adaptive importance sampling for a random walk with scoring in a general state space. Conditions under which exponential convergence occurs to the zero-variance solution are reviewed. These results generalize previous work for finite, discrete state spaces in Kollman (1993) and in Kollman, Baggerly, Cox, and Picard (1996). This paper is intended for nonstatisticians and includes considerable explanatory material.

  8. Duplex sampling apparatus and method

    DOEpatents

    Brown, Paul E.; Lloyd, Robert

    1992-01-01

    An improved apparatus is provided for sampling a gaseous mixture and for measuring mixture components. The apparatus includes two sampling containers connected in series serving as a duplex sampling apparatus. The apparatus is adapted to independently determine the amounts of condensable and noncondensable gases in admixture from a single sample. More specifically, a first container includes a first port capable of selectively connecting to and disconnecting from a sample source and a second port capable of selectively connecting to and disconnecting from a second container. A second container also includes a first port capable of selectively connecting to and disconnecting from the second port of the first container and a second port capable of either selectively connecting to and disconnecting from a differential pressure source. By cooling a mixture sample in the first container, the condensable vapors form a liquid, leaving noncondensable gases either as free gases or dissolved in the liquid. The condensed liquid is heated to drive out dissolved noncondensable gases, and all the noncondensable gases are transferred to the second container. Then the first and second containers are separated from one another in order to separately determine the amount of noncondensable gases and the amount of condensable gases in the sample.

  9. Apparatus and method for handheld sampling

    DOEpatents

    Staab, Torsten A.

    2005-09-20

    The present invention includes an apparatus, and corresponding method, for taking a sample. The apparatus is built around a frame designed to be held in at least one hand. A sample media is used to secure the sample. A sample media adapter for securing the sample media is operated by a trigger mechanism connectively attached within the frame to the sample media adapter.

  10. Soil sampling kit and a method of sampling therewith

    DOEpatents

    Thompson, C.V.

    1991-02-05

    A soil sampling device and a sample containment device for containing a soil sample is disclosed. In addition, a method for taking a soil sample using the soil sampling device and soil sample containment device to minimize the loss of any volatile organic compounds contained in the soil sample prior to analysis is disclosed. The soil sampling device comprises two close fitting, longitudinal tubular members of suitable length, the inner tube having the outward end closed. With the inner closed tube withdrawn a selected distance, the outer tube can be inserted into the ground or other similar soft material to withdraw a sample of material for examination. The inner closed end tube controls the volume of the sample taken and also serves to eject the sample. The soil sample containment device has a sealing member which is adapted to attach to an analytical apparatus which analyzes the volatile organic compounds contained in the sample. The soil sampling device in combination with the soil sample containment device allows an operator to obtain a soil sample containing volatile organic compounds and minimizing the loss of the volatile organic compounds prior to analysis of the soil sample for the volatile organic compounds. 11 figures.

  11. Soil sampling kit and a method of sampling therewith

    DOEpatents

    Thompson, Cyril V.

    1991-01-01

    A soil sampling device and a sample containment device for containing a soil sample is disclosed. In addition, a method for taking a soil sample using the soil sampling device and soil sample containment device to minimize the loss of any volatile organic compounds contained in the soil sample prior to analysis is disclosed. The soil sampling device comprises two close fitting, longitudinal tubular members of suitable length, the inner tube having the outward end closed. With the inner closed tube withdrawn a selected distance, the outer tube can be inserted into the ground or other similar soft material to withdraw a sample of material for examination. The inner closed end tube controls the volume of the sample taken and also serves to eject the sample. The soil sample containment device has a sealing member which is adapted to attach to an analytical apparatus which analyzes the volatile organic compounds contained in the sample. The soil sampling device in combination with the soil sample containment device allow an operator to obtain a soil sample containing volatile organic compounds and minimizing the loss of the volatile organic compounds prior to analysis of the soil sample for the volatile organic compounds.

  12. Importance-Sampling Monte Carlo Approach to Classical Spin Systems

    NASA Astrophysics Data System (ADS)

    Huang, Hsing-Mei

    A new approach for carrying out static Monte Carlo calculations of thermodynamic quantities for classical spin systems is proposed. Combining the ideas of coincidence countings and importance samplings, we formulate a scheme for obtaining Γ(E), the number of states for a fixed energy E, and use Γ(E) to compute thermodynamic properties. Using the Ising model as an example, we demonstrate that our procedure leads to accurate numerical results without excessive use of computer time. We also show that the procedure is easily extended to obtaining magnetic properties of the Ising model.

  13. Sampling High-Altitude and Stratified Mating Flights of Red Imported Fire Ant

    Technology Transfer Automated Retrieval System (TEKTRAN)

    With the exception of an airplane equipped with nets, no method has been developed that successfully samples red imported fire ant, Solenopsis invicta Buren, sexuals in mating/dispersal flights throughout their potential altitudinal trajectories. We developed and tested a method for sampling queens ...

  14. Duplex sampling apparatus and method

    SciTech Connect

    Brown, P.E.; Lloyd, R.

    1992-07-07

    This patent describes a method of measuring the condensable vapor content and the noncondensable gaseous content of a mixture of condensable vapors and noncondensable gases. It comprises collecting a quantity of a mixture of condensable vapors and noncondensable gases in a first container, cooling the first container whereby the condensable vapors in the mixture are condensed, transferring the noncondensable gases from the first container to a downstream second container, determining the quantity of condensable vapors retained in the first container, and measuring the pressure of noncondensable gases in the second container, measuring the temperature of noncondensable gases in the second container, thereby determining the quantity of noncondensable gases in the second container from the temperature, pressure and volume, using the ideal gas law: P{sub v} = nRT, whereby the ratio of condensable vapors to noncondensable gases in the mixture is determined.

  15. Probabilistic inference using linear Gaussian importance sampling for hybrid Bayesian networks

    NASA Astrophysics Data System (ADS)

    Sun, Wei; Chang, K. C.

    2005-05-01

    Probabilistic inference for Bayesian networks is in general NP-hard using either exact algorithms or approximate methods. However, for very complex networks, only the approximate methods such as stochastic sampling could be used to provide a solution given any time constraint. There are several simulation methods currently available. They include logic sampling (the first proposed stochastic method for Bayesian networks, the likelihood weighting algorithm) the most commonly used simulation method because of its simplicity and efficiency, the Markov blanket scoring method, and the importance sampling algorithm. In this paper, we first briefly review and compare these available simulation methods, then we propose an improved importance sampling algorithm called linear Gaussian importance sampling algorithm for general hybrid model (LGIS). LGIS is aimed for hybrid Bayesian networks consisting of both discrete and continuous random variables with arbitrary distributions. It uses linear function and Gaussian additive noise to approximate the true conditional probability distribution for continuous variable given both its parents and evidence in a Bayesian network. One of the most important features of the newly developed method is that it can adaptively learn the optimal important function from the previous samples. We test the inference performance of LGIS using a 16-node linear Gaussian model and a 6-node general hybrid model. The performance comparison with other well-known methods such as Junction tree (JT) and likelihood weighting (LW) shows that LGIS-GHM is very promising.

  16. Clever particle filters, sequential importance sampling and the optimal proposal

    NASA Astrophysics Data System (ADS)

    Snyder, Chris

    2014-05-01

    Particle filters rely on sequential importance sampling and it is well known that their performance can depend strongly on the choice of proposal distribution from which new ensemble members (particles) are drawn. The use of clever proposals has seen substantial recent interest in the geophysical literature, with schemes such as the implicit particle filter and the equivalent-weights particle filter. Both these schemes employ proposal distributions at time tk+1 that depend on the state at tk and the observations at time tk+1. I show that, beginning with particles drawn randomly from the conditional distribution of the state at tk given observations through tk, the optimal proposal (the distribution of the state at tk+1 given the state at tk and the observations at tk+1) minimizes the variance of the importance weights for particles at tk overall all possible proposal distributions. This means that bounds on the performance of the optimal proposal, such as those given by Snyder (2011), also bound the performance of the implicit and equivalent-weights particle filters. In particular, in spite of the fact that they may be dramatically more effective than other particle filters in specific instances, those schemes will suffer degeneracy (maximum importance weight approaching unity) unless the ensemble size is exponentially large in a quantity that, in the simplest case that all degrees of freedom in the system are i.i.d., is proportional to the system dimension. I will also discuss the behavior to be expected in more general cases, such as global numerical weather prediction, and how that behavior depends qualitatively on the observing network. Snyder, C., 2012: Particle filters, the "optimal" proposal and high-dimensional systems. Proceedings, ECMWF Seminar on Data Assimilation for Atmosphere and Ocean., 6-9 September 2011.

  17. Surface self-diffusion constants at low temperature: Monte Carlo transition state theory with importance sampling

    SciTech Connect

    Voter, A.F.; Doll, J.D.

    1984-06-01

    We present an importance-sampling method which, when combined with a Monte Carlo procedure for evaluating transition state theory rates, allows computation of classically exact, transition state theory surface diffusion constants at arbitrarily low temperature. In the importance-sampling method, a weighting factor is applied to the transition state region, and Metropolis steps are chosen from a special distribution which facilitates transfer between the two important regions of configuration space: the binding site minimum and the saddle point between two binding sites. We apply the method to the diffusion of Rh on Rh(111) and Rh on Rh(100), in the temperature range of existing field ion microscope experiments.

  18. A Comparison of EPI Sampling, Probability Sampling, and Compact Segment Sampling Methods for Micro and Small Enterprises

    PubMed Central

    Chao, Li-Wei; Szrek, Helena; Peltzer, Karl; Ramlagan, Shandir; Fleming, Peter; Leite, Rui; Magerman, Jesswill; Ngwenya, Godfrey B.; Pereira, Nuno Sousa; Behrman, Jere

    2011-01-01

    Finding an efficient method for sampling micro- and small-enterprises (MSEs) for research and statistical reporting purposes is a challenge in developing countries, where registries of MSEs are often nonexistent or outdated. This lack of a sampling frame creates an obstacle in finding a representative sample of MSEs. This study uses computer simulations to draw samples from a census of businesses and non-businesses in the Tshwane Municipality of South Africa, using three different sampling methods: the traditional probability sampling method, the compact segment sampling method, and the World Health Organization’s Expanded Programme on Immunization (EPI) sampling method. Three mechanisms by which the methods could differ are tested, the proximity selection of respondents, the at-home selection of respondents, and the use of inaccurate probability weights. The results highlight the importance of revisits and accurate probability weights, but the lesser effect of proximity selection on the samples’ statistical properties. PMID:22582004

  19. Methods for Studying Ciliary Import Mechanisms.

    PubMed

    Takao, Daisuke; Verhey, Kristen J

    2016-01-01

    Cilia and flagella are microtubule-based organelles that play important roles in human health by contributing to cellular motility as well as sensing and responding to environmental cues. Defects in cilia formation and function cause a broad class of human genetic diseases called ciliopathies. To carry out their specialized functions, cilia contain a unique complement of proteins that must be imported into the ciliary compartment. In this chapter, we describe methods to measure the permeability barrier of the ciliary gate by microinjection of fluorescent proteins and dextrans of different sizes into ciliated cells. We also describe a fluorescence recovery after photobleaching (FRAP) assay to measure the entry of ciliary proteins into the ciliary compartment. These assays can be used to determine the molecular mechanisms that regulate the formation and function of cilia in mammalian cells. PMID:27514912

  20. The Importance of Introductory Statistics Students Understanding Appropriate Sampling Techniques

    ERIC Educational Resources Information Center

    Menil, Violeta C.

    2005-01-01

    In this paper the author discusses the meaning of sampling, the reasons for sampling, the Central Limit Theorem, and the different techniques of sampling. Practical and relevant examples are given to make the appropriate sampling techniques understandable to students of Introductory Statistics courses. With a thorough knowledge of sampling…

  1. Subrandom methods for multidimensional nonuniform sampling.

    PubMed

    Worley, Bradley

    2016-08-01

    Methods of nonuniform sampling that utilize pseudorandom number sequences to select points from a weighted Nyquist grid are commonplace in biomolecular NMR studies, due to the beneficial incoherence introduced by pseudorandom sampling. However, these methods require the specification of a non-arbitrary seed number in order to initialize a pseudorandom number generator. Because the performance of pseudorandom sampling schedules can substantially vary based on seed number, this can complicate the task of routine data collection. Approaches such as jittered sampling and stochastic gap sampling are effective at reducing random seed dependence of nonuniform sampling schedules, but still require the specification of a seed number. This work formalizes the use of subrandom number sequences in nonuniform sampling as a means of seed-independent sampling, and compares the performance of three subrandom methods to their pseudorandom counterparts using commonly applied schedule performance metrics. Reconstruction results using experimental datasets are also provided to validate claims made using these performance metrics.

  2. Subrandom methods for multidimensional nonuniform sampling

    NASA Astrophysics Data System (ADS)

    Worley, Bradley

    2016-08-01

    Methods of nonuniform sampling that utilize pseudorandom number sequences to select points from a weighted Nyquist grid are commonplace in biomolecular NMR studies, due to the beneficial incoherence introduced by pseudorandom sampling. However, these methods require the specification of a non-arbitrary seed number in order to initialize a pseudorandom number generator. Because the performance of pseudorandom sampling schedules can substantially vary based on seed number, this can complicate the task of routine data collection. Approaches such as jittered sampling and stochastic gap sampling are effective at reducing random seed dependence of nonuniform sampling schedules, but still require the specification of a seed number. This work formalizes the use of subrandom number sequences in nonuniform sampling as a means of seed-independent sampling, and compares the performance of three subrandom methods to their pseudorandom counterparts using commonly applied schedule performance metrics. Reconstruction results using experimental datasets are also provided to validate claims made using these performance metrics.

  3. Understanding Mars: The Geologic Importance of Returned Samples

    NASA Astrophysics Data System (ADS)

    Christensen, P. R.

    2011-12-01

    what are the nature, ages, and origin of the diverse suite of aqueous environments, were any of them habitable, how, when, and why did environments vary through time, and finally, did any of them host life or its precursors? A critical next step toward answering these questions would be provided through the analysis of carefully selected samples from geologically diverse and well-characterized sites that are returned to Earth for detailed study. This sample return campaign is envisioned as a sequence of three missions that collect the samples, place them into Mars orbit, and return them to Earth. Our existing scientific knowledge of Mars makes it possible to select a site at which specific, detailed hypotheses can be tested, and from which the orbital mapping can be validated and extended globally. Existing and future analysis techniques developed in laboratories around the world will provide the means to perform a wide array of tests on these samples, develop hypotheses for the origin of their chemical, isotopic, and morphologic signatures, and, most importantly, perform follow-up measurements to test and validate the findings. These analyses will dramatically improve our understanding of the geologic processes and history of Mars, and through their ties to the global geologic context, will once again revolutionize our understanding of this complex planet.

  4. Method and apparatus for data sampling

    DOEpatents

    Odell, Daniel M. C.

    1994-01-01

    A method and apparatus for sampling radiation detector outputs and determining event data from the collected samples. The method uses high speed sampling of the detector output, the conversion of the samples to digital values, and the discrimination of the digital values so that digital values representing detected events are determined. The high speed sampling and digital conversion is performed by an A/D sampler that samples the detector output at a rate high enough to produce numerous digital samples for each detected event. The digital discrimination identifies those digital samples that are not representative of detected events. The sampling and discrimination also provides for temporary or permanent storage, either serially or in parallel, to a digital storage medium.

  5. Method and apparatus for data sampling

    DOEpatents

    Odell, D.M.C.

    1994-04-19

    A method and apparatus for sampling radiation detector outputs and determining event data from the collected samples is described. The method uses high speed sampling of the detector output, the conversion of the samples to digital values, and the discrimination of the digital values so that digital values representing detected events are determined. The high speed sampling and digital conversion is performed by an A/D sampler that samples the detector output at a rate high enough to produce numerous digital samples for each detected event. The digital discrimination identifies those digital samples that are not representative of detected events. The sampling and discrimination also provides for temporary or permanent storage, either serially or in parallel, to a digital storage medium. 6 figures.

  6. Mixed Methods Sampling: A Typology with Examples

    ERIC Educational Resources Information Center

    Teddlie, Charles; Yu, Fen

    2007-01-01

    This article presents a discussion of mixed methods (MM) sampling techniques. MM sampling involves combining well-established qualitative and quantitative techniques in creative ways to answer research questions posed by MM research designs. Several issues germane to MM sampling are presented including the differences between probability and…

  7. 19 CFR 151.67 - Sampling by importer.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... TREASURY (CONTINUED) EXAMINATION, SAMPLING, AND TESTING OF MERCHANDISE Wool and Hair § 151.67 Sampling by... quantities from the packages of wool or hair designated for examination, provided the bales or bags...

  8. 19 CFR 151.67 - Sampling by importer.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... TREASURY (CONTINUED) EXAMINATION, SAMPLING, AND TESTING OF MERCHANDISE Wool and Hair § 151.67 Sampling by... quantities from the packages of wool or hair designated for examination, provided the bales or bags...

  9. 19 CFR 151.67 - Sampling by importer.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... TREASURY (CONTINUED) EXAMINATION, SAMPLING, AND TESTING OF MERCHANDISE Wool and Hair § 151.67 Sampling by... quantities from the packages of wool or hair designated for examination, provided the bales or bags...

  10. 19 CFR 151.67 - Sampling by importer.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... TREASURY (CONTINUED) EXAMINATION, SAMPLING, AND TESTING OF MERCHANDISE Wool and Hair § 151.67 Sampling by... quantities from the packages of wool or hair designated for examination, provided the bales or bags...

  11. 19 CFR 151.67 - Sampling by importer.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... TREASURY (CONTINUED) EXAMINATION, SAMPLING, AND TESTING OF MERCHANDISE Wool and Hair § 151.67 Sampling by... quantities from the packages of wool or hair designated for examination, provided the bales or bags...

  12. Sampling Methods in Cardiovascular Nursing Research: An Overview.

    PubMed

    Kandola, Damanpreet; Banner, Davina; O'Keefe-McCarthy, Sheila; Jassal, Debbie

    2014-01-01

    Cardiovascular nursing research covers a wide array of topics from health services to psychosocial patient experiences. The selection of specific participant samples is an important part of the research design and process. The sampling strategy employed is of utmost importance to ensure that a representative sample of participants is chosen. There are two main categories of sampling methods: probability and non-probability. Probability sampling is the random selection of elements from the population, where each element of the population has an equal and independent chance of being included in the sample. There are five main types of probability sampling including simple random sampling, systematic sampling, stratified sampling, cluster sampling, and multi-stage sampling. Non-probability sampling methods are those in which elements are chosen through non-random methods for inclusion into the research study and include convenience sampling, purposive sampling, and snowball sampling. Each approach offers distinct advantages and disadvantages and must be considered critically. In this research column, we provide an introduction to these key sampling techniques and draw on examples from the cardiovascular research. Understanding the differences in sampling techniques may aid nurses in effective appraisal of research literature and provide a reference pointfor nurses who engage in cardiovascular research.

  13. Dynamic Method for Identifying Collected Sample Mass

    NASA Technical Reports Server (NTRS)

    Carson, John

    2008-01-01

    G-Sample is designed for sample collection missions to identify the presence and quantity of sample material gathered by spacecraft equipped with end effectors. The software method uses a maximum-likelihood estimator to identify the collected sample's mass based on onboard force-sensor measurements, thruster firings, and a dynamics model of the spacecraft. This makes sample mass identification a computation rather than a process requiring additional hardware. Simulation examples of G-Sample are provided for spacecraft model configurations with a sample collection device mounted on the end of an extended boom. In the absence of thrust knowledge errors, the results indicate that G-Sample can identify the amount of collected sample mass to within 10 grams (with 95-percent confidence) by using a force sensor with a noise and quantization floor of 50 micrometers. These results hold even in the presence of realistic parametric uncertainty in actual spacecraft inertia, center-of-mass offset, and first flexibility modes. Thrust profile knowledge is shown to be a dominant sensitivity for G-Sample, entering in a nearly one-to-one relationship with the final mass estimation error. This means thrust profiles should be well characterized with onboard accelerometers prior to sample collection. An overall sample-mass estimation error budget has been developed to approximate the effect of model uncertainty, sensor noise, data rate, and thrust profile error on the expected estimate of collected sample mass.

  14. Photographic sampling: a photographic sampling method for mites on plants.

    PubMed

    Sircom, J

    2000-01-01

    A photographic sampling method for mites on plants was evaluated using Tetranychus urticae and Phytoseiulus persimilis on pepper plants. It was found to be 92% accurate for T. urticae eggs and 98% accurate for P. persimilis eggs at densities up to 45 eggs per cm2 for T. urticae, and up to 3 eggs per cm2 for P. persimilis. The motiles of the two species were not confused, nor were they confused with exuviae or other matter.

  15. Systems and methods for sample analysis

    SciTech Connect

    Cooks, Robert Graham; Li, Guangtao; Li, Xin; Ouyang, Zheng

    2015-10-20

    The invention generally relates to systems and methods for sample analysis. In certain embodiments, the invention provides a system for analyzing a sample that includes a probe including a material connected to a high voltage source, a device for generating a heated gas, and a mass analyzer.

  16. Systems and methods for sample analysis

    DOEpatents

    Cooks, Robert Graham; Li, Guangtao; Li, Xin; Ouyang, Zheng

    2015-01-13

    The invention generally relates to systems and methods for sample analysis. In certain embodiments, the invention provides a system for analyzing a sample that includes a probe including a material connected to a high voltage source, a device for generating a heated gas, and a mass analyzer.

  17. Sampling Plant Diversity and Rarity at Landscape Scales: Importance of Sampling Time in Species Detectability

    PubMed Central

    Zhang, Jian; Nielsen, Scott E.; Grainger, Tess N.; Kohler, Monica; Chipchar, Tim; Farr, Daniel R.

    2014-01-01

    Documenting and estimating species richness at regional or landscape scales has been a major emphasis for conservation efforts, as well as for the development and testing of evolutionary and ecological theory. Rarely, however, are sampling efforts assessed on how they affect detection and estimates of species richness and rarity. In this study, vascular plant richness was sampled in 356 quarter hectare time-unlimited survey plots in the boreal region of northeast Alberta. These surveys consisted of 15,856 observations of 499 vascular plant species (97 considered to be regionally rare) collected by 12 observers over a 2 year period. Average survey time for each quarter-hectare plot was 82 minutes, ranging from 20 to 194 minutes, with a positive relationship between total survey time and total plant richness. When survey time was limited to a 20-minute search, as in other Alberta biodiversity methods, 61 species were missed. Extending the survey time to 60 minutes, reduced the number of missed species to 20, while a 90-minute cut-off time resulted in the loss of 8 species. When surveys were separated by habitat type, 60 minutes of search effort sampled nearly 90% of total observed richness for all habitats. Relative to rare species, time-unlimited surveys had ∼65% higher rare plant detections post-20 minutes than during the first 20 minutes of the survey. Although exhaustive sampling was attempted, observer bias was noted among observers when a subsample of plots was re-surveyed by different observers. Our findings suggest that sampling time, combined with sample size and observer effects, should be considered in landscape-scale plant biodiversity surveys. PMID:24740179

  18. 40 CFR 80.1349 - Alternative sampling and testing requirements for importers who import gasoline into the United...

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... requirements for importers who import gasoline into the United States by truck. 80.1349 Section 80.1349... FUELS AND FUEL ADDITIVES Gasoline Benzene Sampling, Testing and Retention Requirements § 80.1349 Alternative sampling and testing requirements for importers who import gasoline into the United States...

  19. Method and apparatus for sampling atmospheric mercury

    DOEpatents

    Trujillo, Patricio E.; Campbell, Evan E.; Eutsler, Bernard C.

    1976-01-20

    A method of simultaneously sampling particulate mercury, organic mercurial vapors, and metallic mercury vapor in the working and occupational environment and determining the amount of mercury derived from each such source in the sampled air. A known volume of air is passed through a sampling tube containing a filter for particulate mercury collection, a first adsorber for the selective adsorption of organic mercurial vapors, and a second adsorber for the adsorption of metallic mercury vapor. Carbon black molecular sieves are particularly useful as the selective adsorber for organic mercurial vapors. The amount of mercury adsorbed or collected in each section of the sampling tube is readily quantitatively determined by flameless atomic absorption spectrophotometry.

  20. 40 CFR 80.1630 - Sampling and testing requirements for refiners, gasoline importers and producers and importers of...

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... refiners, gasoline importers and producers and importers of certified ethanol denaturant. 80.1630 Section... refiners, gasoline importers and producers and importers of certified ethanol denaturant. (a) Sample and test each batch of gasoline and certified ethanol denaturant. (1) Refiners and importers shall...

  1. An Importance Sampling EM Algorithm for Latent Regression Models

    ERIC Educational Resources Information Center

    von Davier, Matthias; Sinharay, Sandip

    2007-01-01

    Reporting methods used in large-scale assessments such as the National Assessment of Educational Progress (NAEP) rely on latent regression models. To fit the latent regression model using the maximum likelihood estimation technique, multivariate integrals must be evaluated. In the computer program MGROUP used by the Educational Testing Service for…

  2. A method for selecting training samples based on camera response

    NASA Astrophysics Data System (ADS)

    Zhang, Leihong; Li, Bei; Pan, Zilan; Liang, Dong; Kang, Yi; Zhang, Dawei; Ma, Xiuhua

    2016-09-01

    In the process of spectral reflectance reconstruction, sample selection plays an important role in the accuracy of the constructed model and in reconstruction effects. In this paper, a method for training sample selection based on camera response is proposed. It has been proved that the camera response value has a close correlation with the spectral reflectance. Consequently, in this paper we adopt the technique of drawing a sphere in camera response value space to select the training samples which have a higher correlation with the test samples. In addition, the Wiener estimation method is used to reconstruct the spectral reflectance. Finally, we find that the method of sample selection based on camera response value has the smallest color difference and root mean square error after reconstruction compared to the method using the full set of Munsell color charts, the Mohammadi training sample selection method, and the stratified sampling method. Moreover, the goodness of fit coefficient of this method is also the highest among the four sample selection methods. Taking all the factors mentioned above into consideration, the method of training sample selection based on camera response value enhances the reconstruction accuracy from both the colorimetric and spectral perspectives.

  3. What Is Hypercalcemia? The Importance of Fasting Samples

    PubMed Central

    Siyam, Fadi F.; Klachko, David M.

    2013-01-01

    The differentiation between primary or tertiary (both hypercalcemic) and secondary (normocalcemic) hyperparathyroidism requires the identification of hypercalcemia. Calcium in the blood exists as bound, complexed and ionized fractions. Calcium sensors on parathyroid cells interact only with the ionized fraction (about 50% of the total calcium concentration). Many formulas using albumin, total protein or phosphate to correct or adjust total calcium to reflect the level of ionized calcium may be accurate only within a limited range. In addition, they can introduce errors based on inaccuracies in the measurement of these other metabolites. Clinical conditions, mainly those illnesses affecting acid-base balance, can alter the proportions of bound and free calcium. How and when the blood samples are drawn can alter the level of total calcium. Prolonged standing or prolonged venous stasis causes hemoconcentration, increasing the bound fraction. Preceding exercise can also affect blood calcium levels. Ingestion of calcium supplements or calcium-containing nutrients can cause transient elevations in blood calcium levels lasting several hours, leading to unnecessary further testing. Fasting total calcium levels may be sufficient for monitoring progress. However, for diagnostic purposes, fasting ionized calcium levels should be used. Therefore, for an isolated high total calcium level, we recommend obtaining a repeat fasting total and ionized calcium measurement before further investigations. Hypercalcemia may be diagnosed if there are persistent or frequent total or, preferably, ionized calcium levels >3 SD above the mean of the normal range or if there are progressively rising levels. PMID:24474951

  4. Bayesian individualization via sampling-based methods.

    PubMed

    Wakefield, J

    1996-02-01

    We consider the situation where we wish to adjust the dosage regimen of a patient based on (in general) sparse concentration measurements taken on-line. A Bayesian decision theory approach is taken which requires the specification of an appropriate prior distribution and loss function. A simple method for obtaining samples from the posterior distribution of the pharmacokinetic parameters of the patient is described. In general, these samples are used to obtain a Monte Carlo estimate of the expected loss which is then minimized with respect to the dosage regimen. Some special cases which yield analytic solutions are described. When the prior distribution is based on a population analysis then a method of accounting for the uncertainty in the population parameters is described. Two simulation studies showing how the methods work in practice are presented. PMID:8827585

  5. Actinide recovery method -- Large soil samples

    SciTech Connect

    Maxwell , S.L. III

    2000-04-25

    There is a need to measure actinides in environmental samples with lower and lower detection limits, requiring larger sample sizes. This analysis is adversely affected by sample-matrix interferences, which make analyzing soil samples above five-grams very difficult. A new Actinide-Recovery Method has been developed by the Savannah River Site Central Laboratory to preconcentrate actinides from large-soil samples. Diphonix Resin (Eichrom Industries), a 1994 R and D 100 winner, is used to preconcentrate the actinides from large soil samples, which are bound powerfully to the resin's diphosphonic acid groups. A rapid microwave-digestion technique is used to remove the actinides from the Diphonix Resin, which effectively eliminates interfering matrix components from the soil matrix. The microwave-digestion technique is more effective and less tedious than catalyzed hydrogen peroxide digestions of the resin or digestion of diphosphonic stripping agents such as HEDPA. After resin digestion, the actinides are recovered in a small volume of nitric acid which can be loaded onto small extraction chromatography columns, such as TEVA Resin, U-TEVA Resin or TRU Resin (Eichrom Industries). Small, selective extraction columns do not generate large volumes of liquid waste and provide consistent tracer recoveries after soil matrix elimination.

  6. A comparison of two ozone sampling methods

    SciTech Connect

    Downey, E.B.; Buchan, R.M.; Blehm, K.D.; Gunter, B.J.

    1983-05-01

    A study was conducted to compare the alkaline potassium iodide (AKI) impinger method versus a direct-reading chemiluminescent monitor for determining ozone concentrations. Comparisons were made in both a controlled laboratory situation and in the field during MIG welding. Laboratoy results indicated that the accuracy of the AKI procedure is affected by sample size. In the field, AKI impinger samples seemed to give very low estimations of the true ozone concentration. The direct-reading chemiluminescent monitor performed excellently in both the laboratory and field, and exhibited its merit as an industrial hygiene field instrument.

  7. Actinide Recovery Method for Large Soil Samples

    SciTech Connect

    Maxwell, S.L. III; Nichols, S.

    1998-11-01

    A new Actinide Recovery Method has been developed by the Savannah River Site Central Laboratory to preconcentrate actinides in very large soil samples. Diphonix Resin(r) is used eliminate soil matrix interferences and preconcentrate actinides after soil leaching or soil fusion. A rapid microwave digestion technique is used to remove the actinides from the Diphonix Resin(r). After the resin digestion, the actinides are recovered in a small volume of nitric acid which can be easily loaded onto small extraction-chromatography columns, such as TEVA Resin(r), U-TEVA Resin(r) or TRU Resin(r) (Eichrom Industries). This method enables the application of small, selective extraction-columns to recover actinides from very large soil samples with high selectivity, consistent tracer recoveries and minimal liquid waste.

  8. Methods for Sampling of Airborne Viruses

    PubMed Central

    Verreault, Daniel; Moineau, Sylvain; Duchaine, Caroline

    2008-01-01

    Summary: To better understand the underlying mechanisms of aerovirology, accurate sampling of airborne viruses is fundamental. The sampling instruments commonly used in aerobiology have also been used to recover viruses suspended in the air. We reviewed over 100 papers to evaluate the methods currently used for viral aerosol sampling. Differentiating infections caused by direct contact from those caused by airborne dissemination can be a very demanding task given the wide variety of sources of viral aerosols. While epidemiological data can help to determine the source of the contamination, direct data obtained from air samples can provide very useful information for risk assessment purposes. Many types of samplers have been used over the years, including liquid impingers, solid impactors, filters, electrostatic precipitators, and many others. The efficiencies of these samplers depend on a variety of environmental and methodological factors that can affect the integrity of the virus structure. The aerodynamic size distribution of the aerosol also has a direct effect on sampler efficiency. Viral aerosols can be studied under controlled laboratory conditions, using biological or nonbiological tracers and surrogate viruses, which are also discussed in this review. Lastly, general recommendations are made regarding future studies on the sampling of airborne viruses. PMID:18772283

  9. Flow cytometric detection method for DNA samples

    DOEpatents

    Nasarabadi, Shanavaz; Langlois, Richard G.; Venkateswaran, Kodumudi S.

    2006-08-01

    Disclosed herein are two methods for rapid multiplex analysis to determine the presence and identity of target DNA sequences within a DNA sample. Both methods use reporting DNA sequences, e.g., modified conventional Taqman.RTM. probes, to combine multiplex PCR amplification with microsphere-based hybridization using flow cytometry means of detection. Real-time PCR detection can also be incorporated. The first method uses a cyanine dye, such as, Cy3.TM., as the reporter linked to the 5' end of a reporting DNA sequence. The second method positions a reporter dye, e.g., FAM, on the 3' end of the reporting DNA sequence and a quencher dye, e.g., TAMRA, on the 5' end.

  10. Flow cytometric detection method for DNA samples

    DOEpatents

    Nasarabadi,Shanavaz; Langlois, Richard G.; Venkateswaran, Kodumudi S.

    2011-07-05

    Disclosed herein are two methods for rapid multiplex analysis to determine the presence and identity of target DNA sequences within a DNA sample. Both methods use reporting DNA sequences, e.g., modified conventional Taqman.RTM. probes, to combine multiplex PCR amplification with microsphere-based hybridization using flow cytometry means of detection. Real-time PCR detection can also be incorporated. The first method uses a cyanine dye, such as, Cy3.TM., as the reporter linked to the 5' end of a reporting DNA sequence. The second method positions a reporter dye, e.g., FAM.TM. on the 3' end of the reporting DNA sequence and a quencher dye, e.g., TAMRA.TM., on the 5' end.

  11. A quantitative evaluation of two methods for preserving hair samples

    USGS Publications Warehouse

    Roon, David A.; Waits, L.P.; Kendall, K.C.

    2003-01-01

    Hair samples are an increasingly important DNA source for wildlife studies, yet optimal storage methods and DNA degradation rates have not been rigorously evaluated. We tested amplification success rates over a one-year storage period for DNA extracted from brown bear (Ursus arctos) hair samples preserved using silica desiccation and -20C freezing. For three nuclear DNA microsatellites, success rates decreased significantly after a six-month time point, regardless of storage method. For a 1000 bp mitochondrial fragment, a similar decrease occurred after a two-week time point. Minimizing delays between collection and DNA extraction will maximize success rates for hair-based noninvasive genetic sampling projects.

  12. Field evaluation of a VOST sampling method

    SciTech Connect

    Jackson, M.D.; Johnson, L.D.; Fuerst, R.G.; McGaughey, J.F.; Bursey, J.T.; Merrill, R.G.

    1994-12-31

    The VOST (SW-846 Method 0030) specifies the use of Tenax{reg_sign} and a particular petroleum-based charcoal (SKC Lot 104, or its equivalent), that is no longer commercially available. In field evaluation studies of VOST methodology, a replacement petroleum-based charcoal has been used: candidate replacement sorbents for charcoal were studied, and Anasorb{reg_sign} 747, a carbon-based sorbent, was selected for field testing. The sampling train was modified to use only Anasorb{reg_sign} in the back tube and Tenax{reg_sign} in the two front tubes to avoid analytical difficulties associated with the analysis of the sequential bed back tube used in the standard VOST train. The standard (SW-846 Method 0030) and the modified VOST methods were evaluated at a chemical manufacturing facility using a quadruple probe system with quadruple trains. In this field test, known concentrations of the halogenated volatile organic compounds, that are listed in the Clean Air Act Amendments of 1990, Title 3, were introduced into the VOST train and the modified VOST train, using the same certified gas cylinder as a source of test compounds. Statistical tests of the comparability of methods were performed on a compound-by-compound basis. For most compounds, the VOST and modified VOST methods were found to be statistically equivalent.

  13. A method for sampling waste corn

    USGS Publications Warehouse

    Frederick, R.B.; Klaas, E.E.; Baldassarre, G.A.; Reinecke, K.J.

    1984-01-01

    Corn had become one of the most important wildlife food in the United States. It is eaten by a wide variety of animals, including white-tailed deer (Odocoileus virginianus ), raccoon (Procyon lotor ), ring-necked pheasant (Phasianus colchicus , wild turkey (Meleagris gallopavo ), and many species of aquatic birds. Damage to unharvested crops had been documented, but many birds and mammals eat waste grain after harvest and do not conflict with agriculture. A good method for measuring waste-corn availability can be essential to studies concerning food density and food and feeding habits of field-feeding wildlife. Previous methods were developed primarily for approximating losses due to harvest machinery. In this paper, a method is described for estimating the amount of waste corn potentially available to wildlife. Detection of temporal changes in food availability and differences caused by agricultural operations (e.g., recently harvested stubble fields vs. plowed fields) are discussed.

  14. Standard methods for sampling North American freshwater fishes

    USGS Publications Warehouse

    Bonar, Scott A.; Hubert, Wayne A.; Willis, David W.

    2009-01-01

    This important reference book provides standard sampling methods recommended by the American Fisheries Society for assessing and monitoring freshwater fish populations in North America. Methods apply to ponds, reservoirs, natural lakes, and streams and rivers containing cold and warmwater fishes. Range-wide and eco-regional averages for indices of abundance, population structure, and condition for individual species are supplied to facilitate comparisons of standard data among populations. Provides information on converting nonstandard to standard data, statistical and database procedures for analyzing and storing standard data, and methods to prevent transfer of invasive species while sampling.

  15. System and Method for Isolation of Samples

    NASA Technical Reports Server (NTRS)

    Zhang, Ye (Inventor); Wu, Honglu (Inventor)

    2014-01-01

    Systems and methods for isolating samples are provided. The system comprises a first membrane and a second membrane disposed within an enclosure. First and second reservoirs can also be disposed within the enclosure and adapted to contain one or more reagents therein. A first valve can be disposed within the enclosure and in fluid communication with the first reservoir, the second reservoir, or both. The first valve can also be in fluid communication with the first or second membranes or both. The first valve can be adapted to selectively regulate the flow of the reagents from the first reservoir, through at least one of the first and second membranes, and into the second reservoir.

  16. [Progress in sample preparation and analytical methods for trace polar small molecules in complex samples].

    PubMed

    Zhang, Qianchun; Luo, Xialin; Li, Gongke; Xiao, Xiaohua

    2015-09-01

    Small polar molecules such as nucleosides, amines, amino acids are important analytes in biological, food, environmental, and other fields. It is necessary to develop efficient sample preparation and sensitive analytical methods for rapid analysis of these polar small molecules in complex matrices. Some typical materials in sample preparation, including silica, polymer, carbon, boric acid and so on, are introduced in this paper. Meanwhile, the applications and developments of analytical methods of polar small molecules, such as reversed-phase liquid chromatography, hydrophilic interaction chromatography, etc., are also reviewed. PMID:26753274

  17. Catching Stardust and Bringing it Home: The Astronomical Importance of Sample Return

    NASA Astrophysics Data System (ADS)

    Brownlee, D.

    2002-12-01

    The return of lunar samples by the Apollo program provided the first opportunity to perform detailed laboratory studies of ancient solid materials from a known astronomical body. The highly detailed study of the samples, using the best available laboratory instruments and techniques, revolutionized our understanding of the Moon and provided fundamental insight into the remarkable and violent processes that occur early in the history of moons and terrestrial planets. This type of astronomical paleontology is only possible with samples and yet the last US sample return was made by Apollo 17- over thirty years ago! The NASA Stardust mission, began a new era of sample missions with its 1999 launch to retrieve samples from the short period comet Wild 2. Genesis (a solar wind collector) was launched in 2001, the Japanese MUSES-C asteroid sample return mission will launch in 2003 and Mars sample return missions are under study. All of these missions will use sophisticated ground-based instrumentation to provide types of information that cannot be obtained by astronomical and spacecraft remote sensing methods. In the case of Stardust, the goal is to determine the fundamental nature of the initial solid building blocks of solar systems at atomic-scale spatial resolution. The samples returned by the mission will be samples from the Kuiper Belt region and they are probably composed of submicron silicate and organic materials of both presolar and nebular origin. Unlocking the detailed records contained in the elemental, chemical, isotopic and mineralogical composition of these tiny components can only be appropriately explored with full power, precision and flexibility of laboratory instrumentation. Laboratory instrumentation has the advantage that is state-of-the-art and is not limited by serious considerations of power, mass, cost or even reliability. The comparison of the comet sample, accumulated beyond Neptune, with asteroidal meteorites that accumulated just beyond the

  18. Sampling high-altitude and stratified mating flights of red imported fire ant.

    PubMed

    Fritz, Gary N; Fritz, Ann H; Vander Meer, Robert K

    2011-05-01

    With the exception of an airplane equipped with nets, no method has been developed that successfully samples red imported fire ant, Solenopsis invicta Buren, sexuals in mating/dispersal flights throughout their potential altitudinal trajectories. We developed and tested a method for sampling queens and males during mating flights at altitudinal intervals reaching as high as "140 m. Our trapping system uses an electric winch and a 1.2-m spindle bolted to a swiveling platform. The winch dispenses up to 183 m of Kevlar-core, nylon rope and the spindle stores 10 panels (0.9 by 4.6 m each) of nylon tulle impregnated with Tangle-Trap. The panels can be attached to the rope at various intervals and hoisted into the air by using a 3-m-diameter, helium-filled balloon. Raising or lowering all 10 panels takes approximately 15-20 min. This trap also should be useful for altitudinal sampling of other insects of medical importance.

  19. Characterizing lentic freshwater fish assemblages using multiple sampling methods

    USGS Publications Warehouse

    Fischer, Jesse R.; Quist, Michael

    2014-01-01

    Characterizing fish assemblages in lentic ecosystems is difficult, and multiple sampling methods are almost always necessary to gain reliable estimates of indices such as species richness. However, most research focused on lentic fish sampling methodology has targeted recreationally important species, and little to no information is available regarding the influence of multiple methods and timing (i.e., temporal variation) on characterizing entire fish assemblages. Therefore, six lakes and impoundments (48–1,557 ha surface area) were sampled seasonally with seven gear types to evaluate the combined influence of sampling methods and timing on the number of species and individuals sampled. Probabilities of detection for species indicated strong selectivities and seasonal trends that provide guidance on optimal seasons to use gears when targeting multiple species. The evaluation of species richness and number of individuals sampled using multiple gear combinations demonstrated that appreciable benefits over relatively few gears (e.g., to four) used in optimal seasons were not present. Specifically, over 90 % of the species encountered with all gear types and season combinations (N = 19) from six lakes and reservoirs were sampled with nighttime boat electrofishing in the fall and benthic trawling, modified-fyke, and mini-fyke netting during the summer. Our results indicated that the characterization of lentic fish assemblages was highly influenced by the selection of sampling gears and seasons, but did not appear to be influenced by waterbody type (i.e., natural lake, impoundment). The standardization of data collected with multiple methods and seasons to account for bias is imperative to monitoring of lentic ecosystems and will provide researchers with increased reliability in their interpretations and decisions made using information on lentic fish assemblages.

  20. The experience sampling method: Investigating students' affective experience

    NASA Astrophysics Data System (ADS)

    Nissen, Jayson M.; Stetzer, MacKenzie R.; Shemwell, Jonathan T.

    2013-01-01

    Improving non-cognitive outcomes such as attitudes, efficacy, and persistence in physics courses is an important goal of physics education. This investigation implemented an in-the-moment surveying technique called the Experience Sampling Method (ESM) [1] to measure students' affective experience in physics. Measurements included: self-efficacy, cognitive efficiency, activation, intrinsic motivation, and affect. Data are presented that show contrasts in students' experiences (e.g., in physics vs. non-physics courses).

  1. System and method for extracting a sample from a surface

    DOEpatents

    Van Berkel, Gary; Covey, Thomas

    2015-06-23

    A system and method is disclosed for extracting a sample from a sample surface. A sample is provided and a sample surface receives the sample which is deposited on the sample surface. A hydrophobic material is applied to the sample surface, and one or more devices are configured to dispense a liquid on the sample, the liquid dissolving the sample to form a dissolved sample material, and the one or more devices are configured to extract the dissolved sample material from the sample surface.

  2. Examination of Hydrate Formation Methods: Trying to Create Representative Samples

    SciTech Connect

    Kneafsey, T.J.; Rees, E.V.L.; Nakagawa, S.; Kwon, T.-H.

    2011-04-01

    Forming representative gas hydrate-bearing laboratory samples is important so that the properties of these materials may be measured, while controlling the composition and other variables. Natural samples are rare, and have often experienced pressure and temperature changes that may affect the property to be measured [Waite et al., 2008]. Forming methane hydrate samples in the laboratory has been done a number of ways, each having advantages and disadvantages. The ice-to-hydrate method [Stern et al., 1996], contacts melting ice with methane at the appropriate pressure to form hydrate. The hydrate can then be crushed and mixed with mineral grains under controlled conditions, and then compacted to create laboratory samples of methane hydrate in a mineral medium. The hydrate in these samples will be part of the load-bearing frame of the medium. In the excess gas method [Handa and Stupin, 1992], water is distributed throughout a mineral medium (e.g. packed moist sand, drained sand, moistened silica gel, other porous media) and the mixture is brought to hydrate-stable conditions (chilled and pressurized with gas), allowing hydrate to form. This method typically produces grain-cementing hydrate from pendular water in sand [Waite et al., 2004]. In the dissolved gas method [Tohidi et al., 2002], water with sufficient dissolved guest molecules is brought to hydrate-stable conditions where hydrate forms. In the laboratory, this is can be done by pre-dissolving the gas of interest in water and then introducing it to the sample under the appropriate conditions. With this method, it is easier to form hydrate from more soluble gases such as carbon dioxide. It is thought that this method more closely simulates the way most natural gas hydrate has formed. Laboratory implementation, however, is difficult, and sample formation is prohibitively time consuming [Minagawa et al., 2005; Spangenberg and Kulenkampff, 2005]. In another version of this technique, a specified quantity of gas

  3. 7 CFR 58.245 - Method of sample analysis.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... Service, Dairy Programs, or Official Methods of Analysis of the Association of Analytical Chemists or... 7 Agriculture 3 2011-01-01 2011-01-01 false Method of sample analysis. 58.245 Section 58.245... Procedures § 58.245 Method of sample analysis. Samples shall be tested according to the applicable methods...

  4. Potassium bromide method of infrared sampling

    USGS Publications Warehouse

    Milkey, R.G.

    1958-01-01

    In the preparation of potassium bromide pressed windows for use in the infrared analysis of solids, severe grinding of the potassium bromide powder may produce strong absorption bands that could interfere seriously with the spectra of the sample. These absorption bands appear to be due to some crystal alteration of the potassium bromide as a result of the grinding process. They were less apt to occur when the coarser powder, which had received a relatively gentle grinding, was used. Window blanks prepared from the coarser powders showed smaller adsorbed water peaks and generally higher over-all transmittance readings than windows pressed from the very fine powders.

  5. Methods for the analysis of carpet samples for asbestos

    SciTech Connect

    Millette, J.R.; Clark, P.J.; Brackett, K.A.; Wheeles, R.K.

    1993-01-01

    Because of the nature of carpet pile, no samples can be directly prepared from carpet for analysis by transmission electron microscopy (TEM). Two indirect methods are currently used by laboratories when preparing samples for measuring the amount of asbestos present in carpet material. One is an ultrasonic shaking technique which requires that a portion of the carpet be cut and sent to the laboratory. The other is a micro-vacuuming technique which has been used generally in the assessment of asbestos in settled dust in buildings. It is not destructive to the carpet. Both methods utilize TEM to identify, measure and count the asbestos fibers found. Each can provide important but different information when an assessment of the level of contamination of carpeting is being made.

  6. Development of an automated data processing method for sample to sample comparison of seized methamphetamines.

    PubMed

    Choe, Sanggil; Lee, Jaesin; Choi, Hyeyoung; Park, Yujin; Lee, Heesang; Pyo, Jaesung; Jo, Jiyeong; Park, Yonghoon; Choi, Hwakyung; Kim, Suncheun

    2012-11-30

    The information about the sources of supply, trafficking routes, distribution patterns and conspiracy links can be obtained from methamphetamine profiling. The precursor and synthetic method for the clandestine manufacture can be estimated from the analysis of minor impurities contained in methamphetamine. Also, the similarity between samples can be evaluated using the peaks that appear in chromatograms. In South Korea, methamphetamine was the most popular drug but the total seized amount of methamphetamine whole through the country was very small. Therefore, it would be more important to find the links between samples than the other uses of methamphetamine profiling. Many Asian countries including Japan and South Korea have been using the method developed by National Research Institute of Police Science of Japan. The method used gas chromatography-flame ionization detector (GC-FID), DB-5 column and four internal standards. It was developed to increase the amount of impurities and minimize the amount of methamphetamine. After GC-FID analysis, the raw data have to be processed. The data processing steps are very complex and require a lot of time and effort. In this study, Microsoft Visual Basic Application (VBA) modules were developed to handle these data processing steps. This module collected the results from the data into an Excel file and then corrected the retention time shift and response deviation generated from the sample preparation and instruments analysis. The developed modules were tested for their performance using 10 samples from 5 different cases. The processed results were analyzed with Pearson correlation coefficient for similarity assessment and the correlation coefficient of the two samples from the same case was more than 0.99. When the modules were applied to 131 seized methamphetamine samples, four samples from two different cases were found to have the common origin and the chromatograms of the four samples were appeared visually identical

  7. Development of an automated data processing method for sample to sample comparison of seized methamphetamines.

    PubMed

    Choe, Sanggil; Lee, Jaesin; Choi, Hyeyoung; Park, Yujin; Lee, Heesang; Pyo, Jaesung; Jo, Jiyeong; Park, Yonghoon; Choi, Hwakyung; Kim, Suncheun

    2012-11-30

    The information about the sources of supply, trafficking routes, distribution patterns and conspiracy links can be obtained from methamphetamine profiling. The precursor and synthetic method for the clandestine manufacture can be estimated from the analysis of minor impurities contained in methamphetamine. Also, the similarity between samples can be evaluated using the peaks that appear in chromatograms. In South Korea, methamphetamine was the most popular drug but the total seized amount of methamphetamine whole through the country was very small. Therefore, it would be more important to find the links between samples than the other uses of methamphetamine profiling. Many Asian countries including Japan and South Korea have been using the method developed by National Research Institute of Police Science of Japan. The method used gas chromatography-flame ionization detector (GC-FID), DB-5 column and four internal standards. It was developed to increase the amount of impurities and minimize the amount of methamphetamine. After GC-FID analysis, the raw data have to be processed. The data processing steps are very complex and require a lot of time and effort. In this study, Microsoft Visual Basic Application (VBA) modules were developed to handle these data processing steps. This module collected the results from the data into an Excel file and then corrected the retention time shift and response deviation generated from the sample preparation and instruments analysis. The developed modules were tested for their performance using 10 samples from 5 different cases. The processed results were analyzed with Pearson correlation coefficient for similarity assessment and the correlation coefficient of the two samples from the same case was more than 0.99. When the modules were applied to 131 seized methamphetamine samples, four samples from two different cases were found to have the common origin and the chromatograms of the four samples were appeared visually identical

  8. Evaluation of Sampling Methods and Development of Sample Plans for Estimating Predator Densities in Cotton

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The cost-reliability of five sampling methods (visual search, drop cloth, beat bucket, shake bucket and sweep net) was determined for predatory arthropods on cotton plants. The beat bucket sample method was the most cost-reliable while the visual sample method was the least cost-reliable. The beat ...

  9. Modified electrokinetic sample injection method in chromatography and electrophoresis analysis

    DOEpatents

    Davidson, J. Courtney; Balch, Joseph W.

    2001-01-01

    A sample injection method for horizontal configured multiple chromatography or electrophoresis units, each containing a number of separation/analysis channels, that enables efficient introduction of analyte samples. This method for loading when taken in conjunction with horizontal microchannels allows much reduced sample volumes and a means of sample stacking to greatly reduce the concentration of the sample. This reduction in the amount of sample can lead to great cost savings in sample preparation, particularly in massively parallel applications such as DNA sequencing. The essence of this method is in preparation of the input of the separation channel, the physical sample introduction, and subsequent removal of excess material. By this method, sample volumes of 100 nanoliter to 2 microliters have been used successfully, compared to the typical 5 microliters of sample required by the prior separation/analysis method.

  10. 7 CFR 58.812 - Methods of sample analysis.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... Marketing Service, Dairy Programs, or the Official Methods of Analysis of the Association of Official... 7 Agriculture 3 2011-01-01 2011-01-01 false Methods of sample analysis. 58.812 Section 58.812... Procedures § 58.812 Methods of sample analysis. Samples shall be tested according to the applicable...

  11. 7 CFR 58.812 - Methods of sample analysis.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 3 2010-01-01 2010-01-01 false Methods of sample analysis. 58.812 Section 58.812... Procedures § 58.812 Methods of sample analysis. Samples shall be tested according to the applicable methods of laboratory analysis contained in either DA Instruction 918-RL, as issued by the USDA,...

  12. 7 CFR 58.812 - Methods of sample analysis.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 7 Agriculture 3 2013-01-01 2013-01-01 false Methods of sample analysis. 58.812 Section 58.812... Procedures § 58.812 Methods of sample analysis. Samples shall be tested according to the applicable methods of laboratory analysis contained in either DA Instruction 918-RL, as issued by the USDA,...

  13. 7 CFR 58.245 - Method of sample analysis.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 3 2010-01-01 2010-01-01 false Method of sample analysis. 58.245 Section 58.245... Procedures § 58.245 Method of sample analysis. Samples shall be tested according to the applicable methods of laboratory analysis contained in either DA Instruction 918-RL as issued by the USDA, Agricultural...

  14. 7 CFR 58.245 - Method of sample analysis.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 7 Agriculture 3 2013-01-01 2013-01-01 false Method of sample analysis. 58.245 Section 58.245... Procedures § 58.245 Method of sample analysis. Samples shall be tested according to the applicable methods of laboratory analysis contained in either DA Instruction 918-RL as issued by the USDA, Agricultural...

  15. 40 CFR Appendix I to Part 261 - Representative Sampling Methods

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 26 2011-07-01 2011-07-01 false Representative Sampling Methods I...—Representative Sampling Methods The methods and equipment used for sampling waste materials will vary with the... Crushed or powdered material—ASTM Standard D346-75 Soil or rock-like material—ASTM Standard D420-69...

  16. Microfluidic DNA sample preparation method and device

    DOEpatents

    Krulevitch, Peter A.; Miles, Robin R.; Wang, Xiao-Bo; Mariella, Raymond P.; Gascoyne, Peter R. C.; Balch, Joseph W.

    2002-01-01

    Manipulation of DNA molecules in solution has become an essential aspect of genetic analyses used for biomedical assays, the identification of hazardous bacterial agents, and in decoding the human genome. Currently, most of the steps involved in preparing a DNA sample for analysis are performed manually and are time, labor, and equipment intensive. These steps include extraction of the DNA from spores or cells, separation of the DNA from other particles and molecules in the solution (e.g. dust, smoke, cell/spore debris, and proteins), and separation of the DNA itself into strands of specific lengths. Dielectrophoresis (DEP), a phenomenon whereby polarizable particles move in response to a gradient in electric field, can be used to manipulate and separate DNA in an automated fashion, considerably reducing the time and expense involved in DNA analyses, as well as allowing for the miniaturization of DNA analysis instruments. These applications include direct transport of DNA, trapping of DNA to allow for its separation from other particles or molecules in the solution, and the separation of DNA into strands of varying lengths.

  17. A whole-path importance-sampling scheme for Feynman path integral calculations of absolute partition functions and free energies.

    PubMed

    Mielke, Steven L; Truhlar, Donald G

    2016-01-21

    Using Feynman path integrals, a molecular partition function can be written as a double integral with the inner integral involving all closed paths centered at a given molecular configuration, and the outer integral involving all possible molecular configurations. In previous work employing Monte Carlo methods to evaluate such partition functions, we presented schemes for importance sampling and stratification in the molecular configurations that constitute the path centroids, but we relied on free-particle paths for sampling the path integrals. At low temperatures, the path sampling is expensive because the paths can travel far from the centroid configuration. We now present a scheme for importance sampling of whole Feynman paths based on harmonic information from an instantaneous normal mode calculation at the centroid configuration, which we refer to as harmonically guided whole-path importance sampling (WPIS). We obtain paths conforming to our chosen importance function by rejection sampling from a distribution of free-particle paths. Sample calculations on CH4 demonstrate that at a temperature of 200 K, about 99.9% of the free-particle paths can be rejected without integration, and at 300 K, about 98% can be rejected. We also show that it is typically possible to reduce the overhead associated with the WPIS scheme by sampling the paths using a significantly lower-order path discretization than that which is needed to converge the partition function.

  18. A whole-path importance-sampling scheme for Feynman path integral calculations of absolute partition functions and free energies.

    PubMed

    Mielke, Steven L; Truhlar, Donald G

    2016-01-21

    Using Feynman path integrals, a molecular partition function can be written as a double integral with the inner integral involving all closed paths centered at a given molecular configuration, and the outer integral involving all possible molecular configurations. In previous work employing Monte Carlo methods to evaluate such partition functions, we presented schemes for importance sampling and stratification in the molecular configurations that constitute the path centroids, but we relied on free-particle paths for sampling the path integrals. At low temperatures, the path sampling is expensive because the paths can travel far from the centroid configuration. We now present a scheme for importance sampling of whole Feynman paths based on harmonic information from an instantaneous normal mode calculation at the centroid configuration, which we refer to as harmonically guided whole-path importance sampling (WPIS). We obtain paths conforming to our chosen importance function by rejection sampling from a distribution of free-particle paths. Sample calculations on CH4 demonstrate that at a temperature of 200 K, about 99.9% of the free-particle paths can be rejected without integration, and at 300 K, about 98% can be rejected. We also show that it is typically possible to reduce the overhead associated with the WPIS scheme by sampling the paths using a significantly lower-order path discretization than that which is needed to converge the partition function. PMID:26801023

  19. Coalescent: an open-science framework for importance sampling in coalescent theory.

    PubMed

    Tewari, Susanta; Spouge, John L

    2015-01-01

    Background. In coalescent theory, computer programs often use importance sampling to calculate likelihoods and other statistical quantities. An importance sampling scheme can exploit human intuition to improve statistical efficiency of computations, but unfortunately, in the absence of general computer frameworks on importance sampling, researchers often struggle to translate new sampling schemes computationally or benchmark against different schemes, in a manner that is reliable and maintainable. Moreover, most studies use computer programs lacking a convenient user interface or the flexibility to meet the current demands of open science. In particular, current computer frameworks can only evaluate the efficiency of a single importance sampling scheme or compare the efficiencies of different schemes in an ad hoc manner. Results. We have designed a general framework (http://coalescent.sourceforge.net; language: Java; License: GPLv3) for importance sampling that computes likelihoods under the standard neutral coalescent model of a single, well-mixed population of constant size over time following infinite sites model of mutation. The framework models the necessary core concepts, comes integrated with several data sets of varying size, implements the standard competing proposals, and integrates tightly with our previous framework for calculating exact probabilities. For a given dataset, it computes the likelihood and provides the maximum likelihood estimate of the mutation parameter. Well-known benchmarks in the coalescent literature validate the accuracy of the framework. The framework provides an intuitive user interface with minimal clutter. For performance, the framework switches automatically to modern multicore hardware, if available. It runs on three major platforms (Windows, Mac and Linux). Extensive tests and coverage make the framework reliable and maintainable. Conclusions. In coalescent theory, many studies of computational efficiency consider only

  20. Coalescent: an open-science framework for importance sampling in coalescent theory

    PubMed Central

    Spouge, John L.

    2015-01-01

    Background. In coalescent theory, computer programs often use importance sampling to calculate likelihoods and other statistical quantities. An importance sampling scheme can exploit human intuition to improve statistical efficiency of computations, but unfortunately, in the absence of general computer frameworks on importance sampling, researchers often struggle to translate new sampling schemes computationally or benchmark against different schemes, in a manner that is reliable and maintainable. Moreover, most studies use computer programs lacking a convenient user interface or the flexibility to meet the current demands of open science. In particular, current computer frameworks can only evaluate the efficiency of a single importance sampling scheme or compare the efficiencies of different schemes in an ad hoc manner. Results. We have designed a general framework (http://coalescent.sourceforge.net; language: Java; License: GPLv3) for importance sampling that computes likelihoods under the standard neutral coalescent model of a single, well-mixed population of constant size over time following infinite sites model of mutation. The framework models the necessary core concepts, comes integrated with several data sets of varying size, implements the standard competing proposals, and integrates tightly with our previous framework for calculating exact probabilities. For a given dataset, it computes the likelihood and provides the maximum likelihood estimate of the mutation parameter. Well-known benchmarks in the coalescent literature validate the accuracy of the framework. The framework provides an intuitive user interface with minimal clutter. For performance, the framework switches automatically to modern multicore hardware, if available. It runs on three major platforms (Windows, Mac and Linux). Extensive tests and coverage make the framework reliable and maintainable. Conclusions. In coalescent theory, many studies of computational efficiency consider only

  1. Coalescent: an open-science framework for importance sampling in coalescent theory.

    PubMed

    Tewari, Susanta; Spouge, John L

    2015-01-01

    Background. In coalescent theory, computer programs often use importance sampling to calculate likelihoods and other statistical quantities. An importance sampling scheme can exploit human intuition to improve statistical efficiency of computations, but unfortunately, in the absence of general computer frameworks on importance sampling, researchers often struggle to translate new sampling schemes computationally or benchmark against different schemes, in a manner that is reliable and maintainable. Moreover, most studies use computer programs lacking a convenient user interface or the flexibility to meet the current demands of open science. In particular, current computer frameworks can only evaluate the efficiency of a single importance sampling scheme or compare the efficiencies of different schemes in an ad hoc manner. Results. We have designed a general framework (http://coalescent.sourceforge.net; language: Java; License: GPLv3) for importance sampling that computes likelihoods under the standard neutral coalescent model of a single, well-mixed population of constant size over time following infinite sites model of mutation. The framework models the necessary core concepts, comes integrated with several data sets of varying size, implements the standard competing proposals, and integrates tightly with our previous framework for calculating exact probabilities. For a given dataset, it computes the likelihood and provides the maximum likelihood estimate of the mutation parameter. Well-known benchmarks in the coalescent literature validate the accuracy of the framework. The framework provides an intuitive user interface with minimal clutter. For performance, the framework switches automatically to modern multicore hardware, if available. It runs on three major platforms (Windows, Mac and Linux). Extensive tests and coverage make the framework reliable and maintainable. Conclusions. In coalescent theory, many studies of computational efficiency consider only

  2. Method for sampling sub-micron particles

    DOEpatents

    Gay, Don D.; McMillan, William G.

    1985-01-01

    Apparatus and method steps for collecting sub-micron sized particles include a collection chamber and cryogenic cooling. The cooling is accomplished by coil tubing carrying nitrogen in liquid form, with the liquid nitrogen changing to the gas phase before exiting from the collection chamber in the tubing. Standard filters are used to filter out particles of diameter greater than or equal to 0.3 microns; however the present invention is used to trap particles of less than 0.3 micron in diameter. A blower draws air to said collection chamber through a filter which filters particles with diameters greater than or equal to 0.3 micron. The air is then cryogenically cooled so that moisture and sub-micron sized particles in the air condense into ice on the coil. The coil is then heated so that the ice melts, and the liquid is then drawn off and passed through a Buchner funnel where the liquid is passed through a Nuclepore membrane. A vacuum draws the liquid through the Nuclepore membrane, with the Nuclepore membrane trapping sub-micron sized particles therein. The Nuclepore membrane is then covered on its top and bottom surfaces with sheets of Mylar.RTM. and the assembly is then crushed into a pellet. This effectively traps the sub-micron sized particles for later analysis.

  3. Methods for characterizing, classifying, and identifying unknowns in samples

    DOEpatents

    Grate, Jay W [West Richland, WA; Wise, Barry M [Manson, WA

    2002-01-01

    Disclosed is a method for taking the data generated from an array of responses from a multichannel instrument, and determining the characteristics of a chemical in the sample without the necessity of calibrating or training the instrument with known samples containing the same chemical. The characteristics determined by the method are then used to classify and identify the chemical in the sample. The method can also be used to quantify the concentration of the chemical in the sample.

  4. Methods for characterizing, classifying, and identifying unknowns in samples

    DOEpatents

    Grate, Jay W.; Wise, Barry M.

    2003-08-12

    Disclosed is a method for taking the data generated from an array of responses from a multichannel instrument, and determining the characteristics of a chemical in the sample without the necessity of calibrating or training the instrument with known samples containing the same chemical. The characteristics determined by the method are then used to classify and identify the chemical in the sample. The method can also be used to quantify the concentration of the chemical in the sample.

  5. Field evaluation of endotoxin air sampling assay methods.

    PubMed

    Thorne, P S; Reynolds, S J; Milton, D K; Bloebaum, P D; Zhang, X; Whitten, P; Burmeister, L F

    1997-11-01

    This study tested the importance of filter media, extraction and assay protocol, and bioaerosol source on the determination of endotoxin under field conditions in swine and poultry confinement buildings. Multiple simultaneous air samples were collected using glass fiber (GF) and polycarbonate (PC) filters, and these were assayed using two methods in two separate laboratories: an endpoint chromogenic Limulus amebocyte lysate (LAL) assay (QCL) performed in water and a kinetic chromogenic LAL assay (KQCL) performed in buffer with resistant-parallel line estimation analysis (KLARE). In addition, two aqueous filter extraction methods were compared in the QCL assay: 120 min extraction at 22 degrees C with vigorous shaking and 30 min extraction at 68 degrees C with gentle rocking. These extraction methods yielded endotoxin activities that were not significantly different and were very highly correlated. Reproducibility of endotoxin determinations from duplicate air sampling filters was very high (Cronbach alpha all > 0.94). When analyzed by the QCL method GF filters yielded significantly higher endotoxin activity than PC filters. QCL and KLARE methods gave similar estimates for endotoxin activity from PC filters; however, GF filters analyzed by the QCL method yielded significantly higher endotoxin activity estimates, suggesting enhancement of the QCL assay or inhibition of the KLARE asay with GF filters. Correlation between QCL-GF and QCL-PC was high (r = 0.98) while that between KLARE-GF and KLARE-PC was moderate (r = 0.68). Analysis of variance demonstrated that assay methodology, filter-type, barn-type, and interactions between assay and filter-type and between assay and barn-type were important factors influencing endotoxin exposure assessment.

  6. Evaluation of sample preservation methods for poultry manure.

    PubMed

    Pan, J; Fadel, J G; Zhang, R; El-Mashad, H M; Ying, Y; Rumsey, T

    2009-08-01

    When poultry manure is collected but cannot be analyzed immediately, a method for storing the manure is needed to ensure accurate subsequent analyses. This study has 3 objectives: (1) to investigate effects of 4 poultry manure sample preservation methods (refrigeration, freezing, acidification, and freeze-drying) on the compositional characteristics of poultry manure; (2) to determine compositional differences in fresh manure with manure samples at 1, 2, and 3 d of accumulation under bird cages; and (3) to assess the influence of 14-d freezing storage on the composition of manure when later exposed to 25 degrees C for 7 d as compared with fresh manure. All manure samples were collected from a layer house. Analyses performed on the manure samples included total Kjeldahl nitrogen, uric acid nitrogen, ammonia nitrogen, and urea nitrogen. In experiment 1, the storage methods most similar to fresh manure, in order of preference, were freezing, freeze-drying, acidification, and refrigeration. Thoroughly mixing manure samples and compressing them to 2 to 3 mm is important for the freezing and freeze-dried samples. In general, refrigeration was found unacceptable for nitrogen analyses. A significant effect (P < 0.0001) of time for refrigeration was found on uric acid nitrogen and ammonia nitrogen. In experiment 2, the total Kjeldahl nitrogen and uric acid nitrogen were significantly lower (P < 0.05) for 1, 2, and 3 d of accumulation compared with fresh manure. Manure after 1, 2, and 3 d of accumulation had similar nitrogen compositions. The results from experiment 3 show that nitrogen components from fresh manure samples and thawed samples from 14 d of freezing are similar at 7 d but high variability of nitrogen compositions during intermediate times from 0 to 7 d prevents the recommendation of freezing manure for use in subsequent experiments and warrants future experimentation. In conclusion, fresh poultry manure can be frozen for accurate subsequent nitrogen

  7. Evaluation of sample preservation methods for poultry manure.

    PubMed

    Pan, J; Fadel, J G; Zhang, R; El-Mashad, H M; Ying, Y; Rumsey, T

    2009-08-01

    When poultry manure is collected but cannot be analyzed immediately, a method for storing the manure is needed to ensure accurate subsequent analyses. This study has 3 objectives: (1) to investigate effects of 4 poultry manure sample preservation methods (refrigeration, freezing, acidification, and freeze-drying) on the compositional characteristics of poultry manure; (2) to determine compositional differences in fresh manure with manure samples at 1, 2, and 3 d of accumulation under bird cages; and (3) to assess the influence of 14-d freezing storage on the composition of manure when later exposed to 25 degrees C for 7 d as compared with fresh manure. All manure samples were collected from a layer house. Analyses performed on the manure samples included total Kjeldahl nitrogen, uric acid nitrogen, ammonia nitrogen, and urea nitrogen. In experiment 1, the storage methods most similar to fresh manure, in order of preference, were freezing, freeze-drying, acidification, and refrigeration. Thoroughly mixing manure samples and compressing them to 2 to 3 mm is important for the freezing and freeze-dried samples. In general, refrigeration was found unacceptable for nitrogen analyses. A significant effect (P < 0.0001) of time for refrigeration was found on uric acid nitrogen and ammonia nitrogen. In experiment 2, the total Kjeldahl nitrogen and uric acid nitrogen were significantly lower (P < 0.05) for 1, 2, and 3 d of accumulation compared with fresh manure. Manure after 1, 2, and 3 d of accumulation had similar nitrogen compositions. The results from experiment 3 show that nitrogen components from fresh manure samples and thawed samples from 14 d of freezing are similar at 7 d but high variability of nitrogen compositions during intermediate times from 0 to 7 d prevents the recommendation of freezing manure for use in subsequent experiments and warrants future experimentation. In conclusion, fresh poultry manure can be frozen for accurate subsequent nitrogen

  8. 7 CFR 29.110 - Method of sampling.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 7 Agriculture 2 2013-01-01 2013-01-01 false Method of sampling. 29.110 Section 29.110 Agriculture Regulations of the Department of Agriculture AGRICULTURAL MARKETING SERVICE (Standards, Inspections, Marketing... INSPECTION Regulations Inspectors, Samplers, and Weighers § 29.110 Method of sampling. In sampling...

  9. 7 CFR 29.110 - Method of sampling.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 7 Agriculture 2 2011-01-01 2011-01-01 false Method of sampling. 29.110 Section 29.110 Agriculture Regulations of the Department of Agriculture AGRICULTURAL MARKETING SERVICE (Standards, Inspections, Marketing... INSPECTION Regulations Inspectors, Samplers, and Weighers § 29.110 Method of sampling. In sampling...

  10. 7 CFR 29.110 - Method of sampling.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 7 Agriculture 2 2012-01-01 2012-01-01 false Method of sampling. 29.110 Section 29.110 Agriculture Regulations of the Department of Agriculture AGRICULTURAL MARKETING SERVICE (Standards, Inspections, Marketing... INSPECTION Regulations Inspectors, Samplers, and Weighers § 29.110 Method of sampling. In sampling...

  11. 7 CFR 29.110 - Method of sampling.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 7 Agriculture 2 2014-01-01 2014-01-01 false Method of sampling. 29.110 Section 29.110 Agriculture Regulations of the Department of Agriculture AGRICULTURAL MARKETING SERVICE (Standards, Inspections, Marketing... INSPECTION Regulations Inspectors, Samplers, and Weighers § 29.110 Method of sampling. In sampling...

  12. 19 CFR 151.83 - Method of sampling.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 19 Customs Duties 2 2010-04-01 2010-04-01 false Method of sampling. 151.83 Section 151.83 Customs Duties U.S. CUSTOMS AND BORDER PROTECTION, DEPARTMENT OF HOMELAND SECURITY; DEPARTMENT OF THE TREASURY (CONTINUED) EXAMINATION, SAMPLING, AND TESTING OF MERCHANDISE Cotton § 151.83 Method of sampling....

  13. 7 CFR 29.110 - Method of sampling.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 2 2010-01-01 2010-01-01 false Method of sampling. 29.110 Section 29.110 Agriculture Regulations of the Department of Agriculture AGRICULTURAL MARKETING SERVICE (Standards, Inspections, Marketing... INSPECTION Regulations Inspectors, Samplers, and Weighers § 29.110 Method of sampling. In sampling...

  14. 19 CFR 151.83 - Method of sampling.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 19 Customs Duties 2 2014-04-01 2014-04-01 false Method of sampling. 151.83 Section 151.83 Customs Duties U.S. CUSTOMS AND BORDER PROTECTION, DEPARTMENT OF HOMELAND SECURITY; DEPARTMENT OF THE TREASURY (CONTINUED) EXAMINATION, SAMPLING, AND TESTING OF MERCHANDISE Cotton § 151.83 Method of sampling....

  15. 19 CFR 151.83 - Method of sampling.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 19 Customs Duties 2 2012-04-01 2012-04-01 false Method of sampling. 151.83 Section 151.83 Customs Duties U.S. CUSTOMS AND BORDER PROTECTION, DEPARTMENT OF HOMELAND SECURITY; DEPARTMENT OF THE TREASURY (CONTINUED) EXAMINATION, SAMPLING, AND TESTING OF MERCHANDISE Cotton § 151.83 Method of sampling....

  16. Sediment sampling and processing methods in Hungary, and possible improvements

    NASA Astrophysics Data System (ADS)

    Tamas, Eniko Anna; Koch, Daniel; Varga, Gyorgy

    2016-04-01

    The importance of the monitoring of sediment processes is unquestionable: sediment balance of regulated rivers suffered substantial alterations in the past century, affecting navigation, energy production, fish habitats and floodplain ecosystems alike; infiltration times to our drinking water wells have shortened, exposing them to an eventual pollution event and making them vulnerable; and sediment-attached contaminants accumulate in floodplains and reservoirs, threatening our healthy environment. The changes in flood characteristics and rating curves of our rivers are regularly being researched and described, involving state-of-the-art measurement methods, modeling tools and traditional statistics. Sediment processes however, are much less known. Unlike the investigation of flow processes, sediment-related research is scarce, which is partly due to the outdated methodology and poor database background in the specific field. Sediment-related data, information and analyses form an important and integral part of Civil engineering in relation to rivers all over the world. In relation to the second largest river of Europe, the Danube, it is widely known in expert community and for long discussed at different expert forums that the sediment balance of the river Danube has changed drastically over the past century. Sediment monitoring on the river Danube started as early as the end of the 19th century, with scattered measurements carried out. Regular sediment sampling was developed in the first half of the 20th century all along the river, with different station density and monitoring frequencies in different countries. After the first few decades of regular sampling, the concept of (mainly industrial) development changed along the river and data needs changed as well, furthermore the complicated and inexact methods of sampling bed load on the alluvial reach of the river were not developed further. Frequency of suspended sediment sampling is very low along the river

  17. Sediment sampling and processing methods in Hungary, and possible improvements

    NASA Astrophysics Data System (ADS)

    Tamas, Eniko Anna; Koch, Daniel; Varga, Gyorgy

    2016-04-01

    The importance of the monitoring of sediment processes is unquestionable: sediment balance of regulated rivers suffered substantial alterations in the past century, affecting navigation, energy production, fish habitats and floodplain ecosystems alike; infiltration times to our drinking water wells have shortened, exposing them to an eventual pollution event and making them vulnerable; and sediment-attached contaminants accumulate in floodplains and reservoirs, threatening our healthy environment. The changes in flood characteristics and rating curves of our rivers are regularly being researched and described, involving state-of-the-art measurement methods, modeling tools and traditional statistics. Sediment processes however, are much less known. Unlike the investigation of flow processes, sediment-related research is scarce, which is partly due to the outdated methodology and poor database background in the specific field. Sediment-related data, information and analyses form an important and integral part of Civil engineering in relation to rivers all over the world. In relation to the second largest river of Europe, the Danube, it is widely known in expert community and for long discussed at different expert forums that the sediment balance of the river Danube has changed drastically over the past century. Sediment monitoring on the river Danube started as early as the end of the 19th century, with scattered measurements carried out. Regular sediment sampling was developed in the first half of the 20th century all along the river, with different station density and monitoring frequencies in different countries. After the first few decades of regular sampling, the concept of (mainly industrial) development changed along the river and data needs changed as well, furthermore the complicated and inexact methods of sampling bed load on the alluvial reach of the river were not developed further. Frequency of suspended sediment sampling is very low along the river

  18. Importance of sites of tracer administration and sampling in turnover studies

    SciTech Connect

    Katz, J.

    1982-01-01

    Our recent studies with tritium and /sup 14/C-labeled lactate and alanine in starved rats revealed that the sites of tracer administration and sampling have a profound effect on the kinetics of the specific activity curves and the calculation of metabolic parameters. The importance of the sites of tracer administration and sampling for the experimental design and interpretation of tracer data in vivo is discussed.

  19. Analysis of Host–Parasite Incongruence in Papillomavirus Evolution Using Importance Sampling

    PubMed Central

    Shah, Seena D.; Doorbar, John; Goldstein, Richard A.

    2010-01-01

    The papillomaviruses (PVs) are a family of viruses infecting several mammalian and nonmammalian species that cause cervical cancer in humans. The evolutionary history of the PVs as it associated with a wide range of host species is not well understood. Incongruities between the phylogenetic trees of various viral genes as well as between these genes and the host phylogenies suggest historical viral recombination as well as violations of strict virus–host cospeciation. The extent of recombination events among PVs is uncertain, however, and there is little evidence to support a theory of PV spread via recent host transfers. We have investigated incongruence between PV genes and hence, the possibility of recombination, using Bayesian phylogenetic methods. We find significant evidence for phylogenetic incongruence among the six PV genes E1, E2, E6, E7, L1, and L2, indicating substantial recombination. Analysis of E1 and L1 phylogenies suggests ancestral recombination events. We also describe a new method for examining alternative host–parasite association mechanisms by applying importance sampling to Bayesian divergence time estimation. This new approach is not restricted by a fixed viral tree topology or knowledge of viral divergence times, multiple parasite taxa per host may be included, and it can distinguish between prior divergence of the virus before host speciation and host transfer of the virus following speciation. Using this method, we find prior divergence of PV lineages associated with the ancestral mammalian host resulting in at least 6 PV lineages prior to speciation of this host. These PV lineages have then followed paths of prior divergence and cospeciation to eventually become associated with the extant host species. Only one significant instance of host transfer is supported, the transfer of the ancestral L1 gene between a Primate and Hystricognathi host based on the divergence times between the υ human type 41 and porcupine PVs. PMID:20093429

  20. Photoacoustic sample vessel and method of elevated pressure operation

    DOEpatents

    Autrey, Tom; Yonker, Clement R.

    2004-05-04

    An improved photoacoustic vessel and method of photoacoustic analysis. The photoacoustic sample vessel comprises an acoustic detector, an acoustic couplant, and an acoustic coupler having a chamber for holding the acoustic couplant and a sample. The acoustic couplant is selected from the group consisting of liquid, solid, and combinations thereof. Passing electromagnetic energy through the sample generates an acoustic signal within the sample, whereby the acoustic signal propagates through the sample to and through the acoustic couplant to the acoustic detector.

  1. Survey of predators and sampling method comparison in sweet corn.

    PubMed

    Musser, Fred R; Nyrop, Jan P; Shelton, Anthony M

    2004-02-01

    Natural predation is an important component of integrated pest management that is often overlooked because it is difficult to quantify and perceived to be unreliable. To begin incorporating natural predation into sweet corn, Zea mays L., pest management, a predator survey was conducted and then three sampling methods were compared for their ability to accurately monitor the most abundant predators. A predator survey on sweet corn foliage in New York between 1999 and 2001 identified 13 species. Orius insidiosus (Say), Coleomegilla maculata (De Geer), and Harmonia axyridis (Pallas) were the most numerous predators in all years. To determine the best method for sampling adult and immature stages of these predators, comparisons were made among nondestructive field counts, destructive counts, and yellow sticky cards. Field counts were correlated with destructive counts for all populations, but field counts of small insects were biased. Sticky cards underrepresented immature populations. Yellow sticky cards were more attractive to C. maculata adults than H. axyridis adults, especially before pollen shed, making coccinellid population estimates based on sticky cards unreliable. Field counts were the most precise method for monitoring adult and immature stages of the three major predators. Future research on predicting predation of pests in sweet corn should be based on field counts of predators because these counts are accurate, have no associated supply costs, and can be made quickly. PMID:14998137

  2. Systems and methods for self-synchronized digital sampling

    NASA Technical Reports Server (NTRS)

    Samson, Jr., John R. (Inventor)

    2008-01-01

    Systems and methods for self-synchronized data sampling are provided. In one embodiment, a system for capturing synchronous data samples is provided. The system includes an analog to digital converter adapted to capture signals from one or more sensors and convert the signals into a stream of digital data samples at a sampling frequency determined by a sampling control signal; and a synchronizer coupled to the analog to digital converter and adapted to receive a rotational frequency signal from a rotating machine, wherein the synchronizer is further adapted to generate the sampling control signal, and wherein the sampling control signal is based on the rotational frequency signal.

  3. Large Deviations and Importance Sampling for Systems of Slow-Fast Motion

    SciTech Connect

    Spiliopoulos, Konstantinos

    2013-02-15

    In this paper we develop the large deviations principle and a rigorous mathematical framework for asymptotically efficient importance sampling schemes for general, fully dependent systems of stochastic differential equations of slow and fast motion with small noise in the slow component. We assume periodicity with respect to the fast component. Depending on the interaction of the fast scale with the smallness of the noise, we get different behavior. We examine how one range of interaction differs from the other one both for the large deviations and for the importance sampling. We use the large deviations results to identify asymptotically optimal importance sampling schemes in each case. Standard Monte Carlo schemes perform poorly in the small noise limit. In the presence of multiscale aspects one faces additional difficulties and straightforward adaptation of importance sampling schemes for standard small noise diffusions will not produce efficient schemes. It turns out that one has to consider the so called cell problem from the homogenization theory for Hamilton-Jacobi-Bellman equations in order to guarantee asymptotic optimality. We use stochastic control arguments.

  4. Sampling bee communities using pan traps: alternative methods increase sample size

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Monitoring of the status of bee populations and inventories of bee faunas require systematic sampling. Efficiency and ease of implementation has encouraged the use of pan traps to sample bees. Efforts to find an optimal standardized sampling method for pan traps have focused on pan trap color. Th...

  5. Methods for cultivation of luminal parasitic protists of clinical importance.

    PubMed

    Clark, C Graham; Diamond, Louis S

    2002-07-01

    Cultivation of luminal protistan parasites has a long history. In this review we discuss the methods and media that are most widely used for the establishment and maintenance of the following organisms in culture: Entamoeba histolytica, Giardia intestinalis, Trichomonas vaginalis, Dientamoeba fragilis, Blastocystis hominis, and Balantidium coli. While cultivation is of limited importance in the diagnostic laboratory, it is essential to most research laboratories, and it is toward the latter that this review is primarily aimed.

  6. Methods for cultivation of luminal parasitic protists of clinical importance.

    PubMed

    Clark, C Graham; Diamond, Louis S

    2002-07-01

    Cultivation of luminal protistan parasites has a long history. In this review we discuss the methods and media that are most widely used for the establishment and maintenance of the following organisms in culture: Entamoeba histolytica, Giardia intestinalis, Trichomonas vaginalis, Dientamoeba fragilis, Blastocystis hominis, and Balantidium coli. While cultivation is of limited importance in the diagnostic laboratory, it is essential to most research laboratories, and it is toward the latter that this review is primarily aimed. PMID:12097242

  7. Passive Samplers for Investigations of Air Quality: Method Description, Implementation, and Comparison to Alternative Sampling Methods

    EPA Science Inventory

    This Paper covers the basics of passive sampler design, compares passive samplers to conventional methods of air sampling, and discusses considerations when implementing a passive sampling program. The Paper also discusses field sampling and sample analysis considerations to ensu...

  8. Uniform Sampling Table Method and its Applications II--Evaluating the Uniform Sampling by Experiment.

    PubMed

    Chen, Yibin; Chen, Jiaxi; Chen, Xuan; Wang, Min; Wang, Wei

    2015-01-01

    A new method of uniform sampling is evaluated in this paper. The items and indexes were adopted to evaluate the rationality of the uniform sampling. The evaluation items included convenience of operation, uniformity of sampling site distribution, and accuracy and precision of measured results. The evaluation indexes included operational complexity, occupation rate of sampling site in a row and column, relative accuracy of pill weight, and relative deviation of pill weight. They were obtained from three kinds of drugs with different shape and size by four kinds of sampling methods. Gray correlation analysis was adopted to make the comprehensive evaluation by comparing it with the standard method. The experimental results showed that the convenience of uniform sampling method was 1 (100%), odds ratio of occupation rate in a row and column was infinity, relative accuracy was 99.50-99.89%, reproducibility RSD was 0.45-0.89%, and weighted incidence degree exceeded the standard method. Hence, the uniform sampling method was easy to operate, and the selected samples were distributed uniformly. The experimental results demonstrated that the uniform sampling method has good accuracy and reproducibility, which can be put into use in drugs analysis. PMID:26525264

  9. Methods for sampling and inorganic analysis of coal

    USGS Publications Warehouse

    Golightly, D. W.; Simon, Frederick Otto

    1989-01-01

    Methods used by the U.S. Geological Survey for the sampling, comminution, and inorganic analysis of coal are summarized in this bulletin. Details, capabilities, and limitations of the methods are presented.

  10. DOE methods for evaluating environmental and waste management samples.

    SciTech Connect

    Goheen, S C; McCulloch, M; Thomas, B L; Riley, R G; Sklarew, D S; Mong, G M; Fadeff, S K

    1994-04-01

    DOE Methods for Evaluating Environmental and Waste Management Samples (DOE Methods) provides applicable methods in use by. the US Department of Energy (DOE) laboratories for sampling and analyzing constituents of waste and environmental samples. The development of DOE Methods is supported by the Laboratory Management Division (LMD) of the DOE. This document contains chapters and methods that are proposed for use in evaluating components of DOE environmental and waste management samples. DOE Methods is a resource intended to support sampling and analytical activities that will aid in defining the type and breadth of contamination and thus determine the extent of environmental restoration or waste management actions needed, as defined by the DOE, the US Environmental Protection Agency (EPA), or others.

  11. An adaptive importance sampling algorithm for Bayesian inversion with multimodal distributions

    SciTech Connect

    Li, Weixuan; Lin, Guang

    2015-03-21

    Parametric uncertainties are encountered in the simulations of many physical systems, and may be reduced by an inverse modeling procedure that calibrates the simulation results to observations on the real system being simulated. Following Bayes’ rule, a general approach for inverse modeling problems is to sample from the posterior distribution of the uncertain model parameters given the observations. However, the large number of repetitive forward simulations required in the sampling process could pose a prohibitive computational burden. This difficulty is particularly challenging when the posterior is multimodal. We present in this paper an adaptive importance sampling algorithm to tackle these challenges. Two essential ingredients of the algorithm are: 1) a Gaussian mixture (GM) model adaptively constructed as the proposal distribution to approximate the possibly multimodal target posterior, and 2) a mixture of polynomial chaos (PC) expansions, built according to the GM proposal, as a surrogate model to alleviate the computational burden caused by computational-demanding forward model evaluations. In three illustrative examples, the proposed adaptive importance sampling algorithm demonstrates its capabilities of automatically finding a GM proposal with an appropriate number of modes for the specific problem under study, and obtaining a sample accurately and efficiently representing the posterior with limited number of forward simulations.

  12. An adaptive importance sampling algorithm for Bayesian inversion with multimodal distributions

    SciTech Connect

    Li, Weixuan; Lin, Guang

    2015-08-01

    Parametric uncertainties are encountered in the simulations of many physical systems, and may be reduced by an inverse modeling procedure that calibrates the simulation results to observations on the real system being simulated. Following Bayes' rule, a general approach for inverse modeling problems is to sample from the posterior distribution of the uncertain model parameters given the observations. However, the large number of repetitive forward simulations required in the sampling process could pose a prohibitive computational burden. This difficulty is particularly challenging when the posterior is multimodal. We present in this paper an adaptive importance sampling algorithm to tackle these challenges. Two essential ingredients of the algorithm are: 1) a Gaussian mixture (GM) model adaptively constructed as the proposal distribution to approximate the possibly multimodal target posterior, and 2) a mixture of polynomial chaos (PC) expansions, built according to the GM proposal, as a surrogate model to alleviate the computational burden caused by computational-demanding forward model evaluations. In three illustrative examples, the proposed adaptive importance sampling algorithm demonstrates its capabilities of automatically finding a GM proposal with an appropriate number of modes for the specific problem under study, and obtaining a sample accurately and efficiently representing the posterior with limited number of forward simulations.

  13. An adaptive importance sampling algorithm for Bayesian inversion with multimodal distributions

    DOE PAGES

    Li, Weixuan; Lin, Guang

    2015-03-21

    Parametric uncertainties are encountered in the simulations of many physical systems, and may be reduced by an inverse modeling procedure that calibrates the simulation results to observations on the real system being simulated. Following Bayes’ rule, a general approach for inverse modeling problems is to sample from the posterior distribution of the uncertain model parameters given the observations. However, the large number of repetitive forward simulations required in the sampling process could pose a prohibitive computational burden. This difficulty is particularly challenging when the posterior is multimodal. We present in this paper an adaptive importance sampling algorithm to tackle thesemore » challenges. Two essential ingredients of the algorithm are: 1) a Gaussian mixture (GM) model adaptively constructed as the proposal distribution to approximate the possibly multimodal target posterior, and 2) a mixture of polynomial chaos (PC) expansions, built according to the GM proposal, as a surrogate model to alleviate the computational burden caused by computational-demanding forward model evaluations. In three illustrative examples, the proposed adaptive importance sampling algorithm demonstrates its capabilities of automatically finding a GM proposal with an appropriate number of modes for the specific problem under study, and obtaining a sample accurately and efficiently representing the posterior with limited number of forward simulations.« less

  14. In-depth analysis of sampling optimization methods

    NASA Astrophysics Data System (ADS)

    Lee, Honggoo; Han, Sangjun; Kim, Myoungsoo; Habets, Boris; Buhl, Stefan; Guhlemann, Steffen; Rößiger, Martin; Bellmann, Enrico; Kim, Seop

    2016-03-01

    High order overlay and alignment models require good coverage of overlay or alignment marks on the wafer. But dense sampling plans are not possible for throughput reasons. Therefore, sampling plan optimization has become a key issue. We analyze the different methods for sampling optimization and discuss the different knobs to fine-tune the methods to constraints of high volume manufacturing. We propose a method to judge sampling plan quality with respect to overlay performance, run-to-run stability and dispositioning criteria using a number of use cases from the most advanced lithography processes.

  15. Method and apparatus for imaging a sample on a device

    DOEpatents

    Trulson, Mark; Stern, David; Fiekowsky, Peter; Rava, Richard; Walton, Ian; Fodor, Stephen P. A.

    2001-01-01

    A method and apparatus for imaging a sample are provided. An electromagnetic radiation source generates excitation radiation which is sized by excitation optics to a line. The line is directed at a sample resting on a support and excites a plurality of regions on the sample. Collection optics collect response radiation reflected from the sample I and image the reflected radiation. A detector senses the reflected radiation and is positioned to permit discrimination between radiation reflected from a certain focal plane in the sample and certain other planes within the sample.

  16. Current methods for identifying clinically important cryptic Candida species.

    PubMed

    Criseo, Giuseppe; Scordino, Fabio; Romeo, Orazio

    2015-04-01

    In recent years, the taxonomy of the most important pathogenic Candida species (Candida albicans, Candida parapsilosis and Candida glabrata) has undergone profound changes due to the description of new closely-related species. This has resulted in the establishment of cryptic species complexes difficult to recognize in clinical diagnostic laboratories. The identification of these novel Candida species seems to be clinically relevant because it is likely that they differ in virulence and drug resistance. Nevertheless, current phenotypic methods are not suitable to accurately distinguish all the species belonging to a specific cryptic complex and therefore their recognition still requires molecular methods. Since traditional mycological techniques have not been useful, a number of molecular based methods have recently been developed. These range from simple PCR-based methods to more sophisticated real-time PCR and/or MALDI-TOF methods. In this article, we review the current methods designed for discriminating among closely related Candida species by highlighting, in particular, the limits of the existing phenotypic tests and the development of rapid and specific molecular tools for their proper identification.

  17. Rapid method for sampling metals for materials identification

    NASA Technical Reports Server (NTRS)

    Higgins, L. E.

    1971-01-01

    Nondamaging process similar to electrochemical machining is useful in obtaining metal samples from places inaccessible to conventional sampling methods or where methods would be hazardous or contaminating to specimens. Process applies to industries where metals or metal alloys play a vital role.

  18. Engineering Study of 500 ML Sample Bottle Transportation Methods

    SciTech Connect

    BOGER, R.M.

    1999-08-25

    This engineering study reviews and evaluates all available methods for transportation of 500-mL grab sample bottles, reviews and evaluates transportation requirements and schedules and analyzes and recommends the most cost-effective method for transporting 500-mL grab sample bottles.

  19. GROUND WATER PURGING AND SAMPLING METHODS: HISTORY VS. HYSTERIA

    EPA Science Inventory

    It has been over 10 years since the low-flow ground water purging and sampling method was initially reported in the literature. The method grew from the recognition that well purging was necessary to collect representative samples, bailers could not achieve well purging, and high...

  20. The Importance of Sample Processing in Analysis of Asbestos Content in Rocks and Soils

    NASA Astrophysics Data System (ADS)

    Neumann, R. D.; Wright, J.

    2012-12-01

    Analysis of asbestos content in rocks and soils using Air Resources Board (ARB) Test Method 435 (M435) involves the processing of samples for subsequent analysis by polarized light microscopy (PLM). The use of different equipment and procedures by commercial laboratories to pulverize rock and soil samples could result in different particle size distributions. It has long been theorized that asbestos-containing samples can be over-pulverized to the point where the particle dimensions of the asbestos no longer meet the required 3:1 length-to-width aspect ratio or the particles become so small that they no longer can be tested for optical characteristics using PLM where maximum PLM magnification is typically 400X. Recent work has shed some light on this issue. ARB staff conducted an interlaboratory study to investigate variability in preparation and analytical procedures used by laboratories performing M435 analysis. With regard to sample processing, ARB staff found that different pulverization equipment and processing procedures produced powders that have varying particle size distributions. PLM analysis of the finest powders produced by one laboratory showed all but one of the 12 samples were non-detect or below the PLM reporting limit; in contrast to the other 36 coarser samples from the same field sample and processed by three other laboratories where 21 samples were above the reporting limit. The set of 12, exceptionally fine powder samples produced by the same laboratory was re-analyzed by transmission electron microscopy (TEM) and results showed that these samples contained asbestos above the TEM reporting limit. However, the use of TEM as a stand-alone analytical procedure, usually performed at magnifications between 3,000 to 20,000X, also has its drawbacks because of the miniscule mass of sample that this method examines. The small amount of powder analyzed by TEM may not be representative of the field sample. The actual mass of the sample powder analyzed by

  1. A method and fortran program for quantitative sampling in paleontology

    USGS Publications Warehouse

    Tipper, J.C.

    1976-01-01

    The Unit Sampling Method is a binomial sampling method applicable to the study of fauna preserved in rocks too well cemented to be disaggregated. Preliminary estimates of the probability of detecting each group in a single sampling unit can be converted to estimates of the group's volumetric abundance by means of correction curves obtained by a computer simulation technique. This paper describes the technique and gives the FORTRAN program. ?? 1976.

  2. Fe(3+)-Fe(2+) transformation method: an important antioxidant assay.

    PubMed

    Gülçin, İlhami

    2015-01-01

    If we look at the multitude of varied and interesting reaction that constitute biochemistry and bioorganic chemistry, it is possible to classify a great many as either oxidation or reduction reactions. The reducing agent transfers electrons to another substance and is thus it oxidized. And, because it gives electrons, it is also called an electron donor. Electron donors can also form charge transfer complexes with electron acceptors. Reductants in biochemistry are very diverse. For example ferric ions (Fe(3+)) are good reducing agents. Also, different bioanalytical reduction methods are available such as Fe(3+)-ferrous ions (Fe(2+)) reduction method, ferric reducing antioxidant power reducing assay. In this section, Fe(3+)-Fe(2+) transformation will be discussed. Recently there has been growing interest in research into the role of plant-derived antioxidants in food and human health. The beneficial influence of many foodstuffs and beverages including fruits, vegetables, tea, coffee, and cacao on human health has been recently recognized to originate from their antioxidant activity. For this purpose, the most commonly method used in vitro determination of reducing capacity of pure food constituents or plant extracts is Fe(3+) reducing ability. This commonly used reducing power method is reviewed and presented in this study. Also, the general chemistry underlying this assay was clarified. Hence, this overview provides a basis and rationale for developing standardized antioxidant capacity methods for the food, nutraceutical, and dietary supplement industries. In addition, the most important advantages of this method were detected and highlighted. The chemical principles of these methods are outlined and critically discussed. The chemical principles of methods of Fe(3+)-Fe(2+) transformation assay are outlined and critically discussed. PMID:25323511

  3. A Mixed Methods Sampling Methodology for a Multisite Case Study

    ERIC Educational Resources Information Center

    Sharp, Julia L.; Mobley, Catherine; Hammond, Cathy; Withington, Cairen; Drew, Sam; Stringfield, Sam; Stipanovic, Natalie

    2012-01-01

    The flexibility of mixed methods research strategies makes such approaches especially suitable for multisite case studies. Yet the utilization of mixed methods to select sites for these studies is rarely reported. The authors describe their pragmatic mixed methods approach to select a sample for their multisite mixed methods case study of a…

  4. Improved importance sampling technique for efficient simulation of digital communication systems

    NASA Technical Reports Server (NTRS)

    Lu, Dingqing; Yao, Kung

    1988-01-01

    A new, improved importance sampling (IIS) approach to simulation is considered. Some basic concepts of IS are introduced, and detailed evolutions of simulation estimation variances for Monte Carlo (MC) and IS simulations are given. The general results obtained from these evolutions are applied to the specific previously known conventional importance sampling (CIS) technique and the new IIS technique. The derivation for a linear system with no signal random memory is considered in some detail. For the CIS technique, the optimum input scaling parameter is found, while for the IIS technique, the optimum translation parameter is found. The results are generalized to a linear system with memory and signals. Specific numerical and simulation results are given which show the advantages of CIS over MC and IIS over CIS for simulations of digital communications systems.

  5. Toxicological importance of human biomonitoring of metallic and metalloid elements in different biological samples.

    PubMed

    Gil, F; Hernández, A F

    2015-06-01

    Human biomonitoring has become an important tool for the assessment of internal doses of metallic and metalloid elements. These elements are of great significance because of their toxic properties and wide distribution in environmental compartments. Although blood and urine are the most used and accepted matrices for human biomonitoring, other non-conventional samples (saliva, placenta, meconium, hair, nails, teeth, breast milk) may have practical advantages and would provide additional information on health risk. Nevertheless, the analysis of these compounds in biological matrices other than blood and urine has not yet been accepted as a useful tool for biomonitoring. The validation of analytical procedures is absolutely necessary for a proper implementation of non-conventional samples in biomonitoring programs. However, the lack of reliable and useful analytical methodologies to assess exposure to metallic elements, and the potential interference of external contamination and variation in biological features of non-conventional samples are important limitations for setting health-based reference values. The influence of potential confounding factors on metallic concentration should always be considered. More research is needed to ascertain whether or not non-conventional matrices offer definitive advantages over the traditional samples and to broaden the available database for establishing worldwide accepted reference values in non-exposed populations.

  6. [Current methods for preparing samples on working with hematology analyzers].

    PubMed

    Tsyganova, A V; Pogorelov, V M; Naumova, I N; Kozinets, G I; Antonov, V S

    2011-03-01

    The paper raises a problem of preparing samples in hematology. It considers whether the preanalytical stage is of importance in hematological studies. The use of disposal vacuum blood collection systems is shown to solve the problem in the standardization of a blood sampling procedure. The benefits of the use of close tube hematology analyzers are also considered. PMID:21584966

  7. Application of the SAMR method to high magnetostrictive samples

    NASA Astrophysics Data System (ADS)

    Sanchez, P.; Lopez, E.; Trujillo, M. C. Sanchez; Aroca, C.

    1988-12-01

    Magnetostriction measurement by using the small angle magnetization rotation method (SAMR) has been performed in high magnetostrictive amorphous samples. To apply the SAMR method to these samples, a theoritical model about the influence of the internal stresses and magnetization distribution has been proposed. The dependence of the magnetostriction, λ s, with the temperature and applied stress was measured in as-cast and in different annealed samples. In the as-cast samples the existence of a stray field and a dependence of λ s with the applied stress has been observed.

  8. A modified Monte Carlo 'local importance function transform' method

    SciTech Connect

    Keady, K. P.; Larsen, E. W.

    2013-07-01

    The Local Importance Function Transform (LIFT) method uses an approximation of the contribution transport problem to bias a forward Monte-Carlo (MC) source-detector simulation [1-3]. Local (cell-based) biasing parameters are calculated from an inexpensive deterministic adjoint solution and used to modify the physics of the forward transport simulation. In this research, we have developed a new expression for the LIFT biasing parameter, which depends on a cell-average adjoint current to scalar flux (J{sup *}/{phi}{sup *}) ratio. This biasing parameter differs significantly from the original expression, which uses adjoint cell-edge scalar fluxes to construct a finite difference estimate of the flux derivative; the resulting biasing parameters exhibit spikes in magnitude at material discontinuities, causing the original LIFT method to lose efficiency in problems with high spatial heterogeneity. The new J{sup *}/{phi}{sup *} expression, while more expensive to obtain, generates biasing parameters that vary smoothly across the spatial domain. The result is an improvement in simulation efficiency. A representative test problem has been developed and analyzed to demonstrate the advantage of the updated biasing parameter expression with regards to solution figure of merit (FOM). For reference, the two variants of the LIFT method are compared to a similar variance reduction method developed by Depinay [4, 5], as well as MC with deterministic adjoint weight windows (WW). (authors)

  9. Method for using polarization gating to measure a scattering sample

    DOEpatents

    Baba, Justin S.

    2015-08-04

    Described herein are systems, devices, and methods facilitating optical characterization of scattering samples. A polarized optical beam can be directed to pass through a sample to be tested. The optical beam exiting the sample can then be analyzed to determine its degree of polarization, from which other properties of the sample can be determined. In some cases, an apparatus can include a source of an optical beam, an input polarizer, a sample, an output polarizer, and a photodetector. In some cases, a signal from a photodetector can be processed through attenuation, variable offset, and variable gain.

  10. The Importance of Meteorite Collections to Sample Return Missions: Past, Present, and Future Considerations

    NASA Technical Reports Server (NTRS)

    Welzenbach, L. C.; McCoy, T. J.; Glavin, D. P.; Dworkin, J. P.; Abell, P. A.

    2012-01-01

    turn led to a new wave of Mars exploration that ultimately could lead to sample return focused on evidence for past or present life. This partnership between collections and missions will be increasingly important in the coming decades as we discover new questions to be addressed and identify targets for for both robotic and human exploration . Nowhere is this more true than in the ultimate search for the abiotic and biotic processes that produced life. Existing collections also provide the essential materials for developing and testing new analytical schemes to detect the rare markers of life and distinguish them from abiotic processes. Large collections of meteorites and the new types being identified within these collections, which come to us at a fraction of the cost of a sample return mission, will continue to shape the objectives of future missions and provide new ways of interpreting returned samples.

  11. Interval sampling methods and measurement error: a computer simulation.

    PubMed

    Wirth, Oliver; Slaven, James; Taylor, Matthew A

    2014-01-01

    A simulation study was conducted to provide a more thorough account of measurement error associated with interval sampling methods. A computer program simulated the application of momentary time sampling, partial-interval recording, and whole-interval recording methods on target events randomly distributed across an observation period. The simulation yielded measures of error for multiple combinations of observation period, interval duration, event duration, and cumulative event duration. The simulations were conducted up to 100 times to yield measures of error variability. Although the present simulation confirmed some previously reported characteristics of interval sampling methods, it also revealed many new findings that pertain to each method's inherent strengths and weaknesses. The analysis and resulting error tables can help guide the selection of the most appropriate sampling method for observation-based behavioral assessments.

  12. [Recent advances in sample preparation methods of plant hormones].

    PubMed

    Wu, Qian; Wang, Lus; Wu, Dapeng; Duan, Chunfeng; Guan, Yafeng

    2014-04-01

    Plant hormones are a group of naturally occurring trace substances which play a crucial role in controlling the plant development, growth and environment response. With the development of the chromatography and mass spectroscopy technique, chromatographic analytical method has become a widely used way for plant hormone analysis. Among the steps of chromatographic analysis, sample preparation is undoubtedly the most vital one. Thus, a highly selective and efficient sample preparation method is critical for accurate identification and quantification of phytohormones. For the three major kinds of plant hormones including acidic plant hormones & basic plant hormones, brassinosteroids and plant polypeptides, the sample preparation methods are reviewed in sequence especially the recently developed methods. The review includes novel methods, devices, extractive materials and derivative reagents for sample preparation of phytohormones analysis. Especially, some related works of our group are included. At last, the future developments in this field are also prospected.

  13. A comparison of methods for representing sparsely sampled random quantities.

    SciTech Connect

    Romero, Vicente Jose; Swiler, Laura Painton; Urbina, Angel; Mullins, Joshua

    2013-09-01

    This report discusses the treatment of uncertainties stemming from relatively few samples of random quantities. The importance of this topic extends beyond experimental data uncertainty to situations involving uncertainty in model calibration, validation, and prediction. With very sparse data samples it is not practical to have a goal of accurately estimating the underlying probability density function (PDF). Rather, a pragmatic goal is that the uncertainty representation should be conservative so as to bound a specified percentile range of the actual PDF, say the range between 0.025 and .975 percentiles, with reasonable reliability. A second, opposing objective is that the representation not be overly conservative; that it minimally over-estimate the desired percentile range of the actual PDF. The presence of the two opposing objectives makes the sparse-data uncertainty representation problem interesting and difficult. In this report, five uncertainty representation techniques are characterized for their performance on twenty-one test problems (over thousands of trials for each problem) according to these two opposing objectives and other performance measures. Two of the methods, statistical Tolerance Intervals and a kernel density approach specifically developed for handling sparse data, exhibit significantly better overall performance than the others.

  14. A random spatial sampling method in a rural developing nation

    PubMed Central

    2014-01-01

    Background Nonrandom sampling of populations in developing nations has limitations and can inaccurately estimate health phenomena, especially among hard-to-reach populations such as rural residents. However, random sampling of rural populations in developing nations can be challenged by incomplete enumeration of the base population. Methods We describe a stratified random sampling method using geographical information system (GIS) software and global positioning system (GPS) technology for application in a health survey in a rural region of Guatemala, as well as a qualitative study of the enumeration process. Results This method offers an alternative sampling technique that could reduce opportunities for bias in household selection compared to cluster methods. However, its use is subject to issues surrounding survey preparation, technological limitations and in-the-field household selection. Application of this method in remote areas will raise challenges surrounding the boundary delineation process, use and translation of satellite imagery between GIS and GPS, and household selection at each survey point in varying field conditions. This method favors household selection in denser urban areas and in new residential developments. Conclusions Random spatial sampling methodology can be used to survey a random sample of population in a remote region of a developing nation. Although this method should be further validated and compared with more established methods to determine its utility in social survey applications, it shows promise for use in developing nations with resource-challenged environments where detailed geographic and human census data are less available. PMID:24716473

  15. A LITERATURE REVIEW OF WIPE SAMPLING METHODS FOR CHEMICAL WARFARE AGENTS AND TOXIC INDUSTRIAL CHEMICALS

    EPA Science Inventory

    Wipe sampling is an important technique for the estimation of contaminant deposition in buildings, homes, or outdoor surfaces as a source of possible human exposure. Numerous

    methods of wipe sampling exist, and each method has its own specification for the type of wipe, we...

  16. Sequential sampling: a novel method in farm animal welfare assessment.

    PubMed

    Heath, C A E; Main, D C J; Mullan, S; Haskell, M J; Browne, W J

    2016-02-01

    Lameness in dairy cows is an important welfare issue. As part of a welfare assessment, herd level lameness prevalence can be estimated from scoring a sample of animals, where higher levels of accuracy are associated with larger sample sizes. As the financial cost is related to the number of cows sampled, smaller samples are preferred. Sequential sampling schemes have been used for informing decision making in clinical trials. Sequential sampling involves taking samples in stages, where sampling can stop early depending on the estimated lameness prevalence. When welfare assessment is used for a pass/fail decision, a similar approach could be applied to reduce the overall sample size. The sampling schemes proposed here apply the principles of sequential sampling within a diagnostic testing framework. This study develops three sequential sampling schemes of increasing complexity to classify 80 fully assessed UK dairy farms, each with known lameness prevalence. Using the Welfare Quality herd-size-based sampling scheme, the first 'basic' scheme involves two sampling events. At the first sampling event half the Welfare Quality sample size is drawn, and then depending on the outcome, sampling either stops or is continued and the same number of animals is sampled again. In the second 'cautious' scheme, an adaptation is made to ensure that correctly classifying a farm as 'bad' is done with greater certainty. The third scheme is the only scheme to go beyond lameness as a binary measure and investigates the potential for increasing accuracy by incorporating the number of severely lame cows into the decision. The three schemes are evaluated with respect to accuracy and average sample size by running 100 000 simulations for each scheme, and a comparison is made with the fixed size Welfare Quality herd-size-based sampling scheme. All three schemes performed almost as well as the fixed size scheme but with much smaller average sample sizes. For the third scheme, an overall

  17. Clinical importance and representation of toxigenic and non-toxigenic Clostridium difficile cultivated from stool samples of hospitalized patients

    PubMed Central

    Predrag, Stojanovic; Branislava, Kocic; Miodrag, Stojanovic; Biljana, Miljkovic – Selimovic; Suzana, Tasic; Natasa, Miladinovic – Tasic; Tatjana, Babic

    2012-01-01

    The aim of this study was to fortify the clinical importance and representation of toxigenic and non-toxigenic Clostridium difficile isolated from stool samples of hospitalized patients. This survey included 80 hospitalized patients with diarrhea and positive findings of Clostridium difficile in stool samples, and 100 hospitalized patients with formed stool as a control group. Bacteriological examination of a stool samples was conducted using standard microbiological methods. Stool sample were inoculated directly on nutrient media for bacterial cultivation (blood agar using 5% sheep blood, Endo agar, selective Salmonella Shigella agar, Selenite-F broth, CIN agar and Skirrow’s medium), and to selective cycloserine-cefoxitin-fructose agar (CCFA) (Biomedics, Parg qe tehnicologico, Madrid, Spain) for isolation of Clostridium difficile. Clostridium difficile toxin was detected by ELISA-ridascreen Clostridium difficile Toxin A/B (R-Biopharm AG, Germany) and ColorPAC ToxinA test (Becton Dickinson, USA). Examination of stool specimens for the presence of parasites (causing diarrhea) was done using standard methods (conventional microscopy), commercial concentration test Paraprep S Gold kit (Dia Mondial, France) and RIDA®QUICK Cryptosporidium/Giardia Combi test (R-Biopharm AG, Germany). Examination of stool specimens for the presence of fungi (causing diarrhea) was performed by standard methods. All stool samples positive for Clostridium difficile were tested for Rota, Noro, Astro and Adeno viruses by ELISA – ridascreen (R-Biopharm AG, Germany). In this research we isolated 99 Clostridium difficile strains from 116 stool samples of 80 hospitalized patients with diarrhea. The 53 (66.25%) of patients with diarrhea were positive for toxins A and B, one (1.25%) were positive for only toxin B. Non-toxigenic Clostridium difficile isolated from samples of 26 (32.5%) patients. However, other pathogenic microorganisms of intestinal tract cultivated from samples of 16 patients

  18. Clinical importance and representation of toxigenic and non-toxigenic Clostridium difficile cultivated from stool samples of hospitalized patients.

    PubMed

    Predrag, Stojanovic; Branislava, Kocic; Miodrag, Stojanovic; Biljana, Miljkovic-Selimovic; Suzana, Tasic; Natasa, Miladinovic-Tasic; Tatjana, Babic

    2012-01-01

    The aim of this study was to fortify the clinical importance and representation of toxigenic and non-toxigenic Clostridium difficile isolated from stool samples of hospitalized patients. This survey included 80 hospitalized patients with diarrhea and positive findings of Clostridium difficile in stool samples, and 100 hospitalized patients with formed stool as a control group. Bacteriological examination of a stool samples was conducted using standard microbiological methods. Stool sample were inoculated directly on nutrient media for bacterial cultivation (blood agar using 5% sheep blood, Endo agar, selective Salmonella Shigella agar, Selenite-F broth, CIN agar and Skirrow's medium), and to selective cycloserine-cefoxitin-fructose agar (CCFA) (Biomedics, Parg qe tehnicologico, Madrid, Spain) for isolation of Clostridium difficile. Clostridium difficile toxin was detected by ELISA-ridascreen Clostridium difficile Toxin A/B (R-Biopharm AG, Germany) and ColorPAC ToxinA test (Becton Dickinson, USA). Examination of stool specimens for the presence of parasites (causing diarrhea) was done using standard methods (conventional microscopy), commercial concentration test Paraprep S Gold kit (Dia Mondial, France) and RIDA(®)QUICK Cryptosporidium/Giardia Combi test (R-Biopharm AG, Germany). Examination of stool specimens for the presence of fungi (causing diarrhea) was performed by standard methods. All stool samples positive for Clostridium difficile were tested for Rota, Noro, Astro and Adeno viruses by ELISA - ridascreen (R-Biopharm AG, Germany). In this research we isolated 99 Clostridium difficile strains from 116 stool samples of 80 hospitalized patients with diarrhea. The 53 (66.25%) of patients with diarrhea were positive for toxins A and B, one (1.25%) were positive for only toxin B. Non-toxigenic Clostridium difficile isolated from samples of 26 (32.5%) patients. However, other pathogenic microorganisms of intestinal tract cultivated from samples of 16 patients

  19. Method and sample spinning apparatus for measuring the NMR spectrum of an orientationally disordered sample

    DOEpatents

    Pines, Alexander; Samoson, Ago

    1990-01-01

    An improved NMR apparatus and method are described which substantially improve the resolution of NMR measurements made on powdered or amorphous or otherwise orientationally disordered samples. The apparatus spins the sample about an axis. The angle of the axis is mechanically varied such that the time average of two or more Legendre polynomials are zero.

  20. Methods for collection and analysis of water samples

    USGS Publications Warehouse

    Rainwater, Frank Hays; Thatcher, Leland Lincoln

    1960-01-01

    This manual contains methods used by the U.S. Geological Survey to collect, preserve, and analyze water samples. Throughout, the emphasis is on obtaining analytical results that accurately describe the chemical composition of the water in situ. Among the topics discussed are selection of sampling sites, frequency of sampling, field equipment, preservatives and fixatives, analytical techniques of water analysis, and instruments. Seventy-seven laboratory and field procedures are given for determining fifty-three water properties.

  1. Nominal Weights Mean Equating: A Method for Very Small Samples

    ERIC Educational Resources Information Center

    Babcock, Ben; Albano, Anthony; Raymond, Mark

    2012-01-01

    The authors introduced nominal weights mean equating, a simplified version of Tucker equating, as an alternative for dealing with very small samples. The authors then conducted three simulation studies to compare nominal weights mean equating to six other equating methods under the nonequivalent groups anchor test design with sample sizes of 20,…

  2. Isolation of Legionella from water samples using various culture methods.

    PubMed

    Kusnetsov, J M; Jousimies-Somer, H R; Nevalainen, A I; Martikainen, P J

    1994-02-01

    The efficacy of a non-selective medium and two selective media were compared for the isolation of legionellas from water samples. The effect of acid wash treatment for decontamination of the water samples on the isolation frequency of legionellas was also studied. The 236 samples were taken from cooling, humidifying and drinking water systems; 21% were legionella-positive when inoculated directly on modified Wadowsky-Yee (MWY) medium and 26% were positive when concentrated (x 200) before cultivation on MWY or CCVC media. Inoculation on MWY medium after concentration followed by decontamination by the acid-wash technique gave the highest isolation frequency (31%). The lowest frequency (8%) was found with the non-selective BCYE alpha medium. An isolation frequency of 28% was achieved with the BCYE alpha medium after concentration and acid-wash treatment of the samples. Forty per cent of the samples were positive for legionellas when the results from all the culture methods were combined. Not all the legionella-positive samples were identified by a single culture method. Ninety-three of the 95 positive samples were detected with the two best combinations of three culture methods. The best culture method for detecting legionellas depended on the source of the water sample. Some water quality characteristics, like temperature and organic matter content, affected the isolation frequency of Legionella spp.

  3. A cryopreservation method for Pasteurella multocida from wetland samples

    USGS Publications Warehouse

    Moore, Melody K.; Shadduck, D.J.; Goldberg, D.R.; Samuel, M.D.

    1998-01-01

    A cryopreservation method and improved isolation techniques for detection of Pasteurella multocida from wetland samples were developed. Wetland water samples were collected in the field, diluted in dimethyl sulfoxide (DMSO, final concentration 10%), and frozen at -180 C in a liquid nitrogen vapor shipper. Frozen samples were transported to the laboratory where they were subsequently thawed and processed in Pasteurella multocida selective broth (PMSB) to isolate P. multocida. This method allowed for consistent isolation of 2 to 18 organisms/ml from water seeded with known concentrations of P. multocida. The method compared favorably with the standard mouse inoculation method and allowed for preservation of the samples until they could be processed in the laboratory.

  4. Communication: importance sampling including path correlation in semiclassical initial value representation calculations for time correlation functions.

    PubMed

    Pan, Feng; Tao, Guohua

    2013-03-01

    Full semiclassical (SC) initial value representation (IVR) for time correlation functions involves a double phase space average over a set of two phase points, each of which evolves along a classical path. Conventionally, the two initial phase points are sampled independently for all degrees of freedom (DOF) in the Monte Carlo procedure. Here, we present an efficient importance sampling scheme by including the path correlation between the two initial phase points for the bath DOF, which greatly improves the performance of the SC-IVR calculations for large molecular systems. Satisfactory convergence in the study of quantum coherence in vibrational relaxation has been achieved for a benchmark system-bath model with up to 21 DOF.

  5. Probability Sampling Method for a Hidden Population Using Respondent-Driven Sampling: Simulation for Cancer Survivors.

    PubMed

    Jung, Minsoo

    2015-01-01

    When there is no sampling frame within a certain group or the group is concerned that making its population public would bring social stigma, we say the population is hidden. It is difficult to approach this kind of population survey-methodologically because the response rate is low and its members are not quite honest with their responses when probability sampling is used. The only alternative known to address the problems caused by previous methods such as snowball sampling is respondent-driven sampling (RDS), which was developed by Heckathorn and his colleagues. RDS is based on a Markov chain, and uses the social network information of the respondent. This characteristic allows for probability sampling when we survey a hidden population. We verified through computer simulation whether RDS can be used on a hidden population of cancer survivors. According to the simulation results of this thesis, the chain-referral sampling of RDS tends to minimize as the sample gets bigger, and it becomes stabilized as the wave progresses. Therefore, it shows that the final sample information can be completely independent from the initial seeds if a certain level of sample size is secured even if the initial seeds were selected through convenient sampling. Thus, RDS can be considered as an alternative which can improve upon both key informant sampling and ethnographic surveys, and it needs to be utilized for various cases domestically as well. PMID:26107223

  6. Probability Sampling Method for a Hidden Population Using Respondent-Driven Sampling: Simulation for Cancer Survivors.

    PubMed

    Jung, Minsoo

    2015-01-01

    When there is no sampling frame within a certain group or the group is concerned that making its population public would bring social stigma, we say the population is hidden. It is difficult to approach this kind of population survey-methodologically because the response rate is low and its members are not quite honest with their responses when probability sampling is used. The only alternative known to address the problems caused by previous methods such as snowball sampling is respondent-driven sampling (RDS), which was developed by Heckathorn and his colleagues. RDS is based on a Markov chain, and uses the social network information of the respondent. This characteristic allows for probability sampling when we survey a hidden population. We verified through computer simulation whether RDS can be used on a hidden population of cancer survivors. According to the simulation results of this thesis, the chain-referral sampling of RDS tends to minimize as the sample gets bigger, and it becomes stabilized as the wave progresses. Therefore, it shows that the final sample information can be completely independent from the initial seeds if a certain level of sample size is secured even if the initial seeds were selected through convenient sampling. Thus, RDS can be considered as an alternative which can improve upon both key informant sampling and ethnographic surveys, and it needs to be utilized for various cases domestically as well.

  7. Methods of human body odor sampling: the effect of freezing.

    PubMed

    Lenochova, Pavlina; Roberts, S Craig; Havlicek, Jan

    2009-02-01

    Body odor sampling is an essential tool in human chemical ecology research. However, methodologies of individual studies vary widely in terms of sampling material, length of sampling, and sample processing. Although these differences might have a critical impact on results obtained, almost no studies test validity of current methods. Here, we focused on the effect of freezing samples between collection and use in experiments involving body odor perception. In 2 experiments, we tested whether axillary odors were perceived differently by raters when presented fresh or having been frozen and whether several freeze-thaw cycles affected sample quality. In the first experiment, samples were frozen for 2 weeks, 1 month, or 4 months. We found no differences in ratings of pleasantness, attractiveness, or masculinity between fresh and frozen samples. Similarly, almost no differences between repeatedly thawed and fresh samples were found. We found some variations in intensity; however, this was unrelated to length of storage. The second experiment tested differences between fresh samples and those frozen for 6 months. Again no differences in subjective ratings were observed. These results suggest that freezing has no significant effect on perceived odor hedonicity and that samples can be reliably used after storage for relatively long periods.

  8. Soil separator and sampler and method of sampling

    DOEpatents

    O'Brien, Barry H [Idaho Falls, ID; Ritter, Paul D [Idaho Falls, ID

    2010-02-16

    A soil sampler includes a fluidized bed for receiving a soil sample. The fluidized bed may be in communication with a vacuum for drawing air through the fluidized bed and suspending particulate matter of the soil sample in the air. In a method of sampling, the air may be drawn across a filter, separating the particulate matter. Optionally, a baffle or a cyclone may be included within the fluidized bed for disentrainment, or dedusting, so only the finest particulate matter, including asbestos, will be trapped on the filter. The filter may be removable, and may be tested to determine the content of asbestos and other hazardous particulate matter in the soil sample.

  9. Method and apparatus for imaging a sample on a device

    DOEpatents

    Trulson, Mark; Stern, David; Fiekowsky, Peter; Rava, Richard; Walton, Ian; Fodor, Stephen P. A.

    1996-01-01

    The present invention provides methods and systems for detecting a labeled marker on a sample located on a support. The imaging system comprises a body for immobilizing the support, an excitation radiation source and excitation optics to generate and direct the excitation radiation at the sample. In response, labeled material on the sample emits radiation which has a wavelength that is different from the excitation wavelength, which radiation is collected by collection optics and imaged onto a detector which generates an image of the sample.

  10. System and method for measuring fluorescence of a sample

    SciTech Connect

    Riot, Vincent J

    2015-03-24

    The present disclosure provides a system and a method for measuring fluorescence of a sample. The sample may be a polymerase-chain-reaction (PCR) array, a loop-mediated-isothermal amplification array, etc. LEDs are used to excite the sample, and a photodiode is used to collect the sample's fluorescence. An electronic offset signal is used to reduce the effects of background fluorescence and the noises from the measurement system. An integrator integrates the difference between the output of the photodiode and the electronic offset signal over a given period of time. The resulting integral is then converted into digital domain for further processing and storage.

  11. DOE methods for evaluating environmental and waste management samples

    SciTech Connect

    Goheen, S.C.; McCulloch, M.; Thomas, B.L.; Riley, R.G.; Sklarew, D.S.; Mong, G.M.; Fadeff, S.K.

    1994-10-01

    DOE Methods for Evaluating Environmental and Waste Management Samples (DOE Methods) is a resource intended to support sampling and analytical activities for the evaluation of environmental and waste management samples from U.S. Department of Energy (DOE) sites. DOE Methods is the result of extensive cooperation from all DOE analytical laboratories. All of these laboratories have contributed key information and provided technical reviews as well as significant moral support leading to the success of this document. DOE Methods is designed to encompass methods for collecting representative samples and for determining the radioisotope activity and organic and inorganic composition of a sample. These determinations will aid in defining the type and breadth of contamination and thus determine the extent of environmental restoration or waste management actions needed, as defined by the DOE, the U.S. Environmental Protection Agency, or others. The development of DOE Methods is supported by the Analytical Services Division of DOE. Unique methods or methods consolidated from similar procedures in the DOE Procedures Database are selected for potential inclusion in this document. Initial selection is based largely on DOE needs and procedure applicability and completeness. Methods appearing in this document are one of two types, {open_quotes}Draft{close_quotes} or {open_quotes}Verified{close_quotes}. {open_quotes}Draft{close_quotes} methods that have been reviewed internally and show potential for eventual verification are included in this document, but they have not been reviewed externally, and their precision and bias may not be known. {open_quotes}Verified{close_quotes} methods in DOE Methods have been reviewed by volunteers from various DOE sites and private corporations. These methods have delineated measures of precision and accuracy.

  12. Convenient mounting method for electrical measurements of thin samples

    NASA Technical Reports Server (NTRS)

    Matus, L. G.; Summers, R. L.

    1986-01-01

    A method for mounting thin samples for electrical measurements is described. The technique is based on a vacuum chuck concept in which the vacuum chuck simultaneously holds the sample and established electrical contact. The mounting plate is composed of a glass-ceramic insulating material and the surfaces of the plate and vacuum chuck are polished. The operation of the vacuum chuck is examined. The contacts on the sample and mounting plate, which are sputter-deposited through metal masks, are analyzed. The mounting method was utilized for van der Pauw measurements.

  13. Comparison of surface sampling methods for virus recovery from fomites.

    PubMed

    Julian, Timothy R; Tamayo, Francisco J; Leckie, James O; Boehm, Alexandria B

    2011-10-01

    The role of fomites in infectious disease transmission relative to other exposure routes is difficult to discern due, in part, to the lack of information on the level and distribution of virus contamination on surfaces. Comparisons of studies intending to fill this gap are difficult because multiple different sampling methods are employed and authors rarely report their method's lower limit of detection. In the present study, we compare a subset of sampling methods identified from a literature review to demonstrate that sampling method significantly influences study outcomes. We then compare a subset of methods identified from the review to determine the most efficient methods for recovering virus from surfaces in a laboratory trial using MS2 bacteriophage as a model virus. Recoveries of infective MS2 and MS2 RNA are determined using both a plaque assay and quantitative reverse transcription-PCR, respectively. We conclude that the method that most effectively recovers virus from nonporous fomites uses polyester-tipped swabs prewetted in either one-quarter-strength Ringer's solution or saline solution. This method recovers a median fraction for infective MS2 of 0.40 and for MS2 RNA of 0.07. Use of the proposed method for virus recovery in future fomite sampling studies would provide opportunities to compare findings across multiple studies.

  14. Evaluating Composite Sampling Methods of Bacillus Spores at Low Concentrations

    PubMed Central

    Hess, Becky M.; Amidan, Brett G.; Anderson, Kevin K.; Hutchison, Janine R.

    2016-01-01

    Restoring all facility operations after the 2001 Amerithrax attacks took years to complete, highlighting the need to reduce remediation time. Some of the most time intensive tasks were environmental sampling and sample analyses. Composite sampling allows disparate samples to be combined, with only a single analysis needed, making it a promising method to reduce response times. We developed a statistical experimental design to test three different composite sampling methods: 1) single medium single pass composite (SM-SPC): a single cellulose sponge samples multiple coupons with a single pass across each coupon; 2) single medium multi-pass composite: a single cellulose sponge samples multiple coupons with multiple passes across each coupon (SM-MPC); and 3) multi-medium post-sample composite (MM-MPC): a single cellulose sponge samples a single surface, and then multiple sponges are combined during sample extraction. Five spore concentrations of Bacillus atrophaeus Nakamura spores were tested; concentrations ranged from 5 to 100 CFU/coupon (0.00775 to 0.155 CFU/cm2). Study variables included four clean surface materials (stainless steel, vinyl tile, ceramic tile, and painted dry wallboard) and three grime coated/dirty materials (stainless steel, vinyl tile, and ceramic tile). Analysis of variance for the clean study showed two significant factors: composite method (p< 0.0001) and coupon material (p = 0.0006). Recovery efficiency (RE) was higher overall using the MM-MPC method compared to the SM-SPC and SM-MPC methods. RE with the MM-MPC method for concentrations tested (10 to 100 CFU/coupon) was similar for ceramic tile, dry wall, and stainless steel for clean materials. RE was lowest for vinyl tile with both composite methods. Statistical tests for the dirty study showed RE was significantly higher for vinyl and stainless steel materials, but lower for ceramic tile. These results suggest post-sample compositing can be used to reduce sample analysis time when

  15. Importance of anthropogenic climate impact, sampling error and urban development in sewer system design.

    PubMed

    Egger, C; Maurer, M

    2015-04-15

    Urban drainage design relying on observed precipitation series neglects the uncertainties associated with current and indeed future climate variability. Urban drainage design is further affected by the large stochastic variability of precipitation extremes and sampling errors arising from the short observation periods of extreme precipitation. Stochastic downscaling addresses anthropogenic climate impact by allowing relevant precipitation characteristics to be derived from local observations and an ensemble of climate models. This multi-climate model approach seeks to reflect the uncertainties in the data due to structural errors of the climate models. An ensemble of outcomes from stochastic downscaling allows for addressing the sampling uncertainty. These uncertainties are clearly reflected in the precipitation-runoff predictions of three urban drainage systems. They were mostly due to the sampling uncertainty. The contribution of climate model uncertainty was found to be of minor importance. Under the applied greenhouse gas emission scenario (A1B) and within the period 2036-2065, the potential for urban flooding in our Swiss case study is slightly reduced on average compared to the reference period 1981-2010. Scenario planning was applied to consider urban development associated with future socio-economic factors affecting urban drainage. The impact of scenario uncertainty was to a large extent found to be case-specific, thus emphasizing the need for scenario planning in every individual case. The results represent a valuable basis for discussions of new drainage design standards aiming specifically to include considerations of uncertainty.

  16. Importance of anthropogenic climate impact, sampling error and urban development in sewer system design.

    PubMed

    Egger, C; Maurer, M

    2015-04-15

    Urban drainage design relying on observed precipitation series neglects the uncertainties associated with current and indeed future climate variability. Urban drainage design is further affected by the large stochastic variability of precipitation extremes and sampling errors arising from the short observation periods of extreme precipitation. Stochastic downscaling addresses anthropogenic climate impact by allowing relevant precipitation characteristics to be derived from local observations and an ensemble of climate models. This multi-climate model approach seeks to reflect the uncertainties in the data due to structural errors of the climate models. An ensemble of outcomes from stochastic downscaling allows for addressing the sampling uncertainty. These uncertainties are clearly reflected in the precipitation-runoff predictions of three urban drainage systems. They were mostly due to the sampling uncertainty. The contribution of climate model uncertainty was found to be of minor importance. Under the applied greenhouse gas emission scenario (A1B) and within the period 2036-2065, the potential for urban flooding in our Swiss case study is slightly reduced on average compared to the reference period 1981-2010. Scenario planning was applied to consider urban development associated with future socio-economic factors affecting urban drainage. The impact of scenario uncertainty was to a large extent found to be case-specific, thus emphasizing the need for scenario planning in every individual case. The results represent a valuable basis for discussions of new drainage design standards aiming specifically to include considerations of uncertainty. PMID:25644630

  17. Molecular characterization of Salmonella enterica serovar Saintpaul isolated from imported seafood, pepper, environmental and clinical samples.

    PubMed

    Akiyama, Tatsuya; Khan, Ashraf A; Cheng, Chorng-Ming; Stefanova, Rossina

    2011-09-01

    A total of 39 Salmonella enterica serovar Saintpaul strains from imported seafood, pepper and from environmental and clinical samples were analyzed for the presence of virulence genes, antibiotic resistance, plasmid and plasmid replicon types. Pulsed-field gel electrophoresis (PFGE) fingerprinting using the XbaI restriction enzyme and plasmid profiling were performed to assess genetic diversity. None of the isolates showed resistance to ampicillin, chloramphenicol, gentamicin, kanamycin, streptomycin, sulfisoxazole, and tetracycline. Seventeen virulence genes were screened for by PCR. All strains were positive for 14 genes (spiA, sifA, invA, spaN, sopE, sipB, iroN, msgA, pagC, orgA, prgH, lpfC, sitC, and tolC) and negative for three genes (spvB, pefA, and cdtB). Twelve strains, including six from clinical samples and six from seafood, carried one or more plasmids. Large plasmids, sized greater than 50 kb were detected in one clinical and three food isolates. One plasmid was able to be typed as IncI1 by PCR-based replicon typing. There were 25 distinct PFGE-XbaI patterns, clustered to two groups. Cluster A, with 68.5% similarity mainly consists of clinical isolates, while Cluster C, with 67.6% similarity, mainly consisted of shrimp isolates from India. Our findings indicated the genetic diversity of S. Saintpaul in clinical samples, imported seafood, and the environment and that this serotype possesses several virulent genes and plasmids which can cause salmonellosis. PMID:21645810

  18. The Precision Efficacy Analysis for Regression Sample Size Method.

    ERIC Educational Resources Information Center

    Brooks, Gordon P.; Barcikowski, Robert S.

    The general purpose of this study was to examine the efficiency of the Precision Efficacy Analysis for Regression (PEAR) method for choosing appropriate sample sizes in regression studies used for precision. The PEAR method, which is based on the algebraic manipulation of an accepted cross-validity formula, essentially uses an effect size to…

  19. Do Women's Voices Provide Cues of the Likelihood of Ovulation? The Importance of Sampling Regime

    PubMed Central

    Fischer, Julia; Semple, Stuart; Fickenscher, Gisela; Jürgens, Rebecca; Kruse, Eberhard; Heistermann, Michael; Amir, Ofer

    2011-01-01

    The human voice provides a rich source of information about individual attributes such as body size, developmental stability and emotional state. Moreover, there is evidence that female voice characteristics change across the menstrual cycle. A previous study reported that women speak with higher fundamental frequency (F0) in the high-fertility compared to the low-fertility phase. To gain further insights into the mechanisms underlying this variation in perceived attractiveness and the relationship between vocal quality and the timing of ovulation, we combined hormone measurements and acoustic analyses, to characterize voice changes on a day-to-day basis throughout the menstrual cycle. Voice characteristics were measured from free speech as well as sustained vowels. In addition, we asked men to rate vocal attractiveness from selected samples. The free speech samples revealed marginally significant variation in F0 with an increase prior to and a distinct drop during ovulation. Overall variation throughout the cycle, however, precluded unequivocal identification of the period with the highest conception risk. The analysis of vowel samples revealed a significant increase in degree of unvoiceness and noise-to-harmonic ratio during menstruation, possibly related to an increase in tissue water content. Neither estrogen nor progestogen levels predicted the observed changes in acoustic characteristics. The perceptual experiments revealed a preference by males for voice samples recorded during the pre-ovulatory period compared to other periods in the cycle. While overall we confirm earlier findings in that women speak with a higher and more variable fundamental frequency just prior to ovulation, the present study highlights the importance of taking the full range of variation into account before drawing conclusions about the value of these cues for the detection of ovulation. PMID:21957453

  20. Beryllium Wipe Sampling (differing methods - differing exposure potentials)

    SciTech Connect

    Kerr, Kent

    2005-03-09

    This research compared three wipe sampling techniques currently used to test for beryllium contamination on room and equipment surfaces in Department of Energy facilities. Efficiencies of removal of beryllium contamination from typical painted surfaces were tested by wipe sampling without a wetting agent, with water-moistened wipe materials, and by methanol-moistened wipes. Analysis indicated that methanol-moistened wipe sampling removed about twice as much beryllium/oil-film surface contamination as water-moistened wipes, which removed about twice as much residue as dry wipes. Criteria at 10 CFR 850.30 and .31 were established on unspecified wipe sampling method(s). The results of this study reveal a need to identify criteria-setting method and equivalency factors. As facilities change wipe sampling methods among the three compared in this study, these results may be useful for approximate correlations. Accurate decontamination decision-making depends on the selection of appropriate wetting agents for the types of residues and surfaces. Evidence for beryllium sensitization via skin exposure argues in favor of wipe sampling with wetting agents that provide enhanced removal efficiency such as methanol when surface contamination includes oil mist residue.

  1. 76 FR 65165 - Importation of Plants for Planting; Risk-Based Sampling and Inspection Approach and Propagative...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-10-20

    ... Animal and Plant Health Inspection Service Importation of Plants for Planting; Risk-Based Sampling and...-based sampling approach for the inspection of imported plants for planting. In our previous approach, we... risk posed by the plants for planting. The risk-based sampling and inspection approach will allow us...

  2. Determining the relative importance of soil sample locations to predict risk of child lead exposure.

    PubMed

    Zahran, Sammy; Mielke, Howard W; McElmurry, Shawn P; Filippelli, Gabriel M; Laidlaw, Mark A S; Taylor, Mark P

    2013-10-01

    Soil lead in urban neighborhoods is a known predictor of child blood lead levels. In this paper, we address the question where one ought to concentrate soil sample collection efforts to efficiently predict children at-risk for soil Pb exposure. Two extensive data sets are combined, including 5467 surface soil samples collected from 286 census tracts, and geo-referenced blood Pb data for 55,551 children in metropolitan New Orleans, USA. Random intercept least squares, random intercept logistic, and quantile regression results indicate that soils collected within 1m adjacent to residential streets most reliably predict child blood Pb outcomes in child blood Pb levels. Regression decomposition results show that residential street soils account for 39.7% of between-neighborhood explained variation, followed by busy street soils (21.97%), open space soils (20.25%), and home foundation soils (18.71%). Just as the age of housing stock is used as a statistical shortcut for child risk of exposure to lead-based paint, our results indicate that one can shortcut the characterization of child risk of exposure to neighborhood soil Pb by concentrating sampling efforts within 1m and adjacent to residential and busy streets, while significantly reducing the total costs of collection and analysis. This efficiency gain can help advance proactive upstream, preventive methods of environmental Pb discovery.

  3. Rapid method for the determination of 226Ra in hydraulic fracturing wastewater samples

    DOE PAGES

    Maxwell, Sherrod L.; Culligan, Brian K.; Warren, Richard A.; McAlister, Daniel R.

    2016-03-24

    A new method that rapidly preconcentrates and measures 226Ra from hydraulic fracturing wastewater samples was developed in the Savannah River Environmental Laboratory. The method improves the quality of 226Ra measurements using gamma spectrometry by providing up to 100x preconcentration of 226Ra from this difficult sample matrix, which contains very high levels of calcium, barium, strontium, magnesium and sodium. The high chemical yield, typically 80-90%, facilitates a low detection limit, important for lower level samples, and indicates method ruggedness. Ba-133 tracer is used to determine chemical yield and correct for geometry-related counting issues. The 226Ra sample preparation takes < 2 hours.

  4. Importance of sampling design and analysis in animal population studies: a comment on Sergio et al

    USGS Publications Warehouse

    Kery, M.; Royle, J. Andrew; Schmid, Hans

    2008-01-01

    1. The use of predators as indicators and umbrellas in conservation has been criticized. In the Trentino region, Sergio et al. (2006; hereafter SEA) counted almost twice as many bird species in quadrats located in raptor territories than in controls. However, SEA detected astonishingly few species. We used contemporary Swiss Breeding Bird Survey data from an adjacent region and a novel statistical model that corrects for overlooked species to estimate the expected number of bird species per quadrat in that region. 2. There are two anomalies in SEA which render their results ambiguous. First, SEA detected on average only 6.8 species, whereas a value of 32 might be expected. Hence, they probably overlooked almost 80% of all species. Secondly, the precision of their mean species counts was greater in two-thirds of cases than in the unlikely case that all quadrats harboured exactly the same number of equally detectable species. This suggests that they detected consistently only a biased, unrepresentative subset of species. 3. Conceptually, expected species counts are the product of true species number and species detectability p. Plenty of factors may affect p, including date, hour, observer, previous knowledge of a site and mobbing behaviour of passerines in the presence of predators. Such differences in p between raptor and control quadrats could have easily created the observed effects. Without a method that corrects for such biases, or without quantitative evidence that species detectability was indeed similar between raptor and control quadrats, the meaning of SEA's counts is hard to evaluate. Therefore, the evidence presented by SEA in favour of raptors as indicator species for enhanced levels of biodiversity remains inconclusive. 4. Synthesis and application. Ecologists should pay greater attention to sampling design and analysis in animal population estimation. Species richness estimation means sampling a community. Samples should be representative for the

  5. Fluidics platform and method for sample preparation and analysis

    SciTech Connect

    Benner, W. Henry; Dzenitis, John M.; Bennet, William J.; Baker, Brian R.

    2014-08-19

    Herein provided are fluidics platform and method for sample preparation and analysis. The fluidics platform is capable of analyzing DNA from blood samples using amplification assays such as polymerase-chain-reaction assays and loop-mediated-isothermal-amplification assays. The fluidics platform can also be used for other types of assays and analyzes. In some embodiments, a sample in a sealed tube can be inserted directly. The following isolation, detection, and analyzes can be performed without a user's intervention. The disclosed platform may also comprises a sample preparation system with a magnetic actuator, a heater, and an air-drying mechanism, and fluid manipulation processes for extraction, washing, elution, assay assembly, assay detection, and cleaning after reactions and between samples.

  6. Self-contained cryogenic gas sampling apparatus and method

    DOEpatents

    McManus, G.J.; Motes, B.G.; Bird, S.K.; Kotter, D.K.

    1996-03-26

    Apparatus for obtaining a whole gas sample, is composed of: a sample vessel having an inlet for receiving a gas sample; a controllable valve mounted for controllably opening and closing the inlet; a valve control coupled to the valve for opening and closing the valve at selected times; a portable power source connected for supplying operating power to the valve control; and a cryogenic coolant in thermal communication with the vessel for cooling the interior of the vessel to cryogenic temperatures. A method is described for obtaining an air sample using the apparatus described above, by: placing the apparatus at a location at which the sample is to be obtained; operating the valve control to open the valve at a selected time and close the valve at a selected subsequent time; and between the selected times maintaining the vessel at a cryogenic temperature by heat exchange with the coolant. 3 figs.

  7. Self-contained cryogenic gas sampling apparatus and method

    DOEpatents

    McManus, Gary J.; Motes, Billy G.; Bird, Susan K.; Kotter, Dale K.

    1996-01-01

    Apparatus for obtaining a whole gas sample, composed of: a sample vessel having an inlet for receiving a gas sample; a controllable valve mounted for controllably opening and closing the inlet; a valve control coupled to the valve for opening and closing the valve at selected times; a portable power source connected for supplying operating power to the valve control; and a cryogenic coolant in thermal communication with the vessel for cooling the interior of the vessel to cryogenic temperatures. A method of obtaining an air sample using the apparatus described above, by: placing the apparatus at a location at which the sample is to be obtained; operating the valve control to open the valve at a selected time and close the valve at a selected subsequent time; and between the selected times maintaining the vessel at a cryogenic temperature by heat exchange with the coolant.

  8. Estimation variance bounds of importance sampling simulations in digital communication systems

    NASA Technical Reports Server (NTRS)

    Lu, D.; Yao, K.

    1991-01-01

    In practical applications of importance sampling (IS) simulation, two basic problems are encountered, that of determining the estimation variance and that of evaluating the proper IS parameters needed in the simulations. The authors derive new upper and lower bounds on the estimation variance which are applicable to IS techniques. The upper bound is simple to evaluate and may be minimized by the proper selection of the IS parameter. Thus, lower and upper bounds on the improvement ratio of various IS techniques relative to the direct Monte Carlo simulation are also available. These bounds are shown to be useful and computationally simple to obtain. Based on the proposed technique, one can readily find practical suboptimum IS parameters. Numerical results indicate that these bounding techniques are useful for IS simulations of linear and nonlinear communication systems with intersymbol interference in which bit error rate and IS estimation variances cannot be obtained readily using prior techniques.

  9. Model reduction algorithms for optimal control and importance sampling of diffusions

    NASA Astrophysics Data System (ADS)

    Hartmann, Carsten; Schütte, Christof; Zhang, Wei

    2016-08-01

    We propose numerical algorithms for solving optimal control and importance sampling problems based on simplified models. The algorithms combine model reduction techniques for multiscale diffusions and stochastic optimization tools, with the aim of reducing the original, possibly high-dimensional problem to a lower dimensional representation of the dynamics, in which only a few relevant degrees of freedom are controlled or biased. Specifically, we study situations in which either a reaction coordinate onto which the dynamics can be projected is known, or situations in which the dynamics shows strongly localized behavior in the small noise regime. No explicit assumptions about small parameters or scale separation have to be made. We illustrate the approach with simple, but paradigmatic numerical examples.

  10. Detecting spatial structures in throughfall data: the effect of extent, sample size, sampling design, and variogram estimation method

    NASA Astrophysics Data System (ADS)

    Voss, Sebastian; Zimmermann, Beate; Zimmermann, Alexander

    2016-04-01

    In the last three decades, an increasing number of studies analyzed spatial patterns in throughfall to investigate the consequences of rainfall redistribution for biogeochemical and hydrological processes in forests. In the majority of cases, variograms were used to characterize the spatial properties of the throughfall data. The estimation of the variogram from sample data requires an appropriate sampling scheme: most importantly, a large sample and an appropriate layout of sampling locations that often has to serve both variogram estimation and geostatistical prediction. While some recommendations on these aspects exist, they focus on Gaussian data and high ratios of the variogram range to the extent of the study area. However, many hydrological data, and throughfall data in particular, do not follow a Gaussian distribution. In this study, we examined the effect of extent, sample size, sampling design, and calculation methods on variogram estimation of throughfall data. For our investigation, we first generated non-Gaussian random fields based on throughfall data with heavy outliers. Subsequently, we sampled the fields with three extents (plots with edge lengths of 25 m, 50 m, and 100 m), four common sampling designs (two grid-based layouts, transect and random sampling), and five sample sizes (50, 100, 150, 200, 400). We then estimated the variogram parameters by method-of-moments and residual maximum likelihood. Our key findings are threefold. First, the choice of the extent has a substantial influence on the estimation of the variogram. A comparatively small ratio of the extent to the correlation length is beneficial for variogram estimation. Second, a combination of a minimum sample size of 150, a design that ensures the sampling of small distances and variogram estimation by residual maximum likelihood offers a good compromise between accuracy and efficiency. Third, studies relying on method-of-moments based variogram estimation may have to employ at least

  11. Detecting spatial structures in throughfall data: The effect of extent, sample size, sampling design, and variogram estimation method

    NASA Astrophysics Data System (ADS)

    Voss, Sebastian; Zimmermann, Beate; Zimmermann, Alexander

    2016-09-01

    In the last decades, an increasing number of studies analyzed spatial patterns in throughfall by means of variograms. The estimation of the variogram from sample data requires an appropriate sampling scheme: most importantly, a large sample and a layout of sampling locations that often has to serve both variogram estimation and geostatistical prediction. While some recommendations on these aspects exist, they focus on Gaussian data and high ratios of the variogram range to the extent of the study area. However, many hydrological data, and throughfall data in particular, do not follow a Gaussian distribution. In this study, we examined the effect of extent, sample size, sampling design, and calculation method on variogram estimation of throughfall data. For our investigation, we first generated non-Gaussian random fields based on throughfall data with large outliers. Subsequently, we sampled the fields with three extents (plots with edge lengths of 25 m, 50 m, and 100 m), four common sampling designs (two grid-based layouts, transect and random sampling) and five sample sizes (50, 100, 150, 200, 400). We then estimated the variogram parameters by method-of-moments (non-robust and robust estimators) and residual maximum likelihood. Our key findings are threefold. First, the choice of the extent has a substantial influence on the estimation of the variogram. A comparatively small ratio of the extent to the correlation length is beneficial for variogram estimation. Second, a combination of a minimum sample size of 150, a design that ensures the sampling of small distances and variogram estimation by residual maximum likelihood offers a good compromise between accuracy and efficiency. Third, studies relying on method-of-moments based variogram estimation may have to employ at least 200 sampling points for reliable variogram estimates. These suggested sample sizes exceed the number recommended by studies dealing with Gaussian data by up to 100 %. Given that most previous

  12. Comparison of pigment content of paint samples using spectrometric methods.

    PubMed

    Trzcińska, Beata; Kowalski, Rafał; Zięba-Palus, Janina

    2014-09-15

    The aim of the paper was to evaluate the influence of pigment concentration and its distribution in polymer binder on the possibility of colour identification and paint sample comparison. Two sets of paint samples: one containing red and another one green pigment were prepared. Each set consisted of 13 samples differing gradually in the concentration of pigment. To obtain the sets of various colour shades white paint was mixed with the appropriate pigment in the form of a concentrated suspension. After solvents evaporation the samples were examined using spectrometric methods. The resin and main filler were identified by IR method. Colour and white pigments were identified on the base of Raman spectra. Colour of samples were compared based on Vis spectrometry according to colour theory. It was found that samples are homogenous (parameter measuring colour similarity ΔE<3). The values of ΔE between the neighbouring samples in the set revealed decreasing linear function and between the first and following one--a logarithmic function.

  13. Off-axis angular spectrum method with variable sampling interval

    NASA Astrophysics Data System (ADS)

    Kim, Yong-Hae; Byun, Chun-Won; Oh, Himchan; Pi, Jae-Eun; Choi, Ji-Hun; Kim, Gi Heon; Lee, Myung-Lae; Ryu, Hojun; Hwang, Chi-Sun

    2015-08-01

    We proposed a novel off-axis angular spectrum method (ASM) for simulating free space wave propagation with a large shifted destination plane. The off-axis numerical simulation took wave propagation between a parallel source and a destination plane, but a destination plane was shifted from a source plane. The shifted angular spectrum method was proposed for diffraction simulation with a shifted destination plane and satisfied the Nyquist condition for sampling by limiting a bandwidth of a propagation field to avoid an aliasing error due to under sampling. However, the effective sampling number of the shifted ASM decreased when the shifted distance of the destination plane was large which caused a numerical error in the diffraction simulation. To compensate for the decrease of an effective sampling number for the large shifted destination plane, we used a variable sampling interval in a Fourier space to maintain the same effective sampling number independent of the shifted distance of the destination plane. As a result, our proposed off-axis ASM with a variable sampling interval can produce simulation results with high accuracy for nearly every shifted distance of a destination plane when an off-axis angle is less than 75°. We compared the performances of the off-axis ASM using the Chirp Z transform and non-uniform FFT for implementing a variable spatial frequency in a Fourier space.

  14. Comparison of pigment content of paint samples using spectrometric methods

    NASA Astrophysics Data System (ADS)

    Trzcińska, Beata; Kowalski, Rafał; Zięba-Palus, Janina

    2014-09-01

    The aim of the paper was to evaluate the influence of pigment concentration and its distribution in polymer binder on the possibility of colour identification and paint sample comparison. Two sets of paint samples: one containing red and another one green pigment were prepared. Each set consisted of 13 samples differing gradually in the concentration of pigment. To obtain the sets of various colour shades white paint was mixed with the appropriate pigment in the form of a concentrated suspension. After solvents evaporation the samples were examined using spectrometric methods. The resin and main filler were identified by IR method. Colour and white pigments were identified on the base of Raman spectra. Colour of samples were compared based on Vis spectrometry according to colour theory. It was found that samples are homogenous (parameter measuring colour similarity ΔE < 3). The values of ΔE between the neighbouring samples in the set revealed decreasing linear function and between the first and following one - a logarithmic function.

  15. RAPID METHOD FOR DETERMINATION OF RADIOSTRONTIUM IN EMERGENCY MILK SAMPLES

    SciTech Connect

    Maxwell, S.; Culligan, B.

    2008-07-17

    A new rapid separation method for radiostrontium in emergency milk samples was developed at the Savannah River Site (SRS) Environmental Bioassay Laboratory (Aiken, SC, USA) that will allow rapid separation and measurement of Sr-90 within 8 hours. The new method uses calcium phosphate precipitation, nitric acid dissolution of the precipitate to coagulate residual fat/proteins and a rapid strontium separation using Sr Resin (Eichrom Technologies, Darien, IL, USA) with vacuum-assisted flow rates. The method is much faster than previous method that use calcination or cation exchange pretreatment, has excellent chemical recovery, and effectively removes beta interferences. When a 100 ml sample aliquot is used, the method has a detection limit of 0.5 Bq/L, well below generic emergency action levels.

  16. Prey selection by an apex predator: the importance of sampling uncertainty.

    PubMed

    Davis, Miranda L; Stephens, Philip A; Willis, Stephen G; Bassi, Elena; Marcon, Andrea; Donaggio, Emanuela; Capitani, Claudia; Apollonio, Marco

    2012-01-01

    The impact of predation on prey populations has long been a focus of ecologists, but a firm understanding of the factors influencing prey selection, a key predictor of that impact, remains elusive. High levels of variability observed in prey selection may reflect true differences in the ecology of different communities but might also reflect a failure to deal adequately with uncertainties in the underlying data. Indeed, our review showed that less than 10% of studies of European wolf predation accounted for sampling uncertainty. Here, we relate annual variability in wolf diet to prey availability and examine temporal patterns in prey selection; in particular, we identify how considering uncertainty alters conclusions regarding prey selection.Over nine years, we collected 1,974 wolf scats and conducted drive censuses of ungulates in Alpe di Catenaia, Italy. We bootstrapped scat and census data within years to construct confidence intervals around estimates of prey use, availability and selection. Wolf diet was dominated by boar (61.5 ± 3.90 [SE] % of biomass eaten) and roe deer (33.7 ± 3.61%). Temporal patterns of prey densities revealed that the proportion of roe deer in wolf diet peaked when boar densities were low, not when roe deer densities were highest. Considering only the two dominant prey types, Manly's standardized selection index using all data across years indicated selection for boar (mean = 0.73 ± 0.023). However, sampling error resulted in wide confidence intervals around estimates of prey selection. Thus, despite considerable variation in yearly estimates, confidence intervals for all years overlapped. Failing to consider such uncertainty could lead erroneously to the assumption of differences in prey selection among years. This study highlights the importance of considering temporal variation in relative prey availability and accounting for sampling uncertainty when interpreting the results of dietary studies. PMID:23110122

  17. Prey selection by an apex predator: the importance of sampling uncertainty.

    PubMed

    Davis, Miranda L; Stephens, Philip A; Willis, Stephen G; Bassi, Elena; Marcon, Andrea; Donaggio, Emanuela; Capitani, Claudia; Apollonio, Marco

    2012-01-01

    The impact of predation on prey populations has long been a focus of ecologists, but a firm understanding of the factors influencing prey selection, a key predictor of that impact, remains elusive. High levels of variability observed in prey selection may reflect true differences in the ecology of different communities but might also reflect a failure to deal adequately with uncertainties in the underlying data. Indeed, our review showed that less than 10% of studies of European wolf predation accounted for sampling uncertainty. Here, we relate annual variability in wolf diet to prey availability and examine temporal patterns in prey selection; in particular, we identify how considering uncertainty alters conclusions regarding prey selection.Over nine years, we collected 1,974 wolf scats and conducted drive censuses of ungulates in Alpe di Catenaia, Italy. We bootstrapped scat and census data within years to construct confidence intervals around estimates of prey use, availability and selection. Wolf diet was dominated by boar (61.5 ± 3.90 [SE] % of biomass eaten) and roe deer (33.7 ± 3.61%). Temporal patterns of prey densities revealed that the proportion of roe deer in wolf diet peaked when boar densities were low, not when roe deer densities were highest. Considering only the two dominant prey types, Manly's standardized selection index using all data across years indicated selection for boar (mean = 0.73 ± 0.023). However, sampling error resulted in wide confidence intervals around estimates of prey selection. Thus, despite considerable variation in yearly estimates, confidence intervals for all years overlapped. Failing to consider such uncertainty could lead erroneously to the assumption of differences in prey selection among years. This study highlights the importance of considering temporal variation in relative prey availability and accounting for sampling uncertainty when interpreting the results of dietary studies.

  18. Sampling efficacy for the red imported fire ant Solenopsis invicta (Hymenoptera: Formicidae).

    PubMed

    Stringer, Lloyd D; Suckling, David Maxwell; Baird, David; Vander Meer, Robert K; Christian, Sheree J; Lester, Philip J

    2011-10-01

    Cost-effective detection of invasive ant colonies before establishment in new ranges is imperative for the protection of national borders and reducing their global impact. We examined the sampling efficiency of food-baits and pitfall traps (baited and nonbaited) in detecting isolated red imported fire ant (Solenopsis invicta Buren) nests in multiple environments in Gainesville, FL. Fire ants demonstrated a significantly higher preference for a mixed protein food type (hotdog or ground meat combined with sweet peanut butter) than for the sugar or water baits offered. Foraging distance success was a function of colony size, detection trap used, and surveillance duration. Colony gyne number did not influence detection success. Workers from small nests (0- to 15-cm mound diameter) traveled no >3 m to a food source, whereas large colonies (>30-cm mound diameter) traveled up to 17 m. Baited pitfall traps performed best at detecting incipient ant colonies followed by nonbaited pitfall traps then food baits, whereas food baits performed well when trying to detect large colonies. These results were used to create an interactive model in Microsoft Excel, whereby surveillance managers can alter trap type, density, and duration parameters to estimate the probability of detecting specified or unknown S. invicta colony sizes. This model will support decision makers who need to balance the sampling cost and risk of failure to detect fire ant colonies.

  19. A method to optimize sampling locations for measuring indoor air distributions

    NASA Astrophysics Data System (ADS)

    Huang, Yan; Shen, Xiong; Li, Jianmin; Li, Bingye; Duan, Ran; Lin, Chao-Hsin; Liu, Junjie; Chen, Qingyan

    2015-02-01

    Indoor air distributions, such as the distributions of air temperature, air velocity, and contaminant concentrations, are very important to occupants' health and comfort in enclosed spaces. When point data is collected for interpolation to form field distributions, the sampling locations (the locations of the point sensors) have a significant effect on time invested, labor costs and measuring accuracy on field interpolation. This investigation compared two different sampling methods: the grid method and the gradient-based method, for determining sampling locations. The two methods were applied to obtain point air parameter data in an office room and in a section of an economy-class aircraft cabin. The point data obtained was then interpolated to form field distributions by the ordinary Kriging method. Our error analysis shows that the gradient-based sampling method has 32.6% smaller error of interpolation than the grid sampling method. We acquired the function between the interpolation errors and the sampling size (the number of sampling points). According to the function, the sampling size has an optimal value and the maximum sampling size can be determined by the sensor and system errors. This study recommends the gradient-based sampling method for measuring indoor air distributions.

  20. 40 CFR 80.1348 - What gasoline sample retention requirements apply to refiners and importers?

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... Benzene Sampling, Testing and Retention Requirements § 80.1348 What gasoline sample retention requirements... independent laboratory shall also include with the retained sample the test result for benzene as...

  1. NEW COLUMN SEPARATION METHOD FOR EMERGENCY URINE SAMPLES

    SciTech Connect

    Maxwell, S; Brian Culligan, B

    2007-08-28

    The Savannah River Site Environmental Bioassay Lab participated in the 2007 NRIP Emergency Response program administered by the National Institute for Standards and Technology (NIST) in May, 2007. A new rapid column separation method was applied directly to the NRIP 2007 emergency urine samples, with only minimal sample preparation to reduce preparation time. Calcium phosphate precipitation, previously used to pre-concentrate actinides and Sr-90 in NRIP 2006 urine and water samples, was not used for the NRIP 2007 urine samples. Instead, the raw urine was acidified and passed directly through the stacked resin columns (TEVA+TRU+SR Resins) to separate the actinides and strontium from the NRIP urine samples more quickly. This improvement reduced sample preparation time for the NRIP 2007 emergency urine analyses significantly. This approach works well for small volume urine samples expected during an emergency response event. Based on initial feedback from NIST, the SRS Environmental Bioassay Lab had the most rapid analysis times for actinides and strontium-90 analyses for NRIP 2007 urine samples.

  2. On the Exploitation of Sensitivity Derivatives for Improving Sampling Methods

    NASA Technical Reports Server (NTRS)

    Cao, Yanzhao; Hussaini, M. Yousuff; Zang, Thomas A.

    2003-01-01

    Many application codes, such as finite-element structural analyses and computational fluid dynamics codes, are capable of producing many sensitivity derivatives at a small fraction of the cost of the underlying analysis. This paper describes a simple variance reduction method that exploits such inexpensive sensitivity derivatives to increase the accuracy of sampling methods. Three examples, including a finite-element structural analysis of an aircraft wing, are provided that illustrate an order of magnitude improvement in accuracy for both Monte Carlo and stratified sampling schemes.

  3. RAPID SEPARATION METHOD FOR EMERGENCY WATER AND URINE SAMPLES

    SciTech Connect

    Maxwell, S.; Culligan, B.

    2008-08-27

    The Savannah River Site Environmental Bioassay Lab participated in the 2008 NRIP Emergency Response program administered by the National Institute for Standards and Technology (NIST) in May, 2008. A new rapid column separation method was used for analysis of actinides and {sup 90}Sr the NRIP 2008 emergency water and urine samples. Significant method improvements were applied to reduce analytical times. As a result, much faster analysis times were achieved, less than 3 hours for determination of {sup 90}Sr and 3-4 hours for actinides. This represents a 25%-33% improvement in analysis times from NRIP 2007 and a {approx}100% improvement compared to NRIP 2006 report times. Column flow rates were increased by a factor of two, with no significant adverse impact on the method performance. Larger sample aliquots, shorter count times, faster cerium fluoride microprecipitation and streamlined calcium phosphate precipitation were also employed. Based on initial feedback from NIST, the SRS Environmental Bioassay Lab had the most rapid analysis times for actinides and {sup 90}Sr analyses for NRIP 2008 emergency urine samples. High levels of potential matrix interferences may be present in emergency samples and rugged methods are essential. Extremely high levels of {sup 210}Po were found to have an adverse effect on the uranium results for the NRIP-08 urine samples, while uranium results for NRIP-08 water samples were not affected. This problem, which was not observed for NRIP-06 or NRIP-07 urine samples, was resolved by using an enhanced {sup 210}Po removal step, which will be described.

  4. Comparison of sampling methods for radiocarbon dating of carbonyls in air samples via accelerator mass spectrometry

    NASA Astrophysics Data System (ADS)

    Schindler, Matthias; Kretschmer, Wolfgang; Scharf, Andreas; Tschekalinskij, Alexander

    2016-05-01

    Three new methods to sample and prepare various carbonyl compounds for radiocarbon measurements were developed and tested. Two of these procedures utilized the Strecker synthetic method to form amino acids from carbonyl compounds with either sodium cyanide or trimethylsilyl cyanide. The third procedure used semicarbazide to form crystalline carbazones with the carbonyl compounds. The resulting amino acids and semicarbazones were then separated and purified using thin layer chromatography. The separated compounds were then combusted to CO2 and reduced to graphite to determine 14C content by accelerator mass spectrometry (AMS). All of these methods were also compared with the standard carbonyl compound sampling method wherein a compound is derivatized with 2,4-dinitrophenylhydrazine and then separated by high-performance liquid chromatography (HPLC).

  5. Exposure to airborne allergens: a review of sampling methods.

    PubMed

    Renström, Anne

    2002-10-01

    A number of methods are used to assess exposure to high-molecular weight allergens. In the occupational setting, airborne dust is often collected on filters using pumps, the filters are eluted and allergen content in the eluate analysed using immunoassays. Collecting inhalable dust using person-carried pumps may be considered the gold standard. Other allergen sampling methods are available. Recently, a method that collects nasally inhaled dust on adhesive surfaces within nasal samplers has been developed. Allergen content can be analysed in eluates using sensitive enzyme immunoassays, or allergen-bearing particles can be immunostained using antibodies, and studied under the microscope. Settling airborne dust can be collected in petri dishes, a cheap and simple method that has been utilised in large-scale exposure studies. Collection of reservoir dust from surfaces using vacuum cleaners with a dust collector is commonly used to measure pet or mite allergens in homes. The sampling methods differ in properties and relevance to personal allergen exposure. Since methods for all steps from sampling to analysis differ between laboratories, determining occupational exposure limits for protein allergens is today unfeasible. A general standardisation of methods is needed.

  6. The impact of particle size selective sampling methods on occupational assessment of airborne beryllium particulates.

    PubMed

    Sleeth, Darrah K

    2013-05-01

    In 2010, the American Conference of Governmental Industrial Hygienists (ACGIH) formally changed its Threshold Limit Value (TLV) for beryllium from a 'total' particulate sample to an inhalable particulate sample. This change may have important implications for workplace air sampling of beryllium. A history of particle size-selective sampling methods, with a special focus on beryllium, will be provided. The current state of the science on inhalable sampling will also be presented, including a look to the future at what new methods or technology may be on the horizon. This includes new sampling criteria focused on particle deposition in the lung, proposed changes to the existing inhalable convention, as well as how the issues facing beryllium sampling may help drive other changes in sampling technology.

  7. 40 CFR 80.1348 - What gasoline sample retention requirements apply to refiners and importers?

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... Benzene Sampling, Testing and Retention Requirements § 80.1348 What gasoline sample retention requirements... include with the retained sample the test result for benzene as conducted pursuant to § 80.46(e). (b... sample the test result for benzene as conducted pursuant to § 80.47....

  8. 40 CFR 80.330 - What are the sampling and testing requirements for refiners and importers?

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... Practice for Manual Sampling of Petroleum and Petroleum Products.” (ii) Samples collected under the... present that could affect the sulfur test result. (2) Automatic sampling of petroleum products in..., entitled “Standard Practice for Automatic Sampling of Petroleum and Petroleum Products.” (c) Test...

  9. Method and apparatus for sampling low-yield wells

    DOEpatents

    Last, George V.; Lanigan, David C.

    2003-04-15

    An apparatus and method for collecting a sample from a low-yield well or perched aquifer includes a pump and a controller responsive to water level sensors for filling a sample reservoir. The controller activates the pump to fill the reservoir when the water level in the well reaches a high level as indicated by the sensor. The controller deactivates the pump when the water level reaches a lower level as indicated by the sensors. The pump continuously activates and deactivates the pump until the sample reservoir is filled with a desired volume, as indicated by a reservoir sensor. At the beginning of each activation cycle, the controller optionally can select to purge an initial quantity of water prior to filling the sample reservoir. The reservoir can be substantially devoid of air and the pump is a low volumetric flow rate pump. Both the pump and the reservoir can be located either inside or outside the well.

  10. Sampling methods for monitoring changes in gonococcal populations.

    PubMed Central

    Bindayna, K. M.; Ison, C. A.

    1989-01-01

    A total of 160 consecutive isolates of Neisseria gonorrhoeae was collected over a 3-month period. They were tested for their susceptibility to penicillin, erythromycin and spectinomycin and the auxotype and the serotype determined. We have evaluated two sampling methods, the collection of every fifth isolate and the first 20 isolates (10 male and 10 female) each month, to determine whether either is representative of the total population. There was no significant difference between either method of sampling and the total for detecting the predominant auxotypes and serovars or the distributions in antibiotic susceptibility. It is possible to monitor major changes in a gonococcal population, particularly susceptibility to antibiotics, using a sample of the total population. PMID:2528473

  11. A New GP Recombination Method Using Random Tree Sampling

    NASA Astrophysics Data System (ADS)

    Tanji, Makoto; Iba, Hitoshi

    We propose a new program evolution method named PORTS (Program Optimization by Random Tree Sampling) which is motivated by the idea of preservation and control of tree fragments in GP (Genetic Programming). We assume that to recombine genetic materials efficiently, tree fragments of any size should be preserved into the next generation. PORTS samples tree fragments and concatenates them by traversing and transitioning between promising trees instead of using subtree crossover and mutation. Because the size of a fragment preserved during a generation update follows a geometric distribution, merits of the method are that it is relatively easy to predict the behavior of tree fragments over time and to control sampling size, by changing a single parameter. From experimental results on RoyalTree, Symbolic Regression and 6-Multiplexer problem, we observed that the performance of PORTS is competitive with Simple GP. Furthermore, the average node size of optimal solutions obtained by PORTS was simple than Simple GP's result.

  12. Evaluation of seven aquatic sampling methods for amphibians and other aquatic fauna

    USGS Publications Warehouse

    Gunzburger, M.S.

    2007-01-01

    To design effective and efficient research and monitoring programs researchers must have a thorough understanding of the capabilities and limitations of their sampling methods. Few direct comparative studies exist for aquatic sampling methods for amphibians. The objective of this study was to simultaneously employ seven aquatic sampling methods in 10 wetlands to compare amphibian species richness and number of individuals detected with each method. Four sampling methods allowed counts of individuals (metal dipnet, D-frame dipnet, box trap, crayfish trap), whereas the other three methods allowed detection of species (visual encounter, aural, and froglogger). Amphibian species richness was greatest with froglogger, box trap, and aural samples. For anuran species, the sampling methods by which each life stage was detected was related to relative length of larval and breeding periods and tadpole size. Detection probability of amphibians varied across sampling methods. Box trap sampling resulted in the most precise amphibian count, but the precision of all four count-based methods was low (coefficient of variation > 145 for all methods). The efficacy of the four count sampling methods at sampling fish and aquatic invertebrates was also analyzed because these predatory taxa are known to be important predictors of amphibian habitat distribution. Species richness and counts were similar for fish with the four methods, whereas invertebrate species richness and counts were greatest in box traps. An effective wetland amphibian monitoring program in the southeastern United States should include multiple sampling methods to obtain the most accurate assessment of species community composition at each site. The combined use of frogloggers, crayfish traps, and dipnets may be the most efficient and effective amphibian monitoring protocol. ?? 2007 Brill Academic Publishers.

  13. Universal nucleic acids sample preparation method for cells, spores and their mixture

    DOEpatents

    Bavykin, Sergei

    2011-01-18

    The present invention relates to a method for extracting nucleic acids from biological samples. More specifically the invention relates to a universal method for extracting nucleic acids from unidentified biological samples. An advantage of the presently invented method is its ability to effectively and efficiently extract nucleic acids from a variety of different cell types including but not limited to prokaryotic or eukaryotic cells and/or recalcitrant organisms (i.e. spores). Unlike prior art methods which are focused on extracting nucleic acids from vegetative cell or spores, the present invention effectively extracts nucleic acids from spores, multiple cell types or mixtures thereof using a single method. Important that the invented method has demonstrated an ability to extract nucleic acids from spores and vegetative bacterial cells with similar levels effectiveness. The invented method employs a multi-step protocol which erodes the cell structure of the biological sample, isolates, labels, fragments nucleic acids and purifies labeled samples from the excess of dye.

  14. A General Linear Method for Equating with Small Samples

    ERIC Educational Resources Information Center

    Albano, Anthony D.

    2015-01-01

    Research on equating with small samples has shown that methods with stronger assumptions and fewer statistical estimates can lead to decreased error in the estimated equating function. This article introduces a new approach to linear observed-score equating, one which provides flexible control over how form difficulty is assumed versus estimated…

  15. Periodicity detection method for small-sample time series datasets.

    PubMed

    Tominaga, Daisuke

    2010-01-01

    Time series of gene expression often exhibit periodic behavior under the influence of multiple signal pathways, and are represented by a model that incorporates multiple harmonics and noise. Most of these data, which are observed using DNA microarrays, consist of few sampling points in time, but most periodicity detection methods require a relatively large number of sampling points. We have previously developed a detection algorithm based on the discrete Fourier transform and Akaike's information criterion. Here we demonstrate the performance of the algorithm for small-sample time series data through a comparison with conventional and newly proposed periodicity detection methods based on a statistical analysis of the power of harmonics.We show that this method has higher sensitivity for data consisting of multiple harmonics, and is more robust against noise than other methods. Although "combinatorial explosion" occurs for large datasets, the computational time is not a problem for small-sample datasets. The MATLAB/GNU Octave script of the algorithm is available on the author's web site: http://www.cbrc.jp/%7Etominaga/piccolo/. PMID:21151841

  16. 40 CFR 80.583 - What alternative sampling and testing requirements apply to importers who transport motor vehicle...

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... diesel fuel, or ECA marine fuel by truck or rail car? Importers who import diesel fuel subject to the 15... rail car for import to the U.S., the importer must obtain a copy of the terminal test result that... diesel fuel samples and perform audits. These inspections or audits may be either announced...

  17. Comparison of DNA preservation methods for environmental bacterial community samples

    USGS Publications Warehouse

    Gray, Michael A.; Pratte, Zoe A.; Kellogg, Christina A.

    2013-01-01

    Field collections of environmental samples, for example corals, for molecular microbial analyses present distinct challenges. The lack of laboratory facilities in remote locations is common, and preservation of microbial community DNA for later study is critical. A particular challenge is keeping samples frozen in transit. Five nucleic acid preservation methods that do not require cold storage were compared for effectiveness over time and ease of use. Mixed microbial communities of known composition were created and preserved by DNAgard™, RNAlater®, DMSO–EDTA–salt (DESS), FTA® cards, and FTA Elute® cards. Automated ribosomal intergenic spacer analysis and clone libraries were used to detect specific changes in the faux communities over weeks and months of storage. A previously known bias in FTA® cards that results in lower recovery of pure cultures of Gram-positive bacteria was also detected in mixed community samples. There appears to be a uniform bias across all five preservation methods against microorganisms with high G + C DNA. Overall, the liquid-based preservatives (DNAgard™, RNAlater®, and DESS) outperformed the card-based methods. No single liquid method clearly outperformed the others, leaving method choice to be based on experimental design, field facilities, shipping constraints, and allowable cost.

  18. Comparison of aquatic macroinvertebrate samples collected using different field methods

    USGS Publications Warehouse

    Lenz, Bernard N.; Miller, Michael A.

    1996-01-01

    Government agencies, academic institutions, and volunteer monitoring groups in the State of Wisconsin collect aquatic macroinvertebrate data to assess water quality. Sampling methods differ among agencies, reflecting the differences in the sampling objectives of each agency. Lack of infor- mation about data comparability impedes data shar- ing among agencies, which can result in duplicated sampling efforts or the underutilization of avail- able information. To address these concerns, com- parisons were made of macroinvertebrate samples collected from wadeable streams in Wisconsin by personnel from the U.S. Geological Survey- National Water Quality Assessment Program (USGS-NAWQA), the Wisconsin Department of Natural Resources (WDNR), the U.S. Department of Agriculture-Forest Service (USDA-FS), and volunteers from the Water Action Volunteer-Water Quality Monitoring Program (WAV). This project was part of the Intergovernmental Task Force on Monitoring Water Quality (ITFM) Wisconsin Water Resources Coordination Project. The numbers, types, and environmental tolerances of the organ- isms collected were analyzed to determine if the four different field methods that were used by the different agencies and volunteer groups provide comparable results. Additionally, this study com- pared the results of samples taken from different locations and habitats within the same streams.

  19. New Methods of Sample Preparation for Atom Probe Specimens

    NASA Technical Reports Server (NTRS)

    Kuhlman, Kimberly, R.; Kowalczyk, Robert S.; Ward, Jennifer R.; Wishard, James L.; Martens, Richard L.; Kelly, Thomas F.

    2003-01-01

    Magnetite is a common conductive mineral found on Earth and Mars. Disk-shaped precipitates approximately 40 nm in diameter have been shown to have manganese and aluminum concentrations. Atom-probe field-ion microscopy (APFIM) is the only technique that can potentially quantify the composition of these precipitates. APFIM will be used to characterize geological and planetary materials, analyze samples of interest for geomicrobiology; and, for the metrology of nanoscale instrumentation. Prior to APFIM sample preparation was conducted by electropolishing, the method of sharp shards (MSS), or Bosch process (deep reactive ion etching) with focused ion beam (FIB) milling as a final step. However, new methods are required for difficult samples. Many materials are not easily fabricated using electropolishing, MSS, or the Bosch process, FIB milling is slow and expensive, and wet chemistry and the reactive ion etching are typically limited to Si and other semiconductors. APFIM sample preparation using the dicing saw is commonly used to section semiconductor wafers into individual devices following manufacture. The dicing saw is a time-effective method for preparing high aspect ratio posts of poorly conducting materials. Femtosecond laser micromachining is also suitable for preparation of posts. FIB time required is reduced by about a factor of 10 and multi-tip specimens can easily be fabricated using the dicing saw.

  20. A method for sampling microbial aerosols using high altitude balloons.

    PubMed

    Bryan, N C; Stewart, M; Granger, D; Guzik, T G; Christner, B C

    2014-12-01

    Owing to the challenges posed to microbial aerosol sampling at high altitudes, very little is known about the abundance, diversity, and extent of microbial taxa in the Earth-atmosphere system. To directly address this knowledge gap, we designed, constructed, and tested a system that passively samples aerosols during ascent through the atmosphere while tethered to a helium-filled latex sounding balloon. The sampling payload is ~ 2.7 kg and comprised of an electronics box and three sampling chambers (one serving as a procedural control). Each chamber is sealed with retractable doors that can be commanded to open and close at designated altitudes. The payload is deployed together with radio beacons that transmit GPS coordinates (latitude, longitude and altitude) in real time for tracking and recovery. A cut mechanism separates the payload string from the balloon at any desired altitude, returning all equipment safely to the ground on a parachute. When the chambers are opened, aerosol sampling is performed using the Rotorod® collection method (40 rods per chamber), with each rod passing through 0.035 m3 per km of altitude sampled. Based on quality control measurements, the collection of ~ 100 cells rod(-1) provided a 3-sigma confidence level of detection. The payload system described can be mated with any type of balloon platform and provides a tool for characterizing the vertical distribution of microorganisms in the troposphere and stratosphere. PMID:25455021

  1. Genomic DNA microextraction: a method to screen numerous samples.

    PubMed

    Ramírez-Solis, R; Rivera-Pérez, J; Wallace, J D; Wims, M; Zheng, H; Bradley, A

    1992-03-01

    Many experimental designs require the analysis of genomic DNA from a large number of samples. Although the polymerase chain reaction (PCR) can be used, the Southern blot is preferred for many assays because of its inherent reliability. The rapid acceptance of PCR, despite a significant rate of false positive/negative results, is partly due to the disadvantages of the sample preparation process for Southern blot analysis. We have devised a rapid protocol to extract high-molecular-weight genomic DNA from a large number of samples. It involves the use of a single 96-well tissue culture dish to carry out all the steps of the sample preparation. This, coupled with the use of a multichannel pipette, facilitates the simultaneous analysis of multiple samples. The procedure may be automated since no centrifugation, mixing, or transferring of the samples is necessary. The method has been used to screen embryonic stem cell clones for the presence of targeted mutations at the Hox-2.6 locus and to obtain data from human blood.

  2. Transuranic waste characterization sampling and analysis methods manual. Revision 1

    SciTech Connect

    Suermann, J.F.

    1996-04-01

    This Methods Manual provides a unified source of information on the sampling and analytical techniques that enable Department of Energy (DOE) facilities to comply with the requirements established in the current revision of the Transuranic Waste Characterization Quality Assurance Program Plan (QAPP) for the Waste Isolation Pilot Plant (WIPP) Transuranic (TRU) Waste Characterization Program (the Program) and the WIPP Waste Analysis Plan. This Methods Manual includes all of the testing, sampling, and analytical methodologies accepted by DOE for use in implementing the Program requirements specified in the QAPP and the WIPP Waste Analysis Plan. The procedures in this Methods Manual are comprehensive and detailed and are designed to provide the necessary guidance for the preparation of site-specific procedures. With some analytical methods, such as Gas Chromatography/Mass Spectrometry, the Methods Manual procedures may be used directly. With other methods, such as nondestructive characterization, the Methods Manual provides guidance rather than a step-by-step procedure. Sites must meet all of the specified quality control requirements of the applicable procedure. Each DOE site must document the details of the procedures it will use and demonstrate the efficacy of such procedures to the Manager, National TRU Program Waste Characterization, during Waste Characterization and Certification audits.

  3. On the Importance of Accounting for Competing Risks in Pediatric Brain Cancer: II. Regression Modeling and Sample Size

    SciTech Connect

    Tai, Bee-Choo; Grundy, Richard; Machin, David

    2011-03-15

    Purpose: To accurately model the cumulative need for radiotherapy in trials designed to delay or avoid irradiation among children with malignant brain tumor, it is crucial to account for competing events and evaluate how each contributes to the timing of irradiation. An appropriate choice of statistical model is also important for adequate determination of sample size. Methods and Materials: We describe the statistical modeling of competing events (A, radiotherapy after progression; B, no radiotherapy after progression; and C, elective radiotherapy) using proportional cause-specific and subdistribution hazard functions. The procedures of sample size estimation based on each method are outlined. These are illustrated by use of data comparing children with ependymoma and other malignant brain tumors. The results from these two approaches are compared. Results: The cause-specific hazard analysis showed a reduction in hazards among infants with ependymoma for all event types, including Event A (adjusted cause-specific hazard ratio, 0.76; 95% confidence interval, 0.45-1.28). Conversely, the subdistribution hazard analysis suggested an increase in hazard for Event A (adjusted subdistribution hazard ratio, 1.35; 95% confidence interval, 0.80-2.30), but the reduction in hazards for Events B and C remained. Analysis based on subdistribution hazard requires a larger sample size than the cause-specific hazard approach. Conclusions: Notable differences in effect estimates and anticipated sample size were observed between methods when the main event showed a beneficial effect whereas the competing events showed an adverse effect on the cumulative incidence. The subdistribution hazard is the most appropriate for modeling treatment when its effects on both the main and competing events are of interest.

  4. 40 CFR 80.1348 - What gasoline sample retention requirements apply to refiners and importers?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 16 2011-07-01 2011-07-01 false What gasoline sample retention... PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) REGULATION OF FUELS AND FUEL ADDITIVES Gasoline Benzene Sampling, Testing and Retention Requirements § 80.1348 What gasoline sample retention...

  5. 40 CFR 80.1348 - What gasoline sample retention requirements apply to refiners and importers?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 16 2010-07-01 2010-07-01 false What gasoline sample retention... PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) REGULATION OF FUELS AND FUEL ADDITIVES Gasoline Benzene Sampling, Testing and Retention Requirements § 80.1348 What gasoline sample retention...

  6. 40 CFR 80.1348 - What gasoline sample retention requirements apply to refiners and importers?

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 17 2012-07-01 2012-07-01 false What gasoline sample retention... PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) REGULATION OF FUELS AND FUEL ADDITIVES Gasoline Benzene Sampling, Testing and Retention Requirements § 80.1348 What gasoline sample retention...

  7. Field Methods and Sample Collection Techniques for the Surveillance of West Nile Virus in Avian Hosts.

    PubMed

    Wheeler, Sarah S; Boyce, Walter M; Reisen, William K

    2016-01-01

    Avian hosts play an important role in the spread, maintenance, and amplification of West Nile virus (WNV). Avian susceptibility to WNV varies from species to species thus surveillance efforts can focus both on birds that survive infection and those that succumb. Here we describe methods for the collection and sampling of live birds for WNV antibodies or viremia, and methods for the sampling of dead birds. Target species and study design considerations are discussed. PMID:27188560

  8. 40 CFR 80.1644 - Sampling and testing requirements for producers and importers of certified ethanol denaturant.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... producers and importers of certified ethanol denaturant. 80.1644 Section 80.1644 Protection of Environment... ethanol denaturant. (a) Sample and test each batch of certified ethanol denaturant. (1) Producers and importers of certified ethanol denaturant shall collect a representative sample from each batch of...

  9. RAPID SEPARATION METHOD FOR ACTINIDES IN EMERGENCY SOIL SAMPLES

    SciTech Connect

    Maxwell, S.; Culligan, B.; Noyes, G.

    2009-11-09

    A new rapid method for the determination of actinides in soil and sediment samples has been developed at the Savannah River Site Environmental Lab (Aiken, SC, USA) that can be used for samples up to 2 grams in emergency response situations. The actinides in soil method utilizes a rapid sodium hydroxide fusion method, a lanthanum fluoride soil matrix removal step, and a streamlined column separation process with stacked TEVA, TRU and DGA Resin cartridges. Lanthanum was separated rapidly and effectively from Am and Cm on DGA Resin. Vacuum box technology and rapid flow rates are used to reduce analytical time. Alpha sources are prepared using cerium fluoride microprecipitation for counting by alpha spectrometry. The method showed high chemical recoveries and effective removal of interferences. This new procedure was applied to emergency soil samples received in the NRIP Emergency Response exercise administered by the National Institute for Standards and Technology (NIST) in April, 2009. The actinides in soil results were reported within 4-5 hours with excellent quality.

  10. Comparison of clinical samples and methods in chronic cutaneous leishmaniasis.

    PubMed

    Eroglu, Fadime; Uzun, Soner; Koltas, Ismail Soner

    2014-11-01

    This study aimed at finding out the most effective clinical samples and methods in chronic cutaneous leishmaniasis (CCL). Smear, aspiration fluid, and filter paper samples were taken from 104 skin lesions of suspected cases with CCL, and they were compared using microscopic examination, culture, and molecular methods. We characterized four different forms of CCL and identified the causative agents in CCL forms using high-resolution melting curve real-time polymerase chain reaction assay. We observed that smear was detected to be the most sensitive (63.5%) among clinical samples, and real-time polymerase chain reaction method was the most sensitive (96.8%) among the methods used in diagnosis of CCL. We identified 68.8% Leishmania tropica and 31.2% L. infantum in papular lesions, 69.2% L. infantum and 30.8% L. tropica in nodular lesions, 57.9% L. tropica and 42.1% L. major in ulcerating plaque lesions, and 55.5% L. tropica and 44.5% L. major in noduloulcerative lesions in CCL patients.

  11. Spanish Multicenter Normative Studies (NEURONORMA Project): methods and sample characteristics.

    PubMed

    Peña-Casanova, Jordi; Blesa, Rafael; Aguilar, Miquel; Gramunt-Fombuena, Nina; Gómez-Ansón, Beatriz; Oliva, Rafael; Molinuevo, José Luis; Robles, Alfredo; Barquero, María Sagrario; Antúnez, Carmen; Martínez-Parra, Carlos; Frank-García, Anna; Fernández, Manuel; Alfonso, Verónica; Sol, Josep M

    2009-06-01

    This paper describes the methods and sample characteristics of a series of Spanish normative studies (The NEURONORMA project). The primary objective of our research was to collect normative and psychometric information on a sample of people aged over 49 years. The normative information was based on a series of selected, but commonly used, neuropsychological tests covering attention, language, visuo-perceptual abilities, constructional tasks, memory, and executive functions. A sample of 356 community dwelling individuals was studied. Demographics, socio-cultural, and medical data were collected. Cognitive normality was validated via informants and a cognitive screening test. Norms were calculated for midpoint age groups. Effects of age, education, and sex were determined. The use of these norms should improve neuropsychological diagnostic accuracy in older Spanish subjects. These data may also be of considerable use for comparisons with other normative studies. Limitations of these normative data are also commented on.

  12. Inferring the demographic history from DNA sequences: An importance sampling approach based on non-homogeneous processes.

    PubMed

    Ait Kaci Azzou, S; Larribe, F; Froda, S

    2016-10-01

    In Ait Kaci Azzou et al. (2015) we introduced an Importance Sampling (IS) approach for estimating the demographic history of a sample of DNA sequences, the skywis plot. More precisely, we proposed a new nonparametric estimate of a population size that changes over time. We showed on simulated data that the skywis plot can work well in typical situations where the effective population size does not undergo very steep changes. In this paper, we introduce an iterative procedure which extends the previous method and gives good estimates under such rapid variations. In the iterative calibrated skywis plot we approximate the effective population size by a piecewise constant function, whose values are re-estimated at each step. These piecewise constant functions are used to generate the waiting times of non homogeneous Poisson processes related to a coalescent process with mutation under a variable population size model. Moreover, the present IS procedure is based on a modified version of the Stephens and Donnelly (2000) proposal distribution. Finally, we apply the iterative calibrated skywis plot method to a simulated data set from a rapidly expanding exponential model, and we show that the method based on this new IS strategy correctly reconstructs the demographic history.

  13. Bayesian Methods for Determining the Importance of Effects

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Criticisms have plagued the frequentist null-hypothesis significance testing (NHST) procedure since the day it was created from the Fisher Significance Test and Hypothesis Test of Jerzy Neyman and Egon Pearson. Alternatives to NHST exist in frequentist statistics, but competing methods are also avai...

  14. A Mixed Methods Investigation of Mixed Methods Sampling Designs in Social and Health Science Research

    ERIC Educational Resources Information Center

    Collins, Kathleen M. T.; Onwuegbuzie, Anthony J.; Jiao, Qun G.

    2007-01-01

    A sequential design utilizing identical samples was used to classify mixed methods studies via a two-dimensional model, wherein sampling designs were grouped according to the time orientation of each study's components and the relationship of the qualitative and quantitative samples. A quantitative analysis of 121 studies representing nine fields…

  15. A direct method for e-cigarette aerosol sample collection.

    PubMed

    Olmedo, Pablo; Navas-Acien, Ana; Hess, Catherine; Jarmul, Stephanie; Rule, Ana

    2016-08-01

    E-cigarette use is increasing in populations around the world. Recent evidence has shown that the aerosol produced by e-cigarettes can contain a variety of toxicants. Published studies characterizing toxicants in e-cigarette aerosol have relied on filters, impingers or sorbent tubes, which are methods that require diluting or extracting the sample in a solution during collection. We have developed a collection system that directly condenses e-cigarette aerosol samples for chemical and toxicological analyses. The collection system consists of several cut pipette tips connected with short pieces of tubing. The pipette tip-based collection system can be connected to a peristaltic pump, a vacuum pump, or directly to an e-cigarette user for the e-cigarette aerosol to flow through the system. The pipette tip-based system condenses the aerosol produced by the e-cigarette and collects a liquid sample that is ready for analysis without the need of intermediate extraction solutions. We tested a total of 20 e-cigarettes from 5 different brands commercially available in Maryland. The pipette tip-based collection system condensed between 0.23 and 0.53mL of post-vaped e-liquid after 150 puffs. The proposed method is highly adaptable, can be used during field work and in experimental settings, and allows collecting aerosol samples from a wide variety of e-cigarette devices, yielding a condensate of the likely exact substance that is being delivered to the lungs.

  16. A direct method for e-cigarette aerosol sample collection.

    PubMed

    Olmedo, Pablo; Navas-Acien, Ana; Hess, Catherine; Jarmul, Stephanie; Rule, Ana

    2016-08-01

    E-cigarette use is increasing in populations around the world. Recent evidence has shown that the aerosol produced by e-cigarettes can contain a variety of toxicants. Published studies characterizing toxicants in e-cigarette aerosol have relied on filters, impingers or sorbent tubes, which are methods that require diluting or extracting the sample in a solution during collection. We have developed a collection system that directly condenses e-cigarette aerosol samples for chemical and toxicological analyses. The collection system consists of several cut pipette tips connected with short pieces of tubing. The pipette tip-based collection system can be connected to a peristaltic pump, a vacuum pump, or directly to an e-cigarette user for the e-cigarette aerosol to flow through the system. The pipette tip-based system condenses the aerosol produced by the e-cigarette and collects a liquid sample that is ready for analysis without the need of intermediate extraction solutions. We tested a total of 20 e-cigarettes from 5 different brands commercially available in Maryland. The pipette tip-based collection system condensed between 0.23 and 0.53mL of post-vaped e-liquid after 150 puffs. The proposed method is highly adaptable, can be used during field work and in experimental settings, and allows collecting aerosol samples from a wide variety of e-cigarette devices, yielding a condensate of the likely exact substance that is being delivered to the lungs. PMID:27200479

  17. [Sample preparation methods for chromatographic analysis of organic components in atmospheric particulate matter].

    PubMed

    Hao, Liang; Wu, Dapeng; Guan, Yafeng

    2014-09-01

    The determination of organic composition in atmospheric particulate matter (PM) is of great importance in understanding how PM affects human health, environment, climate, and ecosystem. Organic components are also the scientific basis for emission source tracking, PM regulation and risk management. Therefore, the molecular characterization of the organic fraction of PM has become one of the priority research issues in the field of environmental analysis. Due to the extreme complexity of PM samples, chromatographic methods have been the chief selection. The common procedure for the analysis of organic components in PM includes several steps: sample collection on the fiber filters, sample preparation (transform the sample into a form suitable for chromatographic analysis), analysis by chromatographic methods. Among these steps, the sample preparation methods will largely determine the throughput and the data quality. Solvent extraction methods followed by sample pretreatment (e. g. pre-separation, derivatization, pre-concentration) have long been used for PM sample analysis, and thermal desorption methods have also mainly focused on the non-polar organic component analysis in PM. In this paper, the sample preparation methods prior to chromatographic analysis of organic components in PM are reviewed comprehensively, and the corresponding merits and limitations of each method are also briefly discussed.

  18. Rock sampling. [method for controlling particle size distribution

    NASA Technical Reports Server (NTRS)

    Blum, P. (Inventor)

    1971-01-01

    A method for sampling rock and other brittle materials and for controlling resultant particle sizes is described. The method involves cutting grooves in the rock surface to provide a grouping of parallel ridges and subsequently machining the ridges to provide a powder specimen. The machining step may comprise milling, drilling, lathe cutting or the like; but a planing step is advantageous. Control of the particle size distribution is effected primarily by changing the height and width of these ridges. This control exceeds that obtainable by conventional grinding.

  19. Sampling Small Mammals in Southeastern Forests: The Importance of Trapping in Trees

    SciTech Connect

    Loeb, S.C.; Chapman, G.L.; Ridley, T.R.

    1999-01-01

    We investigated the effect of sampling methodology on the richness and abundance of small mammal communities in loblolly pine forests. Trapping in trees using Sherman live traps was included along with routine ground trapping using the same device. Estimates of species richness did not differ among samples in which tree traps were included or excluded. However, diversity indeces (Shannon-Wiener, Simpson, Shannon and Brillouin) were strongly effected. The indeces were significantly greater than if tree samples were included primarily the result of flying squirrel captures. Without tree traps, the results suggested that cotton mince dominated the community. We recommend that tree traps we included in sampling.

  20. Random Sampling of Quantum States: a Survey of Methods. And Some Issues Regarding the Overparametrized Method

    NASA Astrophysics Data System (ADS)

    Maziero, Jonas

    2015-12-01

    The numerical generation of random quantum states (RQS) is an important procedure for investigations in quantum information science. Here, we review some methods that may be used for performing that task. We start by presenting a simple procedure for generating random state vectors, for which the main tool is the random sampling of unbiased discrete probability distributions (DPD). Afterwards, the creation of random density matrices is addressed. In this context, we first present the standard method, which consists in using the spectral decomposition of a quantum state for getting RQS from random DPDs and random unitary matrices. In the sequence, the Bloch vector parametrization method is described. This approach, despite being useful in several instances, is not in general convenient for RQS generation. In the last part of the article, we regard the overparametrized method (OPM) and the related Ginibre and Bures techniques. The OPM can be used to create random positive semidefinite matrices with unit trace from randomly produced general complex matrices in a simple way that is friendly for numerical implementations. We consider a physically relevant issue related to the possible domains that may be used for the real and imaginary parts of the elements of such general complex matrices. Subsequently, a too fast concentration of measure in the quantum state space that appears in this parametrization is noticed.

  1. Importance of Sample Size for the Estimation of Repeater F Waves in Amyotrophic Lateral Sclerosis

    PubMed Central

    Fang, Jia; Liu, Ming-Sheng; Guan, Yu-Zhou; Cui, Bo; Cui, Li-Ying

    2015-01-01

    Background: In amyotrophic lateral sclerosis (ALS), repeater F waves are increased. Accurate assessment of repeater F waves requires an adequate sample size. Methods: We studied the F waves of left ulnar nerves in ALS patients. Based on the presence or absence of pyramidal signs in the left upper limb, the ALS patients were divided into two groups: One group with pyramidal signs designated as P group and the other without pyramidal signs designated as NP group. The Index repeating neurons (RN) and Index repeater F waves (Freps) were compared among the P, NP and control groups following 20 and 100 stimuli respectively. For each group, the Index RN and Index Freps obtained from 20 and 100 stimuli were compared. Results: In the P group, the Index RN (P = 0.004) and Index Freps (P = 0.001) obtained from 100 stimuli were significantly higher than from 20 stimuli. For F waves obtained from 20 stimuli, no significant differences were identified between the P and NP groups for Index RN (P = 0.052) and Index Freps (P = 0.079); The Index RN (P < 0.001) and Index Freps (P < 0.001) of the P group were significantly higher than the control group; The Index RN (P = 0.002) of the NP group was significantly higher than the control group. For F waves obtained from 100 stimuli, the Index RN (P < 0.001) and Index Freps (P < 0.001) of the P group were significantly higher than the NP group; The Index RN (P < 0.001) and Index Freps (P < 0.001) of the P and NP groups were significantly higher than the control group. Conclusions: Increased repeater F waves reflect increased excitability of motor neuron pool and indicate upper motor neuron dysfunction in ALS. For an accurate evaluation of repeater F waves in ALS patients especially those with moderate to severe muscle atrophy, 100 stimuli would be required. PMID:25673456

  2. Bandpass Sampling--An Opportunity to Stress the Importance of In-Depth Understanding

    ERIC Educational Resources Information Center

    Stern, Harold P. E.

    2010-01-01

    Many bandpass signals can be sampled at rates lower than the Nyquist rate, allowing significant practical advantages. Illustrating this phenomenon after discussing (and proving) Shannon's sampling theorem provides a valuable opportunity for an instructor to reinforce the principle that innovation is possible when students strive to have a complete…

  3. 40 CFR 80.1347 - What are the sampling and testing requirements for refiners and importers?

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... Benzene Sampling, Testing and Retention Requirements § 80.1347 What are the sampling and testing... benzene requirements of this subpart, except as modified by paragraphs (a)(2), (a)(3) and (a)(4) of this..., 2015, to determine its benzene concentration for compliance with the requirements of this...

  4. 40 CFR 80.1347 - What are the sampling and testing requirements for refiners and importers?

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... Benzene Sampling, Testing and Retention Requirements § 80.1347 What are the sampling and testing... benzene requirements of this subpart, except as modified by paragraphs (a)(2), (a)(3) and (a)(4) of this... benzene concentration for compliance with the requirements of this subpart. (ii) Independent...

  5. 40 CFR 80.330 - What are the sampling and testing requirements for refiners and importers?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ...(a) and 1 CFR part 51. Copies may be obtained from the American Society for Testing and Materials... alternative method is correlated to the method provided in § 80.46(a)(1). (d) Test method for sulfur in butane... alternative method is correlated to the method provided in § 80.46(a)(2). (e) Incorporations by...

  6. 40 CFR 80.330 - What are the sampling and testing requirements for refiners and importers?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ...(a) and 1 CFR part 51. Copies may be obtained from the American Society for Testing and Materials... alternative method is correlated to the method provided in § 80.46(a)(1). (d) Test method for sulfur in butane... alternative method is correlated to the method provided in § 80.46(a)(2). (e) Incorporations by...

  7. THE IMPORTANCE OF THE MAGNETIC FIELD FROM AN SMA-CSO-COMBINED SAMPLE OF STAR-FORMING REGIONS

    SciTech Connect

    Koch, Patrick M.; Tang, Ya-Wen; Ho, Paul T. P.; Chen, Huei-Ru Vivien; Liu, Hau-Yu Baobab; Yen, Hsi-Wei; Lai, Shih-Ping; Zhang, Qizhou; Chen, How-Huan; Ching, Tao-Chung; Girart, Josep M.; Frau, Pau; Li, Hua-Bai; Li, Zhi-Yun; Padovani, Marco; Qiu, Keping; Rao, Ramprasad

    2014-12-20

    Submillimeter dust polarization measurements of a sample of 50 star-forming regions, observed with the Submillimeter Array (SMA) and the Caltech Submillimeter Observatory (CSO) covering parsec-scale clouds to milliparsec-scale cores, are analyzed in order to quantify the magnetic field importance. The magnetic field misalignment δ—the local angle between magnetic field and dust emission gradient—is found to be a prime observable, revealing distinct distributions for sources where the magnetic field is preferentially aligned with or perpendicular to the source minor axis. Source-averaged misalignment angles (|δ|) fall into systematically different ranges, reflecting the different source-magnetic field configurations. Possible bimodal (|δ|) distributions are found for the separate SMA and CSO samples. Combining both samples broadens the distribution with a wide maximum peak at small (|δ|) values. Assuming the 50 sources to be representative, the prevailing source-magnetic field configuration is one that statistically prefers small magnetic field misalignments |δ|. When interpreting |δ| together with a magnetohydrodynamics force equation, as developed in the framework of the polarization-intensity gradient method, a sample-based log-linear scaling fits the magnetic field tension-to-gravity force ratio (Σ {sub B}) versus (|δ|) with (Σ {sub B}) = 0.116 · exp (0.047 · (|δ|)) ± 0.20 (mean error), providing a way to estimate the relative importance of the magnetic field, only based on measurable field misalignments |δ|. The force ratio Σ {sub B} discriminates systems that are collapsible on average ((Σ {sub B}) < 1) from other molecular clouds where the magnetic field still provides enough resistance against gravitational collapse ((Σ {sub B}) > 1). The sample-wide trend shows a transition around (|δ|) ≈ 45°. Defining an effective gravitational force ∼1 – (Σ {sub B}), the average magnetic-field-reduced star formation efficiency is at least a

  8. RAPID SEPARATION METHOD FOR ACTINIDES IN EMERGENCY AIR FILTER SAMPLES

    SciTech Connect

    Maxwell, S.; Noyes, G.; Culligan, B.

    2010-02-03

    A new rapid method for the determination of actinides and strontium in air filter samples has been developed at the Savannah River Site Environmental Lab (Aiken, SC, USA) that can be used in emergency response situations. The actinides and strontium in air filter method utilizes a rapid acid digestion method and a streamlined column separation process with stacked TEVA, TRU and Sr Resin cartridges. Vacuum box technology and rapid flow rates are used to reduce analytical time. Alpha emitters are prepared using cerium fluoride microprecipitation for counting by alpha spectrometry. The purified {sup 90}Sr fractions are mounted directly on planchets and counted by gas flow proportional counting. The method showed high chemical recoveries and effective removal of interferences. This new procedure was applied to emergency air filter samples received in the NRIP Emergency Response exercise administered by the National Institute for Standards and Technology (NIST) in April, 2009. The actinide and {sup 90}Sr in air filter results were reported in {approx}4 hours with excellent quality.

  9. Rapid identification of clinically important bacteroides by coagglutination method.

    PubMed

    Lalitha, M K; Anandi, V; Elias, L; Kalpana, C R

    1991-03-01

    A coagglutination technique using indigenous reagents was applied for the rapid identification of Bacteroides fragilis and the black pigmented bacteroides group, using colony suspensions. All the 58 strains of B. fragilis and 42 strains of black pigmented bacteroides tested could be correctly identified by this method. The specificity of the coagglutination reagent was confirmed by the absence of cross reactivity with the related species of bacteroides, viz., B. distasonis, B. ovatus, B. vulgatus and B. thetaiotaomicron as well as other anaerobic and aerobic bacteria. A panel of four antisera against B. fragilis was required for correct identification of the strains tested, indicating the presence of multiple serotypes. On the other hand, all 42 strains of black pigmented bacteroides tested could be identified, using a single reagent as these strains appeared to have no antigenic type variants. PMID:1855827

  10. Method for microRNA isolation from clinical serum samples.

    PubMed

    Li, Yu; Kowdley, Kris V

    2012-12-01

    MicroRNAs are a group of intracellular noncoding RNA molecules that have been implicated in a variety of human diseases. Because of their high stability in blood, microRNAs released into circulation could be potentially utilized as noninvasive biomarkers for diagnosis or prognosis. Current microRNA isolation protocols are specifically designed for solid tissues and are impractical for biomarker development utilizing small-volume serum samples on a large scale. Thus, a protocol for microRNA isolation from serum is needed to accommodate these conditions in biomarker development. To establish such a protocol, we developed a simplified approach to normalize sample input by using single synthetic spike-in microRNA. We evaluated three commonly used commercial microRNA isolation kits for the best performance by comparing RNA quality and yield. The manufacturer's protocol was further modified to improve the microRNA yield from 200μl of human serum. MicroRNAs isolated from a large set of clinical serum samples were tested on the miRCURY LNA real-time PCR panel and confirmed to be suitable for high-throughput microRNA profiling. In conclusion, we have established a proven method for microRNA isolation from clinical serum samples suitable for microRNA biomarker development.

  11. Hand held sample tube manipulator, system and method

    DOEpatents

    Kenny, Donald V [Liberty Township, OH; Smith, Deborah L [Liberty Township, OH; Severance, Richard A [late of Columbus, OH

    2001-01-01

    A manipulator apparatus, system and method for measuring analytes present in sample tubes. The manipulator apparatus includes a housing having a central bore with an inlet end and outlet end; a plunger mechanism with at least a portion thereof slideably disposed for reciprocal movement within the central bore, the plunger mechanism having a tubular gas channel with an inlet end and an outlet end, the gas channel inlet end disposed in the same direction as said inlet end of the central bore, wherein the inlet end of said plunger mechanism is adapted for movement so as to expel a sample tube inserted in the bore at the outlet end of the housing, the inlet end of the plunger mechanism is adapted for connection to gas supply; a first seal is disposed in the housing for sealing between the central bore and the plunger mechanism; a second seal is disposed at the outlet end of the housing for sealing between the central bore and a sample tube; a holder mounted on the housing for holding the sample tube; and a biasing mechanism for returning the plunger mechanism to a starting position.

  12. A time domain sampling method for inverse acoustic scattering problems

    NASA Astrophysics Data System (ADS)

    Guo, Yukun; Hömberg, Dietmar; Hu, Guanghui; Li, Jingzhi; Liu, Hongyu

    2016-06-01

    This work concerns the inverse scattering problems of imaging unknown/inaccessible scatterers by transient acoustic near-field measurements. Based on the analysis of the migration method, we propose efficient and effective sampling schemes for imaging small and extended scatterers from knowledge of time-dependent scattered data due to incident impulsive point sources. Though the inverse scattering problems are known to be nonlinear and ill-posed, the proposed imaging algorithms are totally "direct" involving only integral calculations on the measurement surface. Theoretical justifications are presented and numerical experiments are conducted to demonstrate the effectiveness and robustness of our methods. In particular, the proposed static imaging functionals enhance the performance of the total focusing method (TFM) and the dynamic imaging functionals show analogous behavior to the time reversal inversion but without solving time-dependent wave equations.

  13. A Novel Method for Sampling Alpha-Helical Protein Backbones

    DOE R&D Accomplishments Database

    Fain, Boris; Levitt, Michael

    2001-01-01

    We present a novel technique of sampling the configurations of helical proteins. Assuming knowledge of native secondary structure, we employ assembly rules gathered from a database of existing structures to enumerate the geometrically possible 3-D arrangements of the constituent helices. We produce a library of possible folds for 25 helical protein cores. In each case the method finds significant numbers of conformations close to the native structure. In addition we assign coordinates to all atoms for 4 of the 25 proteins. In the context of database driven exhaustive enumeration our method performs extremely well, yielding significant percentages of structures (0.02%--82%) within 6A of the native structure. The method's speed and efficiency make it a valuable contribution towards the goal of predicting protein structure.

  14. ADAPTIVE ANNEALED IMPORTANCE SAMPLING FOR MULTIMODAL POSTERIOR EXPLORATION AND MODEL SELECTION WITH APPLICATION TO EXTRASOLAR PLANET DETECTION

    SciTech Connect

    Liu, Bin

    2014-07-01

    We describe an algorithm that can adaptively provide mixture summaries of multimodal posterior distributions. The parameter space of the involved posteriors ranges in size from a few dimensions to dozens of dimensions. This work was motivated by an astrophysical problem called extrasolar planet (exoplanet) detection, wherein the computation of stochastic integrals that are required for Bayesian model comparison is challenging. The difficulty comes from the highly nonlinear models that lead to multimodal posterior distributions. We resort to importance sampling (IS) to estimate the integrals, and thus translate the problem to be how to find a parametric approximation of the posterior. To capture the multimodal structure in the posterior, we initialize a mixture proposal distribution and then tailor its parameters elaborately to make it resemble the posterior to the greatest extent possible. We use the effective sample size (ESS) calculated based on the IS draws to measure the degree of approximation. The bigger the ESS is, the better the proposal resembles the posterior. A difficulty within this tailoring operation lies in the adjustment of the number of mixing components in the mixture proposal. Brute force methods just preset it as a large constant, which leads to an increase in the required computational resources. We provide an iterative delete/merge/add process, which works in tandem with an expectation-maximization step to tailor such a number online. The efficiency of our proposed method is tested via both simulation studies and real exoplanet data analysis.

  15. 40 CFR 80.335 - What gasoline sample retention requirements apply to refiners and importers?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... certify that the procedures meet the requirements of the ASTM procedures required under 40 CFR 80.330. (d... plus a sample of the ethanol used to conduct the handblend testing pursuant to § 80.69 must be retained....

  16. Well fluid isolation and sample apparatus and method

    DOEpatents

    Schalla, Ronald; Smith, Ronald M.; Hall, Stephen H.; Smart, John E.

    1995-01-01

    The present invention specifically permits purging and/or sampling of a well but only removing, at most, about 25% of the fluid volume compared to conventional methods and, at a minimum, removing none of the fluid volume from the well. The invention is an isolation assembly that is inserted into the well. The isolation assembly is designed so that only a volume of fluid between the outside diameter of the isolation assembly and the inside diameter of the well over a fluid column height from the bottom of the well to the top of the active portion (lower annulus) is removed. A seal may be positioned above the active portion thereby sealing the well and preventing any mixing or contamination of inlet fluid with fluid above the packer. Purged well fluid is stored in a riser above the packer. Ports in the wall of the isolation assembly permit purging and sampling of the lower annulus along the height of the active portion.

  17. Path Sampling Methods for Enzymatic Quantum Particle Transfer Reactions.

    PubMed

    Dzierlenga, M W; Varga, M J; Schwartz, S D

    2016-01-01

    The mechanisms of enzymatic reactions are studied via a host of computational techniques. While previous methods have been used successfully, many fail to incorporate the full dynamical properties of enzymatic systems. This can lead to misleading results in cases where enzyme motion plays a significant role in the reaction coordinate, which is especially relevant in particle transfer reactions where nuclear tunneling may occur. In this chapter, we outline previous methods, as well as discuss newly developed dynamical methods to interrogate mechanisms of enzymatic particle transfer reactions. These new methods allow for the calculation of free energy barriers and kinetic isotope effects (KIEs) with the incorporation of quantum effects through centroid molecular dynamics (CMD) and the full complement of enzyme dynamics through transition path sampling (TPS). Recent work, summarized in this chapter, applied the method for calculation of free energy barriers to reaction in lactate dehydrogenase (LDH) and yeast alcohol dehydrogenase (YADH). We found that tunneling plays an insignificant role in YADH but plays a more significant role in LDH, though not dominant over classical transfer. Additionally, we summarize the application of a TPS algorithm for the calculation of reaction rates in tandem with CMD to calculate the primary H/D KIE of YADH from first principles. We found that the computationally obtained KIE is within the margin of error of experimentally determined KIEs and corresponds to the KIE of particle transfer in the enzyme. These methods provide new ways to investigate enzyme mechanism with the inclusion of protein and quantum dynamics.

  18. Path Sampling Methods for Enzymatic Quantum Particle Transfer Reactions.

    PubMed

    Dzierlenga, M W; Varga, M J; Schwartz, S D

    2016-01-01

    The mechanisms of enzymatic reactions are studied via a host of computational techniques. While previous methods have been used successfully, many fail to incorporate the full dynamical properties of enzymatic systems. This can lead to misleading results in cases where enzyme motion plays a significant role in the reaction coordinate, which is especially relevant in particle transfer reactions where nuclear tunneling may occur. In this chapter, we outline previous methods, as well as discuss newly developed dynamical methods to interrogate mechanisms of enzymatic particle transfer reactions. These new methods allow for the calculation of free energy barriers and kinetic isotope effects (KIEs) with the incorporation of quantum effects through centroid molecular dynamics (CMD) and the full complement of enzyme dynamics through transition path sampling (TPS). Recent work, summarized in this chapter, applied the method for calculation of free energy barriers to reaction in lactate dehydrogenase (LDH) and yeast alcohol dehydrogenase (YADH). We found that tunneling plays an insignificant role in YADH but plays a more significant role in LDH, though not dominant over classical transfer. Additionally, we summarize the application of a TPS algorithm for the calculation of reaction rates in tandem with CMD to calculate the primary H/D KIE of YADH from first principles. We found that the computationally obtained KIE is within the margin of error of experimentally determined KIEs and corresponds to the KIE of particle transfer in the enzyme. These methods provide new ways to investigate enzyme mechanism with the inclusion of protein and quantum dynamics. PMID:27497161

  19. Vadose Zone Sampling Methods for Detection of Preferential Pesticides Transport

    NASA Astrophysics Data System (ADS)

    Peranginangin, N.; Richards, B. K.; Steenhuis, T. S.

    2003-12-01

    Leaching of agricultural applied chemicals through the vadose zone is a major cause for the occurrence of agrichemicals in groundwater. Accurate soil water sampling methods are needed to ensure meaningful monitoring results, especially for soils that have significant preferential flow paths. The purpose of this study was to assess the capability and the effectiveness of various soil water sampling methods in detecting preferential transport of pesticides in a strongly-structured silty clay loam (Hudson series) soil. Soil water sampling devices tested were wick pan and gravity pan lysimeters, tile lines, porous ceramic cups, and pipe lysimeters; all installed at 45 to105 cm depth below the ground surface. A reasonable worse-case scenario was tested by applying a simulated rain storm soon after pesticides were sprayed at agronomic rates. Herbicides atrazine (6-chloro-N2-ethyl-N4-isopropyl-1,3,5-triazine-2,4-diamine) and 2,4-D (2,4-dichloro-phenoxyacetic acid) were chosen as model compounds. Chloride (KCl) tracer was used to determine spatial and temporal distribution of non-reactive solute and water as well as a basis for determining the retardation in pesticides movement. Results show that observed pesticide mobility was much greater than would be predicted by uniform flow. Under relatively high soil moisture conditions, gravity and wick pan lysimeters had comparably good collection efficiencies, whereas the wick samplers had an advantage over gravity driven sampler when the soil moisture content was below field capacity. Pipe lysimeters had breakthrough patterns that were similar to pan samplers. At small plot scale, tile line samplers tended to underestimate solute concentration because of water dilution around the samplers. The use of porous cup samplers performed poorly because of their sensitivity to local profile characteristics: only by chance can they intercept and sample the preferential flow paths that are critical to transport. Wick sampler had the least

  20. Comparison of individual and pooled sampling methods for detecting bacterial pathogens of fish

    USGS Publications Warehouse

    Mumford, Sonia; Patterson, Chris; Evered, J.; Brunson, Ray; Levine, J.; Winton, J.

    2005-01-01

    Examination of finfish populations for viral and bacterial pathogens is an important component of fish disease control programs worldwide. Two methods are commonly used for collecting tissue samples for bacteriological culture, the currently accepted standards for detection of bacterial fish pathogens. The method specified in the Office International des Epizooties Manual of Diagnostic Tests for Aquatic Animals permits combining renal and splenic tissues from as many as 5 fish into pooled samples. The American Fisheries Society (AFS) Blue Book/US Fish and Wildlife Service (USFWS) Inspection Manual specifies the use of a bacteriological loop for collecting samples from the kidney of individual fish. An alternative would be to more fully utilize the pooled samples taken for virology. If implemented, this approach would provide substantial savings in labor and materials. To compare the relative performance of the AFS/USFWS method and this alternative approach, cultures of Yersinia ruckeri were used to establish low-level infections in groups of rainbow trout (Oncorhynchus mykiss) that were sampled by both methods. Yersinia ruckeri was cultured from 22 of 37 groups by at least 1 method. The loop method yielded 18 positive groups, with 1 group positive in the loop samples but negative in the pooled samples. The pooled samples produced 21 positive groups, with 4 groups positive in the pooled samples but negative in the loop samples. There was statistically significant agreement (Spearman coefficient 0.80, P < 0.001) in the relative ability of the 2 sampling methods to permit detection of low-level bacterial infections of rainbow trout.

  1. Artifact free denuder method for sampling of carbonaceous aerosols

    NASA Astrophysics Data System (ADS)

    Mikuška, P.; Vecera, Z.; Broškovicová, A.

    2003-04-01

    Over the past decade, a growing attention has been focused on the carbonaceous aerosols. Although they may account for 30--60% of the total fine aerosol mass, their concentration and formation mechanisms are not well understood, particularly in comparison with major fine particle inorganic species. The deficiency in knowledge of carbonaceous aerosols results from their complexity and because of problems associated with their collection. Conventional sampling techniques of the carbonaceous aerosols, which utilize filters/backup adsorbents suffer from sampling artefacts. Positive artifacts are mainly due to adsorption of gas-phase organic compounds by the filter material or by the already collected particles, whereas negative artifacts arise from the volatilisation of already collected organic compounds from the filter. Furthermore, in the course of the sampling, the composition of the collected organic compounds may be modified by oxidants (O_3, NO_2, PAN, peroxides) that are present in the air passing through the sampler. It is clear that new, artifact free, method for sampling of carbonaceous aerosols is needed. A combination of a diffusion denuder and a filter in series is very promising in this respect. The denuder is expected to collect gaseous oxidants and gas-phase organic compounds from sample air stream prior to collection of aerosol particles on filters, and eliminate thus both positive and negative sampling artifacts for carbonaceous aerosols. This combination is subject of the presentation. Several designs of diffusion denuders (cylindrical, annular, parallel plate, multi-channel) in combination with various types of wall coatings (dry, liquid) were examined. Special attention was given to preservation of the long-term collection efficiency. Different adsorbents (activated charcoal, molecular sieve, porous polymers) and sorbents coated with various chemical reagents (KI, Na_2SO_3, MnO_2, ascorbic acid) or chromatographic stationary phases (silicon oils

  2. Drum plug piercing and sampling device and method

    DOEpatents

    Counts, Kevin T.

    2011-04-26

    An apparatus and method for piercing a drum plug of a drum in order to sample and/or vent gases that may accumulate in a space of the drum is provided. The drum is not damaged and can be reused since the pierced drum plug can be subsequently replaced. The apparatus includes a frame that is configured for engagement with the drum. A cylinder actuated by a fluid is mounted to the frame. A piercer is placed into communication with the cylinder so that actuation of the cylinder causes the piercer to move in a linear direction so that the piercer may puncture the drum plug of the drum.

  3. Methods of scaling threshold color difference using printed samples

    NASA Astrophysics Data System (ADS)

    Huang, Min; Cui, Guihua; Liu, Haoxue; Luo, M. Ronnier

    2012-01-01

    A series of printed samples on substrate of semi-gloss paper and with the magnitude of threshold color difference were prepared for scaling the visual color difference and to evaluate the performance of different method. The probabilities of perceptibly was used to normalized to Z-score and different color differences were scaled to the Z-score. The visual color difference was got, and checked with the STRESS factor. The results indicated that only the scales have been changed but the relative scales between pairs in the data are preserved.

  4. Generalized Jones matrix method for homogeneous biaxial samples.

    PubMed

    Ortega-Quijano, Noé; Fade, Julien; Alouini, Mehdi

    2015-08-10

    The generalized Jones matrix (GJM) is a recently introduced tool to describe linear transformations of three-dimensional light fields. Based on this framework, a specific method for obtaining the GJM of uniaxial anisotropic media was recently presented. However, the GJM of biaxial media had not been tackled so far, as the previous method made use of a simplified rotation matrix that lacks a degree of freedom in the three-dimensional rotation, thus being not suitable for calculating the GJM of biaxial media. In this work we propose a general method to derive the GJM of arbitrarily-oriented homogeneous biaxial media. It is based on the differential generalized Jones matrix (dGJM), which is the three-dimensional counterpart of the conventional differential Jones matrix. We show that the dGJM provides a simple and elegant way to describe uniaxial and biaxial media, with the capacity to model multiple simultaneous optical effects. The practical usefulness of this method is illustrated by the GJM modeling of the polarimetric properties of a negative uniaxial KDP crystal and a biaxial KTP crystal for any three-dimensional sample orientation. The results show that this method constitutes an advantageous and straightforward way to model biaxial media, which show a growing relevance for many interesting applications.

  5. A microRNA isolation method from clinical samples

    PubMed Central

    Zununi Vahed, Sepideh; Barzegari, Abolfazl; Rahbar Saadat, Yalda; Mohammadi, Somayeh; Samadi, Nasser

    2016-01-01

    Introduction: microRNAs (miRNAs) are considered to be novel molecular biomakers that could be exploited in the diagnosis and treatment of different diseases. The present study aimed to develop an efficient miRNA isolation method from different clinical specimens. Methods: Total RNAs were isolated by Trizol reagent followed by precipitation of the large RNAs with potassium acetate (KCH3COOH), polyethylene glycol (PEG) 4000 and 6000, and lithium chloride (LiCl). Then, small RNAs were enriched and recovered from the supernatants by applying a combination of LiCl and ethanol. The efficiency of the method was evaluated through the quality, quantity, and integrity of the recovered RNAs using the A260/280 absorbance ratio, reverse transcription PCR (RT-PCR), and quantitative real-time PCR (q-PCR). Results: Comparison of different RNA isolation methods based on the precipitation of DNA and large RNAs, high miRNA recovery and PCR efficiency revealed that applying potassium acetate with final precipitation of small RNAs using 2.5 M LiCl plus ethanol can provide high yield and quality small RNAs that can be exploited for clinical purposes. Conclusion: The current isolation method can be applied for most clinical samples including cells, formalin-fixed and paraffin-embedded (FFPE) tissues and even body fluids with a wide applicability in molecular biology investigations. PMID:27340621

  6. A stochastic optimization method to estimate the spatial distribution of a pathogen from a sample.

    PubMed

    Parnell, S; Gottwald, T R; Irey, M S; Luo, W; van den Bosch, F

    2011-10-01

    Information on the spatial distribution of plant disease can be utilized to implement efficient and spatially targeted disease management interventions. We present a pathogen-generic method to estimate the spatial distribution of a plant pathogen using a stochastic optimization process which is epidemiologically motivated. Based on an initial sample, the method simulates the individual spread processes of a pathogen between patches of host to generate optimized spatial distribution maps. The method was tested on data sets of Huanglongbing of citrus and was compared with a kriging method from the field of geostatistics using the well-established kappa statistic to quantify map accuracy. Our method produced accurate maps of disease distribution with kappa values as high as 0.46 and was able to outperform the kriging method across a range of sample sizes based on the kappa statistic. As expected, map accuracy improved with sample size but there was a high amount of variation between different random sample placements (i.e., the spatial distribution of samples). This highlights the importance of sample placement on the ability to estimate the spatial distribution of a plant pathogen and we thus conclude that further research into sampling design and its effect on the ability to estimate disease distribution is necessary. PMID:21916625

  7. Eigenvector method for umbrella sampling enables error analysis

    NASA Astrophysics Data System (ADS)

    Thiede, Erik H.; Van Koten, Brian; Weare, Jonathan; Dinner, Aaron R.

    2016-08-01

    Umbrella sampling efficiently yields equilibrium averages that depend on exploring rare states of a model by biasing simulations to windows of coordinate values and then combining the resulting data with physical weighting. Here, we introduce a mathematical framework that casts the step of combining the data as an eigenproblem. The advantage to this approach is that it facilitates error analysis. We discuss how the error scales with the number of windows. Then, we derive a central limit theorem for averages that are obtained from umbrella sampling. The central limit theorem suggests an estimator of the error contributions from individual windows, and we develop a simple and computationally inexpensive procedure for implementing it. We demonstrate this estimator for simulations of the alanine dipeptide and show that it emphasizes low free energy pathways between stable states in comparison to existing approaches for assessing error contributions. Our work suggests the possibility of using the estimator and, more generally, the eigenvector method for umbrella sampling to guide adaptation of the simulation parameters to accelerate convergence.

  8. Eigenvector method for umbrella sampling enables error analysis.

    PubMed

    Thiede, Erik H; Van Koten, Brian; Weare, Jonathan; Dinner, Aaron R

    2016-08-28

    Umbrella sampling efficiently yields equilibrium averages that depend on exploring rare states of a model by biasing simulations to windows of coordinate values and then combining the resulting data with physical weighting. Here, we introduce a mathematical framework that casts the step of combining the data as an eigenproblem. The advantage to this approach is that it facilitates error analysis. We discuss how the error scales with the number of windows. Then, we derive a central limit theorem for averages that are obtained from umbrella sampling. The central limit theorem suggests an estimator of the error contributions from individual windows, and we develop a simple and computationally inexpensive procedure for implementing it. We demonstrate this estimator for simulations of the alanine dipeptide and show that it emphasizes low free energy pathways between stable states in comparison to existing approaches for assessing error contributions. Our work suggests the possibility of using the estimator and, more generally, the eigenvector method for umbrella sampling to guide adaptation of the simulation parameters to accelerate convergence. PMID:27586912

  9. Method for testing earth samples for contamination by organic contaminants

    DOEpatents

    Schabron, John F.

    1996-01-01

    Provided is a method for testing earth samples for contamination by organic contaminants, and particularly for aromatic compounds such as those found in diesel fuel and other heavy fuel oils, kerosene, creosote, coal oil, tars and asphalts. A drying step is provided in which a drying agent is contacted with either the earth sample or a liquid extract phase to reduce to possibility of false indications of contamination that could occur when humic material is present in the earth sample. This is particularly a problem when using relatively safe, non-toxic and inexpensive polar solvents such as isopropyl alcohol since the humic material tends to be very soluble in those solvents when water is present. Also provided is an ultraviolet spectroscopic measuring technique for obtaining an indication as to whether a liquid extract phase contains aromatic organic contaminants. In one embodiment, the liquid extract phase is subjected to a narrow and discrete band of radiation including a desired wave length and the ability of the liquid extract phase to absorb that wavelength of ultraviolet radiation is measured to provide an indication of the presence of aromatic organic contaminants.

  10. Method for testing earth samples for contamination by organic contaminants

    DOEpatents

    Schabron, J.F.

    1996-10-01

    Provided is a method for testing earth samples for contamination by organic contaminants, and particularly for aromatic compounds such as those found in diesel fuel and other heavy fuel oils, kerosene, creosote, coal oil, tars and asphalts. A drying step is provided in which a drying agent is contacted with either the earth sample or a liquid extract phase to reduce to possibility of false indications of contamination that could occur when humic material is present in the earth sample. This is particularly a problem when using relatively safe, non-toxic and inexpensive polar solvents such as isopropyl alcohol since the humic material tends to be very soluble in those solvents when water is present. Also provided is an ultraviolet spectroscopic measuring technique for obtaining an indication as to whether a liquid extract phase contains aromatic organic contaminants. In one embodiment, the liquid extract phase is subjected to a narrow and discrete band of radiation including a desired wave length and the ability of the liquid extract phase to absorb that wavelength of ultraviolet radiation is measured to provide an indication of the presence of aromatic organic contaminants. 2 figs.

  11. Photothermal method using a pyroelectric sensor for thermophysical characterization of agricultural and biological samples

    NASA Astrophysics Data System (ADS)

    Frandas, A.; Dadarlat, Dorin; Chirtoc, Mihai; Jalink, Henk; Bicanic, Dane D.; Paris, D.; Antoniow, Jean S.; Egee, Michel; Ungureanu, Costica

    1998-07-01

    The photopyroelectric method in different experimental configurations was used for thermophysical characterization of agricultural and biological samples. The study appears important due to the relation of thermal parameters to the quality of foodstuffs (connected to their preservation, storage and adulteration), migration profiles in biodegradable packages, and the mechanism of desiccation tolerance of seeds. Results are presented on the thermal parameters measurement and their dependence on temperature and water content for samples such as: honey, starch, seeds.

  12. 40 CFR 80.335 - What gasoline sample retention requirements apply to refiners and importers?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) REGULATION OF FUELS AND FUEL ADDITIVES Gasoline Sulfur... certify that the procedures meet the requirements of the ASTM procedures required under 40 CFR 80.330. (d... plus a sample of the ethanol used to conduct the handblend testing pursuant to § 80.69 must be retained....

  13. 40 CFR 80.335 - What gasoline sample retention requirements apply to refiners and importers?

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... certify that the procedures meet the requirements of the ASTM procedures required under 40 CFR 80.330. (d... 40 Protection of Environment 17 2013-07-01 2013-07-01 false What gasoline sample retention... PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) REGULATION OF FUELS AND FUEL ADDITIVES Gasoline...

  14. 40 CFR 80.335 - What gasoline sample retention requirements apply to refiners and importers?

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... certify that the procedures meet the requirements of the ASTM procedures required under 40 CFR 80.330. (d... 40 Protection of Environment 17 2014-07-01 2014-07-01 false What gasoline sample retention... PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) REGULATION OF FUELS AND FUEL ADDITIVES Gasoline...

  15. 40 CFR 80.335 - What gasoline sample retention requirements apply to refiners and importers?

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... certify that the procedures meet the requirements of the ASTM procedures required under 40 CFR 80.330. (d... 40 Protection of Environment 17 2012-07-01 2012-07-01 false What gasoline sample retention... PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) REGULATION OF FUELS AND FUEL ADDITIVES Gasoline...

  16. SAMSAN- MODERN NUMERICAL METHODS FOR CLASSICAL SAMPLED SYSTEM ANALYSIS

    NASA Technical Reports Server (NTRS)

    Frisch, H. P.

    1994-01-01

    SAMSAN was developed to aid the control system analyst by providing a self consistent set of computer algorithms that support large order control system design and evaluation studies, with an emphasis placed on sampled system analysis. Control system analysts have access to a vast array of published algorithms to solve an equally large spectrum of controls related computational problems. The analyst usually spends considerable time and effort bringing these published algorithms to an integrated operational status and often finds them less general than desired. SAMSAN reduces the burden on the analyst by providing a set of algorithms that have been well tested and documented, and that can be readily integrated for solving control system problems. Algorithm selection for SAMSAN has been biased toward numerical accuracy for large order systems with computational speed and portability being considered important but not paramount. In addition to containing relevant subroutines from EISPAK for eigen-analysis and from LINPAK for the solution of linear systems and related problems, SAMSAN contains the following not so generally available capabilities: 1) Reduction of a real non-symmetric matrix to block diagonal form via a real similarity transformation matrix which is well conditioned with respect to inversion, 2) Solution of the generalized eigenvalue problem with balancing and grading, 3) Computation of all zeros of the determinant of a matrix of polynomials, 4) Matrix exponentiation and the evaluation of integrals involving the matrix exponential, with option to first block diagonalize, 5) Root locus and frequency response for single variable transfer functions in the S, Z, and W domains, 6) Several methods of computing zeros for linear systems, and 7) The ability to generate documentation "on demand". All matrix operations in the SAMSAN algorithms assume non-symmetric matrices with real double precision elements. There is no fixed size limit on any matrix in any

  17. RESULTS FROM EPA FUNDED RESEARCH PROGRAMS ON THE IMPORTANCE OF PURGE VOLUME, SAMPLE VOLUME, SAMPLE FLOW RATE AND TEMPORAL VARIATIONS ON SOIL GAS CONCENTRATIONS

    EPA Science Inventory

    Two research studies funded and overseen by EPA have been conducted since October 2006 on soil gas sampling methods and variations in shallow soil gas concentrations with the purpose of improving our understanding of soil gas methods and data for vapor intrusion applications. Al...

  18. BMAA extraction of cyanobacteria samples: which method to choose?

    PubMed

    Lage, Sandra; Burian, Alfred; Rasmussen, Ulla; Costa, Pedro Reis; Annadotter, Heléne; Godhe, Anna; Rydberg, Sara

    2016-01-01

    β-N-Methylamino-L-alanine (BMAA), a neurotoxin reportedly produced by cyanobacteria, diatoms and dinoflagellates, is proposed to be linked to the development of neurological diseases. BMAA has been found in aquatic and terrestrial ecosystems worldwide, both in its phytoplankton producers and in several invertebrate and vertebrate organisms that bioaccumulate it. LC-MS/MS is the most frequently used analytical technique in BMAA research due to its high selectivity, though consensus is lacking as to the best extraction method to apply. This study accordingly surveys the efficiency of three extraction methods regularly used in BMAA research to extract BMAA from cyanobacteria samples. The results obtained provide insights into possible reasons for the BMAA concentration discrepancies in previous publications. In addition and according to the method validation guidelines for analysing cyanotoxins, the TCA protein precipitation method, followed by AQC derivatization and LC-MS/MS analysis, is now validated for extracting protein-bound (after protein hydrolysis) and free BMAA from cyanobacteria matrix. BMAA biological variability was also tested through the extraction of diatom and cyanobacteria species, revealing a high variance in BMAA levels (0.0080-2.5797 μg g(-1) DW).

  19. Simulated Tempering Distributed Replica Sampling, Virtual Replica Exchange, and Other Generalized-Ensemble Methods for Conformational Sampling.

    PubMed

    Rauscher, Sarah; Neale, Chris; Pomès, Régis

    2009-10-13

    Generalized-ensemble algorithms in temperature space have become popular tools to enhance conformational sampling in biomolecular simulations. A random walk in temperature leads to a corresponding random walk in potential energy, which can be used to cross over energetic barriers and overcome the problem of quasi-nonergodicity. In this paper, we introduce two novel methods: simulated tempering distributed replica sampling (STDR) and virtual replica exchange (VREX). These methods are designed to address the practical issues inherent in the replica exchange (RE), simulated tempering (ST), and serial replica exchange (SREM) algorithms. RE requires a large, dedicated, and homogeneous cluster of CPUs to function efficiently when applied to complex systems. ST and SREM both have the drawback of requiring extensive initial simulations, possibly adaptive, for the calculation of weight factors or potential energy distribution functions. STDR and VREX alleviate the need for lengthy initial simulations, and for synchronization and extensive communication between replicas. Both methods are therefore suitable for distributed or heterogeneous computing platforms. We perform an objective comparison of all five algorithms in terms of both implementation issues and sampling efficiency. We use disordered peptides in explicit water as test systems, for a total simulation time of over 42 μs. Efficiency is defined in terms of both structural convergence and temperature diffusion, and we show that these definitions of efficiency are in fact correlated. Importantly, we find that ST-based methods exhibit faster temperature diffusion and correspondingly faster convergence of structural properties compared to RE-based methods. Within the RE-based methods, VREX is superior to both SREM and RE. On the basis of our observations, we conclude that ST is ideal for simple systems, while STDR is well-suited for complex systems.

  20. Automated Aqueous Sample Concentration Methods for in situ Astrobiological Instrumentation

    NASA Astrophysics Data System (ADS)

    Aubrey, A. D.; Grunthaner, F. J.

    2009-12-01

    The era of wet chemical experiments for in situ planetary science investigations is upon us, as evidenced by recent results from the surface of Mars by Phoenix’s microscopy, electrochemistry, and conductivity analyzer, MECA [1]. Studies suggest that traditional thermal volatilization methods for planetary science in situ investigations induce organic degradation during sample processing [2], an effect that is enhanced in the presence of oxidants [3]. Recent developments have trended towards adaptation of non-destructive aqueous extraction and analytical methods for future astrobiological instrumentation. Wet chemical extraction techniques under investigation include subcritical water extraction, SCWE [4], aqueous microwave assisted extraction, MAE, and organic solvent extraction [5]. Similarly, development of miniaturized analytical space flight instruments that require aqueous extracts include microfluidic capillary electrophoresis chips, μCE [6], liquid-chromatography mass-spectrometrometers, LC-MS [7], and life marker chips, LMC [8]. If organics are present on the surface of Mars, they are expected to be present at extremely low concentrations (parts-per-billion), orders of magnitude below the sensitivities of most flight instrument technologies. Therefore, it becomes necessary to develop and integrate concentration mechanisms for in situ sample processing before delivery to analytical flight instrumentation. We present preliminary results of automated solid-phase-extraction (SPE) sample purification and concentration methods for the treatment of highly saline aqueous soil extracts. These methods take advantage of the affinity of low molecular weight organic compounds with natural and synthetic scavenger materials. These interactions allow for the separation of target organic analytes from unfavorable background species (i.e. salts) during inline treatment, and a clever method for selective desorption is utilized to obtain concentrated solutions on the order

  1. 40 CFR 80.583 - What alternative sampling and testing requirements apply to importers who transport motor vehicle...

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... requirements apply to importers who transport motor vehicle diesel fuel, NRLM diesel fuel, or ECA marine fuel... (CONTINUED) AIR PROGRAMS (CONTINUED) REGULATION OF FUELS AND FUEL ADDITIVES Motor Vehicle Diesel Fuel... alternative sampling and testing requirements apply to importers who transport motor vehicle diesel fuel,...

  2. Approaches of using the beard testing method to obtain complete length distributions of the original samples

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The fiber testing instruments such as HVI can rapidly measure fiber length by testing a tapered fiber beard of the sample. But these instruments that use the beard testing method only report a limited number of fiber length parameters instead of the complete length distribution that is important fo...

  3. Active Search on Carcasses versus Pitfall Traps: a Comparison of Sampling Methods.

    PubMed

    Zanetti, N I; Camina, R; Visciarelli, E C; Centeno, N D

    2016-04-01

    The study of insect succession in cadavers and the classification of arthropods have mostly been done by placing a carcass in a cage, protected from vertebrate scavengers, which is then visited periodically. An alternative is to use specific traps. Few studies on carrion ecology and forensic entomology involving the carcasses of large vertebrates have employed pitfall traps. The aims of this study were to compare both sampling methods (active search on a carcass and pitfall trapping) for each coleopteran family, and to establish whether there is a discrepancy (underestimation and/or overestimation) in the presence of each family by either method. A great discrepancy was found for almost all families with some of them being more abundant in samples obtained through active search on carcasses and others in samples from traps, whereas two families did not show any bias towards a given sampling method. The fact that families may be underestimated or overestimated by the type of sampling technique highlights the importance of combining both methods, active search on carcasses and pitfall traps, in order to obtain more complete information on decomposition, carrion habitat and cadaveric families or species. Furthermore, a hypothesis advanced on the reasons for the underestimation by either sampling method showing biases towards certain families. Information about the sampling techniques indicating which would be more appropriate to detect or find a particular family is provided.

  4. Active Search on Carcasses versus Pitfall Traps: a Comparison of Sampling Methods.

    PubMed

    Zanetti, N I; Camina, R; Visciarelli, E C; Centeno, N D

    2016-04-01

    The study of insect succession in cadavers and the classification of arthropods have mostly been done by placing a carcass in a cage, protected from vertebrate scavengers, which is then visited periodically. An alternative is to use specific traps. Few studies on carrion ecology and forensic entomology involving the carcasses of large vertebrates have employed pitfall traps. The aims of this study were to compare both sampling methods (active search on a carcass and pitfall trapping) for each coleopteran family, and to establish whether there is a discrepancy (underestimation and/or overestimation) in the presence of each family by either method. A great discrepancy was found for almost all families with some of them being more abundant in samples obtained through active search on carcasses and others in samples from traps, whereas two families did not show any bias towards a given sampling method. The fact that families may be underestimated or overestimated by the type of sampling technique highlights the importance of combining both methods, active search on carcasses and pitfall traps, in order to obtain more complete information on decomposition, carrion habitat and cadaveric families or species. Furthermore, a hypothesis advanced on the reasons for the underestimation by either sampling method showing biases towards certain families. Information about the sampling techniques indicating which would be more appropriate to detect or find a particular family is provided. PMID:26732526

  5. 19 CFR 151.70 - Method of sampling by Customs.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... THE TREASURY (CONTINUED) EXAMINATION, SAMPLING, AND TESTING OF MERCHANDISE Wool and Hair § 151.70... feasible to obtain a representative general sample of the wool or hair in a sampling unit or to test such...

  6. 19 CFR 151.70 - Method of sampling by Customs.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... THE TREASURY (CONTINUED) EXAMINATION, SAMPLING, AND TESTING OF MERCHANDISE Wool and Hair § 151.70... feasible to obtain a representative general sample of the wool or hair in a sampling unit or to test such...

  7. 19 CFR 151.70 - Method of sampling by Customs.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... THE TREASURY (CONTINUED) EXAMINATION, SAMPLING, AND TESTING OF MERCHANDISE Wool and Hair § 151.70... feasible to obtain a representative general sample of the wool or hair in a sampling unit or to test such...

  8. 19 CFR 151.70 - Method of sampling by Customs.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... THE TREASURY (CONTINUED) EXAMINATION, SAMPLING, AND TESTING OF MERCHANDISE Wool and Hair § 151.70... feasible to obtain a representative general sample of the wool or hair in a sampling unit or to test such...

  9. 19 CFR 151.70 - Method of sampling by Customs.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... THE TREASURY (CONTINUED) EXAMINATION, SAMPLING, AND TESTING OF MERCHANDISE Wool and Hair § 151.70... feasible to obtain a representative general sample of the wool or hair in a sampling unit or to test such...

  10. 40 CFR Appendix I to Part 261 - Representative Sampling Methods

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... form and consistency of the waste materials to be sampled. Samples collected using the sampling... Crushed or powdered material—ASTM Standard D346-75 Soil or rock-like material—ASTM Standard D420-69...

  11. 40 CFR Appendix I to Part 261 - Representative Sampling Methods

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... form and consistency of the waste materials to be sampled. Samples collected using the sampling... Crushed or powdered material—ASTM Standard D346-75 Soil or rock-like material—ASTM Standard D420-69...

  12. Method of remotely characterizing thermal properties of a sample

    NASA Technical Reports Server (NTRS)

    Heyman, Joseph S. (Inventor); Heath, D. Michele (Inventor); Welch, Christopher (Inventor); Winfree, William P. (Inventor); Miller, William E. (Inventor)

    1992-01-01

    A sample in a wind tunnel is radiated from a thermal energy source outside of the wind tunnel. A thermal imager system, also located outside of the wind tunnel, reads surface radiations from the sample as a function of time. The produced thermal images are characteristic of the heat transferred from the sample to the flow across the sample. In turn, the measured rates of heat loss of the sample are characteristic of the flow and the sample.

  13. Clustering Methods with Qualitative Data: a Mixed-Methods Approach for Prevention Research with Small Samples.

    PubMed

    Henry, David; Dymnicki, Allison B; Mohatt, Nathaniel; Allen, James; Kelly, James G

    2015-10-01

    Qualitative methods potentially add depth to prevention research but can produce large amounts of complex data even with small samples. Studies conducted with culturally distinct samples often produce voluminous qualitative data but may lack sufficient sample sizes for sophisticated quantitative analysis. Currently lacking in mixed-methods research are methods allowing for more fully integrating qualitative and quantitative analysis techniques. Cluster analysis can be applied to coded qualitative data to clarify the findings of prevention studies by aiding efforts to reveal such things as the motives of participants for their actions and the reasons behind counterintuitive findings. By clustering groups of participants with similar profiles of codes in a quantitative analysis, cluster analysis can serve as a key component in mixed-methods research. This article reports two studies. In the first study, we conduct simulations to test the accuracy of cluster assignment using three different clustering methods with binary data as produced when coding qualitative interviews. Results indicated that hierarchical clustering, K-means clustering, and latent class analysis produced similar levels of accuracy with binary data and that the accuracy of these methods did not decrease with samples as small as 50. Whereas the first study explores the feasibility of using common clustering methods with binary data, the second study provides a "real-world" example using data from a qualitative study of community leadership connected with a drug abuse prevention project. We discuss the implications of this approach for conducting prevention research, especially with small samples and culturally distinct communities.

  14. Clustering Methods with Qualitative Data: A Mixed Methods Approach for Prevention Research with Small Samples

    PubMed Central

    Henry, David; Dymnicki, Allison B.; Mohatt, Nathaniel; Allen, James; Kelly, James G.

    2016-01-01

    Qualitative methods potentially add depth to prevention research, but can produce large amounts of complex data even with small samples. Studies conducted with culturally distinct samples often produce voluminous qualitative data, but may lack sufficient sample sizes for sophisticated quantitative analysis. Currently lacking in mixed methods research are methods allowing for more fully integrating qualitative and quantitative analysis techniques. Cluster analysis can be applied to coded qualitative data to clarify the findings of prevention studies by aiding efforts to reveal such things as the motives of participants for their actions and the reasons behind counterintuitive findings. By clustering groups of participants with similar profiles of codes in a quantitative analysis, cluster analysis can serve as a key component in mixed methods research. This article reports two studies. In the first study, we conduct simulations to test the accuracy of cluster assignment using three different clustering methods with binary data as produced when coding qualitative interviews. Results indicated that hierarchical clustering, K-Means clustering, and latent class analysis produced similar levels of accuracy with binary data, and that the accuracy of these methods did not decrease with samples as small as 50. Whereas the first study explores the feasibility of using common clustering methods with binary data, the second study provides a “real-world” example using data from a qualitative study of community leadership connected with a drug abuse prevention project. We discuss the implications of this approach for conducting prevention research, especially with small samples and culturally distinct communities. PMID:25946969

  15. Markov chain Monte Carlo posterior sampling with the Hamiltonian method.

    SciTech Connect

    Hanson, Kenneth M.

    2001-01-01

    A major advantage of Bayesian data analysis is that provides a characterization of the uncertainty in the model parameters estimated from a given set of measurements in the form of a posterior probability distribution. When the analysis involves a complicated physical phenomenon, the posterior may not be available in analytic form, but only calculable by means of a simulation code. In such cases, the uncertainty in inferred model parameters requires characterization of a calculated functional. An appealing way to explore the posterior, and hence characterize the uncertainty, is to employ the Markov Chain Monte Carlo technique. The goal of MCMC is to generate a sequence random of parameter x samples from a target pdf (probability density function), {pi}(x). In Bayesian analysis, this sequence corresponds to a set of model realizations that follow the posterior distribution. There are two basic MCMC techniques. In Gibbs sampling, typically one parameter is drawn from the conditional pdf at a time, holding all others fixed. In the Metropolis algorithm, all the parameters can be varied at once. The parameter vector is perturbed from the current sequence point by adding a trial step drawn randomly from a symmetric pdf. The trial position is either accepted or rejected on the basis of the probability at the trial position relative to the current one. The Metropolis algorithm is often employed because of its simplicity. The aim of this work is to develop MCMC methods that are useful for large numbers of parameters, n, say hundreds or more. In this regime the Metropolis algorithm can be unsuitable, because its efficiency drops as 0.3/n. The efficiency is defined as the reciprocal of the number of steps in the sequence needed to effectively provide a statistically independent sample from {pi}.

  16. Passive sampling methods for contaminated sediments: risk assessment and management.

    PubMed

    Greenberg, Marc S; Chapman, Peter M; Allan, Ian J; Anderson, Kim A; Apitz, Sabine E; Beegan, Chris; Bridges, Todd S; Brown, Steve S; Cargill, John G; McCulloch, Megan C; Menzie, Charles A; Shine, James P; Parkerton, Thomas F

    2014-04-01

    This paper details how activity-based passive sampling methods (PSMs), which provide information on bioavailability in terms of freely dissolved contaminant concentrations (Cfree ), can be used to better inform risk management decision making at multiple points in the process of assessing and managing contaminated sediment sites. PSMs can increase certainty in site investigation and management, because Cfree is a better predictor of bioavailability than total bulk sediment concentration (Ctotal ) for 4 key endpoints included in conceptual site models (benthic organism toxicity, bioaccumulation, sediment flux, and water column exposures). The use of passive sampling devices (PSDs) presents challenges with respect to representative sampling for estimating average concentrations and other metrics relevant for exposure and risk assessment. These challenges can be addressed by designing studies that account for sources of variation associated with PSMs and considering appropriate spatial scales to meet study objectives. Possible applications of PSMs include: quantifying spatial and temporal trends in bioavailable contaminants, identifying and evaluating contaminant source contributions, calibrating site-specific models, and, improving weight-of-evidence based decision frameworks. PSM data can be used to assist in delineating sediment management zones based on likelihood of exposure effects, monitor remedy effectiveness, and, evaluate risk reduction after sediment treatment, disposal, or beneficial reuse after management actions. Examples are provided illustrating why PSMs and freely dissolved contaminant concentrations (Cfree ) should be incorporated into contaminated sediment investigations and study designs to better focus on and understand contaminant bioavailability, more accurately estimate exposure to sediment-associated contaminants, and better inform risk management decisions. Research and communication needs for encouraging broader use are discussed. PMID

  17. Passive sampling methods for contaminated sediments: Risk assessment and management

    PubMed Central

    Greenberg, Marc S; Chapman, Peter M; Allan, Ian J; Anderson, Kim A; Apitz, Sabine E; Beegan, Chris; Bridges, Todd S; Brown, Steve S; Cargill, John G; McCulloch, Megan C; Menzie, Charles A; Shine, James P; Parkerton, Thomas F

    2014-01-01

    This paper details how activity-based passive sampling methods (PSMs), which provide information on bioavailability in terms of freely dissolved contaminant concentrations (Cfree), can be used to better inform risk management decision making at multiple points in the process of assessing and managing contaminated sediment sites. PSMs can increase certainty in site investigation and management, because Cfree is a better predictor of bioavailability than total bulk sediment concentration (Ctotal) for 4 key endpoints included in conceptual site models (benthic organism toxicity, bioaccumulation, sediment flux, and water column exposures). The use of passive sampling devices (PSDs) presents challenges with respect to representative sampling for estimating average concentrations and other metrics relevant for exposure and risk assessment. These challenges can be addressed by designing studies that account for sources of variation associated with PSMs and considering appropriate spatial scales to meet study objectives. Possible applications of PSMs include: quantifying spatial and temporal trends in bioavailable contaminants, identifying and evaluating contaminant source contributions, calibrating site-specific models, and, improving weight-of-evidence based decision frameworks. PSM data can be used to assist in delineating sediment management zones based on likelihood of exposure effects, monitor remedy effectiveness, and, evaluate risk reduction after sediment treatment, disposal, or beneficial reuse after management actions. Examples are provided illustrating why PSMs and freely dissolved contaminant concentrations (Cfree) should be incorporated into contaminated sediment investigations and study designs to better focus on and understand contaminant bioavailability, more accurately estimate exposure to sediment-associated contaminants, and better inform risk management decisions. Research and communication needs for encouraging broader use are discussed. Integr

  18. Passive sampling methods for contaminated sediments: risk assessment and management.

    PubMed

    Greenberg, Marc S; Chapman, Peter M; Allan, Ian J; Anderson, Kim A; Apitz, Sabine E; Beegan, Chris; Bridges, Todd S; Brown, Steve S; Cargill, John G; McCulloch, Megan C; Menzie, Charles A; Shine, James P; Parkerton, Thomas F

    2014-04-01

    This paper details how activity-based passive sampling methods (PSMs), which provide information on bioavailability in terms of freely dissolved contaminant concentrations (Cfree ), can be used to better inform risk management decision making at multiple points in the process of assessing and managing contaminated sediment sites. PSMs can increase certainty in site investigation and management, because Cfree is a better predictor of bioavailability than total bulk sediment concentration (Ctotal ) for 4 key endpoints included in conceptual site models (benthic organism toxicity, bioaccumulation, sediment flux, and water column exposures). The use of passive sampling devices (PSDs) presents challenges with respect to representative sampling for estimating average concentrations and other metrics relevant for exposure and risk assessment. These challenges can be addressed by designing studies that account for sources of variation associated with PSMs and considering appropriate spatial scales to meet study objectives. Possible applications of PSMs include: quantifying spatial and temporal trends in bioavailable contaminants, identifying and evaluating contaminant source contributions, calibrating site-specific models, and, improving weight-of-evidence based decision frameworks. PSM data can be used to assist in delineating sediment management zones based on likelihood of exposure effects, monitor remedy effectiveness, and, evaluate risk reduction after sediment treatment, disposal, or beneficial reuse after management actions. Examples are provided illustrating why PSMs and freely dissolved contaminant concentrations (Cfree ) should be incorporated into contaminated sediment investigations and study designs to better focus on and understand contaminant bioavailability, more accurately estimate exposure to sediment-associated contaminants, and better inform risk management decisions. Research and communication needs for encouraging broader use are discussed.

  19. Importance of closely spaced vertical sampling in delineating chemical and microbiological gradients in groundwater studies

    USGS Publications Warehouse

    Smith, R.L.; Harvey, R.W.; LeBlanc, D.R.

    1991-01-01

    Vertical gradients of selected chemical constituents, bacterial populations, bacterial activity and electron acceptors were investigated for an unconfined aquifer contaminated with nitrate and organic compounds on Cape Cod, Massachusetts, U.S.A. Fifteen-port multilevel sampling devices (MLS's) were installed within the contaminant plume at the source of the contamination, and at 250 and 2100 m downgradient from the source. Depth profiles of specific conductance and dissolved oxygen at the downgradient sites exhibited vertical gradients that were both steep and inversely related. Narrow zones (2-4 m thick) of high N2O and NH4+ concentrations were also detected within the contaminant plume. A 27-fold change in bacterial abundance; a 35-fold change in frequency of dividing cells (FDC), an indicator of bacterial growth; a 23-fold change in 3H-glucose uptake, a measure of heterotrophic activity; and substantial changes in overall cell morphology were evident within a 9-m vertical interval at 250 m downgradient. The existence of these gradients argues for the need for closely spaced vertical sampling in groundwater studies because small differences in the vertical placement of a well screen can lead to incorrect conclusions about the chemical and microbiological processes within an aquifer.Vertical gradients of selected chemical constituents, bacterial populations, bacterial activity and electron acceptors were investigated for an unconfined aquifer contaminated with nitrate and organic compounds on Cape Cod, Massachusetts, USA. Fifteen-port multilevel sampling devices (MLS's) were installed within the contaminant plume at the source of the contamination, and at 250 and 2100 m downgradient from the source. Depth profiles of specific conductance and dissolved oxygen at the downgradient sites exhibited vertical gradients that were both steep and inversely related. Narrow zones (2-4 m thick) of high N2O and NH4+ concentrations were also detected within the contaminant plume

  20. A Method For Parallel, Automated, Thermal Cycling of Submicroliter Samples

    PubMed Central

    Nakane, Jonathan; Broemeling, David; Donaldson, Roger; Marziali, Andre; Willis, Thomas D.; O'Keefe, Matthew; Davis, Ronald W.

    2001-01-01

    A large fraction of the cost of DNA sequencing and other DNA-analysis processes results from the reagent costs incurred during cycle sequencing or PCR. In particular, the high cost of the enzymes and dyes used in these processes often results in thermal cycling costs exceeding $0.50 per sample. In the case of high-throughput DNA sequencing, this is a significant and unnecessary expense. Improved detection efficiency of new sequencing instrumentation allows the reaction volumes for cycle sequencing to be scaled down to one-tenth of presently used volumes, resulting in at least a 10-fold decrease in the cost of this process. However, commercially available thermal cyclers and automated reaction setup devices have inherent design limitations which make handling volumes of <1 μL extremely difficult. In this paper, we describe a method for thermal cycling aimed at reliable, automated cycling of submicroliter reaction volumes. PMID:11230168

  1. Sampling trace organic compounds in water: a comparison of a continuous active sampler to continuous passive and discrete sampling methods.

    PubMed

    Coes, Alissa L; Paretti, Nicholas V; Foreman, William T; Iverson, Jana L; Alvarez, David A

    2014-03-01

    A continuous active sampling method was compared to continuous passive and discrete sampling methods for the sampling of trace organic compounds (TOCs) in water. Results from each method are compared and contrasted in order to provide information for future investigators to use while selecting appropriate sampling methods for their research. The continuous low-level aquatic monitoring (CLAM) sampler (C.I.Agent® Storm-Water Solutions) is a submersible, low flow-rate sampler, that continuously draws water through solid-phase extraction media. CLAM samplers were deployed at two wastewater-dominated stream field sites in conjunction with the deployment of polar organic chemical integrative samplers (POCIS) and the collection of discrete (grab) water samples. All samples were analyzed for a suite of 69 TOCs. The CLAM and POCIS samples represent time-integrated samples that accumulate the TOCs present in the water over the deployment period (19-23 h for CLAM and 29 days for POCIS); the discrete samples represent only the TOCs present in the water at the time and place of sampling. Non-metric multi-dimensional scaling and cluster analysis were used to examine patterns in both TOC detections and relative concentrations between the three sampling methods. A greater number of TOCs were detected in the CLAM samples than in corresponding discrete and POCIS samples, but TOC concentrations in the CLAM samples were significantly lower than in the discrete and (or) POCIS samples. Thirteen TOCs of varying polarity were detected by all of the three methods. TOC detections and concentrations obtained by the three sampling methods, however, are dependent on multiple factors. This study found that stream discharge, constituent loading, and compound type all affected TOC concentrations detected by each method. In addition, TOC detections and concentrations were affected by the reporting limits, bias, recovery, and performance of each method.

  2. Sampling trace organic compounds in water: a comparison of a continuous active sampler to continuous passive and discrete sampling methods

    USGS Publications Warehouse

    Coes, Alissa L.; Paretti, Nicholas V.; Foreman, William T.; Iverson, Jana L.; Alvarez, David A.

    2014-01-01

    A continuous active sampling method was compared to continuous passive and discrete sampling methods for the sampling of trace organic compounds (TOCs) in water. Results from each method are compared and contrasted in order to provide information for future investigators to use while selecting appropriate sampling methods for their research. The continuous low-level aquatic monitoring (CLAM) sampler (C.I.Agent® Storm-Water Solutions) is a submersible, low flow-rate sampler, that continuously draws water through solid-phase extraction media. CLAM samplers were deployed at two wastewater-dominated stream field sites in conjunction with the deployment of polar organic chemical integrative samplers (POCIS) and the collection of discrete (grab) water samples. All samples were analyzed for a suite of 69 TOCs. The CLAM and POCIS samples represent time-integrated samples that accumulate the TOCs present in the water over the deployment period (19–23 h for CLAM and 29 days for POCIS); the discrete samples represent only the TOCs present in the water at the time and place of sampling. Non-metric multi-dimensional scaling and cluster analysis were used to examine patterns in both TOC detections and relative concentrations between the three sampling methods. A greater number of TOCs were detected in the CLAM samples than in corresponding discrete and POCIS samples, but TOC concentrations in the CLAM samples were significantly lower than in the discrete and (or) POCIS samples. Thirteen TOCs of varying polarity were detected by all of the three methods. TOC detections and concentrations obtained by the three sampling methods, however, are dependent on multiple factors. This study found that stream discharge, constituent loading, and compound type all affected TOC concentrations detected by each method. In addition, TOC detections and concentrations were affected by the reporting limits, bias, recovery, and performance of each method.

  3. Sampling trace organic compounds in water: a comparison of a continuous active sampler to continuous passive and discrete sampling methods.

    PubMed

    Coes, Alissa L; Paretti, Nicholas V; Foreman, William T; Iverson, Jana L; Alvarez, David A

    2014-03-01

    A continuous active sampling method was compared to continuous passive and discrete sampling methods for the sampling of trace organic compounds (TOCs) in water. Results from each method are compared and contrasted in order to provide information for future investigators to use while selecting appropriate sampling methods for their research. The continuous low-level aquatic monitoring (CLAM) sampler (C.I.Agent® Storm-Water Solutions) is a submersible, low flow-rate sampler, that continuously draws water through solid-phase extraction media. CLAM samplers were deployed at two wastewater-dominated stream field sites in conjunction with the deployment of polar organic chemical integrative samplers (POCIS) and the collection of discrete (grab) water samples. All samples were analyzed for a suite of 69 TOCs. The CLAM and POCIS samples represent time-integrated samples that accumulate the TOCs present in the water over the deployment period (19-23 h for CLAM and 29 days for POCIS); the discrete samples represent only the TOCs present in the water at the time and place of sampling. Non-metric multi-dimensional scaling and cluster analysis were used to examine patterns in both TOC detections and relative concentrations between the three sampling methods. A greater number of TOCs were detected in the CLAM samples than in corresponding discrete and POCIS samples, but TOC concentrations in the CLAM samples were significantly lower than in the discrete and (or) POCIS samples. Thirteen TOCs of varying polarity were detected by all of the three methods. TOC detections and concentrations obtained by the three sampling methods, however, are dependent on multiple factors. This study found that stream discharge, constituent loading, and compound type all affected TOC concentrations detected by each method. In addition, TOC detections and concentrations were affected by the reporting limits, bias, recovery, and performance of each method. PMID:24419241

  4. Importance of closely spaced vertical sampling in delineating chemical and microbiological gradients in groundwater studies

    NASA Astrophysics Data System (ADS)

    Smith, Richard L.; Harvey, Ronald W.; LeBlanc, Denis R.

    1991-02-01

    Vertical gradients of selected chemical constituents, bacterial populations, bacterial activity and electron acceptors were investigated for an unconfined aquifer contaminated with nitrate and organic compounds on Cape Cod, Massachusetts, U.S.A. Fifteen-port multilevel sampling devices (MLS's) were installed within the contaminant plume at the source of the contamination, and at 250 and 2100 m downgradient from the source. Depth profiles of specific conductance and dissolved oxygen at the downgradient sites exhibited vertical gradients that were both steep and inversely related. Narrow zones (2-4 m thick) of high N 2O and NH 4+ concentrations were also detected within the contaminant plume. A 27-fold change in bacterial abundance; a 35-fold change in frequency of dividing cells (FDC), an indicator of bacterial growth; a 23-fold change in 3H-glucose uptake, a measure of heterotrophic activity; and substantial changes in overall cell morphology were evident within a 9-m vertical interval at 250 m downgradient. The existence of these gradients argues for the need for closely spaced vertical sampling in groundwater studies because small differences in the vertical placement of a well screen can lead to incorrect conclusions about the chemical and microbiological processes within an aquifer.

  5. Characterization of spinal cord lesions in cattle and horses with rabies: the importance of correct sampling.

    PubMed

    Bassuino, Daniele M; Konradt, Guilherme; Cruz, Raquel A S; Silva, Gustavo S; Gomes, Danilo C; Pavarini, Saulo P; Driemeier, David

    2016-07-01

    Twenty-six cattle and 7 horses were diagnosed with rabies. Samples of brain and spinal cord were processed for hematoxylin and eosin staining and immunohistochemistry (IHC). In addition, refrigerated fragments of brain and spinal cord were tested by direct fluorescent antibody test and intracerebral inoculation in mice. Statistical analyses and Fisher exact test were performed by commercial software. Histologic lesions were observed in the spinal cord in all of the cattle and horses. Inflammatory lesions in horses were moderate at the thoracic, lumbar, and sacral levels, and marked at the lumbar enlargement level. Gitter cells were present in large numbers in the lumbar enlargement region. IHC staining intensity ranged from moderate to strong. Inflammatory lesions in cattle were moderate in all spinal cord sections, and gitter cells were present in small numbers. IHC staining intensity was strong in all spinal cord sections. Only 2 horses exhibited lesions in the brain, which were located mainly in the obex and cerebellum; different from that observed in cattle, which had lesions in 25 cases. Fisher exact test showed that the odds of detecting lesions caused by rabies in horses are 3.5 times higher when spinal cord sections are analyzed, as compared to analysis of brain samples alone.

  6. De novo mutations from sporadic schizophrenia cases highlight important signaling genes in an independent sample.

    PubMed

    Kranz, Thorsten M; Harroch, Sheila; Manor, Orly; Lichtenberg, Pesach; Friedlander, Yechiel; Seandel, Marco; Harkavy-Friedman, Jill; Walsh-Messinger, Julie; Dolgalev, Igor; Heguy, Adriana; Chao, Moses V; Malaspina, Dolores

    2015-08-01

    Schizophrenia is a debilitating syndrome with high heritability. Genomic studies reveal more than a hundred genetic variants, largely nonspecific and of small effect size, and not accounting for its high heritability. De novo mutations are one mechanism whereby disease related alleles may be introduced into the population, although these have not been leveraged to explore the disease in general samples. This paper describes a framework to find high impact genes for schizophrenia. This study consists of two different datasets. First, whole exome sequencing was conducted to identify disruptive de novo mutations in 14 complete parent-offspring trios with sporadic schizophrenia from Jerusalem, which identified 5 sporadic cases with de novo gene mutations in 5 different genes (PTPRG, TGM5, SLC39A13, BTK, CDKN3). Next, targeted exome capture of these genes was conducted in 48 well-characterized, unrelated, ethnically diverse schizophrenia cases, recruited and characterized by the same research team in New York (NY sample), which demonstrated extremely rare and potentially damaging variants in three of the five genes (MAF<0.01) in 12/48 cases (25%); including PTPRG (5 cases), SCL39A13 (4 cases) and TGM5 (4 cases), a higher number than usually identified by whole exome sequencing. Cases differed in cognition and illness features based on which mutation-enriched gene they carried. Functional de novo mutations in protein-interaction domains in sporadic schizophrenia can illuminate risk genes that increase the propensity to develop schizophrenia across ethnicities. PMID:26091878

  7. De novo mutations from sporadic schizophrenia cases highlight important signaling genes in an independent sample

    PubMed Central

    Kranz, Thorsten M; Harroch, Sheila; Manor, Orly; Lichtenberg, Pesach; Friedlander, Yechiel; Seandel, Marco; Harkavy-Friedman, Jill; Walsh-Messinger, Julie; Dolgalev, Igor; Heguy, Adriana; Chao, Moses V; Malaspina, Dolores

    2015-01-01

    Schizophrenia is a debilitating syndrome with high heritability. Genomic studies reveal more than a hundred genetic variants, largely nonspecific and of small effect size, and not accounting for its high heritability. De novo mutations are one mechanism whereby disease related alleles may be introduced into the population, although these have not been leveraged to explore the disease in general samples. This paper describes a framework to find high impact genes for schizophrenia. This study consists of two different datasets. First, whole exome sequencing was conducted to identify disruptive de novo mutations in 14 complete parent–offspring trios with sporadic schizophrenia from Jerusalem, which identified 5 sporadic cases with de novo gene mutations in 5 different genes (PTPRG, TGM5, SLC39A13, BTK, CDKN3). Next, targeted exome capture of these genes was conducted in 48 well-characterized, unrelated, ethnically diverse schizophrenia cases, recruited and characterized by the same research team in New York (NY sample), which demonstrated extremely rare and potentially damaging variants in three of the five genes (MAF < 0.01) in 12/48 cases (25%); including PTPRG (5 cases), SCL39A13 (4 cases) and TGM5 (4 cases), a higher number than usually identified by whole exome sequencing. Cases differed in cognition and illness features based on which mutation-enriched gene they carried. Functional de novo mutations in protein-interaction domains in sporadic schizophrenia can illuminate risk genes that increase the propensity to develop schizophrenia across ethnicities. PMID:26091878

  8. De novo mutations from sporadic schizophrenia cases highlight important signaling genes in an independent sample.

    PubMed

    Kranz, Thorsten M; Harroch, Sheila; Manor, Orly; Lichtenberg, Pesach; Friedlander, Yechiel; Seandel, Marco; Harkavy-Friedman, Jill; Walsh-Messinger, Julie; Dolgalev, Igor; Heguy, Adriana; Chao, Moses V; Malaspina, Dolores

    2015-08-01

    Schizophrenia is a debilitating syndrome with high heritability. Genomic studies reveal more than a hundred genetic variants, largely nonspecific and of small effect size, and not accounting for its high heritability. De novo mutations are one mechanism whereby disease related alleles may be introduced into the population, although these have not been leveraged to explore the disease in general samples. This paper describes a framework to find high impact genes for schizophrenia. This study consists of two different datasets. First, whole exome sequencing was conducted to identify disruptive de novo mutations in 14 complete parent-offspring trios with sporadic schizophrenia from Jerusalem, which identified 5 sporadic cases with de novo gene mutations in 5 different genes (PTPRG, TGM5, SLC39A13, BTK, CDKN3). Next, targeted exome capture of these genes was conducted in 48 well-characterized, unrelated, ethnically diverse schizophrenia cases, recruited and characterized by the same research team in New York (NY sample), which demonstrated extremely rare and potentially damaging variants in three of the five genes (MAF<0.01) in 12/48 cases (25%); including PTPRG (5 cases), SCL39A13 (4 cases) and TGM5 (4 cases), a higher number than usually identified by whole exome sequencing. Cases differed in cognition and illness features based on which mutation-enriched gene they carried. Functional de novo mutations in protein-interaction domains in sporadic schizophrenia can illuminate risk genes that increase the propensity to develop schizophrenia across ethnicities.

  9. Rapid fusion method for the determination of refractory thorium and uranium isotopes in soil samples

    SciTech Connect

    Maxwell, Sherrod L.; Hutchison, Jay B.; McAlister, Daniel R.

    2015-02-14

    Recently, approximately 80% of participating laboratories failed to accurately determine uranium isotopes in soil samples in the U.S Department of Energy Mixed Analyte Performance Evaluation Program (MAPEP) Session 30, due to incomplete dissolution of refractory particles in the samples. Failing laboratories employed acid dissolution methods, including hydrofluoric acid, to recover uranium from the soil matrix. The failures illustrate the importance of rugged soil dissolution methods for the accurate measurement of analytes in the sample matrix. A new rapid fusion method has been developed by the Savannah River National Laboratory (SRNL) to prepare 1-2 g soil sample aliquots very quickly, with total dissolution of refractory particles. Soil samples are fused with sodium hydroxide at 600 ºC in zirconium crucibles to enable complete dissolution of the sample. Uranium and thorium are separated on stacked TEVA and TRU extraction chromatographic resin cartridges, prior to isotopic measurements by alpha spectrometry on cerium fluoride microprecipitation sources. Plutonium can also be separated and measured using this method. Batches of 12 samples can be prepared for measurement in <5 hours.

  10. Rapid fusion method for the determination of refractory thorium and uranium isotopes in soil samples

    DOE PAGES

    Maxwell, Sherrod L.; Hutchison, Jay B.; McAlister, Daniel R.

    2015-02-14

    Recently, approximately 80% of participating laboratories failed to accurately determine uranium isotopes in soil samples in the U.S Department of Energy Mixed Analyte Performance Evaluation Program (MAPEP) Session 30, due to incomplete dissolution of refractory particles in the samples. Failing laboratories employed acid dissolution methods, including hydrofluoric acid, to recover uranium from the soil matrix. The failures illustrate the importance of rugged soil dissolution methods for the accurate measurement of analytes in the sample matrix. A new rapid fusion method has been developed by the Savannah River National Laboratory (SRNL) to prepare 1-2 g soil sample aliquots very quickly, withmore » total dissolution of refractory particles. Soil samples are fused with sodium hydroxide at 600 ºC in zirconium crucibles to enable complete dissolution of the sample. Uranium and thorium are separated on stacked TEVA and TRU extraction chromatographic resin cartridges, prior to isotopic measurements by alpha spectrometry on cerium fluoride microprecipitation sources. Plutonium can also be separated and measured using this method. Batches of 12 samples can be prepared for measurement in <5 hours.« less

  11. MARKOV CHAIN MONTE CARLO POSTERIOR SAMPLING WITH THE HAMILTONIAN METHOD

    SciTech Connect

    K. HANSON

    2001-02-01

    The Markov Chain Monte Carlo technique provides a means for drawing random samples from a target probability density function (pdf). MCMC allows one to assess the uncertainties in a Bayesian analysis described by a numerically calculated posterior distribution. This paper describes the Hamiltonian MCMC technique in which a momentum variable is introduced for each parameter of the target pdf. In analogy to a physical system, a Hamiltonian H is defined as a kinetic energy involving the momenta plus a potential energy {var_phi}, where {var_phi} is minus the logarithm of the target pdf. Hamiltonian dynamics allows one to move along trajectories of constant H, taking large jumps in the parameter space with relatively few evaluations of {var_phi} and its gradient. The Hamiltonian algorithm alternates between picking a new momentum vector and following such trajectories. The efficiency of the Hamiltonian method for multidimensional isotropic Gaussian pdfs is shown to remain constant at around 7% for up to several hundred dimensions. The Hamiltonian method handles correlations among the variables much better than the standard Metropolis algorithm. A new test, based on the gradient of {var_phi}, is proposed to measure the convergence of the MCMC sequence.

  12. Martian Radiative Transfer Modeling Using the Optimal Spectral Sampling Method

    NASA Technical Reports Server (NTRS)

    Eluszkiewicz, J.; Cady-Pereira, K.; Uymin, G.; Moncet, J.-L.

    2005-01-01

    The large volume of existing and planned infrared observations of Mars have prompted the development of a new martian radiative transfer model that could be used in the retrievals of atmospheric and surface properties. The model is based on the Optimal Spectral Sampling (OSS) method [1]. The method is a fast and accurate monochromatic technique applicable to a wide range of remote sensing platforms (from microwave to UV) and was originally developed for the real-time processing of infrared and microwave data acquired by instruments aboard the satellites forming part of the next-generation global weather satellite system NPOESS (National Polarorbiting Operational Satellite System) [2]. As part of our on-going research related to the radiative properties of the martian polar caps, we have begun the development of a martian OSS model with the goal of using it to perform self-consistent atmospheric corrections necessary to retrieve caps emissivity from the Thermal Emission Spectrometer (TES) spectra. While the caps will provide the initial focus area for applying the new model, it is hoped that the model will be of interest to the wider Mars remote sensing community.

  13. Appraisal of fracture sampling methods and a new workflow to characterise heterogeneous fracture networks at outcrop

    NASA Astrophysics Data System (ADS)

    Watkins, Hannah; Bond, Clare E.; Healy, Dave; Butler, Robert W. H.

    2015-03-01

    Characterising fractures at outcrop for use as analogues to fractured reservoirs can use several methods. Four important fracture data collection methods are linear scanline sampling, areal sampling, window sampling and circular scanline sampling. In regions of homogeneous fracture networks these methods are adequate to characterise fracture patterns for use as outcrop analogues, however where fractures are heterogeneous, it is more difficult to characterise fracture networks and a different approach is needed. We develop a workflow for fracture data collection in a region of heterogeneous fractures in a fold and thrust belt, which we believe has applicability to a wide variety of fracture networks in different tectonic settings. We use an augmented circular scanline method, along with areal sampling to collect a range of fracture attribute data, including orientation, length, aperture, spatial distribution and intensity. This augmented circular scanline method more than halves the time taken for data collection, provides accurate, unbiased data that is representative of local fracture network attributes and involves data collection of a wider range of fracture attributes than other sampling techniques alone.

  14. Insights on Antioxidant Assays for Biological Samples Based on the Reduction of Copper Complexes—The Importance of Analytical Conditions

    PubMed Central

    Marques, Sara S.; Magalhães, Luís M.; Tóth, Ildikó V.; Segundo, Marcela A.

    2014-01-01

    Total antioxidant capacity assays are recognized as instrumental to establish antioxidant status of biological samples, however the varying experimental conditions result in conclusions that may not be transposable to other settings. After selection of the complexing agent, reagent addition order, buffer type and concentration, copper reducing assays were adapted to a high-throughput scheme and validated using model biological antioxidant compounds of ascorbic acid, Trolox (a soluble analogue of vitamin E), uric acid and glutathione. A critical comparison was made based on real samples including NIST-909c human serum certified sample, and five study samples. The validated method provided linear range up to 100 µM Trolox, (limit of detection 2.3 µM; limit of quantification 7.7 µM) with recovery results above 85% and precision <5%. The validated developed method with an increased sensitivity is a sound choice for assessment of TAC in serum samples. PMID:24968275

  15. An economic passive sampling method to detect particulate pollutants using magnetic measurements.

    PubMed

    Cao, Liwan; Appel, Erwin; Hu, Shouyun; Ma, Mingming

    2015-10-01

    Identifying particulate matter (PM) emitted from industrial processes into the atmosphere is an important issue in environmental research. This paper presents a passive sampling method using simple artificial samplers that maintains the advantage of bio-monitoring, but overcomes some of its disadvantages. The samplers were tested in a heavily polluted area (Linfen, China) and compared to results from leaf samples. Spatial variations of magnetic susceptibility from artificial passive samplers and leaf samples show very similar patterns. Scanning electron microscopy suggests that the collected PM are mostly in the range of 2-25 μm; frequent occurrence of spherical shape indicates industrial combustion dominates PM emission. Magnetic properties around power plants show different features than other plants. This sampling method provides a suitable and economic tool for semi-quantifying temporal and spatial distribution of air quality; they can be installed in a regular grid and calibrate the weight of PM.

  16. Probing methane hydrate nucleation through the forward flux sampling method.

    PubMed

    Bi, Yuanfei; Li, Tianshu

    2014-11-26

    Understanding the nucleation of hydrate is the key to developing effective strategies for controlling methane hydrate formation. Here we present a computational study of methane hydrate nucleation, by combining the forward flux sampling (FFS) method and the coarse-grained water model mW. To facilitate the application of FFS in studying the formation of methane hydrate, we developed an effective order parameter λ on the basis of the topological analysis of the tetrahedral network. The order parameter capitalizes the signature of hydrate structure, i.e., polyhedral cages, and is capable of efficiently distinguishing hydrate from ice and liquid water while allowing the formation of different hydrate phases, i.e., sI, sII, and amorphous. Integration of the order parameter λ with FFS allows explicitly computing hydrate nucleation rates and obtaining an ensemble of nucleation trajectories under conditions where spontaneous hydrate nucleation becomes too slow to occur in direct simulation. The convergence of the obtained hydrate nucleation rate was found to depend crucially on the convergence of the spatial distribution for the spontaneously formed hydrate seeds obtained from the initial sampling of FFS. The validity of the approach is also verified by the agreement between the calculated nucleation rate and that inferred from the direct simulation. Analyzing the obtained large ensemble of hydrate nucleation trajectories, we show hydrate formation at 220 K and 500 bar is initiated by the nucleation events occurring in the vicinity of water-methane interface, and facilitated by a gradual transition from amorphous to crystalline structure. The latter provides the direct support to the proposed two-step nucleation mechanism of methane hydrate. PMID:24849698

  17. COMPARISON OF USEPA FIELD SAMPLING METHODS FOR BENTHIC MACROINVERTEBRATE STUDIES

    EPA Science Inventory

    Two U.S. Environmental Protection Agency (USEPA) macroinvertebrate sampling protocols were compared in the Mid-Atlantic Highlands region. The Environmental Monitoring and Assessment Program (EMAP) wadeable streams protocol results in a single composite sample from nine transects...

  18. Evaluation of Environmental Sample Analysis Methods and Results Reporting in the National Children's Study Vanguard Study.

    PubMed

    Heikkinen, Maire S A; Khalaf, Abdisalam; Beard, Barbara; Viet, Susan M; Dellarco, Michael

    2016-05-01

    During the initial Vanguard phase of the U.S. National Children's Study (NCS), about 2000 tap water, surface wipe, and air samples were collected and analyzed immediately. The shipping conditions, analysis methods, results, and laboratory performance were evaluated to determine the best approaches for use in the NCS Main Study. The main conclusions were (1) to employ established sample analysis methods, when possible, and alternate methodologies only after careful consideration with method validation studies; (2) lot control and prescreening sample collection materials are important quality assurance procedures; (3) packing samples correctly requires careful training and adjustment of shipping conditions to local conditions; (4) trip blanks and spiked samples should be considered for samplers with short expiration times and labile analytes; (5) two study-specific results reports should be required: laboratory electronic data deliverables (EDD) of sample results in a useable electronic format (CSV or SEDD XML/CSV) and a data package with sample results and supporting information in PDF format. These experiences and lessons learned can be applied to any long-term study.

  19. Preparation of samples for leaf architecture studies, a method for mounting cleared leaves1

    PubMed Central

    Vasco, Alejandra; Thadeo, Marcela; Conover, Margaret; Daly, Douglas C.

    2014-01-01

    • Premise of the study: Several recent waves of interest in leaf architecture have shown an expanding range of approaches and applications across a number of disciplines. Despite this increased interest, examination of existing archives of cleared and mounted leaves shows that current methods for mounting, in particular, yield unsatisfactory results and deterioration of samples over relatively short periods. Although techniques for clearing and staining leaves are numerous, published techniques for mounting leaves are scarce. • Methods and Results: Here we present a complete protocol and recommendations for clearing, staining, and imaging leaves, and, most importantly, a method to permanently mount cleared leaves. • Conclusions: The mounting protocol is faster than other methods, inexpensive, and straightforward; moreover, it yields clear and permanent samples that can easily be imaged, scanned, and stored. Specimens mounted with this method preserve well, with leaves that were mounted more than 35 years ago showing no signs of bubbling or discoloration. PMID:25225627

  20. An evaluation of long-term preservation methods for brown bear (Ursus arctos) faecal DNA samples

    USGS Publications Warehouse

    Murphy, M.A.; Waits, L.P.; Kendall, K.C.; Wasser, S.K.; Higbee, J.A.; Bogden, R.

    2002-01-01

    Relatively few large-scale faecal DNA studies have been initiated due to difficulties in amplifying low quality and quantity DNA template. To improve brown bear faecal DNA PCR amplification success rates and to determine post collection sample longevity, five preservation methods were evaluated: 90% ethanol, DETs buffer, silica-dried, oven-dried stored at room temperature, and oven-dried stored at -20??C. Preservation effectiveness was evaluated for 50 faecal samples by PCR amplification of a mitochondrial DNA (mtDNA) locus (???146 bp) and a nuclear DNA (nDNA) locus (???200 bp) at time points of one week, one month, three months and six months. Preservation method and storage time significantly impacted mtDNA and nDNA amplification success rates. For mtDNA, all preservation methods had ??? 75% success at one week, but storage time had a significant impact on the effectiveness of the silica preservation method. Ethanol preserved samples had the highest success rates for both mtDNA (86.5%) and nDNA (84%). Nuclear DNA amplification success rates ranged from 26-88%, and storage time had a significant impact on all methods but ethanol. Preservation method and storage time should be important considerations for researchers planning projects utilizing faecal DNA. We recommend preservation of faecal samples in 90% ethanol when feasible, although when collecting in remote field conditions or for both DNA and hormone assays a dry collection method may be advantageous.

  1. Sampling and Decontamination Method for Culture of Nontuberculous Mycobacteria in Respiratory Samples of Cystic Fibrosis Patients

    PubMed Central

    De Geyter, Deborah; De Schutter, Iris; Mouton, Christine; Wellemans, Isabelle; Hanssens, Laurence; Schelstraete, Petra; Malfroot, Anne; Pierard, Denis

    2013-01-01

    We confirmed that chlorhexidine decontamination yielded more nontuberculous mycobacteria than did the N-acetyl-l-cysteine-NaOH-oxalic acid procedure from respiratory samples of cystic fibrosis patients on solid cultures. However, this improved recovery is mostly balanced if the latter is combined with liquid culture. Furthermore, none of the 145 cough swabs, used to sample young children, cultured positive, suggesting that swabs are low-quality samples. PMID:24048532

  2. Photoacoustic spectroscopy sample array vessels and photoacoustic spectroscopy methods for using the same

    DOEpatents

    Amonette, James E.; Autrey, S. Thomas; Foster-Mills, Nancy S.

    2006-02-14

    Methods and apparatus for simultaneous or sequential, rapid analysis of multiple samples by photoacoustic spectroscopy are disclosed. Particularly, a photoacoustic spectroscopy sample array vessel including a vessel body having multiple sample cells connected thereto is disclosed. At least one acoustic detector is acoustically positioned near the sample cells. Methods for analyzing the multiple samples in the sample array vessels using photoacoustic spectroscopy are provided.

  3. A comparison of four gravimetric fine particle sampling methods.

    PubMed

    Yanosky, J D; MacIntosh, D L

    2001-06-01

    A study was conducted to compare four gravimetric methods of measuring fine particle (PM2.5) concentrations in air: the BGI, Inc. PQ200 Federal Reference Method PM2.5 (FRM) sampler; the Harvard-Marple Impactor (HI); the BGI, Inc. GK2.05 KTL Respirable/Thoracic Cyclone (KTL); and the AirMetrics MiniVol (MiniVol). Pairs of FRM, HI, and KTL samplers and one MiniVol sampler were collocated and 24-hr integrated PM2.5 samples were collected on 21 days from January 6 through April 9, 2000. The mean and standard deviation of PM2.5 levels from the FRM samplers were 13.6 and 6.8 microg/m3, respectively. Significant systematic bias was found between mean concentrations from the FRM and the MiniVol (1.14 microg/m3, p = 0.0007), the HI and the MiniVol (0.85 microg/m3, p = 0.0048), and the KTL and the MiniVol (1.23 microg/m3, p = 0.0078) according to paired t test analyses. Linear regression on all pairwise combinations of the sampler types was used to evaluate measurements made by the samplers. None of the regression intercepts was significantly different from 0, and only two of the regression slopes were significantly different from 1, that for the FRM and the MiniVol [beta1 = 0.91, 95% CI (0.83-0.99)] and that for the KTL and the MiniVol [beta1 = 0.88, 95% CI (0.78-0.98)]. Regression R2 terms were 0.96 or greater between all pairs of samplers, and regression root mean square error terms (RMSE) were 1.65 microg/m3 or less. These results suggest that the MiniVol will underestimate measurements made by the FRM, the HI, and the KTL by an amount proportional to PM2.5 concentration. Nonetheless, these results indicate that all of the sampler types are comparable if approximately 10% variation on the mean levels and on individual measurement levels is considered acceptable and the actual concentration is within the range of this study (5-35 microg/m3).

  4. Acoustically levitated droplets: a contactless sampling method for fluorescence studies.

    PubMed

    Leiterer, Jork; Grabolle, Markus; Rurack, Knut; Resch-Genger, Ute; Ziegler, Jan; Nann, Thomas; Panne, Ulrich

    2008-01-01

    Acoustic levitation is used as a new tool to study concentration-dependent processes in fluorescence spectroscopy. With this technique, small amounts of liquid and solid samples can be measured without the need for sample supports or containers, which often limits signal acquisition and can even alter sample properties due to interactions with the support material. We demonstrate that, because of the small sample volume, fluorescence measurements at high concentrations of an organic dye are possible without the limitation of inner-filter effects, which hamper such experiments in conventional, cuvette-based measurements. Furthermore, we show that acoustic levitation of liquid samples provides an experimentally simple way to study distance-dependent fluorescence modulations in semiconductor nanocrystals. The evaporation of the solvent during levitation leads to a continuous increase of solute concentration and can easily be monitored by laser-induced fluorescence. PMID:18596335

  5. Alternative methods for determining the electrical conductivity of core samples.

    PubMed

    Lytle, R J; Duba, A G; Willows, J L

    1979-05-01

    Electrode configurations are described that can be used in measuring the electrical conductivity of a core sample and that do not require access to the core end faces. The use of these configurations eliminates the need for machining the core ends for placement of end electrodes. This is because the conductivity in the cases described is relatively insensitive to the length of the sample. We validated the measurement technique by comparing mathematical models with actual measurements that were made perpendicular and paralled to the core axis of granite samples.

  6. Field sampling method for quantifying odorants in humid environments.

    PubMed

    Trabue, Steven L; Scoggin, Kenwood D; Li, Hong; Burns, Robert; Xin, Hongwei

    2008-05-15

    Most air quality studies in agricultural environments use thermal desorption analysis for quantifying semivolatile organic compounds (SVOCs) associated with odor. The objective of this study was to develop a robust sampling technique for measuring SVOCs in humid environments. Test atmospheres were generated at ambient temperatures (23 +/- 1.5 degrees C) and 25, 50, and 80% relative humidity (RH). Sorbent material used included Tenax, graphitized carbon, and carbon molecular sieve (CMS). Sorbent tubes were challenged with 2, 4, 8, 12, and 24 L of air at various RHs. Sorbent tubes with CMS material performed poorly at both 50 and 80% RH dueto excessive sorption of water. Heating of CMS tubes during sampling or dry-purging of CMS tubes post sampling effectively reduced water sorption with heating of tubes being preferred due to the higher recovery and reproducibility. Tenaxtubes had breakthrough of the more volatile compounds and tended to form artifacts with increasing volumes of air sampled. Graphitized carbon sorbent tubes containing Carbopack X and Carbopack C performed best with quantitative recovery of all compounds at all RHs and sampling volumes tested. The graphitized carbon tubes were taken to the field for further testing. Field samples taken from inside swine feeding operations showed that butanoic acid, 4-methylphenol, 4-ethylphenol, indole, and 3-methylindole were the compounds detected most often above their odor threshold values. Field samples taken from a poultry facility demonstrated that butanoic acid, 3-methylbutanoic acid, and 4-methylphenol were the compounds above their odor threshold values detected most often, relative humidity, CAFO, VOC, SVOC, thermal desorption, swine, poultry, air quality, odor. PMID:18546717

  7. Photoacoustic spectroscopy sample array vessel and photoacoustic spectroscopy method for using the same

    DOEpatents

    Amonette, James E.; Autrey, S. Thomas; Foster-Mills, Nancy S.; Green, David

    2005-03-29

    Methods and apparatus for analysis of multiple samples by photoacoustic spectroscopy are disclosed. Particularly, a photoacoustic spectroscopy sample array vessel including a vessel body having multiple sample cells connected thereto is disclosed. At least one acoustic detector is acoustically coupled with the vessel body. Methods for analyzing the multiple samples in the sample array vessels using photoacoustic spectroscopy are provided.

  8. Method for preconcentrating a sample for subsequent analysis

    DOEpatents

    Zaromb, Solomon

    1990-01-01

    A system for analysis of trace concentration of contaminants in air includes a portable liquid chromatograph and a preconcentrator for the contaminants to be analyzed. The preconcentrator includes a sample bag having an inlet valve and an outlet valve for collecting an air sample. When the sample is collected the sample bag is connected in series with a sorbing apparatus in a recirculation loop. The sorbing apparatus has an inner gas-permeable container containing a sorbent material and an outer gas-impermeable container. The sample is circulated through the outer container and around the inner container for trapping and preconcentrating the contaminants in the sorbent material. The sorbent material may be a liquid having the same composition as the mobile phase of the chromatograph for direct injection thereinto. Alternatively, the sorbent material may be a porous, solid body, to which mobile phase liquid is added after preconcentration of the contaminants for dissolving the contaminants, the liquid solution then being withdrawn for injection into the chromatograph.

  9. Analytical instrument with apparatus and method for sample concentrating

    DOEpatents

    Zaromb, S.

    1986-08-04

    A system for analysis of trace concentrations of contaminants in air includes a portable liquid chromatograph and a preconcentrator for the contaminants to be analyzed. The preconcentrator includes a sample bag having an inlet valve and an outlet valve for collecting an air sample. When the sample is collected the sample bag is connected in series with a sorbing apparatus in a recirculation loop. The sorbing apparatus has an inner gas-permeable container containing a sorbent material and an outer gas-impermeable container. The sample is circulated through the outer container and around the inner container for trapping and preconcentrating the contaminants in the sorbent material. The sorbent material may be a liquid having the same composition as the mobile phase of the chromatograph for direct injection thereinto. Alternatively, the sorbent material may be a porous, solid body, to which mobile phase liquid is added after preconcentration of the contaminants for dissolving the contaminants, the liquid solution then being withdrawn for injection into the chromatograph.

  10. Comparison of Relative Bias, Precision, and Efficiency of Sampling Methods for Natural Enemies of Soybean Aphid (Hemiptera: Aphididae).

    PubMed

    Bannerman, J A; Costamagna, A C; McCornack, B P; Ragsdale, D W

    2015-06-01

    Generalist natural enemies play an important role in controlling soybean aphid, Aphis glycines (Hemiptera: Aphididae), in North America. Several sampling methods are used to monitor natural enemy populations in soybean, but there has been little work investigating their relative bias, precision, and efficiency. We compare five sampling methods: quadrats, whole-plant counts, sweep-netting, walking transects, and yellow sticky cards to determine the most practical methods for sampling the three most prominent species, which included Harmonia axyridis (Pallas), Coccinella septempunctata L. (Coleoptera: Coccinellidae), and Orius insidiosus (Say) (Hemiptera: Anthocoridae). We show an important time by sampling method interaction indicated by diverging community similarities within and between sampling methods as the growing season progressed. Similarly, correlations between sampling methods for the three most abundant species over multiple time periods indicated differences in relative bias between sampling methods and suggests that bias is not consistent throughout the growing season, particularly for sticky cards and whole-plant samples. Furthermore, we show that sticky cards produce strongly biased capture rates relative to the other four sampling methods. Precision and efficiency differed between sampling methods and sticky cards produced the most precise (but highly biased) results for adult natural enemies, while walking transects and whole-plant counts were the most efficient methods for detecting coccinellids and O. insidiosus, respectively. Based on bias, precision, and efficiency considerations, the most practical sampling methods for monitoring in soybean include walking transects for coccinellid detection and whole-plant counts for detection of small predators like O. insidiosus. Sweep-netting and quadrat samples are also useful for some applications, when efficiency is not paramount.

  11. Comparison of Relative Bias, Precision, and Efficiency of Sampling Methods for Natural Enemies of Soybean Aphid (Hemiptera: Aphididae).

    PubMed

    Bannerman, J A; Costamagna, A C; McCornack, B P; Ragsdale, D W

    2015-06-01

    Generalist natural enemies play an important role in controlling soybean aphid, Aphis glycines (Hemiptera: Aphididae), in North America. Several sampling methods are used to monitor natural enemy populations in soybean, but there has been little work investigating their relative bias, precision, and efficiency. We compare five sampling methods: quadrats, whole-plant counts, sweep-netting, walking transects, and yellow sticky cards to determine the most practical methods for sampling the three most prominent species, which included Harmonia axyridis (Pallas), Coccinella septempunctata L. (Coleoptera: Coccinellidae), and Orius insidiosus (Say) (Hemiptera: Anthocoridae). We show an important time by sampling method interaction indicated by diverging community similarities within and between sampling methods as the growing season progressed. Similarly, correlations between sampling methods for the three most abundant species over multiple time periods indicated differences in relative bias between sampling methods and suggests that bias is not consistent throughout the growing season, particularly for sticky cards and whole-plant samples. Furthermore, we show that sticky cards produce strongly biased capture rates relative to the other four sampling methods. Precision and efficiency differed between sampling methods and sticky cards produced the most precise (but highly biased) results for adult natural enemies, while walking transects and whole-plant counts were the most efficient methods for detecting coccinellids and O. insidiosus, respectively. Based on bias, precision, and efficiency considerations, the most practical sampling methods for monitoring in soybean include walking transects for coccinellid detection and whole-plant counts for detection of small predators like O. insidiosus. Sweep-netting and quadrat samples are also useful for some applications, when efficiency is not paramount. PMID:26470267

  12. Human breath analysis: methods for sample collection and reduction of localized background effects.

    PubMed

    Martin, Audrey N; Farquar, George R; Jones, A Daniel; Frank, Matthias

    2010-01-01

    Solid-phase microextraction (SPME) was applied, in conjunction with gas chromatography-mass spectrometry, to the analysis of volatile organic compounds (VOCs) in human breath samples without requiring exhaled breath condensate collection. A new procedure, exhaled breath vapor (EBV) collection, involving the active sampling and preconcentration of a breath sample with a SPME fiber fitted inside a modified commercial breath-collection device, the RTube, is described. Immediately after sample collection, compounds are desorbed from the SPME fiber at 250 degrees C in the GC-MS injector. Experiments were performed using EBV collected at -80 degrees C and at room temperature, and the results compared to the traditional method of collecting exhaled breath condensate at -80 degrees C followed by passive SPME sampling of the collected condensate. Methods are compared in terms of portability, ease-of-use, speed of analysis, and detection limits. The need for a clean air supply for the study subjects is demonstrated using several localized sources of VOC contaminants including nail polish, lemonade, and gasoline. Various simple methods to supply clean inhaled air to a subject are presented. Chemical exposures are used to demonstrate the importance of providing cleaned air (organic vapor respirator) or an external air source (tubing stretched to a separate room). These techniques allow for facile data interpretation by minimizing background contaminants. It is demonstrated herein that this active SPME breath-sampling device provides advantages in the forms of faster sample collection and data analysis, apparatus portability and avoidance of power or cooling requirements, and performance for sample collection in a contaminated environment. PMID:19844696

  13. On analysis-based two-step interpolation methods for randomly sampled seismic data

    NASA Astrophysics Data System (ADS)

    Yang, Pengliang; Gao, Jinghuai; Chen, Wenchao

    2013-02-01

    Interpolating the missing traces of regularly or irregularly sampled seismic record is an exceedingly important issue in the geophysical community. Many modern acquisition and reconstruction methods are designed to exploit the transform domain sparsity of the few randomly recorded but informative seismic data using thresholding techniques. In this paper, to regularize randomly sampled seismic data, we introduce two accelerated, analysis-based two-step interpolation algorithms, the analysis-based FISTA (fast iterative shrinkage-thresholding algorithm) and the FPOCS (fast projection onto convex sets) algorithm from the IST (iterative shrinkage-thresholding) algorithm and the POCS (projection onto convex sets) algorithm. A MATLAB package is developed for the implementation of these thresholding-related interpolation methods. Based on this package, we compare the reconstruction performance of these algorithms, using synthetic and real seismic data. Combined with several thresholding strategies, the accelerated convergence of the proposed methods is also highlighted.

  14. An efficient and cost-effective method for preparing transmission electron microscopy samples from powders

    SciTech Connect

    Wen, Haiming; Lin, Yaojun; Seidman, David N.; Schoenung, Julie M.; van Rooyen, Isabella J.; Lavernia, Enrique J.

    2015-09-09

    The preparation of transmission electron microcopy (TEM) samples from powders with particle sizes larger than ~100 nm poses a challenge. The existing methods are complicated and expensive, or have a low probability of success. Herein, we report a modified methodology for preparation of TEM samples from powders, which is efficient, cost-effective, and easy to perform. This method involves mixing powders with an epoxy on a piece of weighing paper, curing the powder–epoxy mixture to form a bulk material, grinding the bulk to obtain a thin foil, punching TEM discs from the foil, dimpling the discs, and ion milling the dimpled discs to electron transparency. Compared with the well established and robust grinding–dimpling–ion-milling method for TEM sample preparation for bulk materials, our modified approach for preparing TEM samples from powders only requires two additional simple steps. In this article, step-by-step procedures for our methodology are described in detail, and important strategies to ensure success are elucidated. Furthermore, our methodology has been applied successfully for preparing TEM samples with large thin areas and high quality for many different mechanically milled metallic powders.

  15. An efficient and cost-effective method for preparing transmission electron microscopy samples from powders

    DOE PAGES

    Wen, Haiming; Lin, Yaojun; Seidman, David N.; Schoenung, Julie M.; van Rooyen, Isabella J.; Lavernia, Enrique J.

    2015-09-09

    The preparation of transmission electron microcopy (TEM) samples from powders with particle sizes larger than ~100 nm poses a challenge. The existing methods are complicated and expensive, or have a low probability of success. Herein, we report a modified methodology for preparation of TEM samples from powders, which is efficient, cost-effective, and easy to perform. This method involves mixing powders with an epoxy on a piece of weighing paper, curing the powder–epoxy mixture to form a bulk material, grinding the bulk to obtain a thin foil, punching TEM discs from the foil, dimpling the discs, and ion milling the dimpledmore » discs to electron transparency. Compared with the well established and robust grinding–dimpling–ion-milling method for TEM sample preparation for bulk materials, our modified approach for preparing TEM samples from powders only requires two additional simple steps. In this article, step-by-step procedures for our methodology are described in detail, and important strategies to ensure success are elucidated. Furthermore, our methodology has been applied successfully for preparing TEM samples with large thin areas and high quality for many different mechanically milled metallic powders.« less

  16. Statistical Methods and Tools for Hanford Staged Feed Tank Sampling

    SciTech Connect

    Fountain, Matthew S.; Brigantic, Robert T.; Peterson, Reid A.

    2013-10-01

    This report summarizes work conducted by Pacific Northwest National Laboratory to technically evaluate the current approach to staged feed sampling of high-level waste (HLW) sludge to meet waste acceptance criteria (WAC) for transfer from tank farms to the Hanford Waste Treatment and Immobilization Plant (WTP). The current sampling and analysis approach is detailed in the document titled Initial Data Quality Objectives for WTP Feed Acceptance Criteria, 24590-WTP-RPT-MGT-11-014, Revision 0 (Arakali et al. 2011). The goal of this current work is to evaluate and provide recommendations to support a defensible, technical and statistical basis for the staged feed sampling approach that meets WAC data quality objectives (DQOs).

  17. Sample Selection in Randomized Experiments: A New Method Using Propensity Score Stratified Sampling

    ERIC Educational Resources Information Center

    Tipton, Elizabeth; Hedges, Larry; Vaden-Kiernan, Michael; Borman, Geoffrey; Sullivan, Kate; Caverly, Sarah

    2014-01-01

    Randomized experiments are often seen as the "gold standard" for causal research. Despite the fact that experiments use random assignment to treatment conditions, units are seldom selected into the experiment using probability sampling. Very little research on experimental design has focused on how to make generalizations to well-defined…

  18. Apparatus and method for centrifugation and robotic manipulation of samples

    NASA Technical Reports Server (NTRS)

    Vellinger, John C. (Inventor); Ormsby, Rachel A. (Inventor); Kennedy, David J. (Inventor); Thomas, Nathan A. (Inventor); Shulthise, Leo A. (Inventor); Kurk, Michael A. (Inventor); Metz, George W. (Inventor)

    2007-01-01

    A device for centrifugation and robotic manipulation of specimen samples, including incubating eggs, and uses thereof are provided. The device may advantageously be used for the incubation of avian, reptilian or any type of vertebrate eggs. The apparatus comprises a mechanism for holding samples individually, rotating them individually, rotating them on a centrifuge collectively, injecting them individually with a fixative or other chemical reagent, and maintaining them at controlled temperature, relative humidity and atmospheric composition. The device is applicable to experiments involving entities other than eggs, such as invertebrate specimens, plants, microorganisms and molecular systems.

  19. Effect of sampling method on measured concentrations of sulfide and ammonia in sediment toxicity tests

    SciTech Connect

    Phillips, B.M.; Anderson, B.S.; Hunt, J.W.

    1994-12-31

    Sulfide and ammonia are natural components of marine sediments which may occur in concentrations toxic to marine organisms. Because these compounds are toxic, it is important to measure them accurately to determine their influence on toxicity test results. Standard solid phase test protocols may not adequately address sampling methodology for ammonia and sulfide analysis. Samples are commonly taken from overlying water in test containers, which may not adequately characterize the medium to which test animals are exposed. As part of research conducted under the California State Water Resources Control Board`s Bay Protection and Toxic Cleanup Program, the authors are investigating alternative sampling methods to more accurately characterize sulfide and ammonia in sediments. Measurements taken from water overlying test sediment are compared to those taken from interstitial water in tests using Neanthes and Rhepoxynius. Pre-test interstitial samples are extracted from sediment using centrifugation. Final measurements are made on water centrifuged from sediment in an additional laboratory replicate. Oxidation can affect the measurement of both constituents, therefore efforts are made to reduce oxidation by centrifuging with no head space. Ammonia is analyzed immediately using an ion specific electrode, and sulfide samples are preserved for spectrophotometric analysis. In preliminary studies sulfide concentrations were 13 times higher and ammonia concentrations 4 times higher in the interstitial water than in samples taken from overlying water. Results will be discussed in terms of sulfide and ammonia toxicity and possible ways of improving sampling methodology.

  20. Improved sample management in the cylindrical-tube microelectrophoresis method

    NASA Technical Reports Server (NTRS)

    Smolka, A. J. K.

    1980-01-01

    A modification to an analytical microelectrophoresis system is described that improves the manipulation of the sample particles and fluid. The apparatus modification and improved operational procedure should yield more accurate measurements of particle mobilities and permit less skilled operators to use the apparatus.

  1. 40 CFR Appendix I to Part 261 - Representative Sampling Methods

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... protocols listed below, for sampling waste with properties similar to the indicated materials, will be considered by the Agency to be representative of the waste. Extremely viscous liquid—ASTM Standard D140-70...-like material—ASTM Standard D1452-65 Fly Ash-like material—ASTM Standard D2234-76 Containerized...

  2. 40 CFR Appendix I to Part 261 - Representative Sampling Methods

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... protocols listed below, for sampling waste with properties similar to the indicated materials, will be considered by the Agency to be representative of the waste. Extremely viscous liquid—ASTM Standard D140-70...-like material—ASTM Standard D1452-65 Fly Ash-like material—ASTM Standard D2234-76 Containerized...

  3. Comparison of immunohistochemical, histochemical and immunochemical methods for the detection of wheat protein allergens in meat samples and cooked, dry, raw and fermented sausage samples.

    PubMed

    Lukášková, Z Řezáčová; Tremlová, B; Pospiech, M; Renčová, E; Randulová, Z; Steinhauser, L; Reichová, A; Bednář, J

    2011-01-01

    Nowadays is it a common practice to add vegetable protein in the production of meat products. Because of the possible substitution of high-quality raw meat with vegetable protein without the labelling the product package and because of the allergenic potential of many vegetable proteins, it is important to develop accurate methods for its detection. The objective of the study was to compare histochemical, immunochemical (ELISA, ALERT gliadin screening test) and immunohistochemical methods for the detection of wheat protein in meat samples and sausages. Histochemical methods were useful for the detection of flour in meat samples, but the immunohistochemical method was better for the detection of wheat protein. ALERT gliadin screening test detected gliadin from 10 mg kg(-1), while an immunohistochemical method detected wheat protein concentrations from 1 g kg(-1) and an ELISA method detected wheat protein concentrations from 4 g kg(-1). ALERT gliadin screening test showed results within 1 day, whilst an ELISA detection method took 2 days, and an immunohistochemical procedure took 5 days at the soonest, all including sample preparation. This study also focused on optimisation of an immunohistochemical method for samples of cooked sausage. In addition, three samples were sufficient for wheat protein detection at a concentration of 1 g kg(-1) (and greater) with a confidence level greater than 95%.

  4. A method for estimating population sex ratio for sage-grouse using noninvasive genetic samples.

    PubMed

    Baumgardt, J A; Goldberg, C S; Reese, K P; Connelly, J W; Musil, D D; Garton, E O; Waits, L P

    2013-05-01

    Population sex ratio is an important metric for wildlife management and conservation, but estimates can be difficult to obtain, particularly for sexually monomorphic species or for species that differ in detection probability between the sexes. Noninvasive genetic sampling (NGS) using polymerase chain reaction (PCR) has become a common method for identifying sex from sources such as hair, feathers or faeces, and is a potential source for estimating sex ratio. If, however, PCR success is sex-biased, naively using NGS could lead to a biased sex ratio estimator. We measured PCR success rates and error rates for amplifying the W and Z chromosomes from greater sage-grouse (Centrocercus urophasianus) faecal samples, examined how success and error rates for sex identification changed in response to faecal sample exposure time, and used simulation models to evaluate precision and bias of three sex assignment criteria for estimating population sex ratio with variable sample sizes and levels of PCR replication. We found PCR success rates were higher for females than males and that choice of sex assignment criteria influenced the bias and precision of corresponding sex ratio estimates. Our simulations demonstrate the importance of considering the interplay between the sex bias of PCR success, number of genotyping replicates, sample size, true population sex ratio and accuracy of assignment rules for designing future studies. Our results suggest that using faecal DNA for estimating the sex ratio of sage-grouse populations has great potential and, with minor adaptations and similar marker evaluations, should be applicable to numerous species.

  5. Exploring biomolecular dynamics and interactions using advanced sampling methods

    NASA Astrophysics Data System (ADS)

    Luitz, Manuel; Bomblies, Rainer; Ostermeir, Katja; Zacharias, Martin

    2015-08-01

    Molecular dynamics (MD) and Monte Carlo (MC) simulations have emerged as a valuable tool to investigate statistical mechanics and kinetics of biomolecules and synthetic soft matter materials. However, major limitations for routine applications are due to the accuracy of the molecular mechanics force field and due to the maximum simulation time that can be achieved in current simulations studies. For improving the sampling a number of advanced sampling approaches have been designed in recent years. In particular, variants of the parallel tempering replica-exchange methodology are widely used in many simulation studies. Recent methodological advancements and a discussion of specific aims and advantages are given. This includes improved free energy simulation approaches and conformational search applications.

  6. Comparison of methods for sampling plant bugs on cotton in South Texas (2010)

    Technology Transfer Automated Retrieval System (TEKTRAN)

    A total of 26 cotton fields were sampled by experienced and inexperienced samplers at 3 growth stages using 5 methods to compare the most efficient and accurate method for sampling plant bugs in cotton. Each of the 5 methods had its own distinct advantages and disadvantages as a sampling method (too...

  7. A new method of snowmelt sampling for water stable isotopes

    USGS Publications Warehouse

    Penna, D.; Ahmad, M.; Birks, S. J.; Bouchaou, L.; Brencic, M.; Butt, S.; Holko, L.; Jeelani, G.; Martinez, D. E.; Melikadze, G.; Shanley, J.B.; Sokratov, S. A.; Stadnyk, T.; Sugimoto, A.; Vreca, P.

    2014-01-01

    We modified a passive capillary sampler (PCS) to collect snowmelt water for isotopic analysis. Past applications of PCSs have been to sample soil water, but the novel aspect of this study was the placement of the PCSs at the ground-snowpack interface to collect snowmelt. We deployed arrays of PCSs at 11 sites in ten partner countries on five continents representing a range of climate and snow cover worldwide. The PCS reliably collected snowmelt at all sites and caused negligible evaporative fractionation effects in the samples. PCS is low-cost, easy to install, and collects a representative integrated snowmelt sample throughout the melt season or at the melt event scale. Unlike snow cores, the PCS collects the water that would actually infiltrate the soil; thus, its isotopic composition is appropriate to use for tracing snowmelt water through the hydrologic cycle. The purpose of this Briefing is to show the potential advantages of PCSs and recommend guidelines for constructing and installing them based on our preliminary results from two snowmelt seasons.

  8. Effects of Heterogeneities, Sampling Frequencies, Tools and Methods on Uncertainties in Subsurface Contaminant Concentration Measurements

    NASA Astrophysics Data System (ADS)

    Ezzedine, S. M.; McNab, W. W.

    2007-12-01

    Long-term monitoring (LTM) is particularly important for contaminants which are mitigated by natural processes of dilution, dispersion, and degradation. At many sites, LTM can require decades of expensive sampling at tens or even hundreds of existing monitoring wells, resulting in hundreds of thousands, or millions of dollars per year for sampling and data management. Therefore, contaminant sampling tools, methods and frequencies are chosen to minimize waste and data management costs while ensuring a reliable and informative time-history of contaminant measurement for regulatory compliance. The interplay play between cause (i.e. subsurface heterogeneities, sampling techniques, measurement frequencies) and effect (unreliable data and measurements gap) has been overlooked in many field applications which can lead to inconsistencies in time- histories of contaminant samples. In this study we address the relationship between cause and effect for different hydrogeological sampling settings: porous and fractured media. A numerical model has been developed using AMR-FEM to solve the physicochemical processes that take place in the aquifer and the monitoring well. In the latter, the flow is governed by the Navier-Stokes equations while in the former the flow is governed by the diffusivity equation; both are fully coupled to mimic stressed conditions and to assess the effect of dynamic sampling tool on the formation surrounding the monitoring well. First of all, different sampling tools (i.e., Easy Pump, Snapper Grab Sampler) were simulated in a monitoring well screened in different homogeneous layered aquifers to assess their effect on the sampling measurements. Secondly, in order to make the computer runs more CPU efficient the flow in the monitoring well was replaced by its counterpart flow in porous media with infinite permeability and the new model was used to simulate the effect of heterogeneities, sampling depth, sampling tool and sampling frequencies on the

  9. Capture-recapture and removal methods for sampling closed populations

    USGS Publications Warehouse

    White, Gary C.; Anderson, David R.; Burnham, Kenneth P.; Otis, David L.

    1982-01-01

    The problem of estimating animal abundance is common in wildlife management and environmental impact asessment. Capture-recapture and removal methods are often used to estimate population size. Statistical Inference From Capture Data On Closed Animal Populations, a monograph by Otis et al. (1978), provides a comprehensive synthesis of much of the wildlife and statistical literature on the methods, as well as some extensions of the general theory. In our primer, we focus on capture-recapture and removal methods for trapping studies in which a population is assumed to be closed and do not treat open-population models, such as the Jolly-Seber model, or catch-effort methods in any detail. The primer, written for students interested in population estimation, is intended for use with the more theoretical monograph.

  10. A method for reducing sampling jitter in digital control systems

    NASA Technical Reports Server (NTRS)

    Anderson, T. O.; HURBD W. J.; Hurd, W. J.

    1969-01-01

    Digital phase lock loop system is designed by smoothing the proportional control with a low pass filter. This method does not significantly affect the loop dynamics when the smoothing filter bandwidth is wide compared to loop bandwidth.

  11. Rapid methods to detect organic mercury and total selenium in biological samples

    PubMed Central

    2011-01-01

    Background Organic mercury (Hg) is a global pollutant of concern and selenium is believed to afford protection against mercury risk though few approaches exist to rapidly assess both chemicals in biological samples. Here, micro-scale and rapid methods to detect organic mercury (< 1.5 ml total sample volume, < 1.5 hour) and total selenium (Se; < 3.0 ml total volume, < 3 hour) from a range of biological samples (10-50 mg) are described. Results For organic Hg, samples are digested using Tris-HCl buffer (with sequential additions of protease, NaOH, cysteine, CuSO4, acidic NaBr) followed by extraction with toluene and Na2S2O3. The final product is analyzed via commercially available direct/total mercury analyzers. For Se, a fluorometric assay has been developed for microplate readers that involves digestion (HNO3-HClO4 and HCl), conjugation (2,3-diaminonaphthalene), and cyclohexane extraction. Recovery of organic Hg (86-107%) and Se (85-121%) were determined through use of Standard Reference Materials and lemon shark kidney tissues. Conclusions The approaches outlined provide an easy, rapid, reproducible, and cost-effective platform for monitoring organic Hg and total Se in biological samples. Owing to the importance of organic Hg and Se in the pathophysiology of Hg, integration of such methods into established research monitoring efforts (that largely focus on screening total Hg only) will help increase understanding of Hg's true risks. PMID:21232132

  12. Evaluation of sample pretreatment methods for analysis of polonium isotopes in herbal medicines.

    PubMed

    Sreejith, Sathyapriya R; Nair, Madhu G; Rao, D D

    2014-12-01

    Herbal infusions like ayurvedic aristas are widely consumed by Indian population for good health. With increasing awareness about radiological assessment, an effort was made to assess the radioactivity concentration of naturally occurring radionuclides in herbal medicines. (210)Po is an important alpha particle emitter contributing to internal dose to man from ingestion. Though (210)Po can be spontaneously deposited on silver disk for alpha spectrometric measurements with less radiochemical step, great care has to be taken during the sample pretreatment step owing to the high volatility of polonium even at low temperatures. Aim of the study was to evaluate an appropriate sample pretreatment method for estimation of polonium in herbal medicines. (209)Po was used for radiochemical yield calculation. Conventional open vessel wet ashing, physical evaporation, freeze-drying and microwave digestion in a Teflon vessel were examined. The recovery ranged between 9 and 79%. The lowest recovery was obtained for the samples that were processed by open vessel digestion without any volume reduction. The recoveries were comparable for those samples that were freeze dried and subjected to HNO3 + HClO4 + H2O2 + HF acid digestion and microwave digested samples. (210)Po concentration in the samples ranged from 11.3 to 39.6 mBq/L.

  13. Evaluation of sample pretreatment methods for analysis of polonium isotopes in herbal medicines.

    PubMed

    Sreejith, Sathyapriya R; Nair, Madhu G; Rao, D D

    2014-12-01

    Herbal infusions like ayurvedic aristas are widely consumed by Indian population for good health. With increasing awareness about radiological assessment, an effort was made to assess the radioactivity concentration of naturally occurring radionuclides in herbal medicines. (210)Po is an important alpha particle emitter contributing to internal dose to man from ingestion. Though (210)Po can be spontaneously deposited on silver disk for alpha spectrometric measurements with less radiochemical step, great care has to be taken during the sample pretreatment step owing to the high volatility of polonium even at low temperatures. Aim of the study was to evaluate an appropriate sample pretreatment method for estimation of polonium in herbal medicines. (209)Po was used for radiochemical yield calculation. Conventional open vessel wet ashing, physical evaporation, freeze-drying and microwave digestion in a Teflon vessel were examined. The recovery ranged between 9 and 79%. The lowest recovery was obtained for the samples that were processed by open vessel digestion without any volume reduction. The recoveries were comparable for those samples that were freeze dried and subjected to HNO3 + HClO4 + H2O2 + HF acid digestion and microwave digested samples. (210)Po concentration in the samples ranged from 11.3 to 39.6 mBq/L. PMID:25176601

  14. Ant colony optimization as a method for strategic genotype sampling.

    PubMed

    Spangler, M L; Robbins, K R; Bertrand, J K; Macneil, M; Rekaya, R

    2009-06-01

    A simulation study was carried out to develop an alternative method of selecting animals to be genotyped. Simulated pedigrees included 5000 animals, each assigned genotypes for a bi-allelic single nucleotide polymorphism (SNP) based on assumed allelic frequencies of 0.7/0.3 and 0.5/0.5. In addition to simulated pedigrees, two beef cattle pedigrees, one from field data and the other from a research population, were used to test selected methods using simulated genotypes. The proposed method of ant colony optimization (ACO) was evaluated based on the number of alleles correctly assigned to ungenotyped animals (AK(P)), the probability of assigning true alleles (AK(G)) and the probability of correctly assigning genotypes (APTG). The proposed animal selection method of ant colony optimization was compared to selection using the diagonal elements of the inverse of the relationship matrix (A(-1)). Comparisons of these two methods showed that ACO yielded an increase in AK(P) ranging from 4.98% to 5.16% and an increase in APTG from 1.6% to 1.8% using simulated pedigrees. Gains in field data and research pedigrees were slightly lower. These results suggest that ACO can provide a better genotyping strategy, when compared to A(-1), with different pedigree sizes and structures. PMID:19220227

  15. A Study of Tapered Beard Sampling Method as Used in HVI

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Beard method is used for sampling cotton fibers to generate fibrograms from which length parameters can be obtained. It is the sampling method used by HVI. HVI uses a fiber comb to sample cotton fibers and form a fiber beard for measuring fiber length parameters. A fundamental issue about this sampl...

  16. [DOE method for evaluating environmental and waste management samples: Revision 1, Addendum 1

    SciTech Connect

    Goheen, S.C.

    1995-04-01

    The US Dapartment of Energy`s (DOE`s) environmental and waste management (EM) sampling and analysis activities require that large numbers of samples be analyzed for materials characterization, environmental surveillance, and site-remediation programs. The present document, DOE Methods for Evaluating Environmental and Waste Management Samples (DOE Methods), is a supplemental resource for analyzing many of these samples.

  17. 7 CFR 32.402 - Samples of mohair top grades; method of obtaining.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 2 2010-01-01 2010-01-01 false Samples of mohair top grades; method of obtaining. 32... STANDARD CONTAINER REGULATIONS PURCHASE OF GREASE MOHAIR AND MOHAIR TOP SAMPLES § 32.402 Samples of mohair top grades; method of obtaining. Samples certified as representative of the official standards of...

  18. 7 CFR 32.402 - Samples of mohair top grades; method of obtaining.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 7 Agriculture 2 2011-01-01 2011-01-01 false Samples of mohair top grades; method of obtaining. 32... STANDARD CONTAINER REGULATIONS PURCHASE OF GREASE MOHAIR AND MOHAIR TOP SAMPLES § 32.402 Samples of mohair top grades; method of obtaining. Samples certified as representative of the official standards of...

  19. 7 CFR 31.400 - Samples for wool and wool top grades; method of obtaining.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 7 Agriculture 2 2014-01-01 2014-01-01 false Samples for wool and wool top grades; method of... STANDARDS AND STANDARD CONTAINER REGULATIONS PURCHASE OF WOOL AND WOOL TOP SAMPLES § 31.400 Samples for wool and wool top grades; method of obtaining. Samples certified as representative of the...

  20. 7 CFR 31.400 - Samples for wool and wool top grades; method of obtaining.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 7 Agriculture 2 2013-01-01 2013-01-01 false Samples for wool and wool top grades; method of... STANDARDS AND STANDARD CONTAINER REGULATIONS PURCHASE OF WOOL AND WOOL TOP SAMPLES § 31.400 Samples for wool and wool top grades; method of obtaining. Samples certified as representative of the...

  1. 7 CFR 31.400 - Samples for wool and wool top grades; method of obtaining.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 7 Agriculture 2 2011-01-01 2011-01-01 false Samples for wool and wool top grades; method of... STANDARDS AND STANDARD CONTAINER REGULATIONS PURCHASE OF WOOL AND WOOL TOP SAMPLES § 31.400 Samples for wool and wool top grades; method of obtaining. Samples certified as representative of the...

  2. 7 CFR 32.402 - Samples of mohair top grades; method of obtaining.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 7 Agriculture 2 2013-01-01 2013-01-01 false Samples of mohair top grades; method of obtaining. 32... STANDARD CONTAINER REGULATIONS PURCHASE OF GREASE MOHAIR AND MOHAIR TOP SAMPLES § 32.402 Samples of mohair top grades; method of obtaining. Samples certified as representative of the official standards of...

  3. 7 CFR 32.402 - Samples of mohair top grades; method of obtaining.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 7 Agriculture 2 2014-01-01 2014-01-01 false Samples of mohair top grades; method of obtaining. 32... STANDARD CONTAINER REGULATIONS PURCHASE OF GREASE MOHAIR AND MOHAIR TOP SAMPLES § 32.402 Samples of mohair top grades; method of obtaining. Samples certified as representative of the official standards of...

  4. 7 CFR 31.400 - Samples for wool and wool top grades; method of obtaining.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 7 Agriculture 2 2012-01-01 2012-01-01 false Samples for wool and wool top grades; method of... STANDARDS AND STANDARD CONTAINER REGULATIONS PURCHASE OF WOOL AND WOOL TOP SAMPLES § 31.400 Samples for wool and wool top grades; method of obtaining. Samples certified as representative of the...

  5. 7 CFR 32.402 - Samples of mohair top grades; method of obtaining.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 7 Agriculture 2 2012-01-01 2012-01-01 false Samples of mohair top grades; method of obtaining. 32... STANDARD CONTAINER REGULATIONS PURCHASE OF GREASE MOHAIR AND MOHAIR TOP SAMPLES § 32.402 Samples of mohair top grades; method of obtaining. Samples certified as representative of the official standards of...

  6. The origin and radiation of Macaronesian beetles breeding in Euphorbia: the relative importance of multiple data partitions and population sampling.

    PubMed

    Jordal, Bjarte H; Hewitt, Godfrey M

    2004-10-01

    Species-level phylogenies derived from many independent character sources and wide geographical sampling provide a powerful tool in assessing the importance of various factors associated with cladogenesis. In this study, we explore the relative importance of insular isolation and host plant switching in the diversification of a group of bark beetles (Curculionidae: Scolytinae) feeding and breeding in woody Euphor biaspurges. All species in the genus Aphanarthrumare each associated with only one species group of Euphorbia(succulents or one of three different arborescent groups), and the majority of species are endemic to one or several of the Macaronesian Islands. Hence, putative mechanisms of speciation could be assessed by identifying pairs of sister species in a phylogenetic analysis. We used DNA sequences from two nuclear and two mitochondrial genes, and morphological characters, to reconstruct the genealogical relationships among 92 individuals of 25 species and subspecies of Aphanarthrumand related genera. A stable tree topology was highly dependent on multiple character sources, but much less so on wide population sampling. However, multiple samples per species demonstrated one case of species paraphyly, as well as deep coalescence among three putative subspecies pairs. The phylogenetic analyses consistently placed the arborescent breeding and West African--Lanzarote-distributed species A. armatumin the most basal position in Aphanarthrum, rendering this genus paraphyletic with respect to Coleobothrus. Two major radiations followed, one predominantly African lineage of succulent feeding species, and one island radiation associated with arborescent host plants. Sister comparisons showed that most recent divergences occurred in allopatry on closely related hosts, with subsequent expansions obscuring more ancient events. Only 6 out of 24 cladogenetic events were associated with host switching, rendering geographical factors more important in recent

  7. In-syringe reversed dispersive liquid-liquid microextraction for the evaluation of three important bioactive compounds of basil, tarragon and fennel in human plasma and urine samples.

    PubMed

    Barfi, Azadeh; Nazem, Habibollah; Saeidi, Iman; Peyrovi, Moazameh; Afsharzadeh, Maryam; Barfi, Behruz; Salavati, Hossein

    2016-03-20

    In the present study, an efficient and environmental friendly method (called in-syringe reversed dispersive liquid-liquid microextraction (IS-R-DLLME)) was developed to extract three important components (i.e. para-anisaldehyde, trans-anethole and its isomer estragole) simultaneously in different plant extracts (basil, fennel and tarragon), human plasma and urine samples prior their determination using high-performance liquid chromatography. The importance of choosing these plant extracts as samples is emanating from the dual roles of their bioactive compounds (trans-anethole and estragole), which can alter positively or negatively different cellular processes, and necessity to a simple and efficient method for extraction and sensitive determination of these compounds in the mentioned samples. Under the optimum conditions (including extraction solvent: 120 μL of n-octanol; dispersive solvent: 600 μL of acetone; collecting solvent: 1000 μL of acetone, sample pH 3; with no salt), limits of detection (LODs), linear dynamic ranges (LDRs) and recoveries (R) were 79-81 ng mL(-1), 0.26-6.9 μg mL(-1) and 94.1-99.9%, respectively. The obtained results showed that the IS-R-DLLME was a simple, fast and sensitive method with low level consumption of extraction solvent which provides high recovery under the optimum conditions. The present method was applied to investigate the absorption amounts of the mentioned analytes through the determination of the analytes before (in the plant extracts) and after (in the human plasma and urine samples) the consumption which can determine the toxicity levels of the analytes (on the basis of their dosages) in the extracts. PMID:26802527

  8. In-syringe reversed dispersive liquid-liquid microextraction for the evaluation of three important bioactive compounds of basil, tarragon and fennel in human plasma and urine samples.

    PubMed

    Barfi, Azadeh; Nazem, Habibollah; Saeidi, Iman; Peyrovi, Moazameh; Afsharzadeh, Maryam; Barfi, Behruz; Salavati, Hossein

    2016-03-20

    In the present study, an efficient and environmental friendly method (called in-syringe reversed dispersive liquid-liquid microextraction (IS-R-DLLME)) was developed to extract three important components (i.e. para-anisaldehyde, trans-anethole and its isomer estragole) simultaneously in different plant extracts (basil, fennel and tarragon), human plasma and urine samples prior their determination using high-performance liquid chromatography. The importance of choosing these plant extracts as samples is emanating from the dual roles of their bioactive compounds (trans-anethole and estragole), which can alter positively or negatively different cellular processes, and necessity to a simple and efficient method for extraction and sensitive determination of these compounds in the mentioned samples. Under the optimum conditions (including extraction solvent: 120 μL of n-octanol; dispersive solvent: 600 μL of acetone; collecting solvent: 1000 μL of acetone, sample pH 3; with no salt), limits of detection (LODs), linear dynamic ranges (LDRs) and recoveries (R) were 79-81 ng mL(-1), 0.26-6.9 μg mL(-1) and 94.1-99.9%, respectively. The obtained results showed that the IS-R-DLLME was a simple, fast and sensitive method with low level consumption of extraction solvent which provides high recovery under the optimum conditions. The present method was applied to investigate the absorption amounts of the mentioned analytes through the determination of the analytes before (in the plant extracts) and after (in the human plasma and urine samples) the consumption which can determine the toxicity levels of the analytes (on the basis of their dosages) in the extracts.

  9. Several methods for concentrating bacteria in fluid samples

    NASA Technical Reports Server (NTRS)

    Thomas, R. R.

    1976-01-01

    The sensitivities of the firefly luciferase - ATP flow system and luminol flow system were established as 300,000 E. coli per milliliter and 10,000 E. coli per milliliter respectively. To achieve the detection limit of 1,000 bacteria per milliliter previously established, a method of concentrating microorganisms using a sartorius membrane filter system is investigated. Catalase in 50% ethanol is found to be a stable luminol standard and can be used up to 24 hours with only a 10% loss of activity. The luminol reagent is also stable over a 24 hour period. A method of preparing relatively inexpensive luciferase from desiccated firefly tails is developed.

  10. Comparison of Gingival Crevicular Fluid Sampling Methods in Patients with Severe Chronic Periodontitis

    PubMed Central

    Guentsch, Arndt; Kramesberger, Martin; Sroka, Aneta; Pfister, Wolfgang; Potempa, Jan; Eick, Sigrun

    2011-01-01

    Background Analysis of samplings from periodontal pockets is important in diagnosis and therapy control of periodontitis. In this study, three different sampling techniques were compared to determine if one method can yield samples suitable for reproducible and simultaneous determination of bacterial load, cytokines, neutrophil elastase, and Arg-specific gingipains. R-gingipains are an important virulence factor of Porphyromonas gingivalis, the exact concentration of which in gingival crevicular fluid (GCF) has not yet been quantified. Methods GCF was sampled from four sites per patient (each two sites one method) in 36 chronic periodontitis patients. One week later, the procedure was repeated with alternative methods. The variables that had been determined were: loads of Aggregatibacter actinomycetemcomitans and P. gingivalis, levels of interleukin-6 and interleukin-8, activity of neutrophil elastase and level of R-gingipains. Results The detected cytokine levels were higher using paper strips compared to paper points. Bacteria were found in similar loads from the paper strips and paper points. R-gingipains were detectable in high quantities only by washing of the periodontal pocket. The level of R-gingipains correlated with the load of P. gingivalis. Conclusion The use of paper strips is suitable for simultaneous determination of microbial and immunological parameters. Obtaining GCF by washing can be useful for special purposes. Gingipain concentration in periodontal pockets was directly determined to be up to 1.5 μM. This value indicates that most of so far identified substrates of these proteases by in vitro assays can be easily degraded in P. gingivalis infected sites. PMID:21235330

  11. 40 CFR 761.243 - Standard wipe sample method and size.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... Natural Gas Pipeline: Selecting Sample Sites, Collecting Surface Samples, and Analyzing Standard PCB Wipe Samples § 761.243 Standard wipe sample method and size. (a) Collect a surface sample from a natural gas... June 23, 1987 and revised on April 18, 1991. This document is available on EPA's Web site at...

  12. 40 CFR 761.243 - Standard wipe sample method and size.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... Natural Gas Pipeline: Selecting Sample Sites, Collecting Surface Samples, and Analyzing Standard PCB Wipe Samples § 761.243 Standard wipe sample method and size. (a) Collect a surface sample from a natural gas... June 23, 1987 and revised on April 18, 1991. This document is available on EPA's Web site at...

  13. 40 CFR 761.243 - Standard wipe sample method and size.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... Natural Gas Pipeline: Selecting Sample Sites, Collecting Surface Samples, and Analyzing Standard PCB Wipe Samples § 761.243 Standard wipe sample method and size. (a) Collect a surface sample from a natural gas... June 23, 1987 and revised on April 18, 1991. This document is available on EPA's Web site at...

  14. 40 CFR 761.243 - Standard wipe sample method and size.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Natural Gas Pipeline: Selecting Sample Sites, Collecting Surface Samples, and Analyzing Standard PCB Wipe Samples § 761.243 Standard wipe sample method and size. (a) Collect a surface sample from a natural gas... June 23, 1987 and revised on April 18, 1991. This document is available on EPA's Web site at...

  15. 40 CFR 761.243 - Standard wipe sample method and size.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... Natural Gas Pipeline: Selecting Sample Sites, Collecting Surface Samples, and Analyzing Standard PCB Wipe Samples § 761.243 Standard wipe sample method and size. (a) Collect a surface sample from a natural gas... June 23, 1987 and revised on April 18, 1991. This document is available on EPA's Web site at...

  16. TESTING METHODS FOR DETECTION OF CRYPTOSPORIDIUM SPP. IN WATER SAMPLES

    EPA Science Inventory

    A large waterborne outbreak of cryptosporidiosis in Milwaukee, Wisconsin, U.S.A. in 1993 prompted a search for ways to prevent large-scale waterborne outbreaks of protozoan parasitoses. Methods for detecting Cryptosporidium parvum play an integral role in strategies that lead to...

  17. A New IRT-Based Small Sample DIF Method.

    ERIC Educational Resources Information Center

    Tang, Huixing

    This paper describes an item response theory (IRT) based method of differential item functioning (DIF) detection that involves neither separate calibration nor ability grouping. IRT is used to generate residual scores, scores free of the effects of person or group ability and item difficulty. Analysis of variance is then used to test the group…

  18. COMPARISON OF LARGE RIVER SAMPLING METHODS ON ALGAL METRICS

    EPA Science Inventory

    We compared the results of four methods used to assess the algal communities at 60 sites distributed among four rivers. Based on Principle Component Analysis of physical habitat data collected concomitantly with the algal data, sites were separated into those with a mean thalweg...

  19. COMPARISON OF LARGE RIVER SAMPLING METHOD USING DIATOM METRICS

    EPA Science Inventory

    We compared the results of four methods used to assess the algal communities at 60 sites distributed among four rivers. Based on Principle Component Analysis of physical habitat data collected concomitantly with the algal data, sites were separated into those with a mean thalweg...

  20. High-throughput liquid-absorption preconcentrator sampling methods

    DOEpatents

    Zaromb, Solomon

    1994-01-01

    A system for detecting trace concentrations of an analyte in air includes a preconcentrator for the analyte and an analyte detector. The preconcentrator includes an elongated tubular container comprising a wettable material. The wettable material is continuously wetted with an analyte-sorbing liquid which flows from one part of the container to a lower end. Sampled air flows through the container in contact with the wetted material with a swirling motion which results in efficient transfer of analyte vapors or aerosol particles to the sorbing liquid and preconcentration of traces of analyte in the liquid. The preconcentrated traces of analyte may be either detected within the container or removed therefrom for injection into a separate detection means or for subsequent analysis.

  1. Comparison of methods for the quantification of carbonate carbon in atmospheric PM10 aerosol samples

    NASA Astrophysics Data System (ADS)

    Jankowski, Nicole; Schmidl, Christoph; Marr, Iain L.; Bauer, Heidi; Puxbaum, Hans

    Carbonate carbon (CC) represents an important fraction of atmospheric PM10 along with organic carbon (OC) and elemental carbon (EC), if specific sources (e.g. street abrasion, construction sites, desert dust) contribute to its composition. However, analytical methods for an easy and unambiguous determination of CC in atmospheric aerosols collected on filter matrices are scarce. We propose here a method for the determination of CC based on a heating pretreatment of the sample to remove OC and EC, followed by a total carbon determination to measure CC. This procedure is used for the correction of EC also determined by a heating pretreatment (Cachier, H., Bremond, M.P., Buat-Ménard, P., 1989. Determination of atmospheric soot carbon with a simple thermal method. Tellus 41B, 379-390) but without previous HCl fumigation, as proposed. Comparison of the carbon remaining after the proposed thermal treatment at 460 °C for 60 min in an oxygen stream showed good correlation for the carbonate carbon derived by calculation from the ionic balance for ambient air and street dust samples. Using the "three step" combustion technique it is now possible to determine OC, EC and CC by the use of a TC analyser in the concentration range of 2-200 μg carbon per sample aliquot, with good precision (3-5% RSD for TC and 5-10% for CC) and accuracy. In ambient air samples from a sampling site in Vienna with elevated PM10 levels ("Liesing") CC values as high as 25% of TC and 27% CO 32-; for street dust samples 32% of TC and 25% CO 32- of total PM10 mass were observed.

  2. Twenty-four cases of imported zika virus infections diagnosed by molecular methods.

    PubMed

    Alejo-Cancho, Izaskun; Torner, Nuria; Oliveira, Inés; Martínez, Ana; Muñoz, José; Jane, Mireia; Gascón, Joaquim; Requena-Méndez, Ana; Vilella, Anna; Marcos, M Ángeles; Pinazo, María Jesús; Gonzalo, Verónica; Rodriguez, Natalia; Martínez, Miguel J

    2016-10-01

    Zika virus is an emerging flavivirus widely spreading through Latin America. Molecular diagnosis of the infection can be performed using serum, urine and saliva samples, although a well-defined diagnostic algorithm is not yet established. We describe a series of 24 cases of imported zika virus infection into Catalonia (northeastern Spain). Based on our findings, testing of paired serum and urine samples is recommended.

  3. RAPID METHOD FOR PLUTONIUM, AMERICIUM AND CURIUM IN VERY LARGE SOIL SAMPLES

    SciTech Connect

    Maxwell, S

    2007-01-08

    The analysis of actinides in environmental soil and sediment samples is very important for environmental monitoring. There is a need to measure actinide isotopes with very low detection limits. A new, rapid actinide separation method has been developed and implemented that allows the measurement of plutonium, americium and curium isotopes in very large soil samples (100-200 g) with high chemical recoveries and effective removal of matrix interferences. This method uses stacked TEVA Resin{reg_sign}, TRU Resin{reg_sign} and DGA-Resin{reg_sign} cartridges from Eichrom Technologies (Darien, IL, USA) that allows the rapid separation of plutonium (Pu), americium (Am), and curium (Cm) using a single multistage column combined with alpha spectrometry. The method combines an acid leach step and innovative matrix removal using cerium fluoride precipitation to remove the difficult soil matrix. This method is unique in that it provides high tracer recoveries and effective removal of interferences with small extraction chromatography columns instead of large ion exchange resin columns that generate large amounts of acid waste. By using vacuum box cartridge technology with rapid flow rates, sample preparation time is minimized.

  4. Discharge measurement with salt dilution method in irrigation canals: direct sampling and geophysical controls

    NASA Astrophysics Data System (ADS)

    Comina, C.; Lasagna, M.; De Luca, D. A.; Sambuelli, L.

    2013-08-01

    An important starting point for designing management improvements, particularly in irrigation areas, is to record the baseline state of the water resources, including the amount of discharge from canals. In this respect discharge measurements by means of the salt dilution method is a traditional and well-documented technique. However, this methodology can be strongly influenced by the natural streaming characteristics of the canal (e.g. laminar vs. turbulent flow) and accurate precautions must be considered in the choice of both the measuring section and the length of the measuring reach of the canal which can affect the plume shape. The knowledge of plume distribution in the measuring cross-section is of primary importance for a correct location of sampling points aimed in obtaining a reliable measurement. To obtain this, geophysical imaging of an NaCl plume from a slug-injection salt dilution test has been performed within this paper by means of cross-flow fast electric resistivity tomography (FERT) in a real case history. Direct sampling of the same plume has been also performed with a multisampling optimization technique to obtain an average value over the measuring section by means of contemporarily sampling water in nine points. Results show that a correct visualization of the passage of the salt plume is possible by means of geophysical controls and that this can potentially help in the correct location of sampling points.

  5. Tissue sampling methods and standards for vertebrate genomics

    PubMed Central

    2012-01-01

    The recent rise in speed and efficiency of new sequencing technologies have facilitated high-throughput sequencing, assembly and analyses of genomes, advancing ongoing efforts to analyze genetic sequences across major vertebrate groups. Standardized procedures in acquiring high quality DNA and RNA and establishing cell lines from target species will facilitate these initiatives. We provide a legal and methodological guide according to four standards of acquiring and storing tissue for the Genome 10K Project and similar initiatives as follows: four-star (banked tissue/cell cultures, RNA from multiple types of tissue for transcriptomes, and sufficient flash-frozen tissue for 1 mg of DNA, all from a single individual); three-star (RNA as above and frozen tissue for 1 mg of DNA); two-star (frozen tissue for at least 700 μg of DNA); and one-star (ethanol-preserved tissue for 700 μg of DNA or less of mixed quality). At a minimum, all tissues collected for the Genome 10K and other genomic projects should consider each species’ natural history and follow institutional and legal requirements. Associated documentation should detail as much information as possible about provenance to ensure representative sampling and subsequent sequencing. Hopefully, the procedures outlined here will not only encourage success in the Genome 10K Project but also inspire the adaptation of standards by other genomic projects, including those involving other biota. PMID:23587255

  6. A sampling method for conducting relocation studies with freshwater mussels

    USGS Publications Warehouse

    Waller, D.L.; Rach, J.J.; Cope, W.G.; Luoma, J.A.

    1993-01-01

    Low recovery of transplanted mussels often prevents accurate estimates of survival. We developed a method that provided a high recovery of transplanted mussels and allowed for a reliable assessment of mortality. A 3 x 3 m polyvinyl chloride (PVC) pipe grid was secured to the sediment with iron reinforcing bars. The grid was divided into nine 1-m super(2) segments and each treatment segment, was stocked with 100 marked mussels. The recovery of mussels after six months exceeded 80% in all but one treatment group.

  7. Sampling and analytical methods of stable isotopes and dissolved inorganic carbon from CO2 injection sites

    NASA Astrophysics Data System (ADS)

    van Geldern, Robert; Myrttinen, Anssi; Becker, Veith; Barth, Johannes A. C.

    2010-05-01

    The isotopic composition (δ13C) of dissolved inorganic carbon (DIC), in combination with DIC concentration measurements, can be used to quantify geochemical trapping of CO2 in water. This is of great importance in monitoring the fate of CO2 in the subsurface in CO2 injection projects. When CO2 mixes with water, a shift in the δ13C values, as well as an increase in DIC concentrations is observed in the CO2-H2O system. However, when using standard on-site titration methods, it is often challenging to determining accurate in-situ DIC concentrations. This may be due to CO2 degassing and CO2-exchange between the sample and the atmosphere during titration, causing a change in the pH value or due to other unfavourable conditions such as turbid water samples or limited availability of fluid samples. A way to resolve this problem is by simultaneously determining the DIC concentration and carbon isotopic composition using a standard continuous flow Isotope Ratio Mass Spectrometry (CF-IRMS) setup with a Gasbench II coupled to Delta plusXP mass spectrometer. During sampling, in order to avoid atmospheric contact, water samples taken from the borehole-fluid-sampler should be directly transferred into a suitable container, such as a gasbag. Also, to avoid isotope fractionation due to biological activity in the sample, it is recommended to stabilize the gasbags prior to sampling with HgCl2 for the subsequent stable isotope analysis. The DIC concentration of the samples can be determined from the area of the sample peaks in a chromatogram from a CF-IRMS analysis, since it is directly proportional to the CO2 generated by the reaction of the water with H3PO4. A set of standards with known DIC concentrations should be prepared by mixing NaHCO3 with DIC free water. Since the DIC concentrations of samples taken from CO2 injection sites are expected to be exceptionally high due to the additional high amounts of added CO2, the DIC concentration range of the standards should be set high

  8. Method and apparatus for processing a test sample to concentrate an analyte in the sample from a solvent in the sample

    DOEpatents

    Turner, T.D.; Beller, L.S.; Clark, M.L.; Klingler, K.M.

    1997-10-14

    A method of processing a test sample to concentrate an analyte in the sample from a solvent in the sample includes: (a) boiling the test sample containing the analyte and solvent in a boiling chamber to a temperature greater than or equal to the solvent boiling temperature and less than the analyte boiling temperature to form a rising sample vapor mixture; (b) passing the sample vapor mixture from the boiling chamber to an elongated primary separation tube, the separation tube having internal sidewalls and a longitudinal axis, the longitudinal axis being angled between vertical and horizontal and thus having an upper region and a lower region; (c) collecting the physically transported liquid analyte on the internal sidewalls of the separation tube; and (d) flowing the collected analyte along the angled internal sidewalls of the separation tube to and pass the separation tube lower region. The invention also includes passing a turbulence inducing wave through a vapor mixture to separate physically transported liquid second material from vaporized first material. Apparatus is also disclosed for effecting separations. Further disclosed is a fluidically powered liquid test sample withdrawal apparatus for withdrawing a liquid test sample from a test sample container and for cleaning the test sample container. 8 figs.

  9. Method and apparatus for processing a test sample to concentrate an analyte in the sample from a solvent in the sample

    DOEpatents

    Turner, Terry D.; Beller, Laurence S.; Clark, Michael L.; Klingler, Kerry M.

    1997-01-01

    A method of processing a test sample to concentrate an analyte in the sample from a solvent in the sample includes: a) boiling the test sample containing the analyte and solvent in a boiling chamber to a temperature greater than or equal to the solvent boiling temperature and less than the analyte boiling temperature to form a rising sample vapor mixture; b) passing the sample vapor mixture from the boiling chamber to an elongated primary separation tube, the separation tube having internal sidewalls and a longitudinal axis, the longitudinal axis being angled between vertical and horizontal and thus having an upper region and a lower region; c) collecting the physically transported liquid analyte on the internal sidewalls of the separation tube; and d) flowing the collected analyte along the angled internal sidewalls of the separation tube to and pass the separation tube lower region. The invention also includes passing a turbulence inducing wave through a vapor mixture to separate physically transported liquid second material from vaporized first material. Apparatus are also disclosed for effecting separations. Further disclosed is a fluidically powered liquid test sample withdrawal apparatus for withdrawing a liquid test sample from a test sample container and for cleaning the test sample container.

  10. Application of mosquito sampling count and geospatial methods to improve dengue vector surveillance.

    PubMed

    Chansang, Chitti; Kittayapong, Pattamaporn

    2007-11-01

    Dengue hemorrhagic fever is a major public health problem in several countries around the world. Dengue vector surveillance is an important methodology to determine when and where to take the control action. We used a combination of the Global Positioning System (GPS)/Geographic Information System (GIS) technology and the immature sampling count method to improve dengue vector surveillance. Both complete count and sampling count methods were used simultaneously to collect immature dengue vectors in all houses and all containers in one village in eastern Thailand to determine the efficiency of the sampling count technique. A hand-held GPS unit was used to record the location of surveyed houses. Linear regression indicated a high correlation between total immature populations resulting from the complete count and estimates from sampling count of immature stages. The immature survey data and the GPS coordinates of house location were combined into GIS maps showing distribution of immature density and clustering of immature stages and positive containers in the study area. This approach could be used to improve the efficiency and accuracy of dengue vector surveillance for targeting vector control.

  11. In situ hybridization method for studies of cell wall deficient M. paratuberculosis in tissue samples.

    PubMed

    Hulten, K; Karttunen, T J; El-Zimaity, H M; Naser, S A; Almashhrawi, A; Graham, D Y; El-Zaatari, F A

    2000-12-20

    Cell wall deficient forms of mycobacteria may be important in the pathogenesis of Crohn's disease and sarcoidosis. However, no method has been available to localize this type of organisms in tissue sections. We developed an in situ hybridization method for the demonstration of Mycobacterium paratuberculosis spheroplasts (cell wall deficient forms) in paraffin embedded tissue sections.M. paratuberculosis spheroplasts were prepared by treatment with glycine and lysozyme. Pieces of beef were injected with the prepared spheroplasts. The samples were fixed in buffered formalin and paraffin embedded. A M. paratuberculosis-specific probe derived from the IS900 gene was used. Specificity was controlled by using an irrelevant probe and by hybridizing sections with spheroplasts from other bacteria. Beef samples injected with M. paratuberculosis spheroplasts were the only samples that hybridized with the probe. Beef samples containing acid-fast or spheroplast forms of M. smegmatis and M. tuberculosis as well as the acid-fast forms of M. paratuberculosis did not hybridize with the probe. Unrelated bacterial controls, i.e. Helicobacter pylori and Escherichia coli were also negative in the assay. In situ hybridization with the IS900 probe provides a specific way to localize M. paratuberculosis spheroplasts in tissue sections and may be useful for studies of the connection between M. paratuberculosis and Crohn's disease and sarcoidosis. The assay may also be valuable for studies on Johne's diseased animals. PMID:11118736

  12. An Improved Method for High Quality Metagenomics DNA Extraction from Human and Environmental Samples

    PubMed Central

    Bag, Satyabrata; Saha, Bipasa; Mehta, Ojasvi; Anbumani, D.; Kumar, Naveen; Dayal, Mayanka; Pant, Archana; Kumar, Pawan; Saxena, Shruti; Allin, Kristine H.; Hansen, Torben; Arumugam, Manimozhiyan; Vestergaard, Henrik; Pedersen, Oluf; Pereira, Verima; Abraham, Philip; Tripathi, Reva; Wadhwa, Nitya; Bhatnagar, Shinjini; Prakash, Visvanathan Gnana; Radha, Venkatesan; Anjana, R. M.; Mohan, V.; Takeda, Kiyoshi; Kurakawa, Takashi; Nair, G. Balakrish; Das, Bhabatosh

    2016-01-01

    To explore the natural microbial community of any ecosystems by high-resolution molecular approaches including next generation sequencing, it is extremely important to develop a sensitive and reproducible DNA extraction method that facilitate isolation of microbial DNA of sufficient purity and quantity from culturable and uncultured microbial species living in that environment. Proper lysis of heterogeneous community microbial cells without damaging their genomes is a major challenge. In this study, we have developed an improved method for extraction of community DNA from different environmental and human origin samples. We introduced a combination of physical, chemical and mechanical lysis methods for proper lysis of microbial inhabitants. The community microbial DNA was precipitated by using salt and organic solvent. Both the quality and quantity of isolated DNA was compared with the existing methodologies and the supremacy of our method was confirmed. Maximum recovery of genomic DNA in the absence of substantial amount of impurities made the method convenient for nucleic acid extraction. The nucleic acids obtained using this method are suitable for different downstream applications. This improved method has been named as the THSTI method to depict the Institute where the method was developed. PMID:27240745

  13. Determination of methylmercury in marine biota samples: method validation.

    PubMed

    Carrasco, Luis; Vassileva, Emilia

    2014-05-01

    Regulatory authorities are expected to measure concentration of contaminants in foodstuffs, but the simple determination of total amount cannot be sufficient for fully judging its impact on the human health. In particular, the methylation of metals generally increases their toxicity; therefore validated analytical methods producing reliable results for the assessment of methylated species are highly needed. Nowadays, there is no legal limit for methylmercury (MeHg) in food matrices. Hence, no standardized method for the determination of MeHg exists within the international jurisdiction. Contemplating the possibility of a future legislative limit, a method for low level determination of MeHg in marine biota matrixes, based on aqueous-phase ethylation followed by purge and trap and gas chromatography (GC) coupled to pyrolysis-atomic fluorescence spectrometry (Py-AFS) detection, has been developed and validated. Five different extraction procedures, namely acid and alkaline leaching assisted by microwave and conventional oven heating, as well as enzymatic digestion, were evaluated in terms of their efficiency to extract MeHg from Scallop soft tissue IAEA-452 Certified Reference Material. Alkaline extraction with 25% (w/w) KOH in methanol, microwave-assisted extraction (MAE) with 5M HCl and enzymatic digestion with protease XIV yielded the highest extraction recoveries. Standard addition or the introduction of a dilution step were successfully applied to overcome the matrix effects observed when microwave-assisted extraction using 25% (w/w) KOH in methanol or 25% (w/v) aqueous TMAH were used. ISO 17025 and Eurachem guidelines were followed to perform the validation of the methodology. Accordingly, blanks, selectivity, calibration curve, linearity (0.9995), working range (1-800pg), recovery (97%), precision, traceability, limit of detection (0.45pg), limit of quantification (0.85pg) and expanded uncertainty (15.86%, k=2) were assessed with Fish protein Dorm-3 Certified

  14. Geoscience Education Research Methods: Thinking About Sample Size

    NASA Astrophysics Data System (ADS)

    Slater, S. J.; Slater, T. F.; CenterAstronomy; Physics Education Research

    2011-12-01

    Geoscience education research is at a critical point in which conditions are sufficient to propel our field forward toward meaningful improvements in geosciences education practices. Our field has now reached a point where the outcomes of our research is deemed important to endusers and funding agencies, and where we now have a large number of scientists who are either formally trained in geosciences education research, or who have dedicated themselves to excellence in this domain. At this point we now must collectively work through our epistemology, our rules of what methodologies will be considered sufficiently rigorous, and what data and analysis techniques will be acceptable for constructing evidence. In particular, we have to work out our answer to that most difficult of research questions: "How big should my 'N' be??" This paper presents a very brief answer to that question, addressing both quantitative and qualitative methodologies. Research question/methodology alignment, effect size and statistical power will be discussed, in addition to a defense of the notion that bigger is not always better.

  15. Determination of optimal sampling times for a two blood sample clearance method using (51)Cr-EDTA in cats.

    PubMed

    Vandermeulen, Eva; De Sadeleer, Carlos; Piepsz, Amy; Ham, Hamphrey R; Dobbeleir, André A; Vermeire, Simon T; Van Hoek, Ingrid M; Daminet, Sylvie; Slegers, Guido; Peremans, Kathelijne Y

    2010-08-01

    Estimation of the glomerular filtration rate (GFR) is a useful tool in the evaluation of kidney function in feline medicine. GFR can be determined by measuring the rate of tracer disappearance from the blood, and although these measurements are generally performed by multi-sampling techniques, simplified methods are more convenient in clinical practice. The optimal times for a simplified sampling strategy with two blood samples (2BS) for GFR measurement in cats using plasma (51)chromium ethylene diamine tetra-acetic acid ((51)Cr-EDTA) clearance were investigated. After intravenous administration of (51)Cr-EDTA, seven blood samples were obtained in 46 cats (19 euthyroid and 27 hyperthyroid cats, none with previously diagnosed chronic kidney disease (CKD)). The plasma clearance was then calculated from the seven point blood kinetics (7BS) and used for comparison to define the optimal sampling strategy by correlating different pairs of time points to the reference method. Mean GFR estimation for the reference method was 3.7+/-2.5 ml/min/kg (mean+/-standard deviation (SD)). Several pairs of sampling times were highly correlated with this reference method (r(2) > or = 0.980), with the best results when the first sample was taken 30 min after tracer injection and the second sample between 198 and 222 min after injection; or with the first sample at 36 min and the second at 234 or 240 min (r(2) for both combinations=0.984). Because of the similarity of GFR values obtained with the 2BS method in comparison to the values obtained with the 7BS reference method, the simplified method may offer an alternative for GFR estimation. Although a wide range of GFR values was found in the included group of cats, the applicability should be confirmed in cats suspected of renal disease and with confirmed CKD. Furthermore, although no indications of age-related effect were found in this study, a possible influence of age should be included in future studies. PMID:20452793

  16. Fire ant-detecting canines: a complementary method in detecting red imported fire ants.

    PubMed

    Lin, Hui-Min; Chi, Wei-Lien; Lin, Chung-Chi; Tseng, Yu-Ching; Chen, Wang-Ting; Kung, Yu-Ling; Lien, Yi-Yang; Chen, Yang-Yuan

    2011-02-01

    In this investigation, detection dogs are trained and used in identifying red imported fire ants, Solenopsis invicta Buren, and their nests. The methodology could assist in reducing the frequency and scope of chemical treatments for red imported fire ant management and thus reduce labor costs and chemical use as well as improve control and quarantine efficiency. Three dogs previously trained for customs quarantine were retrained to detect the scents of red imported fire ants. After passing tests involving different numbers of live red imported fire ants and three other ant species--Crematogaster rogenhoferi Mayr, Paratrechina longicornis Latreille, and Pheidole megacephala F.--placed in containers, ajoint field survey for red imported fire ant nests by detection dogs and bait traps was conducted to demonstrate their use as a supplement to conventional detection methods. The most significant findings in this report are (1) with 10 or more red imported fire ants in scent containers, the dogs had >98% chance in tracing the red imported fire ant. Upon the introduction of other ant species, the dogs still achieved on average, a 93% correct red imported fire ant indication rate. Moreover, the dogs demonstrated great competence in pinpointing emerging and smaller red imported fire ant nests in red imported fire ant-infested areas that had been previously confirmed by bait trap stations. (2) Along with the bait trap method, we also discovered that approximately 90% of red imported fire ants foraged within a distance of 14 m away from their nests. The results prove detection dogs to be most effective for red imported fire ant control in areas that have been previously treated with pesticides and therefore containing a low density of remaining red imported fire ant nests. Furthermore, as a complement to other red imported fire ant monitoring methods, this strategy will significantly increase the efficacy of red imported fire ant control in cases of individual mount treatment.

  17. Sampling strategies and post-processing methods for increasing the time resolution of organic aerosol measurements requiring long sample-collection times

    NASA Astrophysics Data System (ADS)

    Modini, Rob L.; Takahama, Satoshi

    2016-07-01

    The composition and properties of atmospheric organic aerosols (OAs) change on timescales of minutes to hours. However, some important OA characterization techniques typically require greater than a few hours of sample-collection time (e.g., Fourier transform infrared (FTIR) spectroscopy). In this study we have performed numerical modeling to investigate and compare sample-collection strategies and post-processing methods for increasing the time resolution of OA measurements requiring long sample-collection times. Specifically, we modeled the measurement of hydrocarbon-like OA (HOA) and oxygenated OA (OOA) concentrations at a polluted urban site in Mexico City, and investigated how to construct hourly resolved time series from samples collected for 4, 6, and 8 h. We modeled two sampling strategies - sequential and staggered sampling - and a range of post-processing methods including interpolation and deconvolution. The results indicated that relative to the more sophisticated and costly staggered sampling methods, linear interpolation between sequential measurements is a surprisingly effective method for increasing time resolution. Additional error can be added to a time series constructed in this manner if a suboptimal sequential sampling schedule is chosen. Staggering measurements is one way to avoid this effect. There is little to be gained from deconvolving staggered measurements, except at very low values of random measurement error (< 5 %). Assuming 20 % random measurement error, one can expect average recovery errors of 1.33-2.81 µg m-3 when using 4-8 h-long sequential and staggered samples to measure time series of concentration values ranging from 0.13-29.16 µg m-3. For 4 h samples, 19-47 % of this total error can be attributed to the process of increasing time resolution alone, depending on the method used, meaning that measurement precision would only be improved by 0.30-0.75 µg m-3 if samples could be collected over 1 h instead of 4 h. Devising a

  18. Sparsity-weighted outlier FLOODing (OFLOOD) method: Efficient rare event sampling method using sparsity of distribution.

    PubMed

    Harada, Ryuhei; Nakamura, Tomotake; Shigeta, Yasuteru

    2016-03-30

    As an extension of the Outlier FLOODing (OFLOOD) method [Harada et al., J. Comput. Chem. 2015, 36, 763], the sparsity of the outliers defined by a hierarchical clustering algorithm, FlexDice, was considered to achieve an efficient conformational search as sparsity-weighted "OFLOOD." In OFLOOD, FlexDice detects areas of sparse distribution as outliers. The outliers are regarded as candidates that have high potential to promote conformational transitions and are employed as initial structures for conformational resampling by restarting molecular dynamics simulations. When detecting outliers, FlexDice defines a rank in the hierarchy for each outlier, which relates to sparsity in the distribution. In this study, we define a lower rank (first ranked), a medium rank (second ranked), and the highest rank (third ranked) outliers, respectively. For instance, the first-ranked outliers are located in a given conformational space away from the clusters (highly sparse distribution), whereas those with the third-ranked outliers are nearby the clusters (a moderately sparse distribution). To achieve the conformational search efficiently, resampling from the outliers with a given rank is performed. As demonstrations, this method was applied to several model systems: Alanine dipeptide, Met-enkephalin, Trp-cage, T4 lysozyme, and glutamine binding protein. In each demonstration, the present method successfully reproduced transitions among metastable states. In particular, the first-ranked OFLOOD highly accelerated the exploration of conformational space by expanding the edges. In contrast, the third-ranked OFLOOD reproduced local transitions among neighboring metastable states intensively. For quantitatively evaluations of sampled snapshots, free energy calculations were performed with a combination of umbrella samplings, providing rigorous landscapes of the biomolecules.

  19. Determination of methylmercury in marine biota samples with advanced mercury analyzer: method validation.

    PubMed

    Azemard, Sabine; Vassileva, Emilia

    2015-06-01

    In this paper, we present a simple, fast and cost-effective method for determination of methyl mercury (MeHg) in marine samples. All important parameters influencing the sample preparation process were investigated and optimized. Full validation of the method was performed in accordance to the ISO-17025 (ISO/IEC, 2005) and Eurachem guidelines. Blanks, selectivity, working range (0.09-3.0ng), recovery (92-108%), intermediate precision (1.7-4.5%), traceability, limit of detection (0.009ng), limit of quantification (0.045ng) and expanded uncertainty (15%, k=2) were assessed. Estimation of the uncertainty contribution of each parameter and the demonstration of traceability of measurement results was provided as well. Furthermore, the selectivity of the method was studied by analyzing the same sample extracts by advanced mercury analyzer (AMA) and gas chromatography-atomic fluorescence spectrometry (GC-AFS). Additional validation of the proposed procedure was effectuated by participation in the IAEA-461 worldwide inter-laboratory comparison exercises. PMID:25624245

  20. Rapid method for recovery of strongylid third stage larvae of parasitic nematodes from small soil samples.

    PubMed

    Knapp-Lawitzke, Friederike; von Samson-Himmelstjerna, Georg; Demeler, Janina

    2014-07-01

    Livestock with access to pasture is generally exposed to infections with parasitic nematode species by uptake of infective third stage larvae (L3) with the grass. L3 can survive on pasture and particularly also in the soil up to several months and sometimes even longer, depending on temperature and humidity. As indicators for health and productivity of grazing animals it is important to determine the intensity and species spectrum of parasitic nematode larvae by analysing grass as well as soil samples. A rapid method for the recovery of L3 using a centrifugal-flotation technique from soil samples of 50-500 g was developed. The method takes advantage of the low specific weight of larvae to separate them from equal sized soil and debris particles by centrifuging them in a saturated sugar solution. A stack of differently sized sieves is used to achieve elimination of larger particles, dust and sugar from the sample to enable easy counting of larvae. Independent of the number of larvae used for inoculation of the samples a mean recovery of 75.3% was obtained. The recovery rates obtained ranged between 60.8% and 88.0% which demonstrates a considerably lower variability compared to earlier approaches and therefore a more precise estimation of the actual numbers of parasite larvae in soil is achieved. Further advantages over already developed methods are the use of easy, affordable and eco-friendly materials, the simplicity of the procedure and a faster processing time with the possibility to examine up to 20 samples per day.

  1. [Comparative Analysis of Spectrophotometric Methods of the Protein Measurement in the Pectic Polysaccharide Samples].

    PubMed

    Ponomareva, S A; Golovchenko, V V; Patova, O A; Vanchikova, E V; Ovodov, Y S

    2015-01-01

    For the assay to reliability of determination of the protein content in the pectic polysaccharide samples by absorbance in the ultraviolet and visible regions of the spectrum a comparison of the eleven techniques called Flores, Lovry, Bradford, Sedmak, Rueman (ninhydrin reaction) methods, the method of ultraviolet spectrophotometry, the method Benedict's reagent, the method Nessler's reagent, the method with amide black, the bicinchoninic reagent and the biuret method was carried out. The data obtained show that insufficient sensitivity of the seven methods from the listed techniques doesn't allow their usage for determination of protein content in pectic polysaccharide samples. But the Lowry, Bradford, Sedmak methods, and the method Nessler's reagent may be used for determination of protein content in pectic polysaccharide samples, and the Bradford method is advisable for protein contaminants content determination in pectic polysaccharide samples in case protein content is less than 15%, and the Lowry method--for samples is more than 15%. PMID:26165122

  2. Swab Sample Transfer for Point-Of-Care Diagnostics: Characterization of Swab Types and Manual Agitation Methods

    PubMed Central

    Panpradist, Nuttada; Toley, Bhushan J.; Zhang, Xiaohong; Byrnes, Samantha; Buser, Joshua R.; Englund, Janet A.; Lutz, Barry R.

    2014-01-01

    Background The global need for disease detection and control has increased effort to engineer point-of-care (POC) tests that are simple, robust, affordable, and non-instrumented. In many POC tests, sample collection involves swabbing the site (e.g., nose, skin), agitating the swab in a fluid to release the sample, and transferring the fluid to a device for analysis. Poor performance in sample transfer can reduce sensitivity and reproducibility. Methods In this study, we compared bacterial release efficiency of seven swab types using manual-agitation methods typical of POC devices. Transfer efficiency was measured using quantitative PCR (qPCR) for Staphylococcus aureus under conditions representing a range of sampling scenarios: 1) spiking low-volume samples onto the swab, 2) submerging the swab in excess-volume samples, and 3) swabbing dried sample from a surface. Results Excess-volume samples gave the expected recovery for most swabs (based on tip fluid capacity); a polyurethane swab showed enhanced recovery, suggesting an ability to accumulate organisms during sampling. Dry samples led to recovery of ∼20–30% for all swabs tested, suggesting that swab structure and volume is less important when organisms are applied to the outer swab surface. Low-volume samples led to the widest range of transfer efficiencies between swab types. Rayon swabs (63 µL capacity) performed well for excess-volume samples, but showed poor recovery for low-volume samples. Nylon (100 µL) and polyester swabs (27 µL) showed intermediate recovery for low-volume and excess-volume samples. Polyurethane swabs (16 µL) showed excellent recovery for all sample types. This work demonstrates that swab transfer efficiency can be affected by swab material, structure, and fluid capacity and details of the sample. Results and quantitative analysis methods from this study will assist POC assay developers in selecting appropriate swab types and transfer methods. PMID:25181250

  3. Method for sequential injection of liquid samples for radioisotope separations

    DOEpatents

    Egorov, Oleg B.; Grate, Jay W.; Bray, Lane A.

    2000-01-01

    The present invention is a method of separating a short-lived daughter isotope from a longer lived parent isotope, with recovery of the parent isotope for further use. Using a system with a bi-directional pump and one or more valves, a solution of the parent isotope is processed to generate two separate solutions, one of which contains the daughter isotope, from which the parent has been removed with a high decontamination factor, and the other solution contains the recovered parent isotope. The process can be repeated on this solution of the parent isotope. The system with the fluid drive and one or more valves is controlled by a program on a microprocessor executing a series of steps to accomplish the operation. In one approach, the cow solution is passed through a separation medium that selectively retains the desired daughter isotope, while the parent isotope and the matrix pass through the medium. After washing this medium, the daughter is released from the separation medium using another solution. With the automated generator of the present invention, all solution handling steps necessary to perform a daughter/parent radionuclide separation, e.g. Bi-213 from Ac-225 "cow" solution, are performed in a consistent, enclosed, and remotely operated format. Operator exposure and spread of contamination are greatly minimized compared to the manual generator procedure described in U.S. patent application Ser. No. 08/789,973, now U.S. Pat. No. 5,749,042, herein incorporated by reference. Using 16 mCi of Ac-225 there was no detectable external contamination of the instrument components.

  4. Trace iodine quantitation in biological samples by mass spectrometric methods: the optimum internal standard.

    PubMed

    Dyke, Jason V; Dasgupta, Purnendu K; Kirk, Andrea B

    2009-07-15

    Accurate quantitation of iodine in biological samples is essential for studies of nutrition and medicine, as well as for epidemiological studies for monitoring intake of this essential nutrient. Despite the importance of accurate measurement, a standardized method for iodine analysis of biological samples is yet to be established. We have evaluated the effectiveness of (72)Ge, (115)In, and (129)I as internal standards for measurement of iodine in milk and urine samples by induction coupled plasma mass spectrometry (ICP-MS) and of (35)Cl(18)O(4)(-), (129)I(-), and 2-chlorobenzenesulfonate (2-CBS) as internal standards for ion chromatography-tandem mass spectrometry (IC-MS/MS). We found recovery of iodine to be markedly low when IC-MS/MS was used without an internal standard. Percent recovery was similarly low using (35)Cl(18)O(4) as an internal standard for milk and unpredictable when used for urine. 2-Chlorobenzebenzenesulfonate provided accurate recovery of iodine from milk, but overestimated iodine in urine samples by as much as a factor of 2. Percent recovery of iodine from milk and urine using ICP-MS without an internal standard was approximately 120%. Use of (115)In predicted approximately 60% of known values for both milk and urine samples. (72)Ge provided reasonable and consistent percent recovery for iodine in milk samples (approximately 108%) but resulted in approximately 80% recovery of iodine from urine. Use of (129)I as an internal standard resulted in excellent recovery of iodine from both milk and urine samples using either IC-MS/MS and ICP-MS.

  5. A Method for Selective Enrichment and Analysis of Nitrotyrosine-Containing Peptides in Complex Proteome Samples

    SciTech Connect

    Zhang, Qibin; Qian, Weijun; Knyushko, Tanya V.; Clauss, Therese RW; Purvine, Samuel O.; Moore, Ronald J.; Sacksteder, Colette A.; Chin, Mark H.; Smith, Desmond J.; Camp, David G.; Bigelow, Diana J.; Smith, Richard D.

    2007-06-01

    Elevated levels of protein tyrosine nitration have been found in various neurodegenerative diseases and aging related pathologies; however, the lack of an efficient enrichment method has prevented the analysis of this important low level protein modification. We have developed an efficient method for specific enrichment of nitrotyrosine containing peptides that permits nitrotyrosine peptides and specific nitration sites to be unambiguously identified with LC-MS/MS. The method is based on the derivatization of nitrotyrosine into free sulfhydryl groups followed by high efficiency enrichment of sulfhydryl-containing peptides with thiopropyl sepharose beads. The derivatization process starts with acetylation with acetic anhydride to block all primary amines, followed by reduction of nitrotyrosine to aminotyrosine, then derivatization of aminotyrosine with N-Succinimidyl S-Acetylthioacetate (SATA), and finally deprotecting of S-acetyl on SATA to form free sulfhydryl groups. This method was evaluated using nitrotyrosine containing peptides, in-vitro nitrated human histone 1.2, and bovine serum albumin (BSA). 91% and 62% of the identified peptides from enriched histone and BSA samples were nitrotyrosine derivatized peptides, respectively, suggesting relative high specificity of the enrichment method. The application of this method to in-vitro nitrated mouse brain homogenate resulted in 35% of identified peptides containing nitrotyrosine (compared to only 5.9% observed from the global analysis of unenriched sample), and a total of 150 unique nitrated peptides covering 102 proteins were identified with a false discovery rate estimated at 3.3% from duplicate LC-MS/MS analyses of a single enriched sample.

  6. A novel videography method for generating crack-extension resistance curves in small bone samples.

    PubMed

    Katsamenis, Orestis L; Jenkins, Thomas; Quinci, Federico; Michopoulou, Sofia; Sinclair, Ian; Thurner, Philipp J

    2013-01-01

    Assessment of bone quality is an emerging solution for quantifying the effects of bone pathology or treatment. Perhaps one of the most important parameters characterising bone quality is the toughness behaviour of bone. Particularly, fracture toughness, is becoming a popular means for evaluating bone quality. The method is moving from a single value approach that models bone as a linear-elastic material (using the stress intensity factor, K) towards full crack extension resistance curves (R-curves) using a non-linear model (the strain energy release rate in J-R curves). However, for explanted human bone or small animal bones, there are difficulties in measuring crack-extension resistance curves due to size constraints at the millimetre and sub-millimetre scale. This research proposes a novel "whitening front tracking" method that uses videography to generate full fracture resistance curves in small bone samples where crack propagation cannot typically be observed. Here we present this method on sharp edge notched samples (<1 mm×1 mm×Length) prepared from four human femora tested in three-point bending. Each sample was loaded in a mechanical tester with the crack propagation recorded using videography and analysed using an algorithm to track the whitening (damage) zone. Using the "whitening front tracking" method, full R-curves and J-R curves could be generated for these samples. The curves for this antiplane longitudinal orientation were similar to those found in the literature, being between the published longitudinal and transverse orientations. The proposed technique shows the ability to generate full "crack" extension resistance curves by tracking the whitening front propagation to overcome the small size limitations and the single value approach. PMID:23405186

  7. Initial evaluation of Centroidal Voronoi Tessellation method for statistical sampling and function integration.

    SciTech Connect

    Romero, Vicente Jose; Peterson, Janet S.; Burkhardt, John V.; Gunzburger, Max Donald

    2003-09-01

    A recently developed Centroidal Voronoi Tessellation (CVT) unstructured sampling method is investigated here to assess its suitability for use in statistical sampling and function integration. CVT efficiently generates a highly uniform distribution of sample points over arbitrarily shaped M-Dimensional parameter spaces. It has recently been shown on several 2-D test problems to provide superior point distributions for generating locally conforming response surfaces. In this paper, its performance as a statistical sampling and function integration method is compared to that of Latin-Hypercube Sampling (LHS) and Simple Random Sampling (SRS) Monte Carlo methods, and Halton and Hammersley quasi-Monte-Carlo sequence methods. Specifically, sampling efficiencies are compared for function integration and for resolving various statistics of response in a 2-D test problem. It is found that on balance CVT performs best of all these sampling methods on our test problems.

  8. [A New Method of Accurately Extracting Spectral Values for Discrete Sampling Points].

    PubMed

    Lü, Zhen-zhen; Liu, Guang-ming; Yang, Jin-song

    2015-08-01

    In the establishment of remote sensing information inversion model, the actual measured data of discrete sampling points and the corresponding spectrum data to pixels of remote sensing image, are used to establish the relation, thus to realize the goal of information retrieval. Accurate extraction of spectrum value is very important to establish the remote sensing inversion mode. Converting target spot layer to ROI (region of interest) and then saving the ROI as ASCII is one of the methods that researchers often used to extract the spectral values. Analyzing the coordinate and spectrum values extracted using original coordinate in ENVI, we found that the extracted and original coordinate were not inconsistent and part of spectrum values not belong to the pixel containing the sampling point. The inversion model based on the above information cannot really reflect relationship between the target properties and spectral values; so that the model is meaningless. We equally divided the pixel into four parts and summed up the law. It was found that only when the sampling points distributed in the upper left corner of pixels, the extracted values were correct. On the basis of the above methods, this paper systematically studied the principle of extraction target coordinate and spectral values, and summarized the rule. A new method for extracting spectral parameters of the pixel that sampling point located in the environment of ENVI software. Firstly, pixel sampling point coordinates for any of the four corner points were extracted by the sample points with original coordinate in ENVI. Secondly, the sampling points were judged in which partition of pixel by comparing the absolute values of difference longitude and latitude of the original and extraction coordinates. Lastly, all points were adjusted to the upper left corner of pixels by symmetry principle and spectrum values were extracted by the same way in the first step. The results indicated that the extracted spectrum

  9. The Importance of Sample Return in Establishing Chemical Evidence for Life on Mars or Other Solar System Bodies

    NASA Technical Reports Server (NTRS)

    Glavin, D. P.; Conrad, P.; Dworkin, J. P.; Eigenbrode, J.; Mahaffy, P. R.

    2011-01-01

    The search for evidence of life on Mars and elsewhere will continue to be one of the primary goals of NASA s robotic exploration program over the next decade. NASA and ESA are currently planning a series of robotic missions to Mars with the goal of understanding its climate, resources, and potential for harboring past or present life. One key goal will be the search for chemical biomarkers including complex organic compounds important in life on Earth. These include amino acids, the monomer building blocks of proteins and enzymes, nucleobases and sugars which form the backbone of DNA and RNA, and lipids, the structural components of cell membranes. Many of these organic compounds can also be formed abiotically as demonstrated by their prevalence in carbonaceous meteorites [1], though, their molecular characteristics may distinguish a biological source [2]. It is possible that in situ instruments may reveal such characteristics, however, return of the right sample (i.e. one with biosignatures or having a high probability of biosignatures) to Earth would allow for more intensive laboratory studies using a broad array of powerful instrumentation for bulk characterization, molecular detection, isotopic and enantiomeric compositions, and spatially resolved chemistry that may be required for confirmation of extant or extinct Martian life. Here we will discuss the current analytical capabilities and strategies for the detection of organics on the Mars Science Laboratory (MSL) using the Sample Analysis at Mars (SAM) instrument suite and how sample return missions from Mars and other targets of astrobiological interest will help advance our understanding of chemical biosignatures in the solar system.

  10. A Typology of Mixed Methods Sampling Designs in Social Science Research

    ERIC Educational Resources Information Center

    Onwuegbuzie, Anthony J.; Collins, Kathleen M. T.

    2007-01-01

    This paper provides a framework for developing sampling designs in mixed methods research. First, we present sampling schemes that have been associated with quantitative and qualitative research. Second, we discuss sample size considerations and provide sample size recommendations for each of the major research designs for quantitative and…

  11. A Sequential Optimization Sampling Method for Metamodels with Radial Basis Functions

    PubMed Central

    Pan, Guang; Ye, Pengcheng; Yang, Zhidong

    2014-01-01

    Metamodels have been widely used in engineering design to facilitate analysis and optimization of complex systems that involve computationally expensive simulation programs. The accuracy of metamodels is strongly affected by the sampling methods. In this paper, a new sequential optimization sampling method is proposed. Based on the new sampling method, metamodels can be constructed repeatedly through the addition of sampling points, namely, extrema points of metamodels and minimum points of density function. Afterwards, the more accurate metamodels would be constructed by the procedure above. The validity and effectiveness of proposed sampling method are examined by studying typical numerical examples. PMID:25133206

  12. Differences in Movement Pattern and Detectability between Males and Females Influence How Common Sampling Methods Estimate Sex Ratio

    PubMed Central

    Rodrigues, João Fabrício Mota; Coelho, Marco Túlio Pacheco

    2016-01-01

    Sampling the biodiversity is an essential step for conservation, and understanding the efficiency of sampling methods allows us to estimate the quality of our biodiversity data. Sex ratio is an important population characteristic, but until now, no study has evaluated how efficient are the sampling methods commonly used in biodiversity surveys in estimating the sex ratio of populations. We used a virtual ecologist approach to investigate whether active and passive capture methods are able to accurately sample a population’s sex ratio and whether differences in movement pattern and detectability between males and females produce biased estimates of sex-ratios when using these methods. Our simulation allowed the recognition of individuals, similar to mark-recapture studies. We found that differences in both movement patterns and detectability between males and females produce biased estimates of sex ratios. However, increasing the sampling effort or the number of sampling days improves the ability of passive or active capture methods to properly sample sex ratio. Thus, prior knowledge regarding movement patterns and detectability for species is important information to guide field studies aiming to understand sex ratio related patterns. PMID:27441554

  13. Comparison of Sample Preparation Methods Used for the Next-Generation Sequencing of Mycobacterium tuberculosis.

    PubMed

    Tyler, Andrea D; Christianson, Sara; Knox, Natalie C; Mabon, Philip; Wolfe, Joyce; Van Domselaar, Gary; Graham, Morag R; Sharma, Meenu K

    2016-01-01

    The advent and widespread application of next-generation sequencing (NGS) technologies to the study of microbial genomes has led to a substantial increase in the number of studies in which whole genome sequencing (WGS) is applied to the analysis of microbial genomic epidemiology. However, microorganisms such as Mycobacterium tuberculosis (MTB) present unique problems for sequencing and downstream analysis based on their unique physiology and the composition of their genomes. In this study, we compare the quality of sequence data generated using the Nextera and TruSeq isolate preparation kits for library construction prior to Illumina sequencing-by-synthesis. Our results confirm that MTB NGS data quality is highly dependent on the purity of the DNA sample submitted for sequencing and its guanine-cytosine content (or GC-content). Our data additionally demonstrate that the choice of library preparation method plays an important role in mitigating downstream sequencing quality issues. Importantly for MTB, the Illumina TruSeq library preparation kit produces more uniform data quality than the Nextera XT method, regardless of the quality of the input DNA. Furthermore, specific genomic sequence motifs are commonly missed by the Nextera XT method, as are regions of especially high GC-content relative to the rest of the MTB genome. As coverage bias is highly undesirable, this study illustrates the importance of appropriate protocol selection when performing NGS studies in order to ensure that sound inferences can be made regarding mycobacterial genomes. PMID:26849565

  14. Enhanced Sampling in Free Energy Calculations: Combining SGLD with the Bennett's Acceptance Ratio and Enveloping Distribution Sampling Methods.

    PubMed

    König, Gerhard; Miller, Benjamin T; Boresch, Stefan; Wu, Xiongwu; Brooks, Bernard R

    2012-10-01

    One of the key requirements for the accurate calculation of free energy differences is proper sampling of conformational space. Especially in biological applications, molecular dynamics simulations are often confronted with rugged energy surfaces and high energy barriers, leading to insufficient sampling and, in turn, poor convergence of the free energy results. In this work, we address this problem by employing enhanced sampling methods. We explore the possibility of using self-guided Langevin dynamics (SGLD) to speed up the exploration process in free energy simulations. To obtain improved free energy differences from such simulations, it is necessary to account for the effects of the bias due to the guiding forces. We demonstrate how this can be accomplished for the Bennett's acceptance ratio (BAR) and the enveloping distribution sampling (EDS) methods. While BAR is considered among the most efficient methods available for free energy calculations, the EDS method developed by Christ and van Gunsteren is a promising development that reduces the computational costs of free energy calculations by simulating a single reference state. To evaluate the accuracy of both approaches in connection with enhanced sampling, EDS was implemented in CHARMM. For testing, we employ benchmark systems with analytical reference results and the mutation of alanine to serine. We find that SGLD with reweighting can provide accurate results for BAR and EDS where conventional molecular dynamics simulations fail. In addition, we compare the performance of EDS with other free energy methods. We briefly discuss the implications of our results and provide practical guidelines for conducting free energy simulations with SGLD.

  15. Comparing Respondent-Driven Sampling and Targeted Sampling Methods of Recruiting Injection Drug Users in San Francisco

    PubMed Central

    Malekinejad, Mohsen; Vaudrey, Jason; Martinez, Alexis N.; Lorvick, Jennifer; McFarland, Willi; Raymond, H. Fisher

    2010-01-01

    The objective of this article is to compare demographic characteristics, risk behaviors, and service utilization among injection drug users (IDUs) recruited from two separate studies in San Francisco in 2005, one which used targeted sampling (TS) and the other which used respondent-driven sampling (RDS). IDUs were recruited using TS (n = 651) and RDS (n = 534) and participated in quantitative interviews that included demographic characteristics, risk behaviors, and service utilization. Prevalence estimates and 95% confidence intervals (CIs) were calculated to assess whether there were differences in these variables by sampling method. There was overlap in 95% CIs for all demographic variables except African American race (TS: 45%, 53%; RDS: 29%, 44%). Maps showed that the proportion of IDUs distributed across zip codes were similar for the TS and RDS sample, with the exception of a single zip code that was more represented in the TS sample. This zip code includes an isolated, predominantly African American neighborhood where only the TS study had a field site. Risk behavior estimates were similar for both TS and RDS samples, although self-reported hepatitis C infection was lower in the RDS sample. In terms of service utilization, more IDUs in the RDS sample reported no recent use of drug treatment and syringe exchange program services. Our study suggests that perhaps a hybrid sampling plan is best suited for recruiting IDUs in San Francisco, whereby the more intensive ethnographic and secondary analysis components of TS would aid in the planning of seed placement and field locations for RDS. PMID:20582573

  16. Comparing respondent-driven sampling and targeted sampling methods of recruiting injection drug users in San Francisco.

    PubMed

    Kral, Alex H; Malekinejad, Mohsen; Vaudrey, Jason; Martinez, Alexis N; Lorvick, Jennifer; McFarland, Willi; Raymond, H Fisher

    2010-09-01

    The objective of this article is to compare demographic characteristics, risk behaviors, and service utilization among injection drug users (IDUs) recruited from two separate studies in San Francisco in 2005, one which used targeted sampling (TS) and the other which used respondent-driven sampling (RDS). IDUs were recruited using TS (n = 651) and RDS (n = 534) and participated in quantitative interviews that included demographic characteristics, risk behaviors, and service utilization. Prevalence estimates and 95% confidence intervals (CIs) were calculated to assess whether there were differences in these variables by sampling method. There was overlap in 95% CIs for all demographic variables except African American race (TS: 45%, 53%; RDS: 29%, 44%). Maps showed that the proportion of IDUs distributed across zip codes were similar for the TS and RDS sample, with the exception of a single zip code that was more represented in the TS sample. This zip code includes an isolated, predominantly African American neighborhood where only the TS study had a field site. Risk behavior estimates were similar for both TS and RDS samples, although self-reported hepatitis C infection was lower in the RDS sample. In terms of service utilization, more IDUs in the RDS sample reported no recent use of drug treatment and syringe exchange program services. Our study suggests that perhaps a hybrid sampling plan is best suited for recruiting IDUs in San Francisco, whereby the more intensive ethnographic and secondary analysis components of TS would aid in the planning of seed placement and field locations for RDS.

  17. {sup 222}Rn in water: A comparison of two sample collection methods and two sample transport methods, and the determination of temporal variation in North Carolina ground water

    SciTech Connect

    Hightower, J.H. III

    1994-12-31

    Objectives of this field experiment were: (1) determine whether there was a statistically significant difference between the radon concentrations of samples collected by EPA`s standard method, using a syringe, and an alternative, slow-flow method; (2) determine whether there was a statistically significant difference between the measured radon concentrations of samples mailed vs samples not mailed; and (3) determine whether there was a temporal variation of water radon concentration over a 7-month period. The field experiment was conducted at 9 sites, 5 private wells, and 4 public wells, at various locations in North Carolina. Results showed that a syringe is not necessary for sample collection, there was generally no significant radon loss due to mailing samples, and there was statistically significant evidence of temporal variations in water radon concentrations.

  18. Extended Phase-Space Methods for Enhanced Sampling in Molecular Simulations: A Review

    PubMed Central

    Fujisaki, Hiroshi; Moritsugu, Kei; Matsunaga, Yasuhiro; Morishita, Tetsuya; Maragliano, Luca

    2015-01-01

    Molecular Dynamics simulations are a powerful approach to study biomolecular conformational changes or protein–ligand, protein–protein, and protein–DNA/RNA interactions. Straightforward applications, however, are often hampered by incomplete sampling, since in a typical simulated trajectory the system will spend most of its time trapped by high energy barriers in restricted regions of the configuration space. Over the years, several techniques have been designed to overcome this problem and enhance space sampling. Here, we review a class of methods that rely on the idea of extending the set of dynamical variables of the system by adding extra ones associated to functions describing the process under study. In particular, we illustrate the Temperature Accelerated Molecular Dynamics (TAMD), Logarithmic Mean Force Dynamics (LogMFD), and Multiscale Enhanced Sampling (MSES) algorithms. We also discuss combinations with techniques for searching reaction paths. We show the advantages presented by this approach and how it allows to quickly sample important regions of the free-energy landscape via automatic exploration. PMID:26389113

  19. Molecular method to assess the diversity of Burkholderia species in environmental samples.

    PubMed

    Salles, Joana Falcão; De Souza, Francisco Adriano; van Elsas, Jan Dirk

    2002-04-01

    In spite of the importance of many members of the genus Burkholderia in the soil microbial community, no direct method to assess the diversity of this genus has been developed so far. The aim of this work was the development of soil DNA-based PCR-denaturing gradient gel electrophoresis (DGGE), a powerful tool for studying the diversity of microbial communities, for detection and analysis of the Burkholderia diversity in soil samples. Primers specific for the genus Burkholderia were developed based on the 16S rRNA gene sequence and were evaluated in PCRs performed with genomic DNAs from Burkholderia and non-Burkholderia species as the templates. The primer system used exhibited good specificity and sensitivity for the majority of established species of the genus Burkholderia. DGGE analyses of the PCR products obtained showed that there were sufficient differences in migration behavior to distinguish the majority of the 14 Burkholderia species tested. Sequence analysis of amplicons generated with soil DNA exclusively revealed sequences affiliated with sequences of Burkholderia species, demonstrating that the PCR-DGGE method is suitable for studying the diversity of this genus in natural settings. A PCR-DGGE analysis of the Burkholderia communities in two grassland plots revealed differences in diversity mainly between bulk and rhizosphere soil samples; the communities in the latter samples produced more complex patterns.

  20. Molecular Method To Assess the Diversity of Burkholderia Species in Environmental Samples

    PubMed Central

    Salles, Joana Falcão; De Souza, Francisco Adriano; van Elsas, Jan Dirk

    2002-01-01

    In spite of the importance of many members of the genus Burkholderia in the soil microbial community, no direct method to assess the diversity of this genus has been developed so far. The aim of this work was the development of soil DNA-based PCR-denaturing gradient gel electrophoresis (DGGE), a powerful tool for studying the diversity of microbial communities, for detection and analysis of the Burkholderia diversity in soil samples. Primers specific for the genus Burkholderia were developed based on the 16S rRNA gene sequence and were evaluated in PCRs performed with genomic DNAs from Burkholderia and non-Burkholderia species as the templates. The primer system used exhibited good specificity and sensitivity for the majority of established species of the genus Burkholderia. DGGE analyses of the PCR products obtained showed that there were sufficient differences in migration behavior to distinguish the majority of the 14 Burkholderia species tested. Sequence analysis of amplicons generated with soil DNA exclusively revealed sequences affiliated with sequences of Burkholderia species, demonstrating that the PCR-DGGE method is suitable for studying the diversity of this genus in natural settings. A PCR-DGGE analysis of the Burkholderia communities in two grassland plots revealed differences in diversity mainly between bulk and rhizosphere soil samples; the communities in the latter samples produced more complex patterns. PMID:11916673

  1. Sample Size Determination: A Comparison of Attribute, Continuous Variable, and Cell Size Methods.

    ERIC Educational Resources Information Center

    Clark, Philip M.

    1984-01-01

    Describes three methods of sample size determination, each having its use in investigation of social science problems: Attribute method; Continuous Variable method; Galtung's Cell Size method. Statistical generalization, benefits of cell size method (ease of use, trivariate analysis and trichotyomized variables), and choice of method are…

  2. A method for sampling halothane and enflurane present in trace amounts in ambient air.

    PubMed

    Burm, A G; Spierdijk, J

    1979-03-01

    A method for the sampling of small amounts of halothane and enflurane in ambient air is described. Sampling is performed by drawing air through a sampling tube packed with Porapak Q, which absorbs the anesthetic agent. The amount absorbed is determined by gas chromatography after thermal desorption. This method can be used for "spot" or personal sampling or for determining mean whole-room concentrations over relatively long periods (several hours).

  3. The Importance of Reference Gene Analysis of Formalin-Fixed, Paraffin-Embedded Samples from Sarcoma Patients — An Often Underestimated Problem12

    PubMed Central

    Aggerholm-Pedersen, Ninna; Safwat, Akmal; Bærentzen, Steen; Nordsmark, Marianne; Nielsen, Ole Steen; Alsner, Jan; Sørensen, Brita S.

    2014-01-01

    Objective: Reverse transcription quantitative real-time polymerase chain reaction is efficient for quantification of gene expression, but the choice of reference genes is of paramount importance as it is essential for correct interpretation of data. This is complicated by the fact that the materials often available are routinely collected formalin-fixed, paraffin-embedded (FFPE) samples in which the mRNA is known to be highly degraded. The purpose of this study was to investigate 22 potential reference genes in sarcoma FFPE samples and to study the variation in expression level within different samples taken from the same tumor and between different histologic types. Methods: Twenty-nine patients treated for sarcoma were enrolled. The samples encompassed 82 (FFPE) specimens. Extraction of total RNA from 7-μm FFPE sections was performed using a fully automated, bead-base RNA isolation procedure, and 22 potential reference genes were analyzed by reverse transcription quantitative real-time polymerase chain reaction. The stability of the genes was analyzed by RealTime Statminer. The intrasamples variation and the interclass correlation coefficients were calculated. The linear regression model was used to calculate the degradation of the mRNA over time. Results: The quality of RNA was sufficient for analysis in 84% of the samples. Recommended reference genes differed with histologic types. However, PPIA, SF3A1, and MRPL19 were stably expressed regardless of the histologic type included. The variation in ∆Cq value for samples from the same patients was similar to the variation between patients. It was possible to compensate for the time-dependent degradation of the mRNA when normalization was made using the selected reference genes. Conclusion: PPIA, SF3A1, and MRPL19 are suitable reference genes for normalization in gene expression studies of FFPE samples from sarcoma regardless of the histology. PMID:25500077

  4. On Authentication Method Impact upon Data Sampling Delay in Wireless Sensor Networks

    NASA Astrophysics Data System (ADS)

    Szalachowski, Pawel; Ksiezopolski, Bogdan; Kotulski, Zbigniew

    Traffic in Wireless Sensor Network (WSN) consists of short packets sent by nodes that are usually identical in respect of software applied and their hardware architecture. In such a communication environment it is important to guarantee authentication of the nodes. The most popular way to achieve this basic security service is using Message Authentication Code (MAC). The sensor node's harbware is very limited so the cryptography used must be very efficient. In the article we focus on the influence of the authentication method's performance on delays in data sampling by the sensor nodes. We present efficiency results for MACs generation in the node. We compare the results for approved, standardized and commonly-used schemes: CMAC, GMAC and HMAC based on MD5 and SHA-1. Additionally, we compare the obtained results with the performance of PKC-based authentication method using the ECDSA.

  5. Method validation for control determination of mercury in fresh fish and shrimp samples by solid sampling thermal decomposition/amalgamation atomic absorption spectrometry.

    PubMed

    Torres, Daiane Placido; Martins-Teixeira, Maristela Braga; Cadore, Solange; Queiroz, Helena Müller

    2015-01-01

    A method for the determination of total mercury in fresh fish and shrimp samples by solid sampling thermal decomposition/amalgamation atomic absorption spectrometry (TDA AAS) has been validated following international foodstuff protocols in order to fulfill the Brazilian National Residue Control Plan. The experimental parameters have been previously studied and optimized according to specific legislation on validation and inorganic contaminants in foodstuff. Linearity, sensitivity, specificity, detection and quantification limits, precision (repeatability and within-laboratory reproducibility), robustness as well as accuracy of the method have been evaluated. Linearity of response was satisfactory for the two range concentrations available on the TDA AAS equipment, between approximately 25.0 and 200.0 μg kg(-1) (square regression) and 250.0 and 2000.0 μg kg(-1) (linear regression) of mercury. The residues for both ranges were homoscedastic and independent, with normal distribution. Correlation coefficients obtained for these ranges were higher than 0.995. Limits of quantification (LOQ) and of detection of the method (LDM), based on signal standard deviation (SD) for a low-in-mercury sample, were 3.0 and 1.0 μg kg(-1), respectively. Repeatability of the method was better than 4%. Within-laboratory reproducibility achieved a relative SD better than 6%. Robustness of the current method was evaluated and pointed sample mass as a significant factor. Accuracy (assessed as the analyte recovery) was calculated on basis of the repeatability, and ranged from 89% to 99%. The obtained results showed the suitability of the present method for direct mercury measurement in fresh fish and shrimp samples and the importance of monitoring the analysis conditions for food control purposes. Additionally, the competence of this method was recognized by accreditation under the standard ISO/IEC 17025. PMID:25996815

  6. Effects of model complexity and priors on estimation using sequential importance sampling/resampling for species conservation

    USGS Publications Warehouse

    Dunham, Kylee; Grand, James B.

    2016-01-01

    We examined the effects of complexity and priors on the accuracy of models used to estimate ecological and observational processes, and to make predictions regarding population size and structure. State-space models are useful for estimating complex, unobservable population processes and making predictions about future populations based on limited data. To better understand the utility of state space models in evaluating population dynamics, we used them in a Bayesian framework and compared the accuracy of models with differing complexity, with and without informative priors using sequential importance sampling/resampling (SISR). Count data were simulated for 25 years using known parameters and observation process for each model. We used kernel smoothing to reduce the effect of particle depletion, which is common when estimating both states and parameters with SISR. Models using informative priors estimated parameter values and population size with greater accuracy than their non-informative counterparts. While the estimates of population size and trend did not suffer greatly in models using non-informative priors, the algorithm was unable to accurately estimate demographic parameters. This model framework provides reasonable estimates of population size when little to no information is available; however, when information on some vital rates is available, SISR can be used to obtain more precise estimates of population size and process. Incorporating model complexity such as that required by structured populations with stage-specific vital rates affects precision and accuracy when estimating latent population variables and predicting population dynamics. These results are important to consider when designing monitoring programs and conservation efforts requiring management of specific population segments.

  7. Sampling uncertainty evaluation for data acquisition board based on Monte Carlo method

    NASA Astrophysics Data System (ADS)

    Ge, Leyi; Wang, Zhongyu

    2008-10-01

    Evaluating the data acquisition board sampling uncertainty is a difficult problem in the field of signal sampling. This paper analyzes the sources of dada acquisition board sampling uncertainty in the first, then introduces a simulation theory of dada acquisition board sampling uncertainty evaluation based on Monte Carlo method and puts forward a relation model of sampling uncertainty results, sampling numbers and simulation times. In the case of different sample numbers and different signal scopes, the author establishes a random sampling uncertainty evaluation program of a PCI-6024E data acquisition board to execute the simulation. The results of the proposed Monte Carlo simulation method are in a good agreement with the GUM ones, and the validities of Monte Carlo method are represented.

  8. A comparison of swab and maceration methods for bacterial sampling of pig carcasses.

    PubMed Central

    Morgan, I. R.; Krautil, F.; Craven, J. A.

    1985-01-01

    A swabbing technique was compared with an excision and maceration technique for bacteriological sampling of pig carcass skin surfaces. Total viable counts at 37 degrees C obtained by swabbing were 46% of those obtained by maceration. At 21 degrees C, swabbing gave total viable counts which were 54% of the counts obtained from excision samples. Escherichia coli counts showed wide variation with both sampling methods. Neither method was more efficient than the other in recovering E. coli, although excision sampling gave generally higher counts. Both methods were equally effective at recovering salmonellae from carcass surfaces. There was no significant difference between the methods in recovering particular Salmonella serotypes. PMID:3905957

  9. A Novel Method of Failure Sample Selection for Electrical Systems Using Ant Colony Optimization

    PubMed Central

    Tian, Shulin; Yang, Chenglin; Liu, Cheng

    2016-01-01

    The influence of failure propagation is ignored in failure sample selection based on traditional testability demonstration experiment method. Traditional failure sample selection generally causes the omission of some failures during the selection and this phenomenon could lead to some fearful risks of usage because these failures will lead to serious propagation failures. This paper proposes a new failure sample selection method to solve the problem. First, the method uses a directed graph and ant colony optimization (ACO) to obtain a subsequent failure propagation set (SFPS) based on failure propagation model and then we propose a new failure sample selection method on the basis of the number of SFPS. Compared with traditional sampling plan, this method is able to improve the coverage of testing failure samples, increase the capacity of diagnosis, and decrease the risk of using. PMID:27738424

  10. An intercomparison study of analytical methods used for quantification of levoglucosan in ambient aerosol filter samples

    NASA Astrophysics Data System (ADS)

    Yttri, K. E.; Schnelle-Kreis, J.; Maenhaut, W.; Abbaszade, G.; Alves, C.; Bjerke, A.; Bonnier, N.; Bossi, R.; Claeys, M.; Dye, C.; Evtyugina, M.; García-Gacio, D.; Hillamo, R.; Hoffer, A.; Hyder, M.; Iinuma, Y.; Jaffrezo, J.-L.; Kasper-Giebl, A.; Kiss, G.; López-Mahia, P. L.; Pio, C.; Piot, C.; Ramirez-Santa-Cruz, C.; Sciare, J.; Teinilä, K.; Vermeylen, R.; Vicente, A.; Zimmermann, R.

    2015-01-01

    laboratories the mean PE was within ±10%. For galactosan, the mean PE for the participating laboratories ranged from -84 to 593%, and as for mannosan 33% of the laboratories reported a mean PE within ±10%. The variability of the various analytical methods, as defined by their minimum and maximum PE value, was typically better for levoglucosan than for mannosan and galactosan, ranging from 3.2 to 41% for levoglucosan, from 10 to 67% for mannosan and from 6 to 364% for galactosan. For the levoglucosan to mannosan ratio, which may be used to assess the relative importance of softwood versus hardwood burning, the variability only ranged from 3.5 to 24 . To our knowledge, this is the first major intercomparison on analytical methods used to quantify monosaccharide anhydrides in ambient aerosol filter samples conducted and reported in the scientific literature. The results show that for levoglucosan the accuracy is only slightly lower than that reported for analysis of SO42- (sulfate) on filter samples, a constituent that has been analysed by numerous laboratories for several decades, typically by ion chromatography and which is considered a fairly easy constituent to measure. Hence, the results obtained for levoglucosan with respect to accuracy are encouraging and suggest that levels of levoglucosan, and to a lesser extent mannosan and galactosan, obtained by most of the analytical methods currently used to quantify monosaccharide anhydrides in ambient aerosol filter samples, are comparable. Finally, the various analytical methods used in the current study should be tested for other aerosol matrices and concentrations as well, the most obvious being summertime aerosol samples affected by wildfires and/or agricultural fires.

  11. An intercomparison study of analytical methods used for quantification of levoglucosan in ambient aerosol filter samples

    NASA Astrophysics Data System (ADS)

    Yttri, K. E.; Schnelle-Kreiss, J.; Maenhaut, W.; Alves, C.; Bossi, R.; Bjerke, A.; Claeys, M.; Dye, C.; Evtyugina, M.; García-Gacio, D.; Gülcin, A.; Hillamo, R.; Hoffer, A.; Hyder, M.; Iinuma, Y.; Jaffrezo, J.-L.; Kasper-Giebl, A.; Kiss, G.; López-Mahia, P. L.; Pio, C.; Piot, C.; Ramirez-Santa-Cruz, C.; Sciare, J.; Teinilä, K.; Vermeylen, R.; Vicente, A.; Zimmermann, R.

    2014-07-01

    the laboratories the mean PE was within ±10%. For galactosan, the mean PE for the participating laboratories ranged from -84 to 593%, and as for mannosan 33% of the laboratories reported a mean PE within ±10%. The variability of the various analytical methods, as defined by their minimum and maximum PE value, was typically better for levoglucosan than for mannosan and galactosan, ranging from 3.2 to 41% for levoglucosan, from 10 to 67% for mannosan, and from 6 to 364% for galactosan. For the levoglucosan to mannosan ratio, which may be used to assess the relative importance of softwood vs. hardwood burning, the variability only ranged from 3.5 to 24%. To our knowledge, this is the first major intercomparison on analytical methods used to quantify monosaccharide anhydrides in ambient aerosol filter samples conducted and reported in the scientific literature. The results show that for levoglucosan the accuracy is only slightly lower than that reported for analysis of SO42- on filter samples, a constituent that has been analyzed by numerous laboratories for several decades, typically by ion chromatography, and which is considered a fairly easy constituent to measure. Hence, the results obtained for levoglucosan with respect to accuracy are encouraging and suggest that levels of levoglucosan, and to a lesser extent mannosan and galactosan, obtained by most of the analytical methods currently used to quantify monosaccharide anhydrides in ambient aerosol filter samples, are comparable. Finally, the various analytical methods used in the current study should be tested for other aerosol matrices and concentrations as well, the most obvious being summertime aerosol samples affected by wild fires and/or agricultural fires.

  12. Persistence Pays Off: Follow-Up Methods for Difficult-to-Track Longitudinal Samples*

    PubMed Central

    Kleschinsky, John H.; Bosworth, Leslie B.; Nelson, Sarah E.; Walsh, Erinn K.; Shaffer, Howard J.

    2009-01-01

    Objective: Evolving privacy and confidentiality regulations make achieving high completion rates in longitudinal studies challenging. Periodically reviewing the methods researchers use to retain participants throughout the follow-up period is important. We review the effectiveness of methods to maximize completion rates in a 1-year longitudinal study of repeat driving-under-the-influence (DUI) offenders. Method: During the course of 21 months, we attempted to follow-up with 704 participants of a licensed residential treatment facility for repeat DUI offenders. High rates of lifetime Diagnostic and Statistical Manual of Mental Disorders, Fourth Edition, substance-use disorders (97.6%) and nonsubstance- or nongambling-related psychiatric disorders (44.5%) among the sample made tracking participants difficult. To locate participants and complete follow-up interviews, we obtained baseline information, contacted collaterals, sent mailed reminders, searched Internet databases, and gave a monetary incentive for completing study interviews. Results: We located 608 participants with active telephone numbers (87.4%) and completed interviews with 488 (70.1% of the entire eligible sample and 80.3% of those with active telephone numbers), after an average (SD) of 8.6 (9.1) calls (median = 5.0). Increasing the number of calls continued to yield additional completions at 10, 20, and 30 calls; at approximately 40 telephone calls, the potential return for additional calls did not justify the added effort. Conclusions: These results suggest that researchers need to (1) employ more than 10 telephone calls to adequately track difficult-to-follow substance-using populations, and (2) prepare for a subsample of participants who might require more extensive contact. These results highlight the importance of using empirical guidelines to plan estimates for the number of contacts needed to achieve an adequate follow-up completion rate. PMID:19737500

  13. [Study on a method of selecting calibration samples in NIR spectral analysis].

    PubMed

    Qin, Chong; Chen, Wen-Wen; He, Xiong-Kui; Zhang, Lu-Da; Ma, Xiang

    2009-10-01

    In the present paper, a simple but novel method based on maximum linearly independent group was introduced into near-infrared (NIR) spectral analysis for selecting representative calibration samples. The experiment materials contained 2,652 tobacco powder samples, with 1,001 samples randomly selected as prediction set, and the others as representative sample candidate set from which calibration sample set was selected. The method of locating maximum linearly independent vectors was used to select representative samples from the spectral vectors of representative samples candidate set. The arithmetic was accomplished by function rref(X,q) in Matlab. The maximum linearly independent spectral vectors were treated as calibration samples set. When different calculating precision q was given, different amount of representative samples were acquired. The selected calibration sample set was used to build regression model to predict the total sugar of tobacco powder samples by PLS. The model was used to analyze 1001 samples in the prediction set. When selecting 32 representative samples, the model presented a good predictive veracity, whose predictive mean relative error was 3.6210%, and correlation coefficient was 0.9643. By paired-samples t-test, we found that the difference between the predicting result of model obtained by 32 samples and that obtained by 146 samples was not significant (alpha=0.05). Also, we compared the methods of randomly selecting calibration samples and maximum linearly independent selection by their predicting effects of models. In the experiment, correspondingly, six calibration sample sets were selected, one of which included 28 samples, while the others included 32, 41, 76, 146 and 163 samples respectively. The method of maximum linearly independent selecting samples turned out to be obviously better than that of randomly selecting. The result indicated that the proposed method can not only effectively enhance the cost-effectiveness of NIR

  14. Methods of biological fluids sample preparation - biogenic amines, methylxanthines, water-soluble vitamins.

    PubMed

    Płonka, Joanna

    2015-01-01

    In recent years demands on the amount of information that can be obtained from the analysis of a single sample have increased. For time and economic reasons it is necessary to examine at the same time larger number of compounds, and compounds from different groups. This can best be seen in such areas as clinical analysis. In many diseases, the best results for patients are obtained when treatment fits the individual characteristics of the patient. Dosage monitoring is important at the beginning of therapy and in the full process of treatment. In the treatment of many diseases biogenic amines (dopamine, serotonin) and methylxanthines (theophylline, theobromine, caffeine) play an important role. They are used as drugs separately or in combination with others to support and strengthen the action of other drugs - for example, the combination of caffeine and paracetamol. Vitamin supplementation may be also an integral part of the treatment process. Specification of complete sample preparation parameters for extraction of the above compounds from biological matrices has been reviewed. Particular attention was given to the preparation stage and extraction methods. This review provides universal guidance on establishing a common procedures across laboratories to facilitate the preparation and analysis of all discussed compounds. PMID:25381720

  15. Rapid Method for Ra-226 and Ra-228 in Water Samples

    SciTech Connect

    Maxwell, Sherrod, L. III

    2006-02-10

    The measurement of radium isotopes in natural waters is important for oceanographic studies and for public health reasons. Ra-226 (1620 year half-life) is one of the most toxic of the long-lived alpha emitters present in the environment due to its long life and its tendency to concentrate in bones, which increases the internal radiation dose of individuals. The analysis of radium-226 and radium-228 in natural waters can be tedious and time-consuming. Different sample preparation methods are often required to prepare Ra-226 and Ra-228 for separate analyses. A rapid method has been developed at the Savannah River Environmental Laboratory that effectively separates both Ra-226 and Ra-228 (via Ac-228) for assay. This method uses MnO{sub 2} Resin from Eichrom Technologies (Darien, IL, USA) to preconcentrate Ra-226 and Ra-228 rapidly from water samples, along with Ba-133 tracer. DGA Resin{reg_sign} (Eichrom) and Ln-Resin{reg_sign} (Eichrom) are employed in tandem to prepare Ra-226 for assay by alpha spectrometry and to determine Ra-228 via the measurement of Ac-228 by gas proportional counting. After preconcentration, the manganese dioxide is dissolved from the resin and passed through stacked Ln-Resin-DGA Resin cartridges that remove uranium and thorium interferences and retain Ac-228 on DGA Resin. The eluate that passed through this column is evaporated, redissolved in a lower acidity and passed through Ln-Resin again to further remove interferences before performing a barium sulfate microprecipitation. The Ac-228 is stripped from the resin, collected using cerium fluoride microprecipitation and counted by gas proportional counting. By using vacuum box cartridge technology with rapid flow rates, sample preparation time is minimized.

  16. METHOD DEVELOPMENT FOR THE DETERMINATION OF FORMALDEHYDE IN SAMPLES OF ENVIRONMENTAL ORIGIN

    EPA Science Inventory

    An analytical method was developed for the determination of formaldehyde in samples of environmental origin. After a review of the current literature, five candidate methods involving chemical derivatization were chosen for evaluation. The five derivatization reagents studied wer...

  17. Practical method for extraction of PCR-quality DNA from environmental soil samples.

    PubMed

    Fitzpatrick, Kelly A; Kersh, Gilbert J; Massung, Robert F

    2010-07-01

    Methods for the extraction of PCR-quality DNA from environmental soil samples by using pairs of commercially available kits were evaluated. Coxiella burnetii DNA was detected in spiked soil samples at <1,000 genome equivalents per gram of soil and in 12 (16.4%) of 73 environmental soil samples.

  18. A robust adaptive sampling method for faster acquisition of MR images.

    PubMed

    Vellagoundar, Jaganathan; Machireddy, Ramasubba Reddy

    2015-06-01

    A robust adaptive k-space sampling method is proposed for faster acquisition and reconstruction of MR images. In this method, undersampling patterns are generated based on magnitude profile of a fully acquired 2-D k-space data. Images are reconstructed using compressive sampling reconstruction algorithm. Simulation experiments are done to assess the performance of the proposed method under various signal-to-noise ratio (SNR) levels. The performance of the method is better than non-adaptive variable density sampling method when k-space SNR is greater than 10dB. The method is implemented on a fully acquired multi-slice raw k-space data and a quality assurance phantom data. Data reduction of up to 60% is achieved in the multi-slice imaging data and 75% is achieved in the phantom imaging data. The results show that reconstruction accuracy is improved over non-adaptive or conventional variable density sampling method. The proposed sampling method is signal dependent and the estimation of sampling locations is robust to noise. As a result, it eliminates the necessity of mathematical model and parameter tuning to compute k-space sampling patterns as required in non-adaptive sampling methods.

  19. Comparison of uranium determination in some Syrian geologic samples using three reactor based methods

    PubMed

    Jubeli

    2000-04-01

    A set of 25 samples of soil, sediments, carbonate and phosphate rocks from Syria were analysed for uranium, using three reactor based methods; instrumental neutron activation analysis (INAA), delayed neutron counting (DNC) and one cycle of irradiation utilizing the cyclic activation system (CAS). Although the three methods are capable of irradiation samples, the last method is the least established for U determination in rocks. The measurements obtained by the three methods are compared. The results show good agreement, with a distinct linear relationship and significant positive correlation coefficients. It was concluded that the CAS method could reliably be used to rapidly determine uranium in geological samples. PMID:10800739

  20. Multivariate Methods for Prediction of Geologic Sample Composition with Laser-Induced Breakdown Spectroscopy

    NASA Technical Reports Server (NTRS)

    Morris, Richard; Anderson, R.; Clegg, S. M.; Bell, J. F., III

    2010-01-01

    Laser-induced breakdown spectroscopy (LIBS) uses pulses of laser light to ablate a material from the surface of a sample and produce an expanding plasma. The optical emission from the plasma produces a spectrum which can be used to classify target materials and estimate their composition. The ChemCam instrument on the Mars Science Laboratory (MSL) mission will use LIBS to rapidly analyze targets remotely, allowing more resource- and time-intensive in-situ analyses to be reserved for targets of particular interest. ChemCam will also be used to analyze samples that are not reachable by the rover's in-situ instruments. Due to these tactical and scientific roles, it is important that ChemCam-derived sample compositions are as accurate as possible. We have compared the results of partial least squares (PLS), multilayer perceptron (MLP) artificial neural networks (ANNs), and cascade correlation (CC) ANNs to determine which technique yields better estimates of quantitative element abundances in rock and mineral samples. The number of hidden nodes in the MLP ANNs was optimized using a genetic algorithm. The influence of two data preprocessing techniques were also investigated: genetic algorithm feature selection and averaging the spectra for each training sample prior to training the PLS and ANN algorithms. We used a ChemCam-like laboratory stand-off LIBS system to collect spectra of 30 pressed powder geostandards and a diverse suite of 196 geologic slab samples of known bulk composition. We tested the performance of PLS and ANNs on a subset of these samples, choosing to focus on silicate rocks and minerals with a loss on ignition of less than 2 percent. This resulted in a set of 22 pressed powder geostandards and 80 geologic samples. Four of the geostandards were used as a validation set and 18 were used as the training set for the algorithms. We found that PLS typically resulted in the lowest average absolute error in its predictions, but that the optimized MLP ANN and

  1. Systems and methods for separating particles and/or substances from a sample fluid

    DOEpatents

    Mariella, Jr., Raymond P.; Dougherty, George M.; Dzenitis, John M.; Miles, Robin R.; Clague, David S.

    2016-11-01

    Systems and methods for separating particles and/or toxins from a sample fluid. A method according to one embodiment comprises simultaneously passing a sample fluid and a buffer fluid through a chamber such that a fluidic interface is formed between the sample fluid and the buffer fluid as the fluids pass through the chamber, the sample fluid having particles of interest therein; applying a force to the fluids for urging the particles of interest to pass through the interface into the buffer fluid; and substantially separating the buffer fluid from the sample fluid.

  2. A method for estimating the relative importance of characters in cladistic analyses.

    PubMed

    DeGusta, David

    2004-08-01

    The method of character importance ranking (CIR) is proposed here as a means for estimating the relative "importance" of characters in cladistic analyses, especially those based on morphological features. CIR uses the weighting variable to incrementally remove one character at a time from the analysis, and then evaluates the impact of the removal on the shape of the cladogram. The greater the impact, the more important the character. The CIR method for determining which characters drive the shape of a particular cladogram has several applications. It identifies the characters with the strongest (though not necessarily most accurate) signal in a cladistic analysis; it permits the informed prioritization of characters for further investigation via genetic, developmental, and functional approaches; and it highlights characters whose definition, scoring, independence, and variation should be reviewed with particular care. The application of CIR reveals that at least some cladograms depend entirely on a single character.

  3. Flexible backbone sampling methods to model and design protein alternative conformations.

    PubMed

    Ollikainen, Noah; Smith, Colin A; Fraser, James S; Kortemme, Tanja

    2013-01-01

    Sampling alternative conformations is key to understanding how proteins work and engineering them for new functions. However, accurately characterizing and modeling protein conformational ensembles remain experimentally and computationally challenging. These challenges must be met before protein conformational heterogeneity can be exploited in protein engineering and design. Here, as a stepping stone, we describe methods to detect alternative conformations in proteins and strategies to model these near-native conformational changes based on backrub-type Monte Carlo moves in Rosetta. We illustrate how Rosetta simulations that apply backrub moves improve modeling of point mutant side-chain conformations, native side-chain conformational heterogeneity, functional conformational changes, tolerated sequence space, protein interaction specificity, and amino acid covariation across protein-protein interfaces. We include relevant Rosetta command lines and RosettaScripts to encourage the application of these types of simulations to other systems. Our work highlights that critical scoring and sampling improvements will be necessary to approximate conformational landscapes. Challenges for the future development of these methods include modeling conformational changes that propagate away from designed mutation sites and modulating backbone flexibility to predictively design functionally important conformational heterogeneity.

  4. [Based on Trigger Sampling Method and Phase Correction of Infrared Spectrum Measurement Applications].

    PubMed

    Li, Yan; Gao, Min-guang; Xu, Liang; Li, Sheng; Li, Xiang-xian; Ye, Shu-bin; Liu, Jian-guo

    2015-07-01

    Fourier transform infrared spectrometer can be realized in high temperature flue gas multicomponent measurement at the same time, has wide application prospects in the field. And one of the important factors to determine the success of application, lies in the measuring system of infrared interference figure sampling phase error control. This paper discusses the main-reasons of the appearance of phase error in the system, through the analysis of Helium-neon laser interference signal zero uniformity, illustrates the produce phase error is the main reason of the laser signal and reference signal phase difference. At the meantime, the quantitative analysis of the phase error influence on instrument signal to noise ratio (SNR), also the Mertz phase correction method for the instrument improves the thousands of times of the original signal to noise ratio. And the related experiment, the experimental results show that the system based on the interference figure sampling method satisfy the needs of high temperature flue gas measurements. PMID:26717778

  5. Analysis of polyethylene microplastics in environmental samples, using a thermal decomposition method.

    PubMed

    Dümichen, Erik; Barthel, Anne-Kathrin; Braun, Ulrike; Bannick, Claus G; Brand, Kathrin; Jekel, Martin; Senz, Rainer

    2015-11-15

    Small polymer particles with a diameter of less than 5 mm called microplastics find their way into the environment from polymer debris and industrial production. Therefore a method is needed to identify and quantify microplastics in various environmental samples to generate reliable concentration values. Such concentration values, i.e. quantitative results, are necessary for an assessment of microplastic in environmental media. This was achieved by thermal extraction in thermogravimetric analysis (TGA), connected to a solid-phase adsorber. These adsorbers were subsequently analysed by thermal desorption gas chromatography mass spectrometry (TDS-GC-MS). In comparison to other chromatographic methods, like pyrolyse gas chromatography mass spectrometry (Py-GC-MS), the relatively high sample masses in TGA (about 200 times higher than used in Py-GC-MS) analysed here enable the measurement of complex matrices that are not homogenous on a small scale. Through the characteristic decomposition products known for every kind of polymer it is possible to identify and even to quantify polymer particles in various matrices. Polyethylene (PE), one of the most important representatives for microplastics, was chosen as an example for identification and quantification.

  6. Total nitrogen determination of various sample types: a comparison of the Hach, Kjeltec, and Kjeldahl methods.

    PubMed

    Watkins, K L; Veum, T L; Krause, G F

    1987-01-01

    Conventional Kjeldahl analysis with modifications, Kjeltec analysis with block digestion and semiautomated distillation, and the Hach method for determining nitrogen (N) were compared using a wide range of samples. Twenty different sample types were ground and mixed. Each sample type was divided into 5 subsamples which were analyzed for N by each of the 3 methods. In each sample type, differences (P less than 0.05) were detected among the 3 N determination methods in 5 of the 20 N sources analyzed. The mean N content over all 20 samples was higher with Kjeldahl analysis (P less than 0.05) than with Kjeltec, while Hach analysis produced intermediate results. Results also indicated that the Hach procedure had the greatest ability to detect differences in N content among sample types, being more sensitive than either other method (P less than 0.05).

  7. Improved Butanol-Methanol (BUME) Method by Replacing Acetic Acid for Lipid Extraction of Biological Samples.

    PubMed

    Cruz, Mutya; Wang, Miao; Frisch-Daiello, Jessica; Han, Xianlin

    2016-07-01

    Extraction of lipids from biological samples is a critical step in lipidomics, especially for shotgun lipidomics where lipid extracts are directly infused into a mass spectrometer. The butanol-methanol (BUME) extraction method was originally developed to extract lipids from plasma samples with 1 % acetic acid. Considering some lipids are sensitive to acidic environments, we modified this protocol by replacing acetic acid with lithium chloride solution and extended the modified extraction to tissue samples. Although no significant reduction of plasmalogen levels in the acidic BUME extracts of rat heart samples was found, the modified method was established to extract various tissue samples, including rat liver, heart, and plasma. Essentially identical profiles of the majority of lipid classes were obtained from the extracts of the modified BUME and traditional Bligh-Dyer methods. However, it was found that neither the original, nor the modified BUME method was suitable for 4-hydroxyalkenal species measurement in biological samples. PMID:27245345

  8. [Establishment and assessment of QA/QC method for sampling and analysis of atmosphere background CO2].

    PubMed

    Liu, Li-xin; Zhou, Ling-xi; Xia, Ling-jun; Wang, Hong-yang; Fang, Shuang-xi

    2014-12-01

    To strengthen scientific management and sharing of greenhouse gas data obtained from atmospheric background stations in China, it is important to ensure the standardization of quality assurance and quality control method for background CO2 sampling and analysis. Based on the greenhouse gas sampling and observation experience of CMA, using portable sampling observation and WS-CRDS analysis technique as an example, the quality assurance measures for atmospheric CO,sampling and observation in the Waliguan station (Qinghai), the glass bottle quality assurance measures and the systematic quality control method during sample analysis, the correction method during data processing, as well as the data grading quality markers and data fitting interpolation method were systematically introduced. Finally, using this research method, the CO2 sampling and observation data at the atmospheric background stations in 3 typical regions were processed and the concentration variation characteristics were analyzed, indicating that this research method could well catch the influences of the regional and local environmental factors on the observation results, and reflect the characteristics of natural and human activities in an objective and accurate way.

  9. Importance of mixed methods in pragmatic trials and dissemination and implementation research.

    PubMed

    Albright, Karen; Gechter, Katherine; Kempe, Allison

    2013-01-01

    With increased attention to the importance of translating research to clinical practice and policy, recent years have seen a proliferation of particular types of research, including pragmatic trials and dissemination and implementation research. Such research seeks to understand how and why interventions function in real-world settings, as opposed to highly controlled settings involving conditions not likely to be repeated outside the research study. Because understanding the context in which interventions are implemented is imperative for effective pragmatic trials and dissemination and implementation research, the use of mixed methods is critical to understanding trial results and the success or failure of implementation efforts. This article discusses a number of dimensions of mixed methods research, utilizing at least one qualitative method and at least one quantitative method, that may be helpful when designing projects or preparing grant proposals. Although the strengths and emphases of qualitative and quantitative approaches differ substantially, methods may be combined in a variety of ways to achieve a deeper level of understanding than can be achieved by one method alone. However, researchers must understand when and how to integrate the data as well as the appropriate order, priority, and purpose of each method. The ability to demonstrate an understanding of the rationale for and benefits of mixed methods research is increasingly important in today's competitive funding environment, and many funding agencies now expect applicants to include mixed methods in proposals. The increasing demand for mixed methods research necessitates broader methodological training and deepened collaboration between medical, clinical, and social scientists. Although a number of challenges to conducting and disseminating mixed methods research remain, the potential for insight generated by such work is substantial.

  10. Optical method for the characterization of laterally-patterned samples in integrated circuits

    DOEpatents

    Maris, Humphrey J.

    2001-01-01

    Disclosed is a method for characterizing a sample having a structure disposed on or within the sample, comprising the steps of applying a first pulse of light to a surface of the sample for creating a propagating strain pulse in the sample, applying a second pulse of light to the surface so that the second pulse of light interacts with the propagating strain pulse in the sample, sensing from a reflection of the second pulse a change in optical response of the sample, and relating a time of occurrence of the change in optical response to at least one dimension of the structure.

  11. Sampling Methods and the Accredited Population in Athletic Training Education Research

    ERIC Educational Resources Information Center

    Carr, W. David; Volberding, Jennifer

    2009-01-01

    Context: We describe methods of sampling the widely-studied, yet poorly defined, population of accredited athletic training education programs (ATEPs). Objective: There are two purposes to this study; first to describe the incidence and types of sampling methods used in athletic training education research, and second to clearly define the…

  12. 7 CFR 51.308 - Methods of sampling and calculation of percentages.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... Grades of Apples Methods of Sampling and Calculation of Percentages § 51.308 Methods of sampling and... weigh ten pounds or less, or in any container where the minimum diameter of the smallest apple does not vary more than 1/2 inch from the minimum diameter of the largest apple, percentages shall be...

  13. 7 CFR 51.308 - Methods of sampling and calculation of percentages.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... Grades of Apples Methods of Sampling and Calculation of Percentages § 51.308 Methods of sampling and... weigh ten pounds or less, or in any container where the minimum diameter of the smallest apple does not vary more than 1/2 inch from the minimum diameter of the largest apple, percentages shall be...

  14. 40 CFR 80.8 - Sampling methods for gasoline and diesel fuel.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... Register in accordance with 5 U.S.C. 552(a) and 1 CFR part 51. Copies may be obtained from the American... 40 Protection of Environment 17 2013-07-01 2013-07-01 false Sampling methods for gasoline and... gasoline and diesel fuel. The sampling methods specified in this section shall be used to collect...

  15. 40 CFR 80.8 - Sampling methods for gasoline, diesel fuel, fuel additives, and renewable fuels.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... of the Federal Register under 5 U.S.C. 552(a) and 1 CFR part 51. To enforce any edition other than... 40 Protection of Environment 17 2014-07-01 2014-07-01 false Sampling methods for gasoline, diesel... Provisions § 80.8 Sampling methods for gasoline, diesel fuel, fuel additives, and renewable fuels....

  16. 40 CFR 80.8 - Sampling methods for gasoline and diesel fuel.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... Register in accordance with 5 U.S.C. 552(a) and 1 CFR part 51. Copies may be obtained from the American... 40 Protection of Environment 17 2012-07-01 2012-07-01 false Sampling methods for gasoline and... gasoline and diesel fuel. The sampling methods specified in this section shall be used to collect...

  17. 40 CFR 80.8 - Sampling methods for gasoline and diesel fuel.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Register in accordance with 5 U.S.C. 552(a) and 1 CFR part 51. Copies may be obtained from the American... 40 Protection of Environment 16 2010-07-01 2010-07-01 false Sampling methods for gasoline and... gasoline and diesel fuel. The sampling methods specified in this section shall be used to collect...

  18. Evaluation of beef trim sampling methods for detection of Shiga toxin-producing Escherichia coli (STEC)

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Presence of Shiga toxin-producing Escherichia coli (STEC) is a major concern in ground beef. Several methods for sampling beef trim prior to grinding are currently used in the beef industry. The purpose of this study was to determine the efficacy of the sampling methods for detecting STEC in beef ...

  19. Application of a Permethrin Immunosorbent Assay Method to Residential Soil and Dust Samples

    EPA Science Inventory

    A low-cost, high throughput bioanalytical screening method was developed for monitoring cis/trans-permethrin in dust and soil samples. The method consisted of a simple sample preparation procedure [sonication with dichloromethane followed by a solvent exchange into methanol:wate...

  20. THE INFLUENCE OF PHYSICAL FACTORS ON COMPARATIVE PERFORMANCE OF SAMPLING METHODS IN LARGE RIVERS

    EPA Science Inventory

    In 1999, we compared five existing benthic macroinvertebrate sampling methods used in boatable rivers. Each sampling protocol was performed at each of 60 sites distributed among four rivers in the Ohio River drainage basin. Initial comparison of methods using key macroinvertebr...

  1. Viability of Actinobacillus pleuropneumoniae in frozen pig lung samples and comparison of different methods of direct diagnosis in fresh samples.

    PubMed

    Gutierrez, C B; Rodriguez Barbosa, J I; Gonzalez, O R; Tascon, R I; Rodriguez Ferri, E F

    1992-04-01

    A comparative study on different methods of diagnosis of Actinobacillus pleuropneumoniae from both fresh and frozen pig lungs is described. A total of 196 lung tissues with pneumonic lesions were examined for culture isolation on chocolate blood agar, as well as for antigen detection by means of the coagglutination test, the immunodiffusion test and the indirect ELISA. These samples were subsequently frozen for 1 yr and then they were recultured. A. pleuropneumoniae was recovered from fresh lung specimens in 30 cases (15.3%) and from frozen samples in only two cases (0.9%). Such a different degree of isolation demonstrates that long freezing had an adverse effect on the viability of this organism in lung samples. A pleuropneumoniae detection was positive in 134 samples (68.4%) by at least one of the immunological techniques examined. The indirect ELISA was the most sensitive and specific test, with antigen detected in 125 lungs (63.8%). In comparison with the coagglutination and immunodiffusion tests, the sensitivities of the indirect ELISA were 95.8 and 93.7%, and the specificities were 67.0 and 63.4%, respectively.

  2. Estimation of the sugar cane cultivated area from LANDSAT images using the two phase sampling method

    NASA Technical Reports Server (NTRS)

    Parada, N. D. J. (Principal Investigator); Cappelletti, C. A.; Mendonca, F. J.; Lee, D. C. L.; Shimabukuro, Y. E.

    1982-01-01

    A two phase sampling method and the optimal sampling segment dimensions for the estimation of sugar cane cultivated area were developed. This technique employs visual interpretations of LANDSAT images and panchromatic aerial photographs considered as the ground truth. The estimates, as a mean value of 100 simulated samples, represent 99.3% of the true value with a CV of approximately 1%; the relative efficiency of the two phase design was 157% when compared with a one phase aerial photographs sample.

  3. Quantitative method of determining beryllium or a compound thereof in a sample

    DOEpatents

    McCleskey, T. Mark; Ehler, Deborah S.; John, Kevin D.; Burrell, Anthony K.; Collis, Gavin E.; Minogue, Edel M.; Warner, Benjamin P.

    2010-08-24

    A method of determining beryllium or a beryllium compound thereof in a sample, includes providing a sample suspected of comprising beryllium or a compound thereof, extracting beryllium or a compound thereof from the sample by dissolving in a solution, adding a fluorescent indicator to the solution to thereby bind any beryllium or a compound thereof to the fluorescent indicator, and determining the presence or amount of any beryllium or a compound thereof in the sample by measuring fluorescence.

  4. Quantitative method of determining beryllium or a compound thereof in a sample

    DOEpatents

    McCleskey, T. Mark; Ehler, Deborah S.; John, Kevin D.; Burrell, Anthony K.; Collis, Gavin E.; Minogue, Edel M.; Warner, Benjamin P.

    2006-10-31

    A method of determining beryllium or a beryllium compound thereof in a sample, includes providing a sample suspected of comprising beryllium or a compound thereof, extracting beryllium or a compound thereof from the sample by dissolving in a solution, adding a fluorescent indicator to the solution to thereby bind any beryllium or a compound thereof to the fluorescent indicator, and determining the presence or amount of any beryllium or a compound thereof in the sample by measuring fluorescence.

  5. Map showing locations of samples dated by radiocarbon methods in the San Francisco Bay region

    USGS Publications Warehouse

    Wright, Robert H.

    1971-01-01

    The potential value of a radiocarbon date is diminished, however, if adequate site data are not taken with the sample and do not accompany the date in publication.  At a minimum, published dates should include an accurate location for the dated sample, type of material dated and method of dating, nature of the site, depth below surface (or other accurately defined datum) of date sample, stratigraphy of material overlying date sample, and the significance of the data in the study.

  6. Analyses of ammonium in geologic samples: comparison of indophenol-blue and fluorometric methods

    NASA Astrophysics Data System (ADS)

    Johnson, B. W.; Goldblatt, C.; El-Sabaawi, R.; Hanson, N.

    2015-12-01

    Nitrogen in geologic materials is a tracer and record of biologic activity. Analysis can be difficult, however, as concentrations are often low (~1s to 10s ppm). The most stable form for N to be preserved in rocks and minerals is as NH4+, which is derived from organic matter and substitutes into mineral lattices for K+. Thus, measuring NH4+ content serves as a good proxy for total N content. Fluorometry (FL) is the standard technique for aqueous samples, and has a number of advantages over older indophenol-blue based, colorimetric (IBC) techniques (Hall, 1993). These include lower sample processing time, safer reagents, more stable reactions, and greater precision. In this study, I adapt a fluorometry technique (Holmes et al., 1999) for use in analyzing NH4+ concentration in geologic materials.Samples and standards were dissolved in hydrofluoric acid, neutralized with potassium hydroxide, then analyzed either with FL or IBC. As part of IBC, a distillation step was carried out to concentrate NH4+, and both techniques were used after this step to as well. Initial results show promise, as reproducible, ppm-level concentrations are demonstrated in several different rock types of various ages: Neoproterozoic siliciclastic sediments, a Jurassic granodiorite, a Cretaceous serpentinite, and Neoproterozoic carbonates. Concentrations are comparable between the two methods, though curiously fluorometry indicates NH4+ concentrations about 22% lower. Distillation appears effective in strengthening the signal for FL, though it may not be necessary for accurate results.There are several factors affecting the quality of FL. Sample pH appears to be the most important. At pH>9, NH3 is stable, and can easily escape a solution. Ensuring samples remain below pH 9 should prove key. Additional tests are being carried out to improve sample recovery, lessen time of HF dissolution, and improve accuracy.This new application of a standard technique should prove useful not only in determining

  7. Environmental and mental conditions predicting the experience of involuntary musical imagery: An experience sampling method study.

    PubMed

    Floridou, Georgia A; Müllensiefen, Daniel

    2015-05-01

    An experience sampling method (ESM) study on 40 volunteers was conducted to explore the environmental factors and psychological conditions related to involuntary musical imagery (INMI) in everyday life. Participants reported 6 times per day for one week on their INMI experiences, relevant contextual information and associated environmental conditions. The resulting data was modeled with Bayesian networks and led to insights into the interplay of factors related to INMI experiences. The activity that a person is engaged was found to play an important role in the experience of mind wandering, which in turn enables the experience of INMI. INMI occurrence is independent of the time of the day while the INMI trigger affects the subjective evaluation of the INMI experience. The results are compared to findings from earlier studies based on retrospective surveys and questionnaires and highlight the advantage of ESM techniques in research on spontaneous experiences like INMI. PMID:25800098

  8. [Preparation of sub-standard samples and XRF analytical method of powder non-metallic minerals].

    PubMed

    Kong, Qin; Chen, Lei; Wang, Ling

    2012-05-01

    In order to solve the problem that standard samples of non-metallic minerals are not satisfactory in practical work by X-ray fluorescence spectrometer (XRF) analysis with pressed powder pellet, a method was studied how to make sub-standard samples according to standard samples of non-metallic minerals and to determine how they can adapt to analysis of mineral powder samples, taking the K-feldspar ore in Ebian-Wudu, Sichuan as an example. Based on the characteristic analysis of K-feldspar ore and the standard samples by X-ray diffraction (XRD) and chemical methods, combined with the principle of the same or similar between the sub-standard samples and unknown samples, the experiment developed the method of preparation of sub-standard samples: both of the two samples above mentioned should have the same kind of minerals and the similar chemical components, adapt mineral processing, and benefit making working curve. Under the optimum experimental conditions, a method for determination of SiO2, Al2O3, Fe2O3, TiO2, CaO, MgO, K2O and Na2O of K-feldspar ore by XRF was established. Thedetermination results are in good agreement with classical chemical methods, which indicates that this method was accurate.

  9. Quantifying Responses of Dung Beetles to Fire Disturbance in Tropical Forests: The Importance of Trapping Method and Seasonality

    PubMed Central

    de Andrade, Rafael Barreto; Barlow, Jos; Louzada, Julio; Vaz-de-Mello, Fernando Zagury; Souza, Mateus; Silveira, Juliana M.; Cochrane, Mark A.

    2011-01-01

    Understanding how biodiversity responds to environmental changes is essential to provide the evidence-base that underpins conservation initiatives. The present study provides a standardized comparison between unbaited flight intercept traps (FIT) and baited pitfall traps (BPT) for sampling dung beetles. We examine the effectiveness of the two to assess fire disturbance effects and how trap performance is affected by seasonality. The study was carried out in a transitional forest between Cerrado (Brazilian Savanna) and Amazon Forest. Dung beetles were collected during one wet and one dry sampling season. The two methods sampled different portions of the local beetle assemblage. Both FIT and BPT were sensitive to fire disturbance during the wet season, but only BPT detected community differences during the dry season. Both traps showed similar correlation with environmental factors. Our results indicate that seasonality had a stronger effect than trap type, with BPT more effective and robust under low population numbers, and FIT more sensitive to fine scale heterogeneity patterns. This study shows the strengths and weaknesses of two commonly used methodologies for sampling dung beetles in tropical forests, as well as highlighting the importance of seasonality in shaping the results obtained by both sampling strategies. PMID:22028831

  10. Evaluation of a modified sampling method for molecular analysis of air microflora.

    PubMed

    Lech, T; Ziembinska-Buczynska, A

    2015-04-10

    A serious issue concerning the durability of economically important materials for humans related to cultural heritage is the process of biodeterioration. As a result of this phenomenon, priceless works of art, documents, and old prints have undergone a process of decomposition caused by microorganisms. Therefore, it is important to constantly monitor the presence and diversity of microorganisms in exposition rooms and storage areas of historical objects. In addition, the use of molecular biology tools for conservation studies will enable detailed research as well as reduce the time needed to perform the analyses compared with using conventional methods related to microbiology and conservation. The aim of this study was to adapt the sampling indoor air method for direct DNA extraction from microorganisms, including evaluating the extracted DNA quality and concentration. The obtained DNA was used to study the diversity of mold fungi in indoor air using polymerase chain reaction-denaturing gradient gel electrophoresis in specific archives and museum environments. The research was conducted in 2 storage rooms of the National Archives in Krakow and in 1 exposition room of the Archaeological Museum in Krakow (Poland).

  11. An improved regulatory sampling method for mapping and representing plant disease from a limited number of samples.

    PubMed

    Luo, W; Pietravalle, S; Parnell, S; van den Bosch, F; Gottwald, T R; Irey, M S; Parker, S R

    2012-06-01

    A key challenge for plant pathologists is to develop efficient methods to describe spatial patterns of disease spread accurately from a limited number of samples. Knowledge of disease spread is essential for informing and justifying plant disease management measures. A mechanistic modelling approach is adopted for disease mapping which is based on disease dispersal gradients and consideration of host pattern. The method is extended to provide measures of uncertainty for the estimates of disease at each host location. In addition, improvements have been made to increase computational efficiency by better initialising the disease status of unsampled hosts and speeding up the optimisation process of the model parameters. These improvements facilitate the practical use of the method by providing information on: (a) mechanisms of pathogen dispersal, (b) distance and pattern of disease spread, and (c) prediction of infection probabilities for unsampled hosts. Two data sets of disease observations, Huanglongbing (HLB) of citrus and strawberry powdery mildew, were used to evaluate the performance of the new method for disease mapping. The result showed that our method gave better estimates of precision for unsampled hosts, compared to both the original method and spatial interpolation. This enables decision makers to understand the spatial aspects of disease processes, and thus formulate regulatory actions accordingly to enhance disease control. PMID:22664065

  12. Comparability among four invertebrate sampling methods, Fountain Creek Basin, Colorado, 2010-2012

    USGS Publications Warehouse

    Zuellig, Robert E.; Bruce, James F.; Stogner, Robert W.; Brown, Krystal D.

    2014-01-01

    The U.S. Geological Survey, in cooperation with Colorado Springs City Engineering and Colorado Springs Utilities, designed a study to determine if sampling method and sample timing resulted in comparable samples and assessments of biological condition. To accomplish this task, annual invertebrate samples were collected concurrently using four sampling methods at 15 U.S. Geological Survey streamflow gages in the Fountain Creek basin from 2010 to 2012. Collectively, the four methods are used by local (U.S. Geological Survey cooperative monitoring program) and State monitoring programs (Colorado Department of Public Health and Environment) in the Fountain Creek basin to produce two distinct sample types for each program that target single-and multiple-habitats. This study found distinguishable differences between single-and multi-habitat sample types using both community similarities and multi-metric index values, while methods from each program within sample type were comparable. This indicates that the Colorado Department of Public Health and Environment methods were compatible with the cooperative monitoring program methods within multi-and single-habitat sample types. Comparisons between September and October samples found distinguishable differences based on community similarities for both sample types, whereas only differences were found for single-habitat samples when multi-metric index values were considered. At one site, differences between September and October index values from single-habitat samples resulted in opposing assessments of biological condition. Direct application of the results to inform the revision of the existing Fountain Creek basin U.S. Geological Survey cooperative monitoring program are discussed.

  13. Sample preparation methods for subsequent determination of metals and non-metals in crude oil--a review.

    PubMed

    Mello, Paola A; Pereira, Juliana S F; Mesko, Marcia F; Barin, Juliano S; Flores, Erico M M

    2012-10-01

    In this review sample preparation strategies used for crude oil digestion in last ten years are discussed focusing on further metals and non-metals determination. One of the main challenges of proposed methods has been to overcome the difficulty to bring crude oil samples into solution, which should be compatible with analytical techniques used for element determination. On this aspect, this review summarizes the sample preparation methods for metals and non metals determination in crude oil including those based on wet digestion, combustion, emulsification, extraction, sample dilution with organic solvents, among others. Conventional methods related to wet digestion with concentrated acids or combustion are also covered, with special emphasis to closed systems. Trends in sample digestion, such as microwave-assisted digestion using diluted acids combined with high-efficiency decomposition systems are discussed. On the other hand, strategies based on sample dilution in organic solvents and procedures recommended for speciation analysis are reported as well as the use of direct analysis in view of the recent importance for crude oil field. A compilation concerning sample preparation for crude oil provided by official methods as well as certified reference materials available for accuracy evaluation is also presented and discussed. PMID:22975177

  14. Synchronization sampling method based on delta-sigma analog-digital converter for underwater towed array system

    NASA Astrophysics Data System (ADS)

    Jiang, Jia-Jia; Duan, Fa-Jie; Li, Yan-Chao; Hua, Xiang-Ning

    2014-03-01

    Synchronization sampling is very important in underwater towed array system where every acquisition node (AN) samples analog signals by its own analog-digital converter (ADC). In this paper, a simple and effective synchronization sampling method is proposed to ensure synchronized operation among different ANs of the underwater towed array system. We first present a master-slave synchronization sampling model, and then design a high accuracy phase-locked loop to synchronize all delta-sigma ADCs to a reference clock. However, when the master-slave synchronization sampling model is used, both the time-delay (TD) of messages traveling along the wired transmission medium and the jitter of the clocks will bring out synchronization sampling error (SSE). Therefore, a simple method is proposed to estimate and compensate the TD of the messages transmission, and then another effective method is presented to overcome the SSE caused by the jitter of the clocks. An experimental system with three ANs is set up, and the related experimental results verify the validity of the synchronization sampling method proposed in this paper.

  15. A Rapid and Specific Method for the Detection of Indole in Complex Biological Samples

    PubMed Central

    Chappell, Cynthia; Gonzales, Christopher; Okhuysen, Pablo

    2015-01-01

    Indole, a bacterial product of tryptophan degradation, has a variety of important applications in the pharmaceutical industry and is a biomarker in biological and clinical specimens. Yet, specific assays to quantitate indole are complex and require expensive equipment and a high level of training. Thus, indole in biological samples is often estimated using the simple and rapid Kovács assay, which nonspecifically detects a variety of commonly occurring indole analogs. We demonstrate here a sensitive, specific, and rapid method for measuring indole in complex biological samples using a specific reaction between unsubstituted indole and hydroxylamine. We compared the hydroxylamine-based indole assay (HIA) to the Kovács assay and confirmed that the two assays are capable of detecting microgram amounts of indole. However, the HIA is specific to indole and does not detect other naturally occurring indole analogs. We further demonstrated the utility of the HIA in measuring indole levels in clinically relevant biological materials, such as fecal samples and bacterial cultures. Mean and median fecal indole concentrations from 53 healthy adults were 2.59 mM and 2.73 mM, respectively, but varied widely (0.30 mM to 6.64 mM) among individuals. We also determined that enterotoxigenic Escherichia coli strain H10407 produces 3.3 ± 0.22 mM indole during a 24-h period in the presence of 5 mM tryptophan. The sensitive and specific HIA should be of value in a variety of settings, such as the evaluation of various clinical samples and the study of indole-producing bacterial species in the gut microbiota. PMID:26386049

  16. Molecular cancer classification using a meta-sample-based regularized robust coding method

    PubMed Central

    2014-01-01

    Motivation Previous studies have demonstrated that machine learning based molecular cancer classification using gene expression profiling (GEP) data is promising for the clinic diagnosis and treatment of cancer. Novel classification methods with high efficiency and prediction accuracy are still needed to deal with high dimensionality and small sample size of typical GEP data. Recently the sparse representation (SR) method has been successfully applied to the cancer classification. Nevertheless, its efficiency needs to be improved when analyzing large-scale GEP data. Results In this paper we present the meta-sample-based regularized robust coding classification (MRRCC), a novel effective cancer classification technique that combines the idea of meta-sample-based cluster method with regularized robust coding (RRC) method. It assumes that the coding residual and the coding coefficient are respectively independent and identically distributed. Similar to meta-sample-based SR classification (MSRC), MRRCC extracts a set of meta-samples from the training samples, and then encodes a testing sample as the sparse linear combination of these meta-samples. The representation fidelity is measured by the l2-norm or l1-norm of the coding residual. Conclusions Extensive experiments on publicly available GEP datasets demonstrate that the proposed method is more efficient while its prediction accuracy is equivalent to existing MSRC-based methods and better than other state-of-the-art dimension reduction based methods. PMID:25473795

  17. A remark on the theory of measuring thermal diffusivity by the modified Angstrom's method. [in lunar samples

    NASA Technical Reports Server (NTRS)

    Horai, K.-I.

    1981-01-01

    A theory of the measurement of the thermal diffusivity of a sample by the modified Angstrom method is developed for the case in which radiative heat loss from the end surface of the sample is not negligible, and applied to measurements performed on lunar samples. Formulas allowing sample thermal diffusivity to be determined from the amplitude decay and phase lag of a temperature wave traveling through the sample are derived for a flat disk sample for which only heat loss from the end surface is important, and a sample of finite diameter and length for which heat loss through the end and side surfaces must be considered. It is noted that in the case of a flat disk, measurements at a single angular frequency of the temperature wave are sufficient, while the sample of finite diameter and length requires measurements at two discrete angular frequencies. Comparison of the values of the thermal diffusivities of two lunar samples of dimensions approximately 1 x 1 x 2 cm derived by the present methods and by the Angstrom theory for a finite bar reveals them to differ by not more than 5%, and indicates that more refined data are required as the measurement theory becomes more complicated.

  18. Best Practices in Using Large, Complex Samples: The Importance of Using Appropriate Weights and Design Effect Compensation

    ERIC Educational Resources Information Center

    Osborne, Jason W.

    2011-01-01

    Large surveys often use probability sampling in order to obtain representative samples, and these data sets are valuable tools for researchers in all areas of science. Yet many researchers are not formally prepared to appropriately utilize these resources. Indeed, users of one popular dataset were generally found "not" to have modeled the analyses…

  19. Development of a sensitive method to extract and detect low numbers of Cryptosporidium oocysts from adult cattle faecal samples.

    PubMed

    Wells, B; Thomson, S; Ensor, H; Innes, E A; Katzer, F

    2016-08-30

    Cryptosporidium transmission studies to date have concluded that adult cattle are not a significant source of oocysts contributing to clinical cryptosporidiosis in calves on farm. However current methods of sample processing have been optimised for calf faecal samples and may be less sensitive when used on adult samples due to lower numbers of oocysts and larger size of samples. A modified and novel method of oocyst extraction and concentration was developed and applied in an experiment involving spiking adult cattle faecal samples with known concentrations of Cryptosporidium oocysts. The results showed an increased sensitivity of detection from 100oocysts/g of faecal sample using conventional protocols to 5oocysts/g using the newly developed method. As it is important to be able to accurately assess the contribution of adult ruminants to the transmission of Cryptosporidium, both on farm and in the environment, the development of the techniques described here is likely to make an important contribution to Cryptosporidium transmission studies in future and in subsequent control strategies aimed at the reduction of Cryptosporidium infection in calves on farm. PMID:27523933

  20. Development of a sensitive method to extract and detect low numbers of Cryptosporidium oocysts from adult cattle faecal samples.

    PubMed

    Wells, B; Thomson, S; Ensor, H; Innes, E A; Katzer, F

    2016-08-30

    Cryptosporidium transmission studies to date have concluded that adult cattle are not a significant source of oocysts contributing to clinical cryptosporidiosis in calves on farm. However current methods of sample processing have been optimised for calf faecal samples and may be less sensitive when used on adult samples due to lower numbers of oocysts and larger size of samples. A modified and novel method of oocyst extraction and concentration was developed and applied in an experiment involving spiking adult cattle faecal samples with known concentrations of Cryptosporidium oocysts. The results showed an increased sensitivity of detection from 100oocysts/g of faecal sample using conventional protocols to 5oocysts/g using the newly developed method. As it is important to be able to accurately assess the contribution of adult ruminants to the transmission of Cryptosporidium, both on farm and in the environment, the development of the techniques described here is likely to make an important contribution to Cryptosporidium transmission studies in future and in subsequent control strategies aimed at the reduction of Cryptosporidium infection in calves on farm.

  1. 40 CFR 80.1642 - Sampling and testing requirements for producers and importers of denatured fuel ethanol and other...

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... producers and importers of denatured fuel ethanol and other oxygenates for use by oxygenate blenders. 80... requirements for producers and importers of denatured fuel ethanol and other oxygenates for use by oxygenate blenders. Beginning January 1, 2017, producers and importers of denatured fuel ethanol (DFE) and...

  2. Melting Temperature Mapping Method: A Novel Method for Rapid Identification of Unknown Pathogenic Microorganisms within Three Hours of Sample Collection.

    PubMed

    Niimi, Hideki; Ueno, Tomohiro; Hayashi, Shirou; Abe, Akihito; Tsurue, Takahiro; Mori, Masashi; Tabata, Homare; Minami, Hiroshi; Goto, Michihiko; Akiyama, Makoto; Yamamoto, Yoshihiro; Saito, Shigeru; Kitajima, Isao

    2015-07-28

    Acquiring the earliest possible identification of pathogenic microorganisms is critical for selecting the appropriate antimicrobial therapy in infected patients. We herein report the novel "melting temperature (Tm) mapping method" for rapidly identifying the dominant bacteria in a clinical sample from sterile sites. Employing only seven primer sets, more than 100 bacterial species can be identified. In particular, using the Difference Value, it is possible to identify samples suitable for Tm mapping identification. Moreover, this method can be used to rapidly diagnose the absence of bacteria in clinical samples. We tested the Tm mapping method using 200 whole blood samples obtained from patients with suspected sepsis, 85% (171/200) of which matched the culture results based on the detection level. A total of 130 samples were negative according to the Tm mapping method, 98% (128/130) of which were also negative based on the culture method. Meanwhile, 70 samples were positive according to the Tm mapping method, and of the 59 suitable for identification, 100% (59/59) exhibited a "match" or "broad match" with the culture or sequencing results. These findings were obtained within three hours of whole blood collection. The Tm mapping method is therefore useful for identifying infectious diseases requiring prompt treatment.

  3. A structured sparse regression method for estimating isoform expression level from multi-sample RNA-seq data.

    PubMed

    Zhang, L; Liu, X J

    2016-01-01

    With the rapid development of next-generation high-throughput sequencing technology, RNA-seq has become a standard and important technique for transcriptome analysis. For multi-sample RNA-seq data, the existing expression estimation methods usually deal with each single-RNA-seq sample, and ignore that the read distributions are consistent across multiple samples. In the current study, we propose a structured sparse regression method, SSRSeq, to estimate isoform expression using multi-sample RNA-seq data. SSRSeq uses a non-parameter model to capture the general tendency of non-uniformity read distribution for all genes across multiple samples. Additionally, our method adds a structured sparse regularization, which not only incorporates the sparse specificity between a gene and its corresponding isoform expression levels, but also reduces the effects of noisy reads, especially for lowly expressed genes and isoforms. Four real datasets were used to evaluate our method on isoform expression estimation. Compared with other popular methods, SSRSeq reduced the variance between multiple samples, and produced more accurate isoform expression estimations, and thus more meaningful biological interpretations. PMID:27323111

  4. A case-base sampling method for estimating recurrent event intensities.

    PubMed

    Saarela, Olli

    2016-10-01

    Case-base sampling provides an alternative to risk set sampling based methods to estimate hazard regression models, in particular when absolute hazards are also of interest in addition to hazard ratios. The case-base sampling approach results in a likelihood expression of the logistic regression form, but instead of categorized time, such an expression is obtained through sampling of a discrete set of person-time coordinates from all follow-up data. In this paper, in the context of a time-dependent exposure such as vaccination, and a potentially recurrent adverse event outcome, we show that the resulting partial likelihood for the outcome event intensity has the asymptotic properties of a likelihood. We contrast this approach to self-matched case-base sampling, which involves only within-individual comparisons. The efficiency of the case-base methods is compared to that of standard methods through simulations, suggesting that the information loss due to sampling is minimal.

  5. Electrodeposition as an alternate method for preparation of environmental samples for iodide by AMS

    NASA Astrophysics Data System (ADS)

    Adamic, M. L.; Lister, T. E.; Dufek, E. J.; Jenson, D. D.; Olson, J. E.; Vockenhuber, C.; Watrous, M. G.

    2015-10-01

    This paper presents an evaluation of an alternate method for preparing environmental samples for 129I analysis by accelerator mass spectrometry (AMS) at Idaho National Laboratory. The optimal sample preparation method is characterized by ease of preparation, capability of processing very small quantities of iodide, and ease of loading into a cathode. Electrodeposition of iodide on a silver wire was evaluated using these criteria. This study indicates that the electrochemically-formed silver iodide deposits produce ion currents similar to those from precipitated silver iodide for the same sample mass. Precipitated silver iodide samples are usually mixed with niobium or silver powder prior to loading in a cathode. Using electrodeposition, the silver is already mixed with the sample and can simply be picked up with tweezers, placed in the sample die, and pressed into a cathode. The major advantage of this method is that the silver wire/electrodeposited silver iodide is much easier to load into a cathode.

  6. Sampling/analytical method evaluation for ethylene oxide emission and control-unit efficiency determinations

    SciTech Connect

    Steger, J.; Gergen, W.; Margeson, J.H.

    1988-05-01

    Radian Corporation, assisting the Environmental Monitoring Systems Laboratory, Environmental Protection Agency, Research Triangle Park, North Carolina, performed a field evaluation of a method for sampling and analyzing ethylene oxide (EO) in the vent stream from a sterilization chamber and a dilute-acid scrubber. The utility of the sampling method for measuring the efficiency of the control unit was also evaluated. The evaluated sampling and analysis procedure used semi-continuous direct sampling with on-line gas chromatographic analysis. Laboratory studies of the sampling method previous to the field test showed that semi-continuous direct sampling was capable of measuring EO emissions to within 11% of the expected value with a between-trial precision of 5%.

  7. Electrodeposition as an alternate method for preparation of environmental samples for iodide by AMS

    SciTech Connect

    Adamic, M. L.; Lister, T. E.; Dufek, E. J.; Jenson, D. D.; Olson, J. E.; Vockenhuber, C.; Watrous, M. G.

    2015-03-25

    This paper presents an evaluation of an alternate method for preparing environmental samples for 129I analysis by accelerator mass spectrometry (AMS) at Idaho National Laboratory. The optimal sample preparation method is characterized by ease of preparation, capability of processing very small quantities of iodide, and ease of loading into a cathode. Electrodeposition of iodide on a silver wire was evaluated using these criteria. This study indicates that the electrochemically-formed silver iodide deposits produce ion currents similar to those from precipitated silver iodide for the same sample mass. Furthermore, precipitated silver iodide samples are usually mixed with niobium or silver powder prior to loading in a cathode. Using electrodeposition, the silver is already mixed with the sample and can simply be picked up with tweezers, placed in the sample die, and pressed into a cathode. The major advantage of this method is that the silver wire/electrodeposited silver iodide is much easier to load into a cathode.

  8. Assessment of dust sampling methods for the study of cultivable-microorganism exposure in stables.

    PubMed

    Normand, Anne-Cécile; Vacheyrou, Mallory; Sudre, Bertrand; Heederik, Dick J J; Piarroux, Renaud

    2009-12-01

    Studies have shown a link between living on a farm, exposure to microbial components (e.g., endotoxins or beta-d-glucans), and a lower risk for allergic diseases and asthma. Due to the lack of validated sampling methods, studies of asthma and atopy have not relied on exposure assessment based on culture techniques. Our objective was therefore to compare several dust sampling methods for the detection of cultivable-microorganism exposure in stables. Sixteen French farms were sampled using four different methods: (i) active air sampling using a pump, (ii) passive dust sampling with a plastic box, (iii) dust sampling with an electrostatic dust fall collector (wipe), and (iv) dust sampling using a spatula to collect dust already settled on a windowsill. The results showed that collection of settled dust samples with either plastic boxes or wipes was reproducible (pairwise correlations, 0.72 and 0.73, respectively) and resulted in highly correlated results (pairwise correlation between the two methods, 0.82). We also found that settled dust samples collected with a plastic box correctly reflected the composition of the samples collected in the air of the stable when there was no farmer activity. A loss of microbial diversity was observed when dust was kept for 3 months at room temperature. We therefore conclude that measurement of viable microorganisms within a reasonable time frame gives an accurate representation of the microbial composition of stable air.

  9. Kinetics of iron import into developing mouse organs determined by a pup-swapping method.

    PubMed

    Chakrabarti, Mrinmoy; Barlas, Mirza Nofil; McCormick, Sean P; Lindahl, Lora S; Lindahl, Paul A

    2015-01-01

    The kinetics of dietary iron import into various organs of mice were evaluated using a novel pup-swapping approach. Newborn pups whose bodies primarily contained (56)Fe or (57)Fe were swapped at birth such that each nursed on milk containing the opposite isotope. A pup from each litter was euthanized weekly over a 7-week period. Blood plasma was obtained, and organs were isolated typically after flushing with Ringer's buffer. (56)Fe and (57)Fe concentrations were determined for organs and plasma; organ volumes were also determined. Mössbauer spectra of equivalent (57)Fe-enriched samples were used to quantify residual blood in organs; this fraction was excluded from later analysis. Rates of import into brain, spleen, heart, and kidneys were highest during the first 2 weeks of life. In contrast, half of iron in the newborn liver exited during that time, and influx peaked later. Two mathematical models were developed to analyze the import kinetics. The only model that simulated the data adequately assumed that an iron-containing species enters the plasma and converts into a second species and that both are independently imported into organs. Consistent with this, liquid chromatography with an on-line ICP-MS detector revealed numerous iron species in plasma besides transferrin. Model fitting required that the first species, assigned to non-transferrin-bound iron, imports faster into organs than the second, assigned to transferrin-bound-iron. Non-transferrin-bound iron rather than transferrin-bound-iron appears to play the dominant role in importing iron into organs during early development of healthy mice.

  10. Modified shifted angular spectrum method for numerical propagation at reduced spatial sampling rates.

    PubMed

    Ritter, André

    2014-10-20

    The shifted angular spectrum method allows a reduction of the number of samples required for numerical off-axis propagation of scalar wave fields. In this work, a modification of the shifted angular spectrum method is presented. It allows a further reduction of the spatial sampling rate for certain wave fields. We calculate the benefit of this method for spherical waves. Additionally, a working implementation is presented showing the example of a spherical wave propagating through a circular aperture. PMID:25401659

  11. Guidance for characterizing explosives contaminated soils: Sampling and selecting on-site analytical methods

    SciTech Connect

    Crockett, A.B.; Craig, H.D.; Jenkins, T.F.; Sisk, W.E.

    1996-09-01

    A large number of defense-related sites are contaminated with elevated levels of secondary explosives. Levels of contamination range from barely detectable to levels above 10% that need special handling due to the detonation potential. Characterization of explosives-contaminated sites is particularly difficult due to the very heterogeneous distribution of contamination in the environment and within samples. To improve site characterization, several options exist including collecting more samples, providing on-site analytical data to help direct the investigation, compositing samples, improving homogenization of samples, and extracting larger samples. On-site analytical methods are essential to more economical and improved characterization. On-site methods might suffer in terms of precision and accuracy, but this is more than offset by the increased number of samples that can be run. While verification using a standard analytical procedure should be part of any quality assurance program, reducing the number of samples analyzed by the more expensive methods can result in significantly reduced costs. Often 70 to 90% of the soil samples analyzed during an explosives site investigation do not contain detectable levels of contamination. Two basic types of on-site analytical methods are in wide use for explosives in soil, calorimetric and immunoassay. Calorimetric methods generally detect broad classes of compounds such as nitroaromatics or nitramines, while immunoassay methods are more compound specific. Since TNT or RDX is usually present in explosive-contaminated soils, the use of procedures designed to detect only these or similar compounds can be very effective.

  12. Simple method to measure power density entering a plane biological sample at millimeter wavelengths.

    PubMed

    Shen, Z Y; Birenbaum, L; Chu, A; Motzkin, S; Rosenthal, S; Sheng, K M

    1987-01-01

    A simple method for measuring microwave power density is described. It is applicable to situations where exposure of samples in the near field of a horn is necessary. A transmitted power method is used to calibrate the power density entering the surface of the sample. Once the calibration is available, the power density is known in terms of the incident and reflected powers within the waveguide. The calibration has been carried out for liquid samples in a quartz cell. Formulas for calculating specific absorption rate (SAR) are derived in terms of the power density and the complex dielectric constant of the sample. An error analysis is also given.

  13. A novel nonstationary deconvolution method based on spectral modeling and variable-step sampling hyperbolic smoothing

    NASA Astrophysics Data System (ADS)

    Li, Fang; Wang, Shoudong; Chen, Xiaohong; Liu, Guochang; Zheng, Qiang

    2014-04-01

    Deconvolution is an important part of seismic processing tool for improving the resolution. One of the key assumptions made in most deconvolutional methods is that the seismic data is stationary. However, due to the anelastic absorption, the seismic data is usually nonstationary. In this paper, a novel nonstationary deconvolution approach is proposed based on spectral modeling and variable-step sampling (VSS) hyperbolic smoothing. To facilitate our method, firstly, we apply the Gabor transform to perform a time-frequency decomposition of the nonstationary seismic trace. Secondly, we estimate the source wavelet amplitude spectrum by spectral modeling. Thirdly, smoothing the Gabor magnitude spectrum of seismic data along hyperbolic paths with VSS can obtain the magnitude of the attenuation function, and can also eliminate the effect of source wavelet. Fourthly, by assuming that the source wavelet and attenuation function are minimum phase, their phases can be determined by Hilbert transform. Finally, the estimated two factors are removed by dividing them into the Gabor spectrum of the trace to estimate the Gabor spectrum of the reflectivity. An inverse Gabor transform gives the time-domain reflectivity estimate. Tests on synthetic and field data show that the presented method is an effective tool that not only has the advantages of stationary deconvolution, but also can compensate for the energy absorption, without knowing or estimating the quality factor Q.

  14. Utility of the microculture method for Leishmania detection in non-invasive samples obtained from a blood bank.

    PubMed

    Ates, Sezen Canim; Bagirova, Malahat; Allahverdiyev, Adil M; Kocazeybek, Bekir; Kosan, Erdogan

    2013-10-01

    In recent years, the role of donor blood has taken an important place in epidemiology of Leishmaniasis. According to the WHO, the numbers of patients considered as symptomatic are only 5-20% of individuals with asymptomatic leishmaniasis. In this study for detection of Leishmania infection in donor blood samples, 343 samples from the Capa Red Crescent Blood Center were obtained and primarily analyzed by microscopic and serological methods. Subsequently, the traditional culture (NNN), Immuno-chromatographic test (ICT) and Polymerase Chain Reaction (PCR) methods were applied to 21 samples which of them were found positive with at least one method. Buffy coat (BC) samples from 343 blood donors were analyzed: 15 (4.3%) were positive by a microculture method (MCM); and 4 (1.1%) by smear. The sera of these 343 samples included 9 (2.6%) determined positive by ELISA and 7 (2%) positive by IFAT. Thus, 21 of (6.1%) the 343 subjects studied by smear, MCM, IFAT and ELISA techniques were identified as positive for leishmaniasis at least one of the techniques and the sensitivity assessed. According to our data, the sensitivity of the methods are identified as MCM (71%), smear (19%), IFAT (33%), ELISA (42%), NNN (4%), PCR (14%) and ICT (4%). Thus, with this study for the first time, the sensitivity of a MCM was examined in blood donors by comparing MCM with the methods used in the diagnosis of leishmaniasis. As a result, MCM was found the most sensitive method for detection of Leishmania parasites in samples obtained from a blood bank. In addition, the presence of Leishmania parasites was detected in donor bloods in Istanbul, a non-endemic region of Turkey, and these results is a vital importance for the health of blood recipients. PMID:23806567

  15. Utility of the microculture method for Leishmania detection in non-invasive samples obtained from a blood bank.

    PubMed

    Ates, Sezen Canim; Bagirova, Malahat; Allahverdiyev, Adil M; Kocazeybek, Bekir; Kosan, Erdogan

    2013-10-01

    In recent years, the role of donor blood has taken an important place in epidemiology of Leishmaniasis. According to the WHO, the numbers of patients considered as symptomatic are only 5-20% of individuals with asymptomatic leishmaniasis. In this study for detection of Leishmania infection in donor blood samples, 343 samples from the Capa Red Crescent Blood Center were obtained and primarily analyzed by microscopic and serological methods. Subsequently, the traditional culture (NNN), Immuno-chromatographic test (ICT) and Polymerase Chain Reaction (PCR) methods were applied to 21 samples which of them were found positive with at least one method. Buffy coat (BC) samples from 343 blood donors were analyzed: 15 (4.3%) were positive by a microculture method (MCM); and 4 (1.1%) by smear. The sera of these 343 samples included 9 (2.6%) determined positive by ELISA and 7 (2%) positive by IFAT. Thus, 21 of (6.1%) the 343 subjects studied by smear, MCM, IFAT and ELISA techniques were identified as positive for leishmaniasis at least one of the techniques and the sensitivity assessed. According to our data, the sensitivity of the methods are identified as MCM (71%), smear (19%), IFAT (33%), ELISA (42%), NNN (4%), PCR (14%) and ICT (4%). Thus, with this study for the first time, the sensitivity of a MCM was examined in blood donors by comparing MCM with the methods used in the diagnosis of leishmaniasis. As a result, MCM was found the most sensitive method for detection of Leishmania parasites in samples obtained from a blood bank. In addition, the presence of Leishmania parasites was detected in donor bloods in Istanbul, a non-endemic region of Turkey, and these results is a vital importance for the health of blood recipients.

  16. RAPID FUSION METHOD FOR DETERMINATION OF PLUTONIUM ISOTOPES IN LARGE RICE SAMPLES

    SciTech Connect

    Maxwell, S.

    2013-03-01

    A new rapid fusion method for the determination of plutonium in large rice samples has been developed at the Savannah River National Laboratory (Aiken, SC, USA) that can be used to determine very low levels of plutonium isotopes in rice. The recent accident at Fukushima Nuclear Power Plant in March, 2011 reinforces the need to have rapid, reliable radiochemical analyses for radionuclides in environmental and food samples. Public concern regarding foods, particularly foods such as rice in Japan, highlights the need for analytical techniques that will allow very large sample aliquots of rice to be used for analysis so that very low levels of plutonium isotopes may be detected. The new method to determine plutonium isotopes in large rice samples utilizes a furnace ashing step, a rapid sodium hydroxide fusion method, a lanthanum fluoride matrix removal step, and a column separation process with TEVA Resin cartridges. The method can be applied to rice sample aliquots as large as 5 kg. Plutonium isotopes can be determined using alpha spectrometry or inductively-coupled plasma mass spectrometry (ICP-MS). The method showed high chemical recoveries and effective removal of interferences. The rapid fusion technique is a rugged sample digestion method that ensures that any refractory plutonium particles are effectively digested. The MDA for a 5 kg rice sample using alpha spectrometry is 7E-5 mBq g{sup -1}. The method can easily be adapted for use by ICP-MS to allow detection of plutonium isotopic ratios.

  17. Rapid method to determine actinides and 89/90Sr in limestone and marble samples

    DOE PAGES

    Maxwell, Sherrod L.; Culligan, Brian; Hutchison, Jay B.; Utsey, Robin C.; Sudowe, Ralf; McAlister, Daniel R.

    2016-04-12

    A new method for the determination of actinides and radiostrontium in limestone and marble samples has been developed that utilizes a rapid sodium hydroxide fusion to digest the sample. Following rapid pre-concentration steps to remove sample matrix interferences, the actinides and 89/90Sr are separated using extraction chromatographic resins and measured radiometrically. The advantages of sodium hydroxide fusion versus other fusion techniques will be discussed. Lastly, this approach has a sample preparation time for limestone and marble samples of <4 hours.

  18. DEVELOPMENT AND FIELD IMPLEMENTATION OF AN IMPROVED METHOD FOR HEADSPACE GAS SAMPLING OF TRANSURANIC WASTE DRUMS

    SciTech Connect

    Polley, M.; Ankrom, J.; Wickland, T.; Warren, J.

    2003-02-27

    A fast, safe, and cost-effective method for obtaining headspace gas samples has been developed and implemented at Los Alamos National Laboratory (LANL). A sample port is installed directly into a drum lid using a pneumatic driver, allowing sampling with a side-port needle. Testing has shown that the sample port can be installed with no release of radioactive material. Use of this system at LANL has significantly reduced the time required for sampling, and eliminates the need for many safety precautions previously used. The system has significantly improved productivity and lowered radiation exposure and cost.

  19. Probe Heating Method for the Analysis of Solid Samples Using a Portable Mass Spectrometer.

    PubMed

    Kumano, Shun; Sugiyama, Masuyuki; Yamada, Masuyoshi; Nishimura, Kazushige; Hasegawa, Hideki; Morokuma, Hidetoshi; Inoue, Hiroyuki; Hashimoto, Yuichiro

    2015-01-01

    We previously reported on the development of a portable mass spectrometer for the onsite screening of illicit drugs, but our previous sampling system could only be used for liquid samples. In this study, we report on an attempt to develop a probe heating method that also permits solid samples to be analyzed using a portable mass spectrometer. An aluminum rod is used as the sampling probe. The powdered sample is affixed to the sampling probe or a droplet of sample solution is placed on the tip of the probe and dried. The probe is then placed on a heater to vaporize the sample. The vapor is then introduced into the portable mass spectrometer and analyzed. With the heater temperature set to 130°C, the developed system detected 1 ng of methamphetamine, 1 ng of amphetamine, 3 ng of 3,4-methylenedioxymethamphetamine, 1 ng of 3,4-methylenedioxyamphetamine, and 0.3 ng of cocaine. Even from mixtures consisting of clove powder and methamphetamine powder, methamphetamine ions were detected by tandem mass spectrometry. The developed probe heating method provides a simple method for the analysis of solid samples. A portable mass spectrometer incorporating this method would thus be useful for the onsite screening of illicit drugs. PMID:26819909

  20. Probe Heating Method for the Analysis of Solid Samples Using a Portable Mass Spectrometer.

    PubMed

    Kumano, Shun; Sugiyama, Masuyuki; Yamada, Masuyoshi; Nishimura, Kazushige; Hasegawa, Hideki; Morokuma, Hidetoshi; Inoue, Hiroyuki; Hashimoto, Yuichiro

    2015-01-01

    We previously reported on the development of a portable mass spectrometer for the onsite screening of illicit drugs, but our previous sampling system could only be used for liquid samples. In this study, we report on an attempt to develop a probe heating method that also permits solid samples to be analyzed using a portable mass spectrometer. An aluminum rod is used as the sampling probe. The powdered sample is affixed to the sampling probe or a droplet of sample solution is placed on the tip of the probe and dried. The probe is then placed on a heater to vaporize the sample. The vapor is then introduced into the portable mass spectrometer and analyzed. With the heater temperature set to 130°C, the developed system detected 1 ng of methamphetamine, 1 ng of amphetamine, 3 ng of 3,4-methylenedioxymethamphetamine, 1 ng of 3,4-methylenedioxyamphetamine, and 0.3 ng of cocaine. Even from mixtures consisting of clove powder and methamphetamine powder, methamphetamine ions were detected by tandem mass spectrometry. The developed probe heating method provides a simple method for the analysis of solid samples. A portable mass spectrometer incorporating this method would thus be useful for the onsite screening of illicit drugs.