Science.gov

Sample records for importance sampling method

  1. A flexible importance sampling method for integrating subgrid processes

    DOE PAGES

    Raut, E. K.; Larson, V. E.

    2016-01-29

    Numerical models of weather and climate need to compute grid-box-averaged rates of physical processes such as microphysics. These averages are computed by integrating subgrid variability over a grid box. For this reason, an important aspect of atmospheric modeling is spatial integration over subgrid scales. The needed integrals can be estimated by Monte Carlo integration. Monte Carlo integration is simple and general but requires many evaluations of the physical process rate. To reduce the number of function evaluations, this paper describes a new, flexible method of importance sampling. It divides the domain of integration into eight categories, such as the portion that containsmore » both precipitation and cloud, or the portion that contains precipitation but no cloud. It then allows the modeler to prescribe the density of sample points within each of the eight categories. The new method is incorporated into the Subgrid Importance Latin Hypercube Sampler (SILHS). The resulting method is tested on drizzling cumulus and stratocumulus cases. In the cumulus case, the sampling error can be considerably reduced by drawing more sample points from the region of rain evaporation.« less

  2. A flexible importance sampling method for integrating subgrid processes

    NASA Astrophysics Data System (ADS)

    Raut, E. K.; Larson, V. E.

    2016-01-01

    Numerical models of weather and climate need to compute grid-box-averaged rates of physical processes such as microphysics. These averages are computed by integrating subgrid variability over a grid box. For this reason, an important aspect of atmospheric modeling is spatial integration over subgrid scales. The needed integrals can be estimated by Monte Carlo integration. Monte Carlo integration is simple and general but requires many evaluations of the physical process rate. To reduce the number of function evaluations, this paper describes a new, flexible method of importance sampling. It divides the domain of integration into eight categories, such as the portion that contains both precipitation and cloud, or the portion that contains precipitation but no cloud. It then allows the modeler to prescribe the density of sample points within each of the eight categories. The new method is incorporated into the Subgrid Importance Latin Hypercube Sampler (SILHS). The resulting method is tested on drizzling cumulus and stratocumulus cases. In the cumulus case, the sampling error can be considerably reduced by drawing more sample points from the region of rain evaporation.

  3. Importance sampling variance reduction for the Fokker-Planck rarefied gas particle method

    NASA Astrophysics Data System (ADS)

    Collyer, B. S.; Connaughton, C.; Lockerby, D. A.

    2016-11-01

    The Fokker-Planck approximation to the Boltzmann equation, solved numerically by stochastic particle schemes, is used to provide estimates for rarefied gas flows. This paper presents a variance reduction technique for a stochastic particle method that is able to greatly reduce the uncertainty of the estimated flow fields when the characteristic speed of the flow is small in comparison to the thermal velocity of the gas. The method relies on importance sampling, requiring minimal changes to the basic stochastic particle scheme. We test the importance sampling scheme on a homogeneous relaxation, planar Couette flow and a lid-driven-cavity flow, and find that our method is able to greatly reduce the noise of estimated quantities. Significantly, we find that as the characteristic speed of the flow decreases, the variance of the noisy estimators becomes independent of the characteristic speed.

  4. Importance sampling variance reduction for the Fokker–Planck rarefied gas particle method

    SciTech Connect

    Collyer, B.S.; Connaughton, C.; Lockerby, D.A.

    2016-11-15

    The Fokker–Planck approximation to the Boltzmann equation, solved numerically by stochastic particle schemes, is used to provide estimates for rarefied gas flows. This paper presents a variance reduction technique for a stochastic particle method that is able to greatly reduce the uncertainty of the estimated flow fields when the characteristic speed of the flow is small in comparison to the thermal velocity of the gas. The method relies on importance sampling, requiring minimal changes to the basic stochastic particle scheme. We test the importance sampling scheme on a homogeneous relaxation, planar Couette flow and a lid-driven-cavity flow, and find that our method is able to greatly reduce the noise of estimated quantities. Significantly, we find that as the characteristic speed of the flow decreases, the variance of the noisy estimators becomes independent of the characteristic speed.

  5. Importance sampling : promises and limitations.

    SciTech Connect

    West, Nicholas J.; Swiler, Laura Painton

    2010-04-01

    Importance sampling is an unbiased sampling method used to sample random variables from different densities than originally defined. These importance sampling densities are constructed to pick 'important' values of input random variables to improve the estimation of a statistical response of interest, such as a mean or probability of failure. Conceptually, importance sampling is very attractive: for example one wants to generate more samples in a failure region when estimating failure probabilities. In practice, however, importance sampling can be challenging to implement efficiently, especially in a general framework that will allow solutions for many classes of problems. We are interested in the promises and limitations of importance sampling as applied to computationally expensive finite element simulations which are treated as 'black-box' codes. In this paper, we present a customized importance sampler that is meant to be used after an initial set of Latin Hypercube samples has been taken, to help refine a failure probability estimate. The importance sampling densities are constructed based on kernel density estimators. We examine importance sampling with respect to two main questions: is importance sampling efficient and accurate for situations where we can only afford small numbers of samples? And does importance sampling require the use of surrogate methods to generate a sufficient number of samples so that the importance sampling process does increase the accuracy of the failure probability estimate? We present various case studies to address these questions.

  6. Improved algorithms and coupled neutron-photon transport for auto-importance sampling method

    NASA Astrophysics Data System (ADS)

    Wang, Xin; Li, Jun-Li; Wu, Zhen; Qiu, Rui; Li, Chun-Yan; Liang, Man-Chun; Zhang, Hui; Gang, Zhi; Xu, Hong

    2017-01-01

    The Auto-Importance Sampling (AIS) method is a Monte Carlo variance reduction technique proposed for deep penetration problems, which can significantly improve computational efficiency without pre-calculations for importance distribution. However, the AIS method is only validated with several simple examples, and cannot be used for coupled neutron-photon transport. This paper presents improved algorithms for the AIS method, including particle transport, fictitious particle creation and adjustment, fictitious surface geometry, random number allocation and calculation of the estimated relative error. These improvements allow the AIS method to be applied to complicated deep penetration problems with complex geometry and multiple materials. A Completely coupled Neutron-Photon Auto-Importance Sampling (CNP-AIS) method is proposed to solve the deep penetration problems of coupled neutron-photon transport using the improved algorithms. The NUREG/CR-6115 PWR benchmark was calculated by using the methods of CNP-AIS, geometry splitting with Russian roulette and analog Monte Carlo, respectively. The calculation results of CNP-AIS are in good agreement with those of geometry splitting with Russian roulette and the benchmark solutions. The computational efficiency of CNP-AIS for both neutron and photon is much better than that of geometry splitting with Russian roulette in most cases, and increased by several orders of magnitude compared with that of the analog Monte Carlo. Supported by the subject of National Science and Technology Major Project of China (2013ZX06002001-007, 2011ZX06004-007) and National Natural Science Foundation of China (11275110, 11375103)

  7. A new method for estimating the demographic history from DNA sequences: an importance sampling approach

    PubMed Central

    Ait Kaci Azzou, Sadoune; Larribe, Fabrice; Froda, Sorana

    2015-01-01

    The effective population size over time (demographic history) can be retraced from a sample of contemporary DNA sequences. In this paper, we propose a novel methodology based on importance sampling (IS) for exploring such demographic histories. Our starting point is the generalized skyline plot with the main difference being that our procedure, skywis plot, uses a large number of genealogies. The information provided by these genealogies is combined according to the IS weights. Thus, we compute a weighted average of the effective population sizes on specific time intervals (epochs), where the genealogies that agree more with the data are given more weight. We illustrate by a simulation study that the skywis plot correctly reconstructs the recent demographic history under the scenarios most commonly considered in the literature. In particular, our method can capture a change point in the effective population size, and its overall performance is comparable with the one of the bayesian skyline plot. We also introduce the case of serially sampled sequences and illustrate that it is possible to improve the performance of the skywis plot in the case of an exponential expansion of the effective population size. PMID:26300910

  8. Comparison of estimates of hardwood bole volume using importance sampling, the centroid method, and some taper equations

    Treesearch

    Harry V., Jr. Wiant; Michael L. Spangler; John E. Baumgras

    2002-01-01

    Various taper systems and the centroid method were compared to unbiased volume estimates made by importance sampling for 720 hardwood trees selected throughout the state of West Virginia. Only the centroid method consistently gave volumes estimates that did not differ significantly from those made by importance sampling, although some taper equations did well for most...

  9. Glyphosate herbicide residue determination in samples of environmental importance using spectrophotometric method.

    PubMed

    Jan, M Rasul; Shah, Jasmin; Muhammad, Mian; Ara, Behisht

    2009-09-30

    A simple selective spectrophotometric method has been developed for the determination of glyphosate herbicide in environmental and biological samples. Glyphosate was reacted with carbon disulphide to form dithiocarbamic acid which was further followed for complex formation with copper in the presence of ammonia. The absorbance of the resulting yellow coloured copper dithiocarbamate complex was measured at 435 nm with molar absorptivity of 1.864 x 10(3) L mol(-1)cm(-1).The analytical parameters were optimized and Beer's law was obeyed in the range of 1.0-70 microg mL(-1). The composition ratio of the complex was glyphosate: copper (2:1) as established by Job's method with a formation constant of 1.06 x 10(5). Glyphosate was satisfactorily determined with limit of detection and quantification of 1.1 and 3.7 microg mL(-1), respectively. The investigated method was applied successfully to the environmental samples. Recovery values in soil, wheat grains and water samples were found to be 80.0+/-0.46 to 87.0+/-0.28%, 95.0+/-0.88 to 102.0+/-0.98% and 85.0+/-0.68 to 92.0+/-0.37%, respectively.

  10. Fast computation of diffuse reflectance in optical coherence tomography using an importance sampling-based Monte Carlo method

    NASA Astrophysics Data System (ADS)

    Lima, Ivan T., Jr.; Kalra, Anshul; Hernández-Figueroa, Hugo E.; Sherif, Sherif S.

    2012-03-01

    Computer simulations of light transport in multi-layered turbid media are an effective way to theoretically investigate light transport in tissue, which can be applied to the analysis, design and optimization of optical coherence tomography (OCT) systems. We present a computationally efficient method to calculate the diffuse reflectance due to ballistic and quasi-ballistic components of photons scattered in turbid media, which represents the signal in optical coherence tomography systems. Our importance sampling based Monte Carlo method enables the calculation of the OCT signal with less than one hundredth of the computational time required by the conventional Monte Carlo method. It also does not produce a systematic bias in the statistical result that is typically observed in existing methods to speed up Monte Carlo simulations of light transport in tissue. This method can be used to assess and optimize the performance of existing OCT systems, and it can also be used to design novel OCT systems.

  11. An efficient reliability algorithm for locating design point using the combination of importance sampling concepts and response surface method

    NASA Astrophysics Data System (ADS)

    Shayanfar, Mohsen Ali; Barkhordari, Mohammad Ali; Roudak, Mohammad Amin

    2017-06-01

    Monte Carlo simulation (MCS) is a useful tool for computation of probability of failure in reliability analysis. However, the large number of required random samples makes it time-consuming. Response surface method (RSM) is another common method in reliability analysis. Although RSM is widely used for its simplicity, it cannot be trusted in highly nonlinear problems due to its linear nature. In this paper, a new efficient algorithm, employing the combination of importance sampling, as a class of MCS, and RSM is proposed. In the proposed algorithm, analysis starts with importance sampling concepts and using a represented two-step updating rule of design point. This part finishes after a small number of samples are generated. Then RSM starts to work using Bucher experimental design, with the last design point and a represented effective length as the center point and radius of Bucher's approach, respectively. Through illustrative numerical examples, simplicity and efficiency of the proposed algorithm and the effectiveness of the represented rules are shown.

  12. DMATIS: Dark Matter ATtenuation Importance Sampling

    NASA Astrophysics Data System (ADS)

    Mahdawi, Mohammad Shafi; Farrar, Glennys R.

    2017-05-01

    DMATIS (Dark Matter ATtenuation Importance Sampling) calculates the trajectories of DM particles that propagate in the Earth's crust and the lead shield to reach the DAMIC detector using an importance sampling Monte-Carlo simulation. A detailed Monte-Carlo simulation avoids the deficiencies of the SGED/KS method that uses a mean energy loss description to calculate the lower bound on the DM-proton cross section. The code implementing the importance sampling technique makes the brute-force Monte-Carlo simulation of moderately strongly interacting DM with nucleons computationally feasible. DMATIS is written in Python 3 and MATHEMATICA.

  13. Diagnosis of cerebral toxoplasmosis in AIDS patients in Brazil: importance of molecular and immunological methods using peripheral blood samples.

    PubMed

    Colombo, Fabio A; Vidal, José E; Penalva de Oliveira, Augusto C; Hernandez, Adrián V; Bonasser-Filho, Francisco; Nogueira, Roberta S; Focaccia, Roberto; Pereira-Chioccola, Vera Lucia

    2005-10-01

    Cerebral toxoplasmosis is the most common cerebral focal lesion in AIDS and still accounts for high morbidity and mortality in Brazil. Its occurrence is more frequent in patients with low CD4(+) T-cell counts. It is directly related to the prevalence of anti-Toxoplasma gondii antibodies in the population. Therefore, it is important to evaluate sensitive, less invasive, and rapid diagnostic tests. We evaluated the value of PCR using peripheral blood samples on the diagnosis of cerebral toxoplasmosis and whether its association with immunological assays can contribute to a timely diagnosis. We prospectively analyzed blood samples from 192 AIDS patients divided into two groups. The first group was composed of samples from 64 patients with cerebral toxoplasmosis diagnosed by clinical and radiological features. The second group was composed of samples from 128 patients with other opportunistic diseases. Blood collection from patients with cerebral toxoplasmosis was done before or on the third day of anti-toxoplasma therapy. PCR for T. gondii, indirect immunofluorescence, enzyme-linked immunosorbent assay, and an avidity test for toxoplasmosis were performed on all samples. The PCR sensitivity and specificity for diagnosis of cerebral toxoplasmosis in blood were 80% and 98%, respectively. Patients with cerebral toxoplasmosis (89%) presented higher titers of anti-T. gondii IgG antibodies than patients with other diseases (57%) (P<0.001). These findings suggest the clinical value of the use of both PCR and high titers of anti-T. gondii IgG antibodies for the diagnosis of cerebral toxoplasmosis. This strategy may prevent more invasive approaches.

  14. Diagnosis of Cerebral Toxoplasmosis in AIDS Patients in Brazil: Importance of Molecular and Immunological Methods Using Peripheral Blood Samples

    PubMed Central

    Colombo, Fabio A.; Vidal, José E.; Oliveira, Augusto C. Penalva de; Hernandez, Adrián V.; Bonasser-Filho, Francisco; Nogueira, Roberta S.; Focaccia, Roberto; Pereira-Chioccola, Vera Lucia

    2005-01-01

    Cerebral toxoplasmosis is the most common cerebral focal lesion in AIDS and still accounts for high morbidity and mortality in Brazil. Its occurrence is more frequent in patients with low CD4+ T-cell counts. It is directly related to the prevalence of anti-Toxoplasma gondii antibodies in the population. Therefore, it is important to evaluate sensitive, less invasive, and rapid diagnostic tests. We evaluated the value of PCR using peripheral blood samples on the diagnosis of cerebral toxoplasmosis and whether its association with immunological assays can contribute to a timely diagnosis. We prospectively analyzed blood samples from 192 AIDS patients divided into two groups. The first group was composed of samples from 64 patients with cerebral toxoplasmosis diagnosed by clinical and radiological features. The second group was composed of samples from 128 patients with other opportunistic diseases. Blood collection from patients with cerebral toxoplasmosis was done before or on the third day of anti-toxoplasma therapy. PCR for T. gondii, indirect immunofluorescence, enzyme-linked immunosorbent assay, and an avidity test for toxoplasmosis were performed on all samples. The PCR sensitivity and specificity for diagnosis of cerebral toxoplasmosis in blood were 80% and 98%, respectively. Patients with cerebral toxoplasmosis (89%) presented higher titers of anti-T. gondii IgG antibodies than patients with other diseases (57%) (P < 0.001). These findings suggest the clinical value of the use of both PCR and high titers of anti-T. gondii IgG antibodies for the diagnosis of cerebral toxoplasmosis. This strategy may prevent more invasive approaches. PMID:16207959

  15. Adaptive importance sampling for network growth models

    PubMed Central

    Holmes, Susan P.

    2016-01-01

    Network Growth Models such as Preferential Attachment and Duplication/Divergence are popular generative models with which to study complex networks in biology, sociology, and computer science. However, analyzing them within the framework of model selection and statistical inference is often complicated and computationally difficult, particularly when comparing models that are not directly related or nested. In practice, ad hoc methods are often used with uncertain results. If possible, the use of standard likelihood-based statistical model selection techniques is desirable. With this in mind, we develop an Adaptive Importance Sampling algorithm for estimating likelihoods of Network Growth Models. We introduce the use of the classic Plackett-Luce model of rankings as a family of importance distributions. Updates to importance distributions are performed iteratively via the Cross-Entropy Method with an additional correction for degeneracy/over-fitting inspired by the Minimum Description Length principle. This correction can be applied to other estimation problems using the Cross-Entropy method for integration/approximate counting, and it provides an interpretation of Adaptive Importance Sampling as iterative model selection. Empirical results for the Preferential Attachment model are given, along with a comparison to an alternative established technique, Annealed Importance Sampling. PMID:27182098

  16. Adaptive Importance Sampling for Control and Inference

    NASA Astrophysics Data System (ADS)

    Kappen, H. J.; Ruiz, H. C.

    2016-03-01

    Path integral (PI) control problems are a restricted class of non-linear control problems that can be solved formally as a Feynman-Kac PI and can be estimated using Monte Carlo sampling. In this contribution we review PI control theory in the finite horizon case. We subsequently focus on the problem how to compute and represent control solutions. We review the most commonly used methods in robotics and control. Within the PI theory, the question of how to compute becomes the question of importance sampling. Efficient importance samplers are state feedback controllers and the use of these requires an efficient representation. Learning and representing effective state-feedback controllers for non-linear stochastic control problems is a very challenging, and largely unsolved, problem. We show how to learn and represent such controllers using ideas from the cross entropy method. We derive a gradient descent method that allows to learn feed-back controllers using an arbitrary parametrisation. We refer to this method as the path integral cross entropy method or PICE. We illustrate this method for some simple examples. The PI control methods can be used to estimate the posterior distribution in latent state models. In neuroscience these problems arise when estimating connectivity from neural recording data using EM. We demonstrate the PI control method as an accurate alternative to particle filtering.

  17. Honest Importance Sampling with Multiple Markov Chains.

    PubMed

    Tan, Aixin; Doss, Hani; Hobert, James P

    2015-01-01

    Importance sampling is a classical Monte Carlo technique in which a random sample from one probability density, π1, is used to estimate an expectation with respect to another, π. The importance sampling estimator is strongly consistent and, as long as two simple moment conditions are satisfied, it obeys a central limit theorem (CLT). Moreover, there is a simple consistent estimator for the asymptotic variance in the CLT, which makes for routine computation of standard errors. Importance sampling can also be used in the Markov chain Monte Carlo (MCMC) context. Indeed, if the random sample from π1 is replaced by a Harris ergodic Markov chain with invariant density π1, then the resulting estimator remains strongly consistent. There is a price to be paid however, as the computation of standard errors becomes more complicated. First, the two simple moment conditions that guarantee a CLT in the iid case are not enough in the MCMC context. Second, even when a CLT does hold, the asymptotic variance has a complex form and is difficult to estimate consistently. In this paper, we explain how to use regenerative simulation to overcome these problems. Actually, we consider a more general set up, where we assume that Markov chain samples from several probability densities, π1, …, πk , are available. We construct multiple-chain importance sampling estimators for which we obtain a CLT based on regeneration. We show that if the Markov chains converge to their respective target distributions at a geometric rate, then under moment conditions similar to those required in the iid case, the MCMC-based importance sampling estimator obeys a CLT. Furthermore, because the CLT is based on a regenerative process, there is a simple consistent estimator of the asymptotic variance. We illustrate the method with two applications in Bayesian sensitivity analysis. The first concerns one-way random effects models under different priors. The second involves Bayesian variable selection in

  18. The importance of microhabitat for biodiversity sampling.

    PubMed

    Mehrabi, Zia; Slade, Eleanor M; Solis, Angel; Mann, Darren J

    2014-01-01

    Responses to microhabitat are often neglected when ecologists sample animal indicator groups. Microhabitats may be particularly influential in non-passive biodiversity sampling methods, such as baited traps or light traps, and for certain taxonomic groups which respond to fine scale environmental variation, such as insects. Here we test the effects of microhabitat on measures of species diversity, guild structure and biomass of dung beetles, a widely used ecological indicator taxon. We demonstrate that choice of trap placement influences dung beetle functional guild structure and species diversity. We found that locally measured environmental variables were unable to fully explain trap-based differences in species diversity metrics or microhabitat specialism of functional guilds. To compare the effects of habitat degradation on biodiversity across multiple sites, sampling protocols must be standardized and scale-relevant. Our work highlights the importance of considering microhabitat scale responses of indicator taxa and designing robust sampling protocols which account for variation in microhabitats during trap placement. We suggest that this can be achieved either through standardization of microhabitat or through better efforts to record relevant environmental variables that can be incorporated into analyses to account for microhabitat effects. This is especially important when rapidly assessing the consequences of human activity on biodiversity loss and associated ecosystem function and services.

  19. The Importance of Microhabitat for Biodiversity Sampling

    PubMed Central

    Mehrabi, Zia; Slade, Eleanor M.; Solis, Angel; Mann, Darren J.

    2014-01-01

    Responses to microhabitat are often neglected when ecologists sample animal indicator groups. Microhabitats may be particularly influential in non-passive biodiversity sampling methods, such as baited traps or light traps, and for certain taxonomic groups which respond to fine scale environmental variation, such as insects. Here we test the effects of microhabitat on measures of species diversity, guild structure and biomass of dung beetles, a widely used ecological indicator taxon. We demonstrate that choice of trap placement influences dung beetle functional guild structure and species diversity. We found that locally measured environmental variables were unable to fully explain trap-based differences in species diversity metrics or microhabitat specialism of functional guilds. To compare the effects of habitat degradation on biodiversity across multiple sites, sampling protocols must be standardized and scale-relevant. Our work highlights the importance of considering microhabitat scale responses of indicator taxa and designing robust sampling protocols which account for variation in microhabitats during trap placement. We suggest that this can be achieved either through standardization of microhabitat or through better efforts to record relevant environmental variables that can be incorporated into analyses to account for microhabitat effects. This is especially important when rapidly assessing the consequences of human activity on biodiversity loss and associated ecosystem function and services. PMID:25469770

  20. Adaptively biased sequential importance sampling for rare events in reaction networks with comparison to exact solutions from finite buffer dCME method.

    PubMed

    Cao, Youfang; Liang, Jie

    2013-07-14

    Critical events that occur rarely in biological processes are of great importance, but are challenging to study using Monte Carlo simulation. By introducing biases to reaction selection and reaction rates, weighted stochastic simulation algorithms based on importance sampling allow rare events to be sampled more effectively. However, existing methods do not address the important issue of barrier crossing, which often arises from multistable networks and systems with complex probability landscape. In addition, the proliferation of parameters and the associated computing cost pose significant problems. Here we introduce a general theoretical framework for obtaining optimized biases in sampling individual reactions for estimating probabilities of rare events. We further describe a practical algorithm called adaptively biased sequential importance sampling (ABSIS) method for efficient probability estimation. By adopting a look-ahead strategy and by enumerating short paths from the current state, we estimate the reaction-specific and state-specific forward and backward moving probabilities of the system, which are then used to bias reaction selections. The ABSIS algorithm can automatically detect barrier-crossing regions, and can adjust bias adaptively at different steps of the sampling process, with bias determined by the outcome of exhaustively generated short paths. In addition, there are only two bias parameters to be determined, regardless of the number of the reactions and the complexity of the network. We have applied the ABSIS method to four biochemical networks: the birth-death process, the reversible isomerization, the bistable Schlögl model, and the enzymatic futile cycle model. For comparison, we have also applied the finite buffer discrete chemical master equation (dCME) method recently developed to obtain exact numerical solutions of the underlying discrete chemical master equations of these problems. This allows us to assess sampling results objectively

  1. Adaptively biased sequential importance sampling for rare events in reaction networks with comparison to exact solutions from finite buffer dCME method

    PubMed Central

    Cao, Youfang; Liang, Jie

    2013-01-01

    Critical events that occur rarely in biological processes are of great importance, but are challenging to study using Monte Carlo simulation. By introducing biases to reaction selection and reaction rates, weighted stochastic simulation algorithms based on importance sampling allow rare events to be sampled more effectively. However, existing methods do not address the important issue of barrier crossing, which often arises from multistable networks and systems with complex probability landscape. In addition, the proliferation of parameters and the associated computing cost pose significant problems. Here we introduce a general theoretical framework for obtaining optimized biases in sampling individual reactions for estimating probabilities of rare events. We further describe a practical algorithm called adaptively biased sequential importance sampling (ABSIS) method for efficient probability estimation. By adopting a look-ahead strategy and by enumerating short paths from the current state, we estimate the reaction-specific and state-specific forward and backward moving probabilities of the system, which are then used to bias reaction selections. The ABSIS algorithm can automatically detect barrier-crossing regions, and can adjust bias adaptively at different steps of the sampling process, with bias determined by the outcome of exhaustively generated short paths. In addition, there are only two bias parameters to be determined, regardless of the number of the reactions and the complexity of the network. We have applied the ABSIS method to four biochemical networks: the birth-death process, the reversible isomerization, the bistable Schlögl model, and the enzymatic futile cycle model. For comparison, we have also applied the finite buffer discrete chemical master equation (dCME) method recently developed to obtain exact numerical solutions of the underlying discrete chemical master equations of these problems. This allows us to assess sampling results objectively

  2. Adaptively biased sequential importance sampling for rare events in reaction networks with comparison to exact solutions from finite buffer dCME method

    NASA Astrophysics Data System (ADS)

    Cao, Youfang; Liang, Jie

    2013-07-01

    Critical events that occur rarely in biological processes are of great importance, but are challenging to study using Monte Carlo simulation. By introducing biases to reaction selection and reaction rates, weighted stochastic simulation algorithms based on importance sampling allow rare events to be sampled more effectively. However, existing methods do not address the important issue of barrier crossing, which often arises from multistable networks and systems with complex probability landscape. In addition, the proliferation of parameters and the associated computing cost pose significant problems. Here we introduce a general theoretical framework for obtaining optimized biases in sampling individual reactions for estimating probabilities of rare events. We further describe a practical algorithm called adaptively biased sequential importance sampling (ABSIS) method for efficient probability estimation. By adopting a look-ahead strategy and by enumerating short paths from the current state, we estimate the reaction-specific and state-specific forward and backward moving probabilities of the system, which are then used to bias reaction selections. The ABSIS algorithm can automatically detect barrier-crossing regions, and can adjust bias adaptively at different steps of the sampling process, with bias determined by the outcome of exhaustively generated short paths. In addition, there are only two bias parameters to be determined, regardless of the number of the reactions and the complexity of the network. We have applied the ABSIS method to four biochemical networks: the birth-death process, the reversible isomerization, the bistable Schlögl model, and the enzymatic futile cycle model. For comparison, we have also applied the finite buffer discrete chemical master equation (dCME) method recently developed to obtain exact numerical solutions of the underlying discrete chemical master equations of these problems. This allows us to assess sampling results objectively

  3. Annealed Importance Sampling Reversible Jump MCMC algorithms

    SciTech Connect

    Karagiannis, Georgios; Andrieu, Christophe

    2013-03-20

    It will soon be 20 years since reversible jump Markov chain Monte Carlo (RJ-MCMC) algorithms have been proposed. They have significantly extended the scope of Markov chain Monte Carlo simulation methods, offering the promise to be able to routinely tackle transdimensional sampling problems, as encountered in Bayesian model selection problems for example, in a principled and flexible fashion. Their practical efficient implementation, however, still remains a challenge. A particular difficulty encountered in practice is in the choice of the dimension matching variables (both their nature and their distribution) and the reversible transformations which allow one to define the one-to-one mappings underpinning the design of these algorithms. Indeed, even seemingly sensible choices can lead to algorithms with very poor performance. The focus of this paper is the development and performance evaluation of a method, annealed importance sampling RJ-MCMC (aisRJ), which addresses this problem by mitigating the sensitivity of RJ-MCMC algorithms to the aforementioned poor design. As we shall see the algorithm can be understood as being an “exact approximation” of an idealized MCMC algorithm that would sample from the model probabilities directly in a model selection set-up. Such an idealized algorithm may have good theoretical convergence properties, but typically cannot be implemented, and our algorithms can approximate the performance of such idealized algorithms to an arbitrary degree while not introducing any bias for any degree of approximation. Our approach combines the dimension matching ideas of RJ-MCMC with annealed importance sampling and its Markov chain Monte Carlo implementation. We illustrate the performance of the algorithm with numerical simulations which indicate that, although the approach may at first appear computationally involved, it is in fact competitive.

  4. Sampling system and method

    DOEpatents

    Decker, David L.; Lyles, Brad F.; Purcell, Richard G.; Hershey, Ronald Lee

    2013-04-16

    The present disclosure provides an apparatus and method for coupling conduit segments together. A first pump obtains a sample and transmits it through a first conduit to a reservoir accessible by a second pump. The second pump further conducts the sample from the reservoir through a second conduit.

  5. Annealed Importance Sampling for Neural Mass Models

    PubMed Central

    Penny, Will; Sengupta, Biswa

    2016-01-01

    Neural Mass Models provide a compact description of the dynamical activity of cell populations in neocortical regions. Moreover, models of regional activity can be connected together into networks, and inferences made about the strength of connections, using M/EEG data and Bayesian inference. To date, however, Bayesian methods have been largely restricted to the Variational Laplace (VL) algorithm which assumes that the posterior distribution is Gaussian and finds model parameters that are only locally optimal. This paper explores the use of Annealed Importance Sampling (AIS) to address these restrictions. We implement AIS using proposals derived from Langevin Monte Carlo (LMC) which uses local gradient and curvature information for efficient exploration of parameter space. In terms of the estimation of Bayes factors, VL and AIS agree about which model is best but report different degrees of belief. Additionally, AIS finds better model parameters and we find evidence of non-Gaussianity in their posterior distribution. PMID:26942606

  6. State estimation in large-scale open channel networks using sequential Monte Carlo methods: Optimal sampling importance resampling and implicit particle filters

    NASA Astrophysics Data System (ADS)

    Rafiee, Mohammad; Barrau, Axel; Bayen, Alexandre M.

    2013-06-01

    This article investigates the performance of Monte Carlo-based estimation methods for estimation of flow state in large-scale open channel networks. After constructing a state space model of the flow based on the Saint-Venant equations, we implement the optimal sampling importance resampling filter to perform state estimation in a case in which measurements are available at every time step. Considering a case in which measurements become available intermittently, a random-map implementation of the implicit particle filter is applied to estimate the state trajectory in the interval between the measurements. Finally, some heuristics are proposed, which are shown to improve the estimation results and lower the computational cost. In the first heuristics, considering the case in which measurements are available at every time step, we apply the implicit particle filter over time intervals of a desired size while incorporating all the available measurements over the corresponding time interval. As a second heuristic method, we introduce a maximum a posteriori (MAP) method, which does not require sampling. It will be seen, through implementation, that the MAP method provides more accurate results in the case of our application while having a smaller computational cost. All estimation methods are tested on a network of 19 tidally forced subchannels and 1 reservoir, Clifton Court Forebay, in Sacramento-San Joaquin Delta in California, and numerical results are presented.

  7. Transition Path Sampling Methods

    NASA Astrophysics Data System (ADS)

    Dellago, C.; Bolhuis, P. G.; Geissler, P. L.

    Transition path sampling, based on a statistical mechanics in trajectory space, is a set of computational methods for the simulation of rare events in complex systems. In this chapter we give an overview of these techniques and describe their statistical mechanical basis as well as their application.

  8. Sampling system and method

    DOEpatents

    Decker, David L.; Lyles, Brad F.; Purcell, Richard G.; Hershey, Ronald Lee

    2017-03-07

    In one embodiment, the present disclosure provides an apparatus and method for supporting a tubing bundle during installation or removal. The apparatus includes a clamp for securing the tubing bundle to an external wireline. In various examples, the clamp is external to the tubing bundle or integral with the tubing bundle. According to one method, a tubing bundle and wireline are deployed together and the tubing bundle periodically secured to the wireline using a clamp. In another embodiment, the present disclosure provides an apparatus and method for coupling conduit segments together. A first pump obtains a sample and transmits it through a first conduit to a reservoir accessible by a second pump. The second pump further conducts the sample from the reservoir through a second conduit. In a specific example, one or more clamps are used to connect the first and/or second conduits to an external wireline.

  9. Acceptance sampling methods for sample results verification

    SciTech Connect

    Jesse, C.A.

    1993-06-01

    This report proposes a statistical sampling method for use during the sample results verification portion of the validation of data packages. In particular, this method was derived specifically for the validation of data packages for metals target analyte analysis performed under United States Environmental Protection Agency Contract Laboratory Program protocols, where sample results verification can be quite time consuming. The purpose of such a statistical method is to provide options in addition to the ``all or nothing`` options that currently exist for sample results verification. The proposed method allows the amount of data validated during the sample results verification process to be based on a balance between risks and the cost of inspection.

  10. Importance of sampling frequency when collecting diatoms

    NASA Astrophysics Data System (ADS)

    Wu, Naicheng; Faber, Claas; Sun, Xiuming; Qu, Yueming; Wang, Chao; Ivetic, Snjezana; Riis, Tenna; Ulrich, Uta; Fohrer, Nicola

    2016-11-01

    There has been increasing interest in diatom-based bio-assessment but we still lack a comprehensive understanding of how to capture diatoms’ temporal dynamics with an appropriate sampling frequency (ASF). To cover this research gap, we collected and analyzed daily riverine diatom samples over a 1-year period (25 April 2013–30 April 2014) at the outlet of a German lowland river. The samples were classified into five clusters (1–5) by a Kohonen Self-Organizing Map (SOM) method based on similarity between species compositions over time. ASFs were determined to be 25 days at Cluster 2 (June-July 2013) and 13 days at Cluster 5 (February-April 2014), whereas no specific ASFs were found at Cluster 1 (April-May 2013), 3 (August-November 2013) (>30 days) and Cluster 4 (December 2013 - January 2014) (<1 day). ASFs showed dramatic seasonality and were negatively related to hydrological wetness conditions, suggesting that sampling interval should be reduced with increasing catchment wetness. A key implication of our findings for freshwater management is that long-term bio-monitoring protocols should be developed with the knowledge of tracking algal temporal dynamics with an appropriate sampling frequency.

  11. Importance of sampling frequency when collecting diatoms

    PubMed Central

    Wu, Naicheng; Faber, Claas; Sun, Xiuming; Qu, Yueming; Wang, Chao; Ivetic, Snjezana; Riis, Tenna; Ulrich, Uta; Fohrer, Nicola

    2016-01-01

    There has been increasing interest in diatom-based bio-assessment but we still lack a comprehensive understanding of how to capture diatoms’ temporal dynamics with an appropriate sampling frequency (ASF). To cover this research gap, we collected and analyzed daily riverine diatom samples over a 1-year period (25 April 2013–30 April 2014) at the outlet of a German lowland river. The samples were classified into five clusters (1–5) by a Kohonen Self-Organizing Map (SOM) method based on similarity between species compositions over time. ASFs were determined to be 25 days at Cluster 2 (June-July 2013) and 13 days at Cluster 5 (February-April 2014), whereas no specific ASFs were found at Cluster 1 (April-May 2013), 3 (August-November 2013) (>30 days) and Cluster 4 (December 2013 - January 2014) (<1 day). ASFs showed dramatic seasonality and were negatively related to hydrological wetness conditions, suggesting that sampling interval should be reduced with increasing catchment wetness. A key implication of our findings for freshwater management is that long-term bio-monitoring protocols should be developed with the knowledge of tracking algal temporal dynamics with an appropriate sampling frequency. PMID:27841310

  12. Improved Sampling Method Reduces Isokinetic Sampling Errors.

    ERIC Educational Resources Information Center

    Karels, Gale G.

    The particulate sampling system currently in use by the Bay Area Air Pollution Control District, San Francisco, California is described in this presentation for the 12th Conference on Methods in Air Pollution and Industrial Hygiene Studies, University of Southern California, April, 1971. The method represents a practical, inexpensive tool that can…

  13. Improved Sampling Method Reduces Isokinetic Sampling Errors.

    ERIC Educational Resources Information Center

    Karels, Gale G.

    The particulate sampling system currently in use by the Bay Area Air Pollution Control District, San Francisco, California is described in this presentation for the 12th Conference on Methods in Air Pollution and Industrial Hygiene Studies, University of Southern California, April, 1971. The method represents a practical, inexpensive tool that can…

  14. [The importance of the sample design effect].

    PubMed

    Guillén, Montserrat; Ayuso, Mercedes

    2004-01-01

    Sample selection through a complex design influences the subsequent statistical analysis. The different means of sample selection may result in bias and greater variance of estimators; simple randomized sampling is the reference design. Diverse examples are provided, illustrating how the various sampling strategies can result in bias and increase variance. The inclusion of different weighting techniques reduces bias. Evaluation of the effect of design enables measurement of the degree of variance distortion due to the sampling design used and therefore provides a direct evaluation of the alteration in the confidence intervals estimated when the sampling design deviates from simple randomized sampling. We recommend measurement of the effect of the design on analysis of the data obtained by sampling and inclusion of weighting techniques in statistical analyses.

  15. Sampling system and method

    SciTech Connect

    Decker, David L; Lyles, Brad F; Purcell, Richard G; Hershey, Ronald Lee

    2014-05-20

    An apparatus and method for supporting a tubing bundle during installation or removal. The apparatus includes a clamp for securing the tubing bundle to an external wireline. The method includes deploying the tubing bundle and wireline together, The tubing bundle is periodically secured to the wireline using a clamp.

  16. Accelerated Nonrigid Intensity-Based Image Registration Using Importance Sampling

    PubMed Central

    Bhagalia, Roshni; Fessler, Jeffrey A.; Kim, Boklye

    2015-01-01

    Nonrigid image registration methods using intensity-based similarity metrics are becoming increasingly common tools to estimate many types of deformations. Nonrigid warps can be very flexible with a large number of parameters and gradient optimization schemes are widely used to estimate them. However for large datasets, the computation of the gradient of the similarity metric with respect to these many parameters becomes very time consuming. Using a small random subset of image voxels to approximate the gradient can reduce computation time. This work focuses on the use of importance sampling to improve accuracy and reduce the variance of this gradient approximation. The proposed importance sampling framework is based on an edge-dependent adaptive sampling distribution designed for use with intensity-based registration algorithms. We compare the performance of registration based on stochastic approximations with and without importance sampling to that using deterministic gradient descent. Empirical results, on simulated MR brain data and real CT inhale-exhale lung data from 8 subjects, show that a combination of stochastic approximation methods and importance sampling improves the rate of convergence of the registration process while preserving accuracy. PMID:19211343

  17. Accelerated nonrigid intensity-based image registration using importance sampling.

    PubMed

    Bhagalia, Roshni; Fessler, Jeffrey A; Kim, Boklye

    2009-08-01

    Nonrigid image registration methods using intensity-based similarity metrics are becoming increasingly common tools to estimate many types of deformations. Nonrigid warps can be very flexible with a large number of parameters and gradient optimization schemes are widely used to estimate them. However, for large datasets, the computation of the gradient of the similarity metric with respect to these many parameters becomes very time consuming. Using a small random subset of image voxels to approximate the gradient can reduce computation time. This work focuses on the use of importance sampling to reduce the variance of this gradient approximation. The proposed importance sampling framework is based on an edge-dependent adaptive sampling distribution designed for use with intensity-based registration algorithms. We compare the performance of registration based on stochastic approximations with and without importance sampling to that using deterministic gradient descent. Empirical results, on simulated magnetic resonance brain data and real computed tomography inhale-exhale lung data from eight subjects, show that a combination of stochastic approximation methods and importance sampling accelerates the registration process while preserving accuracy.

  18. Importance of tissue sampling, laboratory methods, and patient characteristics for detection of Pneumocystis in autopsied lungs of non-immunosuppressed individuals.

    PubMed

    Vargas, S L; Ponce, C; Bustamante, R; Calderón, E; Nevez, G; De Armas, Y; Matos, O; Miller, R F; Gallo, M J

    2017-06-05

    To understand the epidemiological significance of Pneumocystis detection in a lung tissue sample of non-immunosuppressed individuals, we examined sampling procedures, laboratory methodology, and patient characteristics of autopsy series reported in the literature. Number of tissue specimens, DNA-extraction procedures, age and underlying diagnosis highly influence yield and are critical to understand yield differences of Pneumocystis among reports of pulmonary colonization in immunocompetent individuals.

  19. Computing ensembles of transitions from stable states: Dynamic importance sampling.

    PubMed

    Perilla, Juan R; Beckstein, Oliver; Denning, Elizabeth J; Woolf, Thomas B

    2011-01-30

    There is an increasing dataset of solved biomolecular structures in more than one conformation and increasing evidence that large-scale conformational change is critical for biomolecular function. In this article, we present our implementation of a dynamic importance sampling (DIMS) algorithm that is directed toward improving our understanding of important intermediate states between experimentally defined starting and ending points. This complements traditional molecular dynamics methods where most of the sampling time is spent in the stable free energy wells defined by these initial and final points. As such, the algorithm creates a candidate set of transitions that provide insights for the much slower and probably most important, functionally relevant degrees of freedom. The method is implemented in the program CHARMM and is tested on six systems of growing size and complexity. These systems, the folding of Protein A and of Protein G, the conformational changes in the calcium sensor S100A6, the glucose-galactose-binding protein, maltodextrin, and lactoferrin, are also compared against other approaches that have been suggested in the literature. The results suggest good sampling on a diverse set of intermediates for all six systems with an ability to control the bias and thus to sample distributions of trajectories for the analysis of intermediate states.

  20. Consistent Adjoint Driven Importance Sampling using Space, Energy and Angle

    SciTech Connect

    Peplow, Douglas E.; Mosher, Scott W; Evans, Thomas M

    2012-08-01

    For challenging radiation transport problems, hybrid methods combine the accuracy of Monte Carlo methods with the global information present in deterministic methods. One of the most successful hybrid methods is CADIS Consistent Adjoint Driven Importance Sampling. This method uses a deterministic adjoint solution to construct a biased source distribution and consistent weight windows to optimize a specific tally in a Monte Carlo calculation. The method has been implemented into transport codes using just the spatial and energy information from the deterministic adjoint and has been used in many applications to compute tallies with much higher figures-of-merit than analog calculations. CADIS also outperforms user-supplied importance values, which usually take long periods of user time to develop. This work extends CADIS to develop weight windows that are a function of the position, energy, and direction of the Monte Carlo particle. Two types of consistent source biasing are presented: one method that biases the source in space and energy while preserving the original directional distribution and one method that biases the source in space, energy, and direction. Seven simple example problems are presented which compare the use of the standard space/energy CADIS with the new space/energy/angle treatments.

  1. Adaptive importance sampling Monte Carlo simulation of rare transition events.

    PubMed

    de Koning, Maurice; Cai, Wei; Sadigh, Babak; Oppelstrup, Tomas; Kalos, Malvin H; Bulatov, Vasily V

    2005-02-15

    We develop a general theoretical framework for the recently proposed importance sampling method for enhancing the efficiency of rare-event simulations [W. Cai, M. H. Kalos, M. de Koning, and V. V. Bulatov, Phys. Rev. E 66, 046703 (2002)], and discuss practical aspects of its application. We define the success/fail ensemble of all possible successful and failed transition paths of any duration and demonstrate that in this formulation the rare-event problem can be interpreted as a "hit-or-miss" Monte Carlo quadrature calculation of a path integral. The fact that the integrand contributes significantly only for a very tiny fraction of all possible paths then naturally leads to a "standard" importance sampling approach to Monte Carlo (MC) quadrature and the existence of an optimal importance function. In addition to showing that the approach is general and expected to be applicable beyond the realm of Markovian path simulations, for which the method was originally proposed, the formulation reveals a conceptual analogy with the variational MC (VMC) method. The search for the optimal importance function in the former is analogous to finding the ground-state wave function in the latter. In two model problems we discuss practical aspects of finding a suitable approximation for the optimal importance function. For this purpose we follow the strategy that is typically adopted in VMC calculations: the selection of a trial functional form for the optimal importance function, followed by the optimization of its adjustable parameters. The latter is accomplished by means of an adaptive optimization procedure based on a combination of steepest-descent and genetic algorithms.

  2. Sampling methods for phlebotomine sandflies.

    PubMed

    Alexander, B

    2000-06-01

    A review is presented of methods for sampling phlebotomine sandflies (Diptera: Psychodidae). Among approximately 500 species of Phlebotominae so far described, mostly in the New World genus Lutzomyia and the Old World genus Phlebotomus, about 10% are known vectors of Leishmania parasites or other pathogens. Despite being small and fragile, sandflies have a wide geographical range with species occupying a considerable diversity of ecotopes and habitats, from deserts to humid forests, so that suitable methods for collecting them are influenced by environmental conditions where they are sought. Because immature phlebotomines occupy obscure terrestrial habitats, it is difficult to find their breeding sites. Therefore, most trapping methods and sampling procedures focus on sandfly adults, whether resting or active. The diurnal resting sites of adult sandflies include tree holes, buttress roots, rock crevices, houses, animal shelters and burrows, from which they may be aspirated directly or trapped after being disturbed. Sandflies can be collected during their periods of activity by interception traps, or by using attractants such as bait animals, CO2 or light. The method of trapping used should: (a) be suited to the habitat and area to be surveyed, (b) take into account the segment of the sandfly population to be sampled (species, sex and reproduction condition) and (c) yield specimens of appropriate condition for the study objectives (e.g. identification of species present, population genetics or vector implication). Methods for preservation and transportation of sandflies to the laboratory also depend on the objectives of a particular study and are described accordingly.

  3. Elaborating transition interface sampling methods

    SciTech Connect

    Erp, Titus S. van . E-mail: bolhuis@science.uva.nl

    2005-05-01

    We review two recently developed efficient methods for calculating rate constants of processes dominated by rare events in high-dimensional complex systems. The first is transition interface sampling (TIS), based on the measurement of effective fluxes through hypersurfaces in phase space. TIS improves efficiency with respect to standard transition path sampling (TPS) rate constant techniques, because it allows a variable path length and is less sensitive to recrossings. The second method is the partial path version of TIS. Developed for diffusive processes, it exploits the loss of long time correlation. We discuss the relation between the new techniques and the standard reactive flux methods in detail. Path sampling algorithms can suffer from ergodicity problems, and we introduce several new techniques to alleviate these problems, notably path swapping, stochastic configurational bias Monte Carlo shooting moves and order-parameter free path sampling. In addition, we give algorithms to calculate other interesting properties from path ensembles besides rate constants, such as activation energies and reaction mechanisms.

  4. Elaborating transition interface sampling methods

    NASA Astrophysics Data System (ADS)

    van Erp, Titus S.; Bolhuis, Peter G.

    2005-05-01

    We review two recently developed efficient methods for calculating rate constants of processes dominated by rare events in high-dimensional complex systems. The first is transition interface sampling (TIS), based on the measurement of effective fluxes through hypersurfaces in phase space. TIS improves efficiency with respect to standard transition path sampling (TPS) rate constant techniques, because it allows a variable path length and is less sensitive to recrossings. The second method is the partial path version of TIS. Developed for diffusive processes, it exploits the loss of long time correlation. We discuss the relation between the new techniques and the standard reactive flux methods in detail. Path sampling algorithms can suffer from ergodicity problems, and we introduce several new techniques to alleviate these problems, notably path swapping, stochastic configurational bias Monte Carlo shooting moves and order-parameter free path sampling. In addition, we give algorithms to calculate other interesting properties from path ensembles besides rate constants, such as activation energies and reaction mechanisms.

  5. Importance-sampling computation of statistical properties of coupled oscillators

    NASA Astrophysics Data System (ADS)

    Gupta, Shamik; Leitão, Jorge C.; Altmann, Eduardo G.

    2017-07-01

    We introduce and implement an importance-sampling Monte Carlo algorithm to study systems of globally coupled oscillators. Our computational method efficiently obtains estimates of the tails of the distribution of various measures of dynamical trajectories corresponding to states occurring with (exponentially) small probabilities. We demonstrate the general validity of our results by applying the method to two contrasting cases: the driven-dissipative Kuramoto model, a paradigm in the study of spontaneous synchronization; and the conservative Hamiltonian mean-field model, a prototypical system of long-range interactions. We present results for the distribution of the finite-time Lyapunov exponent and a time-averaged order parameter. Among other features, our results show most notably that the distributions exhibit a vanishing standard deviation but a skewness that is increasing in magnitude with the number of oscillators, implying that nontrivial asymmetries and states yielding rare or atypical values of the observables persist even for a large number of oscillators.

  6. A LITERATURE REVIEW OF WIPE SAMPLING METHODS ...

    EPA Pesticide Factsheets

    Wipe sampling is an important technique for the estimation of contaminant deposition in buildings, homes, or outdoor surfaces as a source of possible human exposure. Numerousmethods of wipe sampling exist, and each method has its own specification for the type of wipe, wetting solvent, and determinative step to be used, depending upon the contaminant of concern. The objective of this report is to concisely summarize the findings of a literature review that was conducted to identify the state-of-the-art wipe sampling techniques for a target list of compounds. This report describes the methods used to perform the literature review; a brief review of wipe sampling techniques in general; an analysis of physical and chemical properties of each target analyte; an analysis of wipe sampling techniques for the target analyte list; and asummary of the wipe sampling techniques for the target analyte list, including existing data gaps. In general, no overwhelming consensus can be drawn from the current literature on how to collect a wipe sample for the chemical warfare agents, organophosphate pesticides, and other toxic industrial chemicals of interest to this study. Different methods, media, and wetting solvents have been recommended and used by various groups and different studies. For many of the compounds of interest, no specific wipe sampling methodology has been established for their collection. Before a wipe sampling method (or methods) can be established for the co

  7. 40 CFR 80.330 - What are the sampling and testing requirements for refiners and importers?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... initio. (b) Sampling methods. For purposes of paragraph (a) of this section, refiners and importers shall sample each batch of gasoline by using one of the following methods: (1) Manual sampling of tanks and... applicable procedures in ASTM method D 5842-95, entitled “Standard Practice for Sampling and Handling of...

  8. Sequential Importance Sampling for Rare Event Estimation with Computer Experiments

    SciTech Connect

    Williams, Brian J.; Picard, Richard R.

    2012-06-25

    Importance sampling often drastically improves the variance of percentile and quantile estimators of rare events. We propose a sequential strategy for iterative refinement of importance distributions for sampling uncertain inputs to a computer model to estimate quantiles of model output or the probability that the model output exceeds a fixed or random threshold. A framework is introduced for updating a model surrogate to maximize its predictive capability for rare event estimation with sequential importance sampling. Examples of the proposed methodology involving materials strength and nuclear reactor applications will be presented. The conclusions are: (1) Importance sampling improves UQ of percentile and quantile estimates relative to brute force approach; (2) Benefits of importance sampling increase as percentiles become more extreme; (3) Iterative refinement improves importance distributions in relatively few iterations; (4) Surrogates are necessary for slow running codes; (5) Sequential design improves surrogate quality in region of parameter space indicated by importance distributions; and (6) Importance distributions and VRFs stabilize quickly, while quantile estimates may converge slowly.

  9. Fluid sampling apparatus and method

    DOEpatents

    Yeamans, David R.

    1998-01-01

    Incorporation of a bellows in a sampling syringe eliminates ingress of contaminants, permits replication of amounts and compression of multiple sample injections, and enables remote sampling for off-site analysis.

  10. Fluid sampling apparatus and method

    DOEpatents

    Yeamans, D.R.

    1998-02-03

    Incorporation of a bellows in a sampling syringe eliminates ingress of contaminants, permits replication of amounts and compression of multiple sample injections, and enables remote sampling for off-site analysis. 3 figs.

  11. 9 CFR 327.11 - Receipts to importers for import product samples.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 9 Animals and Animal Products 2 2010-01-01 2010-01-01 false Receipts to importers for import product samples. 327.11 Section 327.11 Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE... AND VOLUNTARY INSPECTION AND CERTIFICATION IMPORTED PRODUCTS § 327.11 Receipts to importers for...

  12. On the importance of incorporating sampling weights in ...

    EPA Pesticide Factsheets

    Occupancy models are used extensively to assess wildlife-habitat associations and to predict species distributions across large geographic regions. Occupancy models were developed as a tool to properly account for imperfect detection of a species. Current guidelines on survey design requirements for occupancy models focus on the number of sample units and the pattern of revisits to a sample unit within a season. We focus on the sampling design or how the sample units are selected in geographic space (e.g., stratified, simple random, unequal probability, etc). In a probability design, each sample unit has a sample weight which quantifies the number of sample units it represents in the finite (oftentimes areal) sampling frame. We demonstrate the importance of including sampling weights in occupancy model estimation when the design is not a simple random sample or equal probability design. We assume a finite areal sampling frame as proposed for a national bat monitoring program. We compare several unequal and equal probability designs and varying sampling intensity within a simulation study. We found the traditional single season occupancy model produced biased estimates of occupancy and lower confidence interval coverage rates compared to occupancy models that accounted for the sampling design. We also discuss how our findings inform the analyses proposed for the nascent North American Bat Monitoring Program and other collaborative synthesis efforts that propose h

  13. Duplex sampling apparatus and method

    DOEpatents

    Brown, Paul E.; Lloyd, Robert

    1992-01-01

    An improved apparatus is provided for sampling a gaseous mixture and for measuring mixture components. The apparatus includes two sampling containers connected in series serving as a duplex sampling apparatus. The apparatus is adapted to independently determine the amounts of condensable and noncondensable gases in admixture from a single sample. More specifically, a first container includes a first port capable of selectively connecting to and disconnecting from a sample source and a second port capable of selectively connecting to and disconnecting from a second container. A second container also includes a first port capable of selectively connecting to and disconnecting from the second port of the first container and a second port capable of either selectively connecting to and disconnecting from a differential pressure source. By cooling a mixture sample in the first container, the condensable vapors form a liquid, leaving noncondensable gases either as free gases or dissolved in the liquid. The condensed liquid is heated to drive out dissolved noncondensable gases, and all the noncondensable gases are transferred to the second container. Then the first and second containers are separated from one another in order to separately determine the amount of noncondensable gases and the amount of condensable gases in the sample.

  14. Adaptive importance sampling of random walks on continuous state spaces

    SciTech Connect

    Baggerly, K.; Cox, D.; Picard, R.

    1998-11-01

    The authors consider adaptive importance sampling for a random walk with scoring in a general state space. Conditions under which exponential convergence occurs to the zero-variance solution are reviewed. These results generalize previous work for finite, discrete state spaces in Kollman (1993) and in Kollman, Baggerly, Cox, and Picard (1996). This paper is intended for nonstatisticians and includes considerable explanatory material.

  15. Perpendicular distance sampling: an alternative method for sampling downed coarse woody debris

    Treesearch

    Michael S. Williams; Jeffrey H. Gove

    2003-01-01

    Coarse woody debris (CWD) plays an important role in many forest ecosystem processes. In recent years, a number of new methods have been proposed to sample CWD. These methods select individual logs into the sample using some form of unequal probability sampling. One concern with most of these methods is the difficulty in estimating the volume of each log. A new method...

  16. Apparatus and method for handheld sampling

    DOEpatents

    Staab, Torsten A.

    2005-09-20

    The present invention includes an apparatus, and corresponding method, for taking a sample. The apparatus is built around a frame designed to be held in at least one hand. A sample media is used to secure the sample. A sample media adapter for securing the sample media is operated by a trigger mechanism connectively attached within the frame to the sample media adapter.

  17. Soil sampling kit and a method of sampling therewith

    DOEpatents

    Thompson, C.V.

    1991-02-05

    A soil sampling device and a sample containment device for containing a soil sample is disclosed. In addition, a method for taking a soil sample using the soil sampling device and soil sample containment device to minimize the loss of any volatile organic compounds contained in the soil sample prior to analysis is disclosed. The soil sampling device comprises two close fitting, longitudinal tubular members of suitable length, the inner tube having the outward end closed. With the inner closed tube withdrawn a selected distance, the outer tube can be inserted into the ground or other similar soft material to withdraw a sample of material for examination. The inner closed end tube controls the volume of the sample taken and also serves to eject the sample. The soil sample containment device has a sealing member which is adapted to attach to an analytical apparatus which analyzes the volatile organic compounds contained in the sample. The soil sampling device in combination with the soil sample containment device allows an operator to obtain a soil sample containing volatile organic compounds and minimizing the loss of the volatile organic compounds prior to analysis of the soil sample for the volatile organic compounds. 11 figures.

  18. Soil sampling kit and a method of sampling therewith

    DOEpatents

    Thompson, Cyril V.

    1991-01-01

    A soil sampling device and a sample containment device for containing a soil sample is disclosed. In addition, a method for taking a soil sample using the soil sampling device and soil sample containment device to minimize the loss of any volatile organic compounds contained in the soil sample prior to analysis is disclosed. The soil sampling device comprises two close fitting, longitudinal tubular members of suitable length, the inner tube having the outward end closed. With the inner closed tube withdrawn a selected distance, the outer tube can be inserted into the ground or other similar soft material to withdraw a sample of material for examination. The inner closed end tube controls the volume of the sample taken and also serves to eject the sample. The soil sample containment device has a sealing member which is adapted to attach to an analytical apparatus which analyzes the volatile organic compounds contained in the sample. The soil sampling device in combination with the soil sample containment device allow an operator to obtain a soil sample containing volatile organic compounds and minimizing the loss of the volatile organic compounds prior to analysis of the soil sample for the volatile organic compounds.

  19. Toward cost-efficient sampling methods

    NASA Astrophysics Data System (ADS)

    Luo, Peng; Li, Yongli; Wu, Chong; Zhang, Guijie

    2015-09-01

    The sampling method has been paid much attention in the field of complex network in general and statistical physics in particular. This paper proposes two new sampling methods based on the idea that a small part of vertices with high node degree could possess the most structure information of a complex network. The two proposed sampling methods are efficient in sampling high degree nodes so that they would be useful even if the sampling rate is low, which means cost-efficient. The first new sampling method is developed on the basis of the widely used stratified random sampling (SRS) method and the second one improves the famous snowball sampling (SBS) method. In order to demonstrate the validity and accuracy of two new sampling methods, we compare them with the existing sampling methods in three commonly used simulation networks that are scale-free network, random network, small-world network, and also in two real networks. The experimental results illustrate that the two proposed sampling methods perform much better than the existing sampling methods in terms of achieving the true network structure characteristics reflected by clustering coefficient, Bonacich centrality and average path length, especially when the sampling rate is low.

  20. Sampling High-Altitude and Stratified Mating Flights of Red Imported Fire Ant

    USDA-ARS?s Scientific Manuscript database

    With the exception of an airplane equipped with nets, no method has been developed that successfully samples red imported fire ant, Solenopsis invicta Buren, sexuals in mating/dispersal flights throughout their potential altitudinal trajectories. We developed and tested a method for sampling queens ...

  1. New prior sampling methods for nested sampling - Development and testing

    NASA Astrophysics Data System (ADS)

    Stokes, Barrie; Tuyl, Frank; Hudson, Irene

    2017-06-01

    Nested Sampling is a powerful algorithm for fitting models to data in the Bayesian setting, introduced by Skilling [1]. The nested sampling algorithm proceeds by carrying out a series of compressive steps, involving successively nested iso-likelihood boundaries, starting with the full prior distribution of the problem parameters. The "central problem" of nested sampling is to draw at each step a sample from the prior distribution whose likelihood is greater than the current likelihood threshold, i.e., a sample falling inside the current likelihood-restricted region. For both flat and informative priors this ultimately requires uniform sampling restricted to the likelihood-restricted region. We present two new methods of carrying out this sampling step, and illustrate their use with the lighthouse problem [2], a bivariate likelihood used by Gregory [3] and a trivariate Gaussian mixture likelihood. All the algorithm development and testing reported here has been done with Mathematica® [4].

  2. Probabilistic inference using linear Gaussian importance sampling for hybrid Bayesian networks

    NASA Astrophysics Data System (ADS)

    Sun, Wei; Chang, K. C.

    2005-05-01

    Probabilistic inference for Bayesian networks is in general NP-hard using either exact algorithms or approximate methods. However, for very complex networks, only the approximate methods such as stochastic sampling could be used to provide a solution given any time constraint. There are several simulation methods currently available. They include logic sampling (the first proposed stochastic method for Bayesian networks, the likelihood weighting algorithm) the most commonly used simulation method because of its simplicity and efficiency, the Markov blanket scoring method, and the importance sampling algorithm. In this paper, we first briefly review and compare these available simulation methods, then we propose an improved importance sampling algorithm called linear Gaussian importance sampling algorithm for general hybrid model (LGIS). LGIS is aimed for hybrid Bayesian networks consisting of both discrete and continuous random variables with arbitrary distributions. It uses linear function and Gaussian additive noise to approximate the true conditional probability distribution for continuous variable given both its parents and evidence in a Bayesian network. One of the most important features of the newly developed method is that it can adaptively learn the optimal important function from the previous samples. We test the inference performance of LGIS using a 16-node linear Gaussian model and a 6-node general hybrid model. The performance comparison with other well-known methods such as Junction tree (JT) and likelihood weighting (LW) shows that LGIS-GHM is very promising.

  3. A Comparison of EPI Sampling, Probability Sampling, and Compact Segment Sampling Methods for Micro and Small Enterprises.

    PubMed

    Chao, Li-Wei; Szrek, Helena; Peltzer, Karl; Ramlagan, Shandir; Fleming, Peter; Leite, Rui; Magerman, Jesswill; Ngwenya, Godfrey B; Pereira, Nuno Sousa; Behrman, Jere

    2012-05-01

    Finding an efficient method for sampling micro- and small-enterprises (MSEs) for research and statistical reporting purposes is a challenge in developing countries, where registries of MSEs are often nonexistent or outdated. This lack of a sampling frame creates an obstacle in finding a representative sample of MSEs. This study uses computer simulations to draw samples from a census of businesses and non-businesses in the Tshwane Municipality of South Africa, using three different sampling methods: the traditional probability sampling method, the compact segment sampling method, and the World Health Organization's Expanded Programme on Immunization (EPI) sampling method. Three mechanisms by which the methods could differ are tested, the proximity selection of respondents, the at-home selection of respondents, and the use of inaccurate probability weights. The results highlight the importance of revisits and accurate probability weights, but the lesser effect of proximity selection on the samples' statistical properties.

  4. An importance sampling algorithm for estimating extremes of perpetuity sequences

    NASA Astrophysics Data System (ADS)

    Collamore, Jeffrey F.

    2012-09-01

    In a wide class of problems in insurance and financial mathematics, it is of interest to study the extremal events of a perpetuity sequence. This paper addresses the problem of numerically evaluating these rare event probabilities. Specifically, an importance sampling algorithm is described which is efficient in the sense that it exhibits bounded relative error, and which is optimal in an appropriate asymptotic sense. The main idea of the algorithm is to use a "dual" change of measure, which is employed to an associated Markov chain over a randomly-stopped time interval. The algorithm also makes use of the so-called forward sequences generated to the given stochastic recursion, together with elements of Markov chain theory.

  5. Subrandom methods for multidimensional nonuniform sampling

    NASA Astrophysics Data System (ADS)

    Worley, Bradley

    2016-08-01

    Methods of nonuniform sampling that utilize pseudorandom number sequences to select points from a weighted Nyquist grid are commonplace in biomolecular NMR studies, due to the beneficial incoherence introduced by pseudorandom sampling. However, these methods require the specification of a non-arbitrary seed number in order to initialize a pseudorandom number generator. Because the performance of pseudorandom sampling schedules can substantially vary based on seed number, this can complicate the task of routine data collection. Approaches such as jittered sampling and stochastic gap sampling are effective at reducing random seed dependence of nonuniform sampling schedules, but still require the specification of a seed number. This work formalizes the use of subrandom number sequences in nonuniform sampling as a means of seed-independent sampling, and compares the performance of three subrandom methods to their pseudorandom counterparts using commonly applied schedule performance metrics. Reconstruction results using experimental datasets are also provided to validate claims made using these performance metrics.

  6. The Importance of Introductory Statistics Students Understanding Appropriate Sampling Techniques

    ERIC Educational Resources Information Center

    Menil, Violeta C.

    2005-01-01

    In this paper the author discusses the meaning of sampling, the reasons for sampling, the Central Limit Theorem, and the different techniques of sampling. Practical and relevant examples are given to make the appropriate sampling techniques understandable to students of Introductory Statistics courses. With a thorough knowledge of sampling…

  7. Method and apparatus for data sampling

    DOEpatents

    Odell, Daniel M. C.

    1994-01-01

    A method and apparatus for sampling radiation detector outputs and determining event data from the collected samples. The method uses high speed sampling of the detector output, the conversion of the samples to digital values, and the discrimination of the digital values so that digital values representing detected events are determined. The high speed sampling and digital conversion is performed by an A/D sampler that samples the detector output at a rate high enough to produce numerous digital samples for each detected event. The digital discrimination identifies those digital samples that are not representative of detected events. The sampling and discrimination also provides for temporary or permanent storage, either serially or in parallel, to a digital storage medium.

  8. Method and apparatus for data sampling

    DOEpatents

    Odell, D.M.C.

    1994-04-19

    A method and apparatus for sampling radiation detector outputs and determining event data from the collected samples is described. The method uses high speed sampling of the detector output, the conversion of the samples to digital values, and the discrimination of the digital values so that digital values representing detected events are determined. The high speed sampling and digital conversion is performed by an A/D sampler that samples the detector output at a rate high enough to produce numerous digital samples for each detected event. The digital discrimination identifies those digital samples that are not representative of detected events. The sampling and discrimination also provides for temporary or permanent storage, either serially or in parallel, to a digital storage medium. 6 figures.

  9. Different methods for volatile sampling in mammals

    PubMed Central

    Möller, Manfred; Marcillo, Andrea; Einspanier, Almuth; Weiß, Brigitte M.

    2017-01-01

    Previous studies showed that olfactory cues are important for mammalian communication. However, many specific compounds that convey information between conspecifics are still unknown. To understand mechanisms and functions of olfactory cues, olfactory signals such as volatile compounds emitted from individuals need to be assessed. Sampling of animals with and without scent glands was typically conducted using cotton swabs rubbed over the skin or fur and analysed by gas chromatography-mass spectrometry (GC-MS). However, this method has various drawbacks, including a high level of contaminations. Thus, we adapted two methods of volatile sampling from other research fields and compared them to sampling with cotton swabs. To do so we assessed the body odor of common marmosets (Callithrix jacchus) using cotton swabs, thermal desorption (TD) tubes and, alternatively, a mobile GC-MS device containing a thermal desorption trap. Overall, TD tubes comprised most compounds (N = 113), with half of those compounds being volatile (N = 52). The mobile GC-MS captured the fewest compounds (N = 35), of which all were volatile. Cotton swabs contained an intermediate number of compounds (N = 55), but very few volatiles (N = 10). Almost all compounds found with the mobile GC-MS were also captured with TD tubes (94%). Hence, we recommend TD tubes for state of the art sampling of body odor of mammals or other vertebrates, particularly for field studies, as they can be easily transported, stored and analysed with high performance instruments in the lab. Nevertheless, cotton swabs capture compounds which still may contribute to the body odor, e.g. after bacterial fermentation, while profiles from mobile GC-MS include only the most abundant volatiles of the body odor. PMID:28841690

  10. Different methods for volatile sampling in mammals.

    PubMed

    Kücklich, Marlen; Möller, Manfred; Marcillo, Andrea; Einspanier, Almuth; Weiß, Brigitte M; Birkemeyer, Claudia; Widdig, Anja

    2017-01-01

    Previous studies showed that olfactory cues are important for mammalian communication. However, many specific compounds that convey information between conspecifics are still unknown. To understand mechanisms and functions of olfactory cues, olfactory signals such as volatile compounds emitted from individuals need to be assessed. Sampling of animals with and without scent glands was typically conducted using cotton swabs rubbed over the skin or fur and analysed by gas chromatography-mass spectrometry (GC-MS). However, this method has various drawbacks, including a high level of contaminations. Thus, we adapted two methods of volatile sampling from other research fields and compared them to sampling with cotton swabs. To do so we assessed the body odor of common marmosets (Callithrix jacchus) using cotton swabs, thermal desorption (TD) tubes and, alternatively, a mobile GC-MS device containing a thermal desorption trap. Overall, TD tubes comprised most compounds (N = 113), with half of those compounds being volatile (N = 52). The mobile GC-MS captured the fewest compounds (N = 35), of which all were volatile. Cotton swabs contained an intermediate number of compounds (N = 55), but very few volatiles (N = 10). Almost all compounds found with the mobile GC-MS were also captured with TD tubes (94%). Hence, we recommend TD tubes for state of the art sampling of body odor of mammals or other vertebrates, particularly for field studies, as they can be easily transported, stored and analysed with high performance instruments in the lab. Nevertheless, cotton swabs capture compounds which still may contribute to the body odor, e.g. after bacterial fermentation, while profiles from mobile GC-MS include only the most abundant volatiles of the body odor.

  11. A Comparison of EPI Sampling, Probability Sampling, and Compact Segment Sampling Methods for Micro and Small Enterprises

    PubMed Central

    Chao, Li-Wei; Szrek, Helena; Peltzer, Karl; Ramlagan, Shandir; Fleming, Peter; Leite, Rui; Magerman, Jesswill; Ngwenya, Godfrey B.; Pereira, Nuno Sousa; Behrman, Jere

    2011-01-01

    Finding an efficient method for sampling micro- and small-enterprises (MSEs) for research and statistical reporting purposes is a challenge in developing countries, where registries of MSEs are often nonexistent or outdated. This lack of a sampling frame creates an obstacle in finding a representative sample of MSEs. This study uses computer simulations to draw samples from a census of businesses and non-businesses in the Tshwane Municipality of South Africa, using three different sampling methods: the traditional probability sampling method, the compact segment sampling method, and the World Health Organization’s Expanded Programme on Immunization (EPI) sampling method. Three mechanisms by which the methods could differ are tested, the proximity selection of respondents, the at-home selection of respondents, and the use of inaccurate probability weights. The results highlight the importance of revisits and accurate probability weights, but the lesser effect of proximity selection on the samples’ statistical properties. PMID:22582004

  12. Mixed Methods Sampling: A Typology with Examples

    ERIC Educational Resources Information Center

    Teddlie, Charles; Yu, Fen

    2007-01-01

    This article presents a discussion of mixed methods (MM) sampling techniques. MM sampling involves combining well-established qualitative and quantitative techniques in creative ways to answer research questions posed by MM research designs. Several issues germane to MM sampling are presented including the differences between probability and…

  13. Mixed Methods Sampling: A Typology with Examples

    ERIC Educational Resources Information Center

    Teddlie, Charles; Yu, Fen

    2007-01-01

    This article presents a discussion of mixed methods (MM) sampling techniques. MM sampling involves combining well-established qualitative and quantitative techniques in creative ways to answer research questions posed by MM research designs. Several issues germane to MM sampling are presented including the differences between probability and…

  14. ALARA ASSESSMENT OF SETTLER SLUDGE SAMPLING METHODS

    SciTech Connect

    NELSEN LA

    2009-01-30

    The purpose of this assessment is to compare underwater and above water settler sludge sampling methods to determine if the added cost for underwater sampling for the sole purpose of worker dose reductions is justified. Initial planning for sludge sampling included container, settler and knock-out-pot (KOP) sampling. Due to the significantly higher dose consequence of KOP sludge, a decision was made to sample KOP underwater to achieve worker dose reductions. Additionally, initial plans were to utilize the underwater sampling apparatus for settler sludge. Since there are no longer plans to sample KOP sludge, the decision for underwater sampling for settler sludge needs to be revisited. The present sampling plan calls for spending an estimated $2,500,000 to design and construct a new underwater sampling system (per A21 C-PL-001 RevOE). This evaluation will compare and contrast the present method of above water sampling to the underwater method that is planned by the Sludge Treatment Project (STP) and determine if settler samples can be taken using the existing sampling cart (with potentially minor modifications) while maintaining doses to workers As Low As Reasonably Achievable (ALARA) and eliminate the need for costly redesigns, testing and personnel retraining.

  15. 19 CFR 151.67 - Sampling by importer.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... TREASURY (CONTINUED) EXAMINATION, SAMPLING, AND TESTING OF MERCHANDISE Wool and Hair § 151.67 Sampling by... quantities from the packages of wool or hair designated for examination, provided the bales or bags are...

  16. 19 CFR 151.67 - Sampling by importer.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... TREASURY (CONTINUED) EXAMINATION, SAMPLING, AND TESTING OF MERCHANDISE Wool and Hair § 151.67 Sampling by... quantities from the packages of wool or hair designated for examination, provided the bales or bags are...

  17. 19 CFR 151.67 - Sampling by importer.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... TREASURY (CONTINUED) EXAMINATION, SAMPLING, AND TESTING OF MERCHANDISE Wool and Hair § 151.67 Sampling by... quantities from the packages of wool or hair designated for examination, provided the bales or bags...

  18. 19 CFR 151.67 - Sampling by importer.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... TREASURY (CONTINUED) EXAMINATION, SAMPLING, AND TESTING OF MERCHANDISE Wool and Hair § 151.67 Sampling by... quantities from the packages of wool or hair designated for examination, provided the bales or bags...

  19. 19 CFR 151.67 - Sampling by importer.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... TREASURY (CONTINUED) EXAMINATION, SAMPLING, AND TESTING OF MERCHANDISE Wool and Hair § 151.67 Sampling by... quantities from the packages of wool or hair designated for examination, provided the bales or bags...

  20. Uniform sampling table method and its applications: establishment of a uniform sampling method.

    PubMed

    Chen, Yibin; Chen, Jiaxi; Wang, Wei

    2013-01-01

    A novel uniform sampling method is proposed in this paper. According to the requirements of uniform sampling, we propose the properties that must be met by analyzing the distribution of samples. Based on this, the proposed uniform sampling method is demonstrated and evaluated strictly by mathematical means such as inference. The uniform sampling tables with respect to Cn(t2) and Cn(t3) are established. Furthermore, a one-dimension uniform sampling method and a multidimension method are proposed. The proposed novel uniform sampling method, which is guided by uniform design theory, enjoys the advantages of simplified use and good representativeness of the whole sample.

  1. Sampling Methods in Cardiovascular Nursing Research: An Overview.

    PubMed

    Kandola, Damanpreet; Banner, Davina; O'Keefe-McCarthy, Sheila; Jassal, Debbie

    2014-01-01

    Cardiovascular nursing research covers a wide array of topics from health services to psychosocial patient experiences. The selection of specific participant samples is an important part of the research design and process. The sampling strategy employed is of utmost importance to ensure that a representative sample of participants is chosen. There are two main categories of sampling methods: probability and non-probability. Probability sampling is the random selection of elements from the population, where each element of the population has an equal and independent chance of being included in the sample. There are five main types of probability sampling including simple random sampling, systematic sampling, stratified sampling, cluster sampling, and multi-stage sampling. Non-probability sampling methods are those in which elements are chosen through non-random methods for inclusion into the research study and include convenience sampling, purposive sampling, and snowball sampling. Each approach offers distinct advantages and disadvantages and must be considered critically. In this research column, we provide an introduction to these key sampling techniques and draw on examples from the cardiovascular research. Understanding the differences in sampling techniques may aid nurses in effective appraisal of research literature and provide a reference pointfor nurses who engage in cardiovascular research.

  2. Fluidics platform and method for sample preparation

    DOEpatents

    Benner, Henry W.; Dzenitis, John M.

    2016-06-21

    Provided herein are fluidics platforms and related methods for performing integrated sample collection and solid-phase extraction of a target component of the sample all in one tube. The fluidics platform comprises a pump, particles for solid-phase extraction and a particle-holding means. The method comprises contacting the sample with one or more reagents in a pump, coupling a particle-holding means to the pump and expelling the waste out of the pump while the particle-holding means retains the particles inside the pump. The fluidics platform and methods herein described allow solid-phase extraction without pipetting and centrifugation.

  3. Dynamic Method for Identifying Collected Sample Mass

    NASA Technical Reports Server (NTRS)

    Carson, John

    2008-01-01

    G-Sample is designed for sample collection missions to identify the presence and quantity of sample material gathered by spacecraft equipped with end effectors. The software method uses a maximum-likelihood estimator to identify the collected sample's mass based on onboard force-sensor measurements, thruster firings, and a dynamics model of the spacecraft. This makes sample mass identification a computation rather than a process requiring additional hardware. Simulation examples of G-Sample are provided for spacecraft model configurations with a sample collection device mounted on the end of an extended boom. In the absence of thrust knowledge errors, the results indicate that G-Sample can identify the amount of collected sample mass to within 10 grams (with 95-percent confidence) by using a force sensor with a noise and quantization floor of 50 micrometers. These results hold even in the presence of realistic parametric uncertainty in actual spacecraft inertia, center-of-mass offset, and first flexibility modes. Thrust profile knowledge is shown to be a dominant sensitivity for G-Sample, entering in a nearly one-to-one relationship with the final mass estimation error. This means thrust profiles should be well characterized with onboard accelerometers prior to sample collection. An overall sample-mass estimation error budget has been developed to approximate the effect of model uncertainty, sensor noise, data rate, and thrust profile error on the expected estimate of collected sample mass.

  4. Innovative methods for inorganic sample preparation

    SciTech Connect

    Essling, A.M.; Huff, E.A.; Graczyk, D.G.

    1992-04-01

    Procedures and guidelines are given for the dissolution of a variety of selected materials using fusion, microwave, and Parr bomb techniques. These materials include germanium glass, corium-concrete mixtures, and zeolites. Emphasis is placed on sample-preparation approaches that produce a single master solution suitable for complete multielement characterization of the sample. In addition, data are presented on the soil microwave digestion method approved by the Environmental Protection Agency (EPA). Advantages and disadvantages of each sample-preparation technique are summarized.

  5. Identification methods for Legionella from environmental samples.

    PubMed

    Bartie, C; Venter, S N; Nel, L H

    2003-03-01

    Laboratories responsible for Legionella diagnostics around the world use a number of different culturing methods of non-equivalent sensitivities and specificities, to detect Legionella species in environmental samples. Specific countries usually standardize and use one approved method. For example, laboratories in Australia use the Australian Standard (AS) method and those in Europe, the International Standard method (ISO). However, no standard culturing methods have been established in South Africa to date. As a result, there is uncertainty about the true prevalence and most common species of Legionella present in the South African environment. In an attempt to provide guidelines for the development of a standard method specific for South Africa, the ISO, AS and a most probable number method were evaluated and compared. In addition, the effect of sample re-incubation with autochthonous amoebae on culture outcome was studied. Samples were collected from four environments, representing industrial water, mine water and biofilm. The samples were concentrated by membrane filtration and divided into three portions and cultured without pretreatment, after acid treatment and after heat treatment, on four culture media namely alphaBCYE, BMPA, MWY and GVPC agar. A selective approach, incorporating heat treatment, but not acid treatment, combined with culture on alphaBCYE and GVPC or MWY, was most appropriate for legionellae detection in the samples evaluated. Legionellae were cultured from 82% of the environmental samples we evaluated. In 54% of the samples tested, legionellae were present in numbers equal to or exceeding 10(2) colony-forming units per milliliter (cfu/ml). Legionella pneumophila serogroups (SGs) 1-14 were the most prevalent species and were present as single, or a combination of two or more SGs in a number of samples tested. Re-incubation of sample concentrates with autochthonous amoebae improved the culturability of legionellae in 50% of cultures on alpha

  6. New methods for sampling sparse populations

    Treesearch

    Anna Ringvall

    2007-01-01

    To improve surveys of sparse objects, methods that use auxiliary information have been suggested. Guided transect sampling uses prior information, e.g., from aerial photographs, for the layout of survey strips. Instead of being laid out straight, the strips will wind between potentially more interesting areas. 3P sampling (probability proportional to prediction) uses...

  7. The rank product method with two samples.

    PubMed

    Koziol, James A

    2010-11-05

    Breitling et al. (2004) introduced a statistical technique, the rank product method, for detecting differentially regulated genes in replicated microarray experiments. The technique has achieved widespread acceptance and is now used more broadly, in such diverse fields as RNAi analysis, proteomics, and machine learning. In this note, we extend the rank product method to the two sample setting, provide distribution theory attending the rank product method in this setting, and give numerical details for implementing the method.

  8. 40 CFR 80.1349 - Alternative sampling and testing requirements for importers who import gasoline into the United...

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... requirements for importers who import gasoline into the United States by truck. 80.1349 Section 80.1349... FUELS AND FUEL ADDITIVES Gasoline Benzene Sampling, Testing and Retention Requirements § 80.1349 Alternative sampling and testing requirements for importers who import gasoline into the United States by...

  9. Sampling plant diversity and rarity at landscape scales: importance of sampling time in species detectability.

    PubMed

    Zhang, Jian; Nielsen, Scott E; Grainger, Tess N; Kohler, Monica; Chipchar, Tim; Farr, Daniel R

    2014-01-01

    Documenting and estimating species richness at regional or landscape scales has been a major emphasis for conservation efforts, as well as for the development and testing of evolutionary and ecological theory. Rarely, however, are sampling efforts assessed on how they affect detection and estimates of species richness and rarity. In this study, vascular plant richness was sampled in 356 quarter hectare time-unlimited survey plots in the boreal region of northeast Alberta. These surveys consisted of 15,856 observations of 499 vascular plant species (97 considered to be regionally rare) collected by 12 observers over a 2 year period. Average survey time for each quarter-hectare plot was 82 minutes, ranging from 20 to 194 minutes, with a positive relationship between total survey time and total plant richness. When survey time was limited to a 20-minute search, as in other Alberta biodiversity methods, 61 species were missed. Extending the survey time to 60 minutes, reduced the number of missed species to 20, while a 90-minute cut-off time resulted in the loss of 8 species. When surveys were separated by habitat type, 60 minutes of search effort sampled nearly 90% of total observed richness for all habitats. Relative to rare species, time-unlimited surveys had ∼ 65% higher rare plant detections post-20 minutes than during the first 20 minutes of the survey. Although exhaustive sampling was attempted, observer bias was noted among observers when a subsample of plots was re-surveyed by different observers. Our findings suggest that sampling time, combined with sample size and observer effects, should be considered in landscape-scale plant biodiversity surveys.

  10. Sampling Plant Diversity and Rarity at Landscape Scales: Importance of Sampling Time in Species Detectability

    PubMed Central

    Zhang, Jian; Nielsen, Scott E.; Grainger, Tess N.; Kohler, Monica; Chipchar, Tim; Farr, Daniel R.

    2014-01-01

    Documenting and estimating species richness at regional or landscape scales has been a major emphasis for conservation efforts, as well as for the development and testing of evolutionary and ecological theory. Rarely, however, are sampling efforts assessed on how they affect detection and estimates of species richness and rarity. In this study, vascular plant richness was sampled in 356 quarter hectare time-unlimited survey plots in the boreal region of northeast Alberta. These surveys consisted of 15,856 observations of 499 vascular plant species (97 considered to be regionally rare) collected by 12 observers over a 2 year period. Average survey time for each quarter-hectare plot was 82 minutes, ranging from 20 to 194 minutes, with a positive relationship between total survey time and total plant richness. When survey time was limited to a 20-minute search, as in other Alberta biodiversity methods, 61 species were missed. Extending the survey time to 60 minutes, reduced the number of missed species to 20, while a 90-minute cut-off time resulted in the loss of 8 species. When surveys were separated by habitat type, 60 minutes of search effort sampled nearly 90% of total observed richness for all habitats. Relative to rare species, time-unlimited surveys had ∼65% higher rare plant detections post-20 minutes than during the first 20 minutes of the survey. Although exhaustive sampling was attempted, observer bias was noted among observers when a subsample of plots was re-surveyed by different observers. Our findings suggest that sampling time, combined with sample size and observer effects, should be considered in landscape-scale plant biodiversity surveys. PMID:24740179

  11. Method and apparatus for sampling atmospheric mercury

    DOEpatents

    Trujillo, Patricio E.; Campbell, Evan E.; Eutsler, Bernard C.

    1976-01-20

    A method of simultaneously sampling particulate mercury, organic mercurial vapors, and metallic mercury vapor in the working and occupational environment and determining the amount of mercury derived from each such source in the sampled air. A known volume of air is passed through a sampling tube containing a filter for particulate mercury collection, a first adsorber for the selective adsorption of organic mercurial vapors, and a second adsorber for the adsorption of metallic mercury vapor. Carbon black molecular sieves are particularly useful as the selective adsorber for organic mercurial vapors. The amount of mercury adsorbed or collected in each section of the sampling tube is readily quantitatively determined by flameless atomic absorption spectrophotometry.

  12. Subrandom methods for multidimensional nonuniform sampling.

    PubMed

    Worley, Bradley

    2016-08-01

    Methods of nonuniform sampling that utilize pseudorandom number sequences to select points from a weighted Nyquist grid are commonplace in biomolecular NMR studies, due to the beneficial incoherence introduced by pseudorandom sampling. However, these methods require the specification of a non-arbitrary seed number in order to initialize a pseudorandom number generator. Because the performance of pseudorandom sampling schedules can substantially vary based on seed number, this can complicate the task of routine data collection. Approaches such as jittered sampling and stochastic gap sampling are effective at reducing random seed dependence of nonuniform sampling schedules, but still require the specification of a seed number. This work formalizes the use of subrandom number sequences in nonuniform sampling as a means of seed-independent sampling, and compares the performance of three subrandom methods to their pseudorandom counterparts using commonly applied schedule performance metrics. Reconstruction results using experimental datasets are also provided to validate claims made using these performance metrics. Copyright © 2016 Elsevier Inc. All rights reserved.

  13. Selection of species and sampling areas: The importance of inference

    Treesearch

    Paul Stephen Corn

    2009-01-01

    Inductive inference, the process of drawing general conclusions from specific observations, is fundamental to the scientific method. Platt (1964) termed conclusions obtained through rigorous application of the scientific method as "strong inference" and noted the following basic steps: generating alternative hypotheses; devising experiments, the...

  14. Sample Bytes to Protect Important Data from Unintentional Transmission in Advanced Embedded Device

    NASA Astrophysics Data System (ADS)

    Chung, Bo-Heung; Kim, Jung-Nye

    Illegal or unintentional file transmission of important data is a sensitive and main security issue in embedded and mobile devices. Within restricted resources such as small memory size and low battery capacity, simple and efficient method is needed to lessen much effort for preventing this illegal activity. Therefore, we discuss a protection technique taking into account these considerations. In our method, sample bytes are extracted from an important file and then it is used to prohibit illegal file transfer and modification. To avoid attacker's easy prediction about the selection position of the sample bytes, it is selected within whole extent of the file by equal distribution and at the random location. To avoid huge increase of the number of the sample bytes, candidate sampling area size of the file is chosen carefully after the analysis of the length and number of files. Also, considering computational overhead to calculate the number and position of the sample bytes to be selected, we propose three types of sampling methods. And we will show the evaluation result of these methods and recommend proper sampling approach to embedded device with low computational power. With the help of this technique, it has advantages that data leakage can be protected and prohibited effectively and the device can be managed securely within low overhead.

  15. Comparison of sampling methods for urine cultures.

    PubMed

    Unlü, Hayriye; Sardan, Yeşim Cetinkaya; Ulker, Saadet

    2007-01-01

    To compare efficacy and cost of conventional and alternative sampling methods concerning urine cultures. An experimental study with two replications was carried out in a 900-bed university hospital in Ankara, Turkey. The sample was 160 hospitalized female patients, who were asked to give urine specimens, September 10,2000 and September 1,2001. They were patients on urology and obstetrics and gynaecology wards. The authors informed the patients about the study first and then obtained two samples from each patient under their observation. The number of specimens was 320. Statistical methods were descriptive. The rates of contamination and significant growth, respectively, were 4.4% and 7.5% for the conventional method and 5.6% and 10% for the alternative method. The cost per culture was 2.588.257 TL (2.10 USD) for the conventional method and 57.021 TL (0.05 USD) for the alternative method. The cost difference was statistically significant. The two methods yielded similar results but the alternative method was less expensive.

  16. 40 CFR 80.1630 - Sampling and testing requirements for refiners, gasoline importers and producers and importers of...

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... refiners, gasoline importers and producers and importers of certified ethanol denaturant. 80.1630 Section... refiners, gasoline importers and producers and importers of certified ethanol denaturant. (a) Sample and test each batch of gasoline and certified ethanol denaturant. (1) Refiners and importers shall...

  17. An Importance Sampling EM Algorithm for Latent Regression Models

    ERIC Educational Resources Information Center

    von Davier, Matthias; Sinharay, Sandip

    2007-01-01

    Reporting methods used in large-scale assessments such as the National Assessment of Educational Progress (NAEP) rely on latent regression models. To fit the latent regression model using the maximum likelihood estimation technique, multivariate integrals must be evaluated. In the computer program MGROUP used by the Educational Testing Service for…

  18. An Importance Sampling EM Algorithm for Latent Regression Models

    ERIC Educational Resources Information Center

    von Davier, Matthias; Sinharay, Sandip

    2007-01-01

    Reporting methods used in large-scale assessments such as the National Assessment of Educational Progress (NAEP) rely on latent regression models. To fit the latent regression model using the maximum likelihood estimation technique, multivariate integrals must be evaluated. In the computer program MGROUP used by the Educational Testing Service for…

  19. Sparse Sampling Methods In Multidimensional NMR

    PubMed Central

    Mobli, Mehdi; Maciejewski, Mark W.; Schuyler, Adam D.; Stern, Alan S.; Hoch, Jeffrey C.

    2014-01-01

    Although the discrete Fourier transform played an enabling role in the development of modern NMR spectroscopy, it suffers from a well-known difficulty providing high-resolution spectra from short data records. In multidimensional NMR experiments, so-called indirect time dimensions are sampled parametrically, with each instance of evolution times along the indirect dimensions sampled via separate one-dimensional experiments. The time required to conduct multidimensional experiments is directly proportional to the number of indirect evolution times sampled. Despite remarkable advances in resolution with increasing magnetic field strength, multiple dimensions remain essential for resolving individual resonances in NMR spectra of biological macromolecues. Conventional Fourier-based methods of spectrum analysis limit the resolution that can be practically achieved in the indirect dimensions. Nonuniform or sparse data collection strategies, together with suitable non-Fourier methods of spectrum analysis, enable high-resolution multidimensional spectra to be obtained. Although some of these approaches were first employed in NMR more than two decades ago, it is only relatively recently that they have been widely adopted. Here we describe the current practice of sparse sampling methods and prospects for further development of the approach to improve resolution and sensitivity and shorten experiment time in multidimensional NMR. While sparse sampling is particularly promising for multidimensional NMR, the basic principles could apply to other forms of multidimensional spectroscopy. PMID:22481242

  20. Turbidity threshold sampling: Methods and instrumentation

    Treesearch

    Rand Eads; Jack Lewis

    2001-01-01

    Traditional methods for determining the frequency of suspended sediment sample collection often rely on measurements, such as water discharge, that are not well correlated to sediment concentration. Stream power is generally not a good predictor of sediment concentration for rivers that transport the bulk of their load as fines, due to the highly variable routing of...

  1. Sampling methods for terrestrial amphibians and reptiles.

    Treesearch

    Paul Stephen Corn; R. Bruce. Bury

    1990-01-01

    Methods described for sampling amphibians and reptiles in Douglas-fir forests in the Pacific Northwest include pitfall trapping, time-constrained collecting, and surveys of coarse woody debris. The herpetofauna of this region differ in breeding and nonbreeding habitats and vagility, so that no single technique is sufficient for a community study. A combination of...

  2. Sampling technique is important for optimal isolation of pharyngeal gonorrhoea.

    PubMed

    Mitchell, M; Rane, V; Fairley, C K; Whiley, D M; Bradshaw, C S; Bissessor, M; Chen, M Y

    2013-11-01

    Culture is insensitive for the detection of pharyngeal gonorrhoea but isolation is pivotal to antimicrobial resistance surveillance. The aim of this study was to ascertain whether recommendations provided to clinicians (doctors and nurses) on pharyngeal swabbing technique could improve gonorrhoea detection rates and to determine which aspects of swabbing technique are important for optimal isolation. This study was undertaken at the Melbourne Sexual Health Centre, Australia. Detection rates among clinicians for pharyngeal gonorrhoea were compared before (June 2006-May 2009) and after (June 2009-June 2012) recommendations on swabbing technique were provided. Associations between detection rates and reported swabbing technique obtained via a clinician questionnaire were examined. The overall yield from testing before and after provision of the recommendations among 28 clinicians was 1.6% (134/8586) and 1.8% (264/15,046) respectively (p=0.17). Significantly higher detection rates were seen following the recommendations among clinicians who reported a change in their swabbing technique in response to the recommendations (2.1% vs. 1.5%; p=0.004), swabbing a larger surface area (2.0% vs. 1.5%; p=0.02), applying more swab pressure (2.5% vs. 1.5%; p<0.001) and a change in the anatomical sites they swabbed (2.2% vs. 1.5%; p=0.002). The predominant change in sites swabbed was an increase in swabbing of the oropharynx: from a median of 0% to 80% of the time. More thorough swabbing improves the isolation of pharyngeal gonorrhoea using culture. Clinicians should receive training to ensure swabbing is performed with sufficient pressure and that it covers an adequate area that includes the oropharynx.

  3. Decomposition and (importance) sampling techniques for multi-stage stochastic linear programs

    SciTech Connect

    Infanger, G.

    1993-11-01

    The difficulty of solving large-scale multi-stage stochastic linear programs arises from the sheer number of scenarios associated with numerous stochastic parameters. The number of scenarios grows exponentially with the number of stages and problems get easily out of hand even for very moderate numbers of stochastic parameters per stage. Our method combines dual (Benders) decomposition with Monte Carlo sampling techniques. We employ importance sampling to efficiently obtain accurate estimates of both expected future costs and gradients and right-hand sides of cuts. The method enables us to solve practical large-scale problems with many stages and numerous stochastic parameters per stage. We discuss the theory of sharing and adjusting cuts between different scenarios in a stage. We derive probabilistic lower and upper bounds, where we use importance path sampling for the upper bound estimation. Initial numerical results turned out to be promising.

  4. Actinide recovery method -- Large soil samples

    SciTech Connect

    Maxwell , S.L. III

    2000-04-25

    There is a need to measure actinides in environmental samples with lower and lower detection limits, requiring larger sample sizes. This analysis is adversely affected by sample-matrix interferences, which make analyzing soil samples above five-grams very difficult. A new Actinide-Recovery Method has been developed by the Savannah River Site Central Laboratory to preconcentrate actinides from large-soil samples. Diphonix Resin (Eichrom Industries), a 1994 R and D 100 winner, is used to preconcentrate the actinides from large soil samples, which are bound powerfully to the resin's diphosphonic acid groups. A rapid microwave-digestion technique is used to remove the actinides from the Diphonix Resin, which effectively eliminates interfering matrix components from the soil matrix. The microwave-digestion technique is more effective and less tedious than catalyzed hydrogen peroxide digestions of the resin or digestion of diphosphonic stripping agents such as HEDPA. After resin digestion, the actinides are recovered in a small volume of nitric acid which can be loaded onto small extraction chromatography columns, such as TEVA Resin, U-TEVA Resin or TRU Resin (Eichrom Industries). Small, selective extraction columns do not generate large volumes of liquid waste and provide consistent tracer recoveries after soil matrix elimination.

  5. Use of enveloping distribution sampling to evaluate important characteristics of biomolecular force fields.

    PubMed

    Huang, Wei; Lin, Zhixiong; van Gunsteren, Wilfred F

    2014-06-19

    The predictive power of biomolecular simulation critically depends on the quality of the force field or molecular model used and on the extent of conformational sampling that can be achieved. Both issues are addressed. First, it is shown that widely used force fields for simulation of proteins in aqueous solution appear to have rather different propensities to stabilize or destabilize α-, π-, and 3(10)- helical structures, which is an important feature of a biomolecular force field due to the omni-presence of such secondary structure in proteins. Second, the relative stability of secondary structure elements in proteins can only be computationally determined through so-called free-energy calculations, the accuracy of which critically depends on the extent of configurational sampling. It is shown that the method of enveloping distribution sampling is a very efficient method to extensively sample different parts of configurational space.

  6. Separation methods for taurine analysis in biological samples.

    PubMed

    Mou, Shifen; Ding, Xiaojing; Liu, Yongjian

    2002-12-05

    Taurine plays an important role in a variety of physiological functions, pharmacological actions and pathological conditions. Many methods for taurine analysis, therefore, have been reported to monitor its levels in biological samples. This review discusses the following techniques: sample preparation; separation and determination methods including high-performance liquid chromatography, gas chromatography, ion chromatography, capillary electrophoresis and hyphenation procedures. It covers articles published between 1990 and 2001.

  7. Constrained sampling method for analytic continuation

    NASA Astrophysics Data System (ADS)

    Sandvik, Anders W.

    2016-12-01

    A method for analytic continuation of imaginary-time correlation functions (here obtained in quantum Monte Carlo simulations) to real-frequency spectral functions is proposed. Stochastically sampling a spectrum parametrized by a large number of δ functions, treated as a statistical-mechanics problem, it avoids distortions caused by (as demonstrated here) configurational entropy in previous sampling methods. The key development is the suppression of entropy by constraining the spectral weight to within identifiable optimal bounds and imposing a set number of peaks. As a test case, the dynamic structure factor of the S =1 /2 Heisenberg chain is computed. Very good agreement is found with Bethe ansatz results in the ground state (including a sharp edge) and with exact diagonalization of small systems at elevated temperatures.

  8. Actinide Recovery Method for Large Soil Samples

    SciTech Connect

    Maxwell, S.L. III; Nichols, S.

    1998-11-01

    A new Actinide Recovery Method has been developed by the Savannah River Site Central Laboratory to preconcentrate actinides in very large soil samples. Diphonix Resin(r) is used eliminate soil matrix interferences and preconcentrate actinides after soil leaching or soil fusion. A rapid microwave digestion technique is used to remove the actinides from the Diphonix Resin(r). After the resin digestion, the actinides are recovered in a small volume of nitric acid which can be easily loaded onto small extraction-chromatography columns, such as TEVA Resin(r), U-TEVA Resin(r) or TRU Resin(r) (Eichrom Industries). This method enables the application of small, selective extraction-columns to recover actinides from very large soil samples with high selectivity, consistent tracer recoveries and minimal liquid waste.

  9. Constrained sampling method for analytic continuation.

    PubMed

    Sandvik, Anders W

    2016-12-01

    A method for analytic continuation of imaginary-time correlation functions (here obtained in quantum Monte Carlo simulations) to real-frequency spectral functions is proposed. Stochastically sampling a spectrum parametrized by a large number of δ functions, treated as a statistical-mechanics problem, it avoids distortions caused by (as demonstrated here) configurational entropy in previous sampling methods. The key development is the suppression of entropy by constraining the spectral weight to within identifiable optimal bounds and imposing a set number of peaks. As a test case, the dynamic structure factor of the S=1/2 Heisenberg chain is computed. Very good agreement is found with Bethe ansatz results in the ground state (including a sharp edge) and with exact diagonalization of small systems at elevated temperatures.

  10. Methods for Sampling of Airborne Viruses

    PubMed Central

    Verreault, Daniel; Moineau, Sylvain; Duchaine, Caroline

    2008-01-01

    Summary: To better understand the underlying mechanisms of aerovirology, accurate sampling of airborne viruses is fundamental. The sampling instruments commonly used in aerobiology have also been used to recover viruses suspended in the air. We reviewed over 100 papers to evaluate the methods currently used for viral aerosol sampling. Differentiating infections caused by direct contact from those caused by airborne dissemination can be a very demanding task given the wide variety of sources of viral aerosols. While epidemiological data can help to determine the source of the contamination, direct data obtained from air samples can provide very useful information for risk assessment purposes. Many types of samplers have been used over the years, including liquid impingers, solid impactors, filters, electrostatic precipitators, and many others. The efficiencies of these samplers depend on a variety of environmental and methodological factors that can affect the integrity of the virus structure. The aerodynamic size distribution of the aerosol also has a direct effect on sampler efficiency. Viral aerosols can be studied under controlled laboratory conditions, using biological or nonbiological tracers and surrogate viruses, which are also discussed in this review. Lastly, general recommendations are made regarding future studies on the sampling of airborne viruses. PMID:18772283

  11. SOIL AND SEDIMENT SAMPLING METHODS | Science ...

    EPA Pesticide Factsheets

    The EPA Office of Solid Waste and Emergency Response's (OSWER) Office of Superfund Remediation and Technology Innovation (OSRTI) needs innovative methods and techniques to solve new and difficult sampling and analytical problems found at the numerous Superfund sites throughout the United States. Inadequate site characterization and a lack of knowledge of surface and subsurface contaminant distributions hinders EPA's ability to make the best decisions on remediation options and to conduct the most effective cleanup efforts. To assist OSWER, NERL conducts research to improve their capability to more accurately, precisely, and efficiently characterize Superfund, RCRA, LUST, oil spills, and brownfield sites and to improve their risk-based decision making capabilities, research is being conducted on improving soil and sediment sampling techniques and improving the sampling and handling of volatile organic compound (VOC) contaminated soils, among the many research programs and tasks being performed at ESD-LV.Under this task, improved sampling approaches and devices will be developed for characterizing the concentration of VOCs in soils. Current approaches and devices used today can lose up to 99% of the VOCs present in the sample due inherent weaknesses in the device and improper/inadequate collection techniques. This error generally causes decision makers to markedly underestimate the soil VOC concentrations and, therefore, to greatly underestimate the ecological

  12. Flow cytometric detection method for DNA samples

    DOEpatents

    Nasarabadi, Shanavaz [Livermore, CA; Langlois, Richard G [Livermore, CA; Venkateswaran, Kodumudi S [Round Rock, TX

    2011-07-05

    Disclosed herein are two methods for rapid multiplex analysis to determine the presence and identity of target DNA sequences within a DNA sample. Both methods use reporting DNA sequences, e.g., modified conventional Taqman.RTM. probes, to combine multiplex PCR amplification with microsphere-based hybridization using flow cytometry means of detection. Real-time PCR detection can also be incorporated. The first method uses a cyanine dye, such as, Cy3.TM., as the reporter linked to the 5' end of a reporting DNA sequence. The second method positions a reporter dye, e.g., FAM.TM. on the 3' end of the reporting DNA sequence and a quencher dye, e.g., TAMRA.TM., on the 5' end.

  13. Flow cytometric detection method for DNA samples

    DOEpatents

    Nasarabadi, Shanavaz; Langlois, Richard G.; Venkateswaran, Kodumudi S.

    2006-08-01

    Disclosed herein are two methods for rapid multiplex analysis to determine the presence and identity of target DNA sequences within a DNA sample. Both methods use reporting DNA sequences, e.g., modified conventional Taqman.RTM. probes, to combine multiplex PCR amplification with microsphere-based hybridization using flow cytometry means of detection. Real-time PCR detection can also be incorporated. The first method uses a cyanine dye, such as, Cy3.TM., as the reporter linked to the 5' end of a reporting DNA sequence. The second method positions a reporter dye, e.g., FAM, on the 3' end of the reporting DNA sequence and a quencher dye, e.g., TAMRA, on the 5' end.

  14. Evaluation of Common Methods for Sampling Invertebrate Pollinator Assemblages: Net Sampling Out-Perform Pan Traps

    PubMed Central

    Popic, Tony J.; Davila, Yvonne C.; Wardle, Glenda M.

    2013-01-01

    Methods for sampling ecological assemblages strive to be efficient, repeatable, and representative. Unknowingly, common methods may be limited in terms of revealing species function and so of less value for comparative studies. The global decline in pollination services has stimulated surveys of flower-visiting invertebrates, using pan traps and net sampling. We explore the relative merits of these two methods in terms of species discovery, quantifying abundance, function, and composition, and responses of species to changing floral resources. Using a spatially-nested design we sampled across a 5000 km2 area of arid grasslands, including 432 hours of net sampling and 1296 pan trap-days, between June 2010 and July 2011. Net sampling yielded 22% more species and 30% higher abundance than pan traps, and better reflected the spatio-temporal variation of floral resources. Species composition differed significantly between methods; from 436 total species, 25% were sampled by both methods, 50% only by nets, and the remaining 25% only by pans. Apart from being less comprehensive, if pan traps do not sample flower-visitors, the link to pollination is questionable. By contrast, net sampling functionally linked species to pollination through behavioural observations of flower-visitation interaction frequency. Netted specimens are also necessary for evidence of pollen transport. Benefits of net-based sampling outweighed minor differences in overall sampling effort. As pan traps and net sampling methods are not equivalent for sampling invertebrate-flower interactions, we recommend net sampling of invertebrate pollinator assemblages, especially if datasets are intended to document declines in pollination and guide measures to retain this important ecosystem service. PMID:23799127

  15. Evaluation of common methods for sampling invertebrate pollinator assemblages: net sampling out-perform pan traps.

    PubMed

    Popic, Tony J; Davila, Yvonne C; Wardle, Glenda M

    2013-01-01

    Methods for sampling ecological assemblages strive to be efficient, repeatable, and representative. Unknowingly, common methods may be limited in terms of revealing species function and so of less value for comparative studies. The global decline in pollination services has stimulated surveys of flower-visiting invertebrates, using pan traps and net sampling. We explore the relative merits of these two methods in terms of species discovery, quantifying abundance, function, and composition, and responses of species to changing floral resources. Using a spatially-nested design we sampled across a 5000 km(2) area of arid grasslands, including 432 hours of net sampling and 1296 pan trap-days, between June 2010 and July 2011. Net sampling yielded 22% more species and 30% higher abundance than pan traps, and better reflected the spatio-temporal variation of floral resources. Species composition differed significantly between methods; from 436 total species, 25% were sampled by both methods, 50% only by nets, and the remaining 25% only by pans. Apart from being less comprehensive, if pan traps do not sample flower-visitors, the link to pollination is questionable. By contrast, net sampling functionally linked species to pollination through behavioural observations of flower-visitation interaction frequency. Netted specimens are also necessary for evidence of pollen transport. Benefits of net-based sampling outweighed minor differences in overall sampling effort. As pan traps and net sampling methods are not equivalent for sampling invertebrate-flower interactions, we recommend net sampling of invertebrate pollinator assemblages, especially if datasets are intended to document declines in pollination and guide measures to retain this important ecosystem service.

  16. Well purge and sample apparatus and method

    DOEpatents

    Schalla, Ronald; Smith, Ronald M.; Hall, Stephen H.; Smart, John E.; Gustafson, Gregg S.

    1995-01-01

    The present invention specifically permits purging and/or sampling of a well but only removing, at most, about 25% of the fluid volume compared to conventional methods and, at a minimum, removing none of the fluid volume from the well. The invention is an isolation assembly with a packer, pump and exhaust, that is inserted into the well. The isolation assembly is designed so that only a volume of fluid between the outside diameter of the isolation assembly and the inside diameter of the well over a fluid column height from the bottom of the well to the top of the active portion (lower annulus) is removed. The packer is positioned above the active portion thereby sealing the well and preventing any mixing or contamination of inlet fluid with fluid above the packer. Ports in the wall of the isolation assembly permit purging and sampling of the lower annulus along the height of the active portion.

  17. Well purge and sample apparatus and method

    DOEpatents

    Schalla, R.; Smith, R.M.; Hall, S.H.; Smart, J.E.; Gustafson, G.S.

    1995-10-24

    The present invention specifically permits purging and/or sampling of a well but only removing, at most, about 25% of the fluid volume compared to conventional methods and, at a minimum, removing none of the fluid volume from the well. The invention is an isolation assembly with a packer, pump and exhaust, that is inserted into the well. The isolation assembly is designed so that only a volume of fluid between the outside diameter of the isolation assembly and the inside diameter of the well over a fluid column height from the bottom of the well to the top of the active portion (lower annulus) is removed. The packer is positioned above the active portion thereby sealing the well and preventing any mixing or contamination of inlet fluid with fluid above the packer. Ports in the wall of the isolation assembly permit purging and sampling of the lower annulus along the height of the active portion. 8 figs.

  18. A quantitative evaluation of two methods for preserving hair samples

    USGS Publications Warehouse

    Roon, David A.; Waits, L.P.; Kendall, K.C.

    2003-01-01

    Hair samples are an increasingly important DNA source for wildlife studies, yet optimal storage methods and DNA degradation rates have not been rigorously evaluated. We tested amplification success rates over a one-year storage period for DNA extracted from brown bear (Ursus arctos) hair samples preserved using silica desiccation and -20C freezing. For three nuclear DNA microsatellites, success rates decreased significantly after a six-month time point, regardless of storage method. For a 1000 bp mitochondrial fragment, a similar decrease occurred after a two-week time point. Minimizing delays between collection and DNA extraction will maximize success rates for hair-based noninvasive genetic sampling projects.

  19. Estimating cross-validatory predictive p-values with integrated importance sampling for disease mapping models.

    PubMed

    Li, Longhai; Feng, Cindy X; Qiu, Shi

    2017-06-30

    An important statistical task in disease mapping problems is to identify divergent regions with unusually high or low risk of disease. Leave-one-out cross-validatory (LOOCV) model assessment is the gold standard for estimating predictive p-values that can flag such divergent regions. However, actual LOOCV is time-consuming because one needs to rerun a Markov chain Monte Carlo analysis for each posterior distribution in which an observation is held out as a test case. This paper introduces a new method, called integrated importance sampling (iIS), for estimating LOOCV predictive p-values with only Markov chain samples drawn from the posterior based on a full data set. The key step in iIS is that we integrate away the latent variables associated the test observation with respect to their conditional distribution without reference to the actual observation. By following the general theory for importance sampling, the formula used by iIS can be proved to be equivalent to the LOOCV predictive p-value. We compare iIS and other three existing methods in the literature with two disease mapping datasets. Our empirical results show that the predictive p-values estimated with iIS are almost identical to the predictive p-values estimated with actual LOOCV and outperform those given by the existing three methods, namely, the posterior predictive checking, the ordinary importance sampling, and the ghosting method by Marshall and Spiegelhalter (2003). Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  20. Field evaluation of a VOST sampling method

    SciTech Connect

    Jackson, M.D.; Johnson, L.D.; Fuerst, R.G.; McGaughey, J.F.; Bursey, J.T.; Merrill, R.G.

    1994-12-31

    The VOST (SW-846 Method 0030) specifies the use of Tenax{reg_sign} and a particular petroleum-based charcoal (SKC Lot 104, or its equivalent), that is no longer commercially available. In field evaluation studies of VOST methodology, a replacement petroleum-based charcoal has been used: candidate replacement sorbents for charcoal were studied, and Anasorb{reg_sign} 747, a carbon-based sorbent, was selected for field testing. The sampling train was modified to use only Anasorb{reg_sign} in the back tube and Tenax{reg_sign} in the two front tubes to avoid analytical difficulties associated with the analysis of the sequential bed back tube used in the standard VOST train. The standard (SW-846 Method 0030) and the modified VOST methods were evaluated at a chemical manufacturing facility using a quadruple probe system with quadruple trains. In this field test, known concentrations of the halogenated volatile organic compounds, that are listed in the Clean Air Act Amendments of 1990, Title 3, were introduced into the VOST train and the modified VOST train, using the same certified gas cylinder as a source of test compounds. Statistical tests of the comparability of methods were performed on a compound-by-compound basis. For most compounds, the VOST and modified VOST methods were found to be statistically equivalent.

  1. A laser microdissection-based workflow for FFPE tissue microproteomics: Important considerations for small sample processing.

    PubMed

    Longuespée, Rémi; Alberts, Deborah; Pottier, Charles; Smargiasso, Nicolas; Mazzucchelli, Gabriel; Baiwir, Dominique; Kriegsmann, Mark; Herfs, Michael; Kriegsmann, Jörg; Delvenne, Philippe; De Pauw, Edwin

    2016-07-15

    Proteomic methods are today widely applied to formalin-fixed paraffin-embedded (FFPE) tissue samples for several applications in research, especially in molecular pathology. To date, there is an unmet need for the analysis of small tissue samples, such as for early cancerous lesions. Indeed, no method has yet been proposed for the reproducible processing of small FFPE tissue samples to allow biomarker discovery. In this work, we tested several procedures to process laser microdissected tissue pieces bearing less than 3000 cells. Combined with appropriate settings for liquid chromatography mass spectrometry-mass spectrometry (LC-MS/MS) analysis, a citric acid antigen retrieval (CAAR)-based procedure was established, allowing to identify more than 1400 proteins from a single microdissected breast cancer tissue biopsy. This work demonstrates important considerations concerning the handling and processing of laser microdissected tissue samples of extremely limited size, in the process opening new perspectives in molecular pathology. A proof of the proposed method for biomarker discovery, with respect to these specific handling considerations, is illustrated using the differential proteomic analysis of invasive breast carcinoma of no special type and invasive lobular triple-negative breast cancer tissues. This work will be of utmost importance for early biomarker discovery or in support of matrix-assisted laser desorption/ionization (MALDI) imaging for microproteomics from small regions of interest. Copyright © 2016. Published by Elsevier Inc.

  2. A method for sampling waste corn

    USGS Publications Warehouse

    Frederick, R.B.; Klaas, E.E.; Baldassarre, G.A.; Reinecke, K.J.

    1984-01-01

    Corn had become one of the most important wildlife food in the United States. It is eaten by a wide variety of animals, including white-tailed deer (Odocoileus virginianus ), raccoon (Procyon lotor ), ring-necked pheasant (Phasianus colchicus , wild turkey (Meleagris gallopavo ), and many species of aquatic birds. Damage to unharvested crops had been documented, but many birds and mammals eat waste grain after harvest and do not conflict with agriculture. A good method for measuring waste-corn availability can be essential to studies concerning food density and food and feeding habits of field-feeding wildlife. Previous methods were developed primarily for approximating losses due to harvest machinery. In this paper, a method is described for estimating the amount of waste corn potentially available to wildlife. Detection of temporal changes in food availability and differences caused by agricultural operations (e.g., recently harvested stubble fields vs. plowed fields) are discussed.

  3. Standard methods for sampling North American freshwater fishes

    USGS Publications Warehouse

    Bonar, Scott A.; Hubert, Wayne A.; Willis, David W.

    2009-01-01

    This important reference book provides standard sampling methods recommended by the American Fisheries Society for assessing and monitoring freshwater fish populations in North America. Methods apply to ponds, reservoirs, natural lakes, and streams and rivers containing cold and warmwater fishes. Range-wide and eco-regional averages for indices of abundance, population structure, and condition for individual species are supplied to facilitate comparisons of standard data among populations. Provides information on converting nonstandard to standard data, statistical and database procedures for analyzing and storing standard data, and methods to prevent transfer of invasive species while sampling.

  4. Methods for sample size determination in cluster randomized trials

    PubMed Central

    Rutterford, Clare; Copas, Andrew; Eldridge, Sandra

    2015-01-01

    Background: The use of cluster randomized trials (CRTs) is increasing, along with the variety in their design and analysis. The simplest approach for their sample size calculation is to calculate the sample size assuming individual randomization and inflate this by a design effect to account for randomization by cluster. The assumptions of a simple design effect may not always be met; alternative or more complicated approaches are required. Methods: We summarise a wide range of sample size methods available for cluster randomized trials. For those familiar with sample size calculations for individually randomized trials but with less experience in the clustered case, this manuscript provides formulae for a wide range of scenarios with associated explanation and recommendations. For those with more experience, comprehensive summaries are provided that allow quick identification of methods for a given design, outcome and analysis method. Results: We present first those methods applicable to the simplest two-arm, parallel group, completely randomized design followed by methods that incorporate deviations from this design such as: variability in cluster sizes; attrition; non-compliance; or the inclusion of baseline covariates or repeated measures. The paper concludes with methods for alternative designs. Conclusions: There is a large amount of methodology available for sample size calculations in CRTs. This paper gives the most comprehensive description of published methodology for sample size calculation and provides an important resource for those designing these trials. PMID:26174515

  5. Performance of sampling methods to estimate log characteristics for wildlife.

    Treesearch

    Lisa J. Bate; Torolf R. Torgersen; Michael J. Wisdom; Edward O. Garton

    2004-01-01

    Accurate estimation of the characteristics of log resources, or coarse woody debris (CWD), is critical to effective management of wildlife and other forest resources. Despite the importance of logs as wildlife habitat, methods for sampling logs have traditionally focused on silvicultural and fire applications. These applications have emphasized estimates of log volume...

  6. Sampling and sample preparation methods for determining concentrations of mycotoxins in foods and feeds.

    PubMed

    2012-01-01

    Sample variation is often the largest error in determining concentrations of mycotoxins in food commodities. The worldwide safety evaluation of mycotoxins requires sampling plans that give acceptably accurate values for the levels of contamination in specific batches or lots of a commodity. Mycotoxin concentrations show a skewed or uneven distribution in foods and feeds, especially in whole kernels (or nuts), so it is extremely difficult to collect a sample that accurately represents the mean batch concentration. Sample variance studies and sampling plans have been published for select mycotoxins such as aflatoxin, fumonisin, and deoxynivalenol, emphasizing the importance of sample selection, sample size, and the number of incremental samples. For meaningful data to be generated from surveillance studies, representative samples should be collected from carefully selected populations (batches or lots) of food that, in turn, should be representative of clearly defined locations (e.g. a country, a region within a country). Although sampling variability is unavoidable, it is essential that the precision of the sampling plan be clearly defined and be considered acceptable by those responsible for interpreting and reporting the surveillance data. The factors influencing variability are detailed here, with reference to both major mycotoxins and major commodities. Sampling of large bag stacks, bulk shipments, and domestic supplies are all discussed. Sampling plans currently accepted in international trade are outlined. Acceptance sampling plans and the variabilities that affect operating characteristic curves of such plans are also detailed. The constraints and issues related to the sampling of harvested crops within subsistence farming areas are also discussed in this chapter, as are the essential rules of sample labelling and storage. The chapter concludes with a short section on sample preparation methods.

  7. System and Method for Isolation of Samples

    NASA Technical Reports Server (NTRS)

    Zhang, Ye (Inventor); Wu, Honglu (Inventor)

    2014-01-01

    Systems and methods for isolating samples are provided. The system comprises a first membrane and a second membrane disposed within an enclosure. First and second reservoirs can also be disposed within the enclosure and adapted to contain one or more reagents therein. A first valve can be disposed within the enclosure and in fluid communication with the first reservoir, the second reservoir, or both. The first valve can also be in fluid communication with the first or second membranes or both. The first valve can be adapted to selectively regulate the flow of the reagents from the first reservoir, through at least one of the first and second membranes, and into the second reservoir.

  8. [Progress in sample preparation and analytical methods for trace polar small molecules in complex samples].

    PubMed

    Zhang, Qianchun; Luo, Xialin; Li, Gongke; Xiao, Xiaohua

    2015-09-01

    Small polar molecules such as nucleosides, amines, amino acids are important analytes in biological, food, environmental, and other fields. It is necessary to develop efficient sample preparation and sensitive analytical methods for rapid analysis of these polar small molecules in complex matrices. Some typical materials in sample preparation, including silica, polymer, carbon, boric acid and so on, are introduced in this paper. Meanwhile, the applications and developments of analytical methods of polar small molecules, such as reversed-phase liquid chromatography, hydrophilic interaction chromatography, etc., are also reviewed.

  9. Catching Stardust and Bringing it Home: The Astronomical Importance of Sample Return

    NASA Astrophysics Data System (ADS)

    Brownlee, D.

    2002-12-01

    The return of lunar samples by the Apollo program provided the first opportunity to perform detailed laboratory studies of ancient solid materials from a known astronomical body. The highly detailed study of the samples, using the best available laboratory instruments and techniques, revolutionized our understanding of the Moon and provided fundamental insight into the remarkable and violent processes that occur early in the history of moons and terrestrial planets. This type of astronomical paleontology is only possible with samples and yet the last US sample return was made by Apollo 17- over thirty years ago! The NASA Stardust mission, began a new era of sample missions with its 1999 launch to retrieve samples from the short period comet Wild 2. Genesis (a solar wind collector) was launched in 2001, the Japanese MUSES-C asteroid sample return mission will launch in 2003 and Mars sample return missions are under study. All of these missions will use sophisticated ground-based instrumentation to provide types of information that cannot be obtained by astronomical and spacecraft remote sensing methods. In the case of Stardust, the goal is to determine the fundamental nature of the initial solid building blocks of solar systems at atomic-scale spatial resolution. The samples returned by the mission will be samples from the Kuiper Belt region and they are probably composed of submicron silicate and organic materials of both presolar and nebular origin. Unlocking the detailed records contained in the elemental, chemical, isotopic and mineralogical composition of these tiny components can only be appropriately explored with full power, precision and flexibility of laboratory instrumentation. Laboratory instrumentation has the advantage that is state-of-the-art and is not limited by serious considerations of power, mass, cost or even reliability. The comparison of the comet sample, accumulated beyond Neptune, with asteroidal meteorites that accumulated just beyond the

  10. Sampling high-altitude and stratified mating flights of red imported fire ant.

    PubMed

    Fritz, Gary N; Fritz, Ann H; Vander Meer, Robert K

    2011-05-01

    With the exception of an airplane equipped with nets, no method has been developed that successfully samples red imported fire ant, Solenopsis invicta Buren, sexuals in mating/dispersal flights throughout their potential altitudinal trajectories. We developed and tested a method for sampling queens and males during mating flights at altitudinal intervals reaching as high as "140 m. Our trapping system uses an electric winch and a 1.2-m spindle bolted to a swiveling platform. The winch dispenses up to 183 m of Kevlar-core, nylon rope and the spindle stores 10 panels (0.9 by 4.6 m each) of nylon tulle impregnated with Tangle-Trap. The panels can be attached to the rope at various intervals and hoisted into the air by using a 3-m-diameter, helium-filled balloon. Raising or lowering all 10 panels takes approximately 15-20 min. This trap also should be useful for altitudinal sampling of other insects of medical importance.

  11. Sample preparation methods for determination of drugs of abuse in hair samples: A review.

    PubMed

    Vogliardi, Susanna; Tucci, Marianna; Stocchero, Giulia; Ferrara, Santo Davide; Favretto, Donata

    2015-02-01

    Hair analysis has assumed increasing importance in the determination of substances of abuse, both in clinical and forensic toxicology investigations. Hair analysis offers particular advantages over other biological matrices (blood and urine), including a larger window of detection, ease of collection and sample stability. In the present work, an overview of sample preparation techniques for the determination of substances of abuse in hair is provided, specifically regarding the principal steps in hair sample treatment-decontamination, extraction and purification. For this purpose, a survey of publications found in the MEDLINE database from 2000 to date was conducted. The most widely consumed substances of abuse and psychotropic drugs were considered. Trends in simplification of hair sample preparation, washing procedures and cleanup methods are discussed. Alternative sample extraction techniques, such as head-space solid phase microextraction (HS-SPDE), supercritical fluid extraction (SFE) and molecularly imprinted polymers (MIP) are also reported.

  12. Log sampling methods and software for stand and landscape analyses.

    Treesearch

    Lisa J. Bate; Torolf R. Torgersen; Michael J. Wisdom; Edward O. Garton; Shawn C. Clabough

    2008-01-01

    We describe methods for efficient, accurate sampling of logs at landscape and stand scales to estimate density, total length, cover, volume, and weight. Our methods focus on optimizing the sampling effort by choosing an appropriate sampling method and transect length for specific forest conditions and objectives. Sampling methods include the line-intersect method and...

  13. Efficiency of snake sampling methods in the Brazilian semiarid region.

    PubMed

    Mesquita, Paula C M D; Passos, Daniel C; Cechin, Sonia Z

    2013-09-01

    The choice of sampling methods is a crucial step in every field survey in herpetology. In countries where time and financial support are limited, the choice of the methods is critical. The methods used to sample snakes often lack objective criteria, and the traditional methods have apparently been more important when making the choice. Consequently researches using not-standardized methods are frequently found in the literature. We have compared four commonly used methods for sampling snake assemblages in a semiarid area in Brazil. We compared the efficacy of each method based on the cost-benefit regarding the number of individuals and species captured, time, and financial investment. We found that pitfall traps were the less effective method in all aspects that were evaluated and it was not complementary to the other methods in terms of abundance of species and assemblage structure. We conclude that methods can only be considered complementary if they are standardized to the objectives of the study. The use of pitfall traps in short-term surveys of the snake fauna in areas with shrubby vegetation and stony soil is not recommended.

  14. A Review of Methods for Detecting Melamine in Food Samples.

    PubMed

    Lu, Yang; Xia, Yinqiang; Liu, Guozhen; Pan, Mingfei; Li, Mengjuan; Lee, Nanju Alice; Wang, Shuo

    2017-01-02

    Melamine is a synthetic chemical used in the manufacture of resins, pigments, and superplasticizers. Human beings can be exposed to melamine through various sources such as migration from related products into foods, pesticide contamination, and illegal addition to foods. Toxicity studies suggest that prolonged consumption of melamine could lead to the formation of kidney stones or even death. Therefore, reliable and accurate detection methods are essential to prevent human exposure to melamine. Sample preparation is of critical importance, since it could directly affect the performance of analytical methods. Some methods for the detection of melamine include instrumental analysis, immunoassays, and sensor methods. In this paper, we have summarized the state-of-the-art methods used for food sample preparation as well as the various detection techniques available for melamine. Combinations of multiple techniques and new materials used in the detection of melamine have also been reviewed. Finally, future perspectives on the applications of microfluidic devices have also been provided.

  15. Statistical sampling methods for soils monitoring

    Treesearch

    Ann M. Abbott

    2010-01-01

    Development of the best sampling design to answer a research question should be an interactive venture between the land manager or researcher and statisticians, and is the result of answering various questions. A series of questions that can be asked to guide the researcher in making decisions that will arrive at an effective sampling plan are described, and a case...

  16. Evaluation of Sampling Methods for Bacillus Spore ...

    EPA Pesticide Factsheets

    Journal Article Following a wide area release of biological materials, mapping the extent of contamination is essential for orderly response and decontamination operations. HVAC filters process large volumes of air and therefore collect highly representative particulate samples in buildings. HVAC filter extraction may have great utility in rapidly estimating the extent of building contamination following a large-scale incident. However, until now, no studies have been conducted comparing the two most appropriate sampling approaches for HVAC filter materials: direct extraction and vacuum-based sampling.

  17. Characterizing lentic freshwater fish assemblages using multiple sampling methods.

    PubMed

    Fischer, Jesse R; Quist, Michael C

    2014-07-01

    Characterizing fish assemblages in lentic ecosystems is difficult, and multiple sampling methods are almost always necessary to gain reliable estimates of indices such as species richness. However, most research focused on lentic fish sampling methodology has targeted recreationally important species, and little to no information is available regarding the influence of multiple methods and timing (i.e., temporal variation) on characterizing entire fish assemblages. Therefore, six lakes and impoundments (48-1,557 ha surface area) were sampled seasonally with seven gear types to evaluate the combined influence of sampling methods and timing on the number of species and individuals sampled. Probabilities of detection for species indicated strong selectivities and seasonal trends that provide guidance on optimal seasons to use gears when targeting multiple species. The evaluation of species richness and number of individuals sampled using multiple gear combinations demonstrated that appreciable benefits over relatively few gears (e.g., to four) used in optimal seasons were not present. Specifically, over 90 % of the species encountered with all gear types and season combinations (N = 19) from six lakes and reservoirs were sampled with nighttime boat electrofishing in the fall and benthic trawling, modified-fyke, and mini-fyke netting during the summer. Our results indicated that the characterization of lentic fish assemblages was highly influenced by the selection of sampling gears and seasons, but did not appear to be influenced by waterbody type (i.e., natural lake, impoundment). The standardization of data collected with multiple methods and seasons to account for bias is imperative to monitoring of lentic ecosystems and will provide researchers with increased reliability in their interpretations and decisions made using information on lentic fish assemblages.

  18. Characterizing lentic freshwater fish assemblages using multiple sampling methods

    USGS Publications Warehouse

    Fischer, Jesse R.; Quist, Michael

    2014-01-01

    Characterizing fish assemblages in lentic ecosystems is difficult, and multiple sampling methods are almost always necessary to gain reliable estimates of indices such as species richness. However, most research focused on lentic fish sampling methodology has targeted recreationally important species, and little to no information is available regarding the influence of multiple methods and timing (i.e., temporal variation) on characterizing entire fish assemblages. Therefore, six lakes and impoundments (48–1,557 ha surface area) were sampled seasonally with seven gear types to evaluate the combined influence of sampling methods and timing on the number of species and individuals sampled. Probabilities of detection for species indicated strong selectivities and seasonal trends that provide guidance on optimal seasons to use gears when targeting multiple species. The evaluation of species richness and number of individuals sampled using multiple gear combinations demonstrated that appreciable benefits over relatively few gears (e.g., to four) used in optimal seasons were not present. Specifically, over 90 % of the species encountered with all gear types and season combinations (N = 19) from six lakes and reservoirs were sampled with nighttime boat electrofishing in the fall and benthic trawling, modified-fyke, and mini-fyke netting during the summer. Our results indicated that the characterization of lentic fish assemblages was highly influenced by the selection of sampling gears and seasons, but did not appear to be influenced by waterbody type (i.e., natural lake, impoundment). The standardization of data collected with multiple methods and seasons to account for bias is imperative to monitoring of lentic ecosystems and will provide researchers with increased reliability in their interpretations and decisions made using information on lentic fish assemblages.

  19. Classifying Imbalanced Data Streams via Dynamic Feature Group Weighting with Importance Sampling.

    PubMed

    Wu, Ke; Edwards, Andrea; Fan, Wei; Gao, Jing; Zhang, Kun

    2014-04-01

    Data stream classification and imbalanced data learning are two important areas of data mining research. Each has been well studied to date with many interesting algorithms developed. However, only a few approaches reported in literature address the intersection of these two fields due to their complex interplay. In this work, we proposed an importance sampling driven, dynamic feature group weighting framework (DFGW-IS) for classifying data streams of imbalanced distribution. Two components are tightly incorporated into the proposed approach to address the intrinsic characteristics of concept-drifting, imbalanced streaming data. Specifically, the ever-evolving concepts are tackled by a weighted ensemble trained on a set of feature groups with each sub-classifier (i.e. a single classifier or an ensemble) weighed by its discriminative power and stable level. The un-even class distribution, on the other hand, is typically battled by the sub-classifier built in a specific feature group with the underlying distribution rebalanced by the importance sampling technique. We derived the theoretical upper bound for the generalization error of the proposed algorithm. We also studied the empirical performance of our method on a set of benchmark synthetic and real world data, and significant improvement has been achieved over the competing algorithms in terms of standard evaluation metrics and parallel running time. Algorithm implementations and datasets are available upon request.

  20. Classifying Imbalanced Data Streams via Dynamic Feature Group Weighting with Importance Sampling

    PubMed Central

    Wu, Ke; Edwards, Andrea; Fan, Wei; Gao, Jing; Zhang, Kun

    2014-01-01

    Data stream classification and imbalanced data learning are two important areas of data mining research. Each has been well studied to date with many interesting algorithms developed. However, only a few approaches reported in literature address the intersection of these two fields due to their complex interplay. In this work, we proposed an importance sampling driven, dynamic feature group weighting framework (DFGW-IS) for classifying data streams of imbalanced distribution. Two components are tightly incorporated into the proposed approach to address the intrinsic characteristics of concept-drifting, imbalanced streaming data. Specifically, the ever-evolving concepts are tackled by a weighted ensemble trained on a set of feature groups with each sub-classifier (i.e. a single classifier or an ensemble) weighed by its discriminative power and stable level. The un-even class distribution, on the other hand, is typically battled by the sub-classifier built in a specific feature group with the underlying distribution rebalanced by the importance sampling technique. We derived the theoretical upper bound for the generalization error of the proposed algorithm. We also studied the empirical performance of our method on a set of benchmark synthetic and real world data, and significant improvement has been achieved over the competing algorithms in terms of standard evaluation metrics and parallel running time. Algorithm implementations and datasets are available upon request. PMID:25568835

  1. The experience sampling method: Investigating students' affective experience

    NASA Astrophysics Data System (ADS)

    Nissen, Jayson M.; Stetzer, MacKenzie R.; Shemwell, Jonathan T.

    2013-01-01

    Improving non-cognitive outcomes such as attitudes, efficacy, and persistence in physics courses is an important goal of physics education. This investigation implemented an in-the-moment surveying technique called the Experience Sampling Method (ESM) [1] to measure students' affective experience in physics. Measurements included: self-efficacy, cognitive efficiency, activation, intrinsic motivation, and affect. Data are presented that show contrasts in students' experiences (e.g., in physics vs. non-physics courses).

  2. Improved importance sampling for Monte Carlo simulation of time-domain optical coherence tomography

    PubMed Central

    Lima, Ivan T.; Kalra, Anshul; Sherif, Sherif S.

    2011-01-01

    We developed an importance sampling based method that significantly speeds up the calculation of the diffusive reflectance due to ballistic and to quasi-ballistic components of photons scattered in turbid media: Class I diffusive reflectance. These components of scattered photons make up the signal in optical coherence tomography (OCT) imaging. We show that the use of this method reduces the computation time of this diffusive reflectance in time-domain OCT by up to three orders of magnitude when compared with standard Monte Carlo simulation. Our method does not produce a systematic bias in the statistical result that is typically observed in existing methods to speed up Monte Carlo simulations of light transport in tissue. This fast Monte Carlo calculation of the Class I diffusive reflectance can be used as a tool to further study the physical process governing OCT signals, e.g., obtain the statistics of the depth-scan, including the effects of multiple scattering of light, in OCT. This is an important prerequisite to future research to increase penetration depth and to improve image extraction in OCT. PMID:21559120

  3. 40 CFR 80.335 - What gasoline sample retention requirements apply to refiners and importers?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 16 2010-07-01 2010-07-01 false What gasoline sample retention... PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) REGULATION OF FUELS AND FUEL ADDITIVES Gasoline Sulfur Sampling, Testing and Retention Requirements for Refiners and Importers § 80.335 What gasoline sample...

  4. 40 CFR 80.335 - What gasoline sample retention requirements apply to refiners and importers?

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 17 2012-07-01 2012-07-01 false What gasoline sample retention... PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) REGULATION OF FUELS AND FUEL ADDITIVES Gasoline Sulfur Sampling, Testing and Retention Requirements for Refiners and Importers § 80.335 What gasoline sample...

  5. 40 CFR 80.335 - What gasoline sample retention requirements apply to refiners and importers?

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 17 2014-07-01 2014-07-01 false What gasoline sample retention... PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) REGULATION OF FUELS AND FUEL ADDITIVES Gasoline Sulfur Sampling, Testing and Retention Requirements for Refiners and Importers § 80.335 What gasoline sample...

  6. 40 CFR 80.335 - What gasoline sample retention requirements apply to refiners and importers?

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 17 2013-07-01 2013-07-01 false What gasoline sample retention... PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) REGULATION OF FUELS AND FUEL ADDITIVES Gasoline Sulfur Sampling, Testing and Retention Requirements for Refiners and Importers § 80.335 What gasoline sample...

  7. Examination of Hydrate Formation Methods: Trying to Create Representative Samples

    SciTech Connect

    Kneafsey, T.J.; Rees, E.V.L.; Nakagawa, S.; Kwon, T.-H.

    2011-04-01

    Forming representative gas hydrate-bearing laboratory samples is important so that the properties of these materials may be measured, while controlling the composition and other variables. Natural samples are rare, and have often experienced pressure and temperature changes that may affect the property to be measured [Waite et al., 2008]. Forming methane hydrate samples in the laboratory has been done a number of ways, each having advantages and disadvantages. The ice-to-hydrate method [Stern et al., 1996], contacts melting ice with methane at the appropriate pressure to form hydrate. The hydrate can then be crushed and mixed with mineral grains under controlled conditions, and then compacted to create laboratory samples of methane hydrate in a mineral medium. The hydrate in these samples will be part of the load-bearing frame of the medium. In the excess gas method [Handa and Stupin, 1992], water is distributed throughout a mineral medium (e.g. packed moist sand, drained sand, moistened silica gel, other porous media) and the mixture is brought to hydrate-stable conditions (chilled and pressurized with gas), allowing hydrate to form. This method typically produces grain-cementing hydrate from pendular water in sand [Waite et al., 2004]. In the dissolved gas method [Tohidi et al., 2002], water with sufficient dissolved guest molecules is brought to hydrate-stable conditions where hydrate forms. In the laboratory, this is can be done by pre-dissolving the gas of interest in water and then introducing it to the sample under the appropriate conditions. With this method, it is easier to form hydrate from more soluble gases such as carbon dioxide. It is thought that this method more closely simulates the way most natural gas hydrate has formed. Laboratory implementation, however, is difficult, and sample formation is prohibitively time consuming [Minagawa et al., 2005; Spangenberg and Kulenkampff, 2005]. In another version of this technique, a specified quantity of gas

  8. System and method for extracting a sample from a surface

    DOEpatents

    Van Berkel, Gary; Covey, Thomas

    2015-06-23

    A system and method is disclosed for extracting a sample from a sample surface. A sample is provided and a sample surface receives the sample which is deposited on the sample surface. A hydrophobic material is applied to the sample surface, and one or more devices are configured to dispense a liquid on the sample, the liquid dissolving the sample to form a dissolved sample material, and the one or more devices are configured to extract the dissolved sample material from the sample surface.

  9. Rapid detection and differentiation of important Campylobacter spp. in poultry samples by dot blot and PCR.

    PubMed

    Fontanot, Marco; Iacumin, Lucilla; Cecchini, Francesca; Comi, Giuseppe; Manzano, Marisa

    2014-10-01

    The detection of Campylobacter, the most commonly reported cause of foodborne gastroenteritis in the European Union, is very important for human health. The most commonly recognised risk factor for infection is the handling and/or consumption of undercooked poultry meat. The methods typically applied to evaluate the presence/absence of Campylobacter in food samples are direct plating and/or enrichment culture based on the Horizontal Method for Detection and Enumeration of Campylobacter spp. (ISO 10272-1B: 2006) and PCR. Molecular methods also allow for the detection of cells that are viable but cannot be cultivated on agar media and that decrease the time required for species identification. The current study proposes the use of two molecular methods for species identification: dot blot and PCR. The dot blot method had a sensitivity of 25 ng for detection of DNA extracted from a pure culture using a digoxigenin-labelled probe for hybridisation; the target DNA was extracted from the enrichment broth at 24 h. PCR was performed using a pair of sensitive and specific primers for the detection of Campylobacter jejuni and Campylobacter coli after 24 h of enrichment in Preston broth. The initial samples were contaminated by 5 × 10 C. jejuni cells/g and 1.5 × 10(2)C. coli cells/g, thus the number of cells present in the enrichment broth at 0 h was 1 or 3 cell/g, respectively.

  10. A new approach to importance sampling for the simulation of false alarms. [in radar systems

    NASA Technical Reports Server (NTRS)

    Lu, D.; Yao, K.

    1987-01-01

    In this paper a modified importance sampling technique for improving the convergence of Importance Sampling is given. By using this approach to estimate low false alarm rates in radar simulations, the number of Monte Carlo runs can be reduced significantly. For one-dimensional exponential, Weibull, and Rayleigh distributions, a uniformly minimum variance unbiased estimator is obtained. For Gaussian distribution the estimator in this approach is uniformly better than that of previously known Importance Sampling approach. For a cell averaging system, by combining this technique and group sampling, the reduction of Monte Carlo runs for a reference cell of 20 and false alarm rate of lE-6 is on the order of 170 as compared to the previously known Importance Sampling approach.

  11. A new approach to importance sampling for the simulation of false alarms. [in radar systems

    NASA Technical Reports Server (NTRS)

    Lu, D.; Yao, K.

    1987-01-01

    In this paper a modified importance sampling technique for improving the convergence of Importance Sampling is given. By using this approach to estimate low false alarm rates in radar simulations, the number of Monte Carlo runs can be reduced significantly. For one-dimensional exponential, Weibull, and Rayleigh distributions, a uniformly minimum variance unbiased estimator is obtained. For Gaussian distribution the estimator in this approach is uniformly better than that of previously known Importance Sampling approach. For a cell averaging system, by combining this technique and group sampling, the reduction of Monte Carlo runs for a reference cell of 20 and false alarm rate of lE-6 is on the order of 170 as compared to the previously known Importance Sampling approach.

  12. 7 CFR 58.245 - Method of sample analysis.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... Procedures § 58.245 Method of sample analysis. Samples shall be tested according to the applicable methods of... 7 Agriculture 3 2010-01-01 2010-01-01 false Method of sample analysis. 58.245 Section 58.245... Service, Dairy Programs, or Official Methods of Analysis of the Association of Analytical Chemists or...

  13. 7 CFR 58.812 - Methods of sample analysis.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... Procedures § 58.812 Methods of sample analysis. Samples shall be tested according to the applicable methods... 7 Agriculture 3 2010-01-01 2010-01-01 false Methods of sample analysis. 58.812 Section 58.812... Marketing Service, Dairy Programs, or the Official Methods of Analysis of the Association of Official...

  14. GeoLab Concept: The Importance of Sample Selection During Long Duration Human Exploration Mission

    NASA Technical Reports Server (NTRS)

    Calaway, M. J.; Evans, C. A.; Bell, M. S.; Graff, T. G.

    2011-01-01

    In the future when humans explore planetary surfaces on the Moon, Mars, and asteroids or beyond, the return of geologic samples to Earth will be a high priority for human spaceflight operations. All future sample return missions will have strict down-mass and volume requirements; methods for in-situ sample assessment and prioritization will be critical for selecting the best samples for return-to-Earth.

  15. The jigsaw puzzle of sequence phenotype inference: Piecing together Shannon entropy, importance sampling, and Empirical Bayes

    PubMed Central

    Shreif, Zeina; Striegel, Deborah A.

    2015-01-01

    A nucleotide sequence 35 base pairs long can take 1,180,591,620,717,411,303,424 possible values. An example of systems biology datasets, protein binding microarrays, contain activity data from about 40000 such sequences. The discrepancy between the number of possible configurations and the available activities is enormous. Thus, albeit that systems biology datasets are large in absolute terms, they oftentimes require methods developed for rare events due to the combinatorial increase in the number of possible configurations of biological systems. A plethora of techniques for handling large datasets, such as Empirical Bayes, or rare events, such as importance sampling, have been developed in the literature, but these cannot always be simultaneously utilized. Here we introduce a principled approach to Empirical Bayes based on importance sampling, information theory, and theoretical physics in the general context of sequence phenotype model induction. We present the analytical calculations that underlie our approach. We demonstrate the computational efficiency of the approach on concrete examples, and demonstrate its efficacy by applying the theory to publicly available protein binding microarray transcription factor datasets and to data on synthetic cAMP-regulated enhancer sequences. As further demonstrations, we find transcription factor binding motifs, predict the activity of new sequences and extract the locations of transcription factor binding sites. In summary, we present a novel method that is efficient (requiring minimal computational time and reasonable amounts of memory), has high predictive power that is comparable with that of models with hundreds of parameters, and has a limited number of optimized parameters, proportional to the sequence length. PMID:26092377

  16. The jigsaw puzzle of sequence phenotype inference: Piecing together Shannon entropy, importance sampling, and Empirical Bayes.

    PubMed

    Shreif, Zeina; Striegel, Deborah A; Periwal, Vipul

    2015-09-07

    A nucleotide sequence 35 base pairs long can take 1,180,591,620,717,411,303,424 possible values. An example of systems biology datasets, protein binding microarrays, contain activity data from about 40,000 such sequences. The discrepancy between the number of possible configurations and the available activities is enormous. Thus, albeit that systems biology datasets are large in absolute terms, they oftentimes require methods developed for rare events due to the combinatorial increase in the number of possible configurations of biological systems. A plethora of techniques for handling large datasets, such as Empirical Bayes, or rare events, such as importance sampling, have been developed in the literature, but these cannot always be simultaneously utilized. Here we introduce a principled approach to Empirical Bayes based on importance sampling, information theory, and theoretical physics in the general context of sequence phenotype model induction. We present the analytical calculations that underlie our approach. We demonstrate the computational efficiency of the approach on concrete examples, and demonstrate its efficacy by applying the theory to publicly available protein binding microarray transcription factor datasets and to data on synthetic cAMP-regulated enhancer sequences. As further demonstrations, we find transcription factor binding motifs, predict the activity of new sequences and extract the locations of transcription factor binding sites. In summary, we present a novel method that is efficient (requiring minimal computational time and reasonable amounts of memory), has high predictive power that is comparable with that of models with hundreds of parameters, and has a limited number of optimized parameters, proportional to the sequence length. Published by Elsevier Ltd.

  17. Blood Sampling Seasonality as an Important Preanalytical Factor for Assessment of Vitamin D Status

    PubMed Central

    Bonelli, Patrizia; Buonocore, Ruggero; Aloe, Rosalia

    2016-01-01

    Summary Background The measurement of vitamin D is now commonplace for preventing osteoporosis and restoring an appropriate concentration that would be effective to counteract the occurrence of other human disorders. The aim of this study was to establish whether blood sampling seasonality may influence total vitamin D concentration in a general population of Italian unselected outpatients. Methods We performed a retrospective search in the laboratory information system of the University Hospital of Parma (Italy, temperate climate), to identify the values of total serum vitamin D (25-hydroxyvitamin D) measured in outpatients aged 18 years and older, who were referred for routine health check-up during the entire year 2014. Results The study population consisted in 11,150 outpatients (median age 62 years; 8592 women and 2558 men). The concentration of vitamin D was consistently lower in samples collected in Winter than in the other three seasons. The frequency of subjects with vitamin D deficiency was approximately double in samples drawn in Winter and Spring than in Summer and Autumn. In the multivariate analysis, the concentration of total vitamin D was found to be independently associated with sex and season of blood testing, but not with the age of the patients. Conclusions According to these findings, blood sampling seasonality should be regarded as an important preanalytical factor in vitamin D assessment. It is also reasonable to suggest that the amount of total vitamin D synthesized during the summer should be high enough to maintain the levels > 50 nmol/L throughout the remaining part of the year. PMID:28356869

  18. Exploration and Sampling Methods for Borrow Areas

    DTIC Science & Technology

    1990-12-01

    environmentally soundcoastal project designs. seIsti-.re f t lv nC.CLS- ~ 0. 1 7J.s The current state of knowledge regarding geological indicators of subaqueous...This report discusses the equipment and techniques that are used in coastal marine and lacustrine environments to locate and characterize poten- tial...because the fill material used was unstable in the beach environment and rapidly washed away. More recently, methods for specifying fill material based

  19. Modified electrokinetic sample injection method in chromatography and electrophoresis analysis

    DOEpatents

    Davidson, J. Courtney; Balch, Joseph W.

    2001-01-01

    A sample injection method for horizontal configured multiple chromatography or electrophoresis units, each containing a number of separation/analysis channels, that enables efficient introduction of analyte samples. This method for loading when taken in conjunction with horizontal microchannels allows much reduced sample volumes and a means of sample stacking to greatly reduce the concentration of the sample. This reduction in the amount of sample can lead to great cost savings in sample preparation, particularly in massively parallel applications such as DNA sequencing. The essence of this method is in preparation of the input of the separation channel, the physical sample introduction, and subsequent removal of excess material. By this method, sample volumes of 100 nanoliter to 2 microliters have been used successfully, compared to the typical 5 microliters of sample required by the prior separation/analysis method.

  20. Development of an automated data processing method for sample to sample comparison of seized methamphetamines.

    PubMed

    Choe, Sanggil; Lee, Jaesin; Choi, Hyeyoung; Park, Yujin; Lee, Heesang; Pyo, Jaesung; Jo, Jiyeong; Park, Yonghoon; Choi, Hwakyung; Kim, Suncheun

    2012-11-30

    The information about the sources of supply, trafficking routes, distribution patterns and conspiracy links can be obtained from methamphetamine profiling. The precursor and synthetic method for the clandestine manufacture can be estimated from the analysis of minor impurities contained in methamphetamine. Also, the similarity between samples can be evaluated using the peaks that appear in chromatograms. In South Korea, methamphetamine was the most popular drug but the total seized amount of methamphetamine whole through the country was very small. Therefore, it would be more important to find the links between samples than the other uses of methamphetamine profiling. Many Asian countries including Japan and South Korea have been using the method developed by National Research Institute of Police Science of Japan. The method used gas chromatography-flame ionization detector (GC-FID), DB-5 column and four internal standards. It was developed to increase the amount of impurities and minimize the amount of methamphetamine. After GC-FID analysis, the raw data have to be processed. The data processing steps are very complex and require a lot of time and effort. In this study, Microsoft Visual Basic Application (VBA) modules were developed to handle these data processing steps. This module collected the results from the data into an Excel file and then corrected the retention time shift and response deviation generated from the sample preparation and instruments analysis. The developed modules were tested for their performance using 10 samples from 5 different cases. The processed results were analyzed with Pearson correlation coefficient for similarity assessment and the correlation coefficient of the two samples from the same case was more than 0.99. When the modules were applied to 131 seized methamphetamine samples, four samples from two different cases were found to have the common origin and the chromatograms of the four samples were appeared visually identical

  1. Microfluidic DNA sample preparation method and device

    DOEpatents

    Krulevitch, Peter A.; Miles, Robin R.; Wang, Xiao-Bo; Mariella, Raymond P.; Gascoyne, Peter R. C.; Balch, Joseph W.

    2002-01-01

    Manipulation of DNA molecules in solution has become an essential aspect of genetic analyses used for biomedical assays, the identification of hazardous bacterial agents, and in decoding the human genome. Currently, most of the steps involved in preparing a DNA sample for analysis are performed manually and are time, labor, and equipment intensive. These steps include extraction of the DNA from spores or cells, separation of the DNA from other particles and molecules in the solution (e.g. dust, smoke, cell/spore debris, and proteins), and separation of the DNA itself into strands of specific lengths. Dielectrophoresis (DEP), a phenomenon whereby polarizable particles move in response to a gradient in electric field, can be used to manipulate and separate DNA in an automated fashion, considerably reducing the time and expense involved in DNA analyses, as well as allowing for the miniaturization of DNA analysis instruments. These applications include direct transport of DNA, trapping of DNA to allow for its separation from other particles or molecules in the solution, and the separation of DNA into strands of varying lengths.

  2. Surface Sampling Methods for Bacillus anthracis Spore Contamination

    PubMed Central

    Hein, Misty J.; Taylor, Lauralynn; Curwin, Brian D.; Kinnes, Gregory M.; Seitz, Teresa A.; Popovic, Tanja; Holmes, Harvey T.; Kellum, Molly E.; McAllister, Sigrid K.; Whaley, David N.; Tupin, Edward A.; Walker, Timothy; Freed, Jennifer A.; Small, Dorothy S.; Klusaritz, Brian; Bridges, John H.

    2002-01-01

    During an investigation conducted December 17–20, 2001, we collected environmental samples from a U.S. postal facility in Washington, D.C., known to be extensively contaminated with Bacillus anthracis spores. Because methods for collecting and analyzing B. anthracis spores have not yet been validated, our objective was to compare the relative effectiveness of sampling methods used for collecting spores from contaminated surfaces. Comparison of wipe, wet and dry swab, and HEPA vacuum sock samples on nonporous surfaces indicated good agreement between results with HEPA vacuum and wipe samples. However, results from HEPA vacuum sock and wipe samples agreed poorly with the swab samples. Dry swabs failed to detect spores >75% of the time they were detected by wipe and HEPA vacuum samples. Wipe samples collected after HEPA vacuum samples and HEPA vacuum samples after wipe samples indicated that neither method completely removed spores from the sampled surfaces. PMID:12396930

  3. 7 CFR 58.245 - Method of sample analysis.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 7 Agriculture 3 2013-01-01 2013-01-01 false Method of sample analysis. 58.245 Section 58.245... Procedures § 58.245 Method of sample analysis. Samples shall be tested according to the applicable methods of laboratory analysis contained in either DA Instruction 918-RL as issued by the USDA, Agricultural Marketing...

  4. 7 CFR 58.812 - Methods of sample analysis.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 7 Agriculture 3 2013-01-01 2013-01-01 false Methods of sample analysis. 58.812 Section 58.812... Procedures § 58.812 Methods of sample analysis. Samples shall be tested according to the applicable methods of laboratory analysis contained in either DA Instruction 918-RL, as issued by the USDA, Agricultural...

  5. 40 CFR 80.335 - What gasoline sample retention requirements apply to refiners and importers?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 16 2011-07-01 2011-07-01 false What gasoline sample retention... PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) REGULATION OF FUELS AND FUEL ADDITIVES Gasoline Sulfur Sampling, Testing and Retention Requirements for Refiners and Importers § 80.335 What gasoline...

  6. Molecular method for the diagnosis of imported pediatric malaria.

    PubMed

    Delhaes Jeanne, L; Berry, A; Dutoit, E; Leclerc, F; Beaudou, J; Leteurtre, S; Camus, D; Benoit-Vical, F

    2010-02-01

    Malaria is a polymorphous disease; it can be life threatening especially for children. We report a case of imported malaria in a boy, illustrating the epidemiological and clinical aspects of severe pediatric malaria. In this case real-time PCR was used to quantify Plasmodium falciparum DNA levels, to monitor the evolution under treatment, and to determine genetic mutations involved in chloroquine resistance. The major epidemiological features of imported malaria, and the difficulty to diagnose childhood severe malaria are described. The contribution of molecular methods for the diagnosis of imported malaria is discussed.

  7. Method for sampling sub-micron particles

    DOEpatents

    Gay, Don D.; McMillan, William G.

    1985-01-01

    Apparatus and method steps for collecting sub-micron sized particles include a collection chamber and cryogenic cooling. The cooling is accomplished by coil tubing carrying nitrogen in liquid form, with the liquid nitrogen changing to the gas phase before exiting from the collection chamber in the tubing. Standard filters are used to filter out particles of diameter greater than or equal to 0.3 microns; however the present invention is used to trap particles of less than 0.3 micron in diameter. A blower draws air to said collection chamber through a filter which filters particles with diameters greater than or equal to 0.3 micron. The air is then cryogenically cooled so that moisture and sub-micron sized particles in the air condense into ice on the coil. The coil is then heated so that the ice melts, and the liquid is then drawn off and passed through a Buchner funnel where the liquid is passed through a Nuclepore membrane. A vacuum draws the liquid through the Nuclepore membrane, with the Nuclepore membrane trapping sub-micron sized particles therein. The Nuclepore membrane is then covered on its top and bottom surfaces with sheets of Mylar.RTM. and the assembly is then crushed into a pellet. This effectively traps the sub-micron sized particles for later analysis.

  8. [A membrane filter sampling method for determining microbial air pollution].

    PubMed

    Cherneva, P; Kiranova, A

    1996-01-01

    The method is a contribution in the evaluation of the exposition and the control of the standards for organic powders. The method concerns the sample-taking procedure and the analysis-making technique for determining of the concentration of the microbial pollution of the air. It is based on filtering of some amount of air through a membrane filter which is then processed for cultivating of microbial colonies on its surface. The results are obtained in number of microbial colonies per unit of air. The method presents opportunity to select and vary the filtered volume of air, to determine the respirable fraction, to determine the personal exposition, as well as for the simultaneous determining of the microbial pollution together with other important parameters of the particle pollutants of the air (metal, fibre and others).

  9. A whole-path importance-sampling scheme for Feynman path integral calculations of absolute partition functions and free energies.

    PubMed

    Mielke, Steven L; Truhlar, Donald G

    2016-01-21

    Using Feynman path integrals, a molecular partition function can be written as a double integral with the inner integral involving all closed paths centered at a given molecular configuration, and the outer integral involving all possible molecular configurations. In previous work employing Monte Carlo methods to evaluate such partition functions, we presented schemes for importance sampling and stratification in the molecular configurations that constitute the path centroids, but we relied on free-particle paths for sampling the path integrals. At low temperatures, the path sampling is expensive because the paths can travel far from the centroid configuration. We now present a scheme for importance sampling of whole Feynman paths based on harmonic information from an instantaneous normal mode calculation at the centroid configuration, which we refer to as harmonically guided whole-path importance sampling (WPIS). We obtain paths conforming to our chosen importance function by rejection sampling from a distribution of free-particle paths. Sample calculations on CH4 demonstrate that at a temperature of 200 K, about 99.9% of the free-particle paths can be rejected without integration, and at 300 K, about 98% can be rejected. We also show that it is typically possible to reduce the overhead associated with the WPIS scheme by sampling the paths using a significantly lower-order path discretization than that which is needed to converge the partition function.

  10. Coalescent: an open-science framework for importance sampling in coalescent theory.

    PubMed

    Tewari, Susanta; Spouge, John L

    2015-01-01

    Background. In coalescent theory, computer programs often use importance sampling to calculate likelihoods and other statistical quantities. An importance sampling scheme can exploit human intuition to improve statistical efficiency of computations, but unfortunately, in the absence of general computer frameworks on importance sampling, researchers often struggle to translate new sampling schemes computationally or benchmark against different schemes, in a manner that is reliable and maintainable. Moreover, most studies use computer programs lacking a convenient user interface or the flexibility to meet the current demands of open science. In particular, current computer frameworks can only evaluate the efficiency of a single importance sampling scheme or compare the efficiencies of different schemes in an ad hoc manner. Results. We have designed a general framework (http://coalescent.sourceforge.net; language: Java; License: GPLv3) for importance sampling that computes likelihoods under the standard neutral coalescent model of a single, well-mixed population of constant size over time following infinite sites model of mutation. The framework models the necessary core concepts, comes integrated with several data sets of varying size, implements the standard competing proposals, and integrates tightly with our previous framework for calculating exact probabilities. For a given dataset, it computes the likelihood and provides the maximum likelihood estimate of the mutation parameter. Well-known benchmarks in the coalescent literature validate the accuracy of the framework. The framework provides an intuitive user interface with minimal clutter. For performance, the framework switches automatically to modern multicore hardware, if available. It runs on three major platforms (Windows, Mac and Linux). Extensive tests and coverage make the framework reliable and maintainable. Conclusions. In coalescent theory, many studies of computational efficiency consider only

  11. Coalescent: an open-science framework for importance sampling in coalescent theory

    PubMed Central

    Spouge, John L.

    2015-01-01

    Background. In coalescent theory, computer programs often use importance sampling to calculate likelihoods and other statistical quantities. An importance sampling scheme can exploit human intuition to improve statistical efficiency of computations, but unfortunately, in the absence of general computer frameworks on importance sampling, researchers often struggle to translate new sampling schemes computationally or benchmark against different schemes, in a manner that is reliable and maintainable. Moreover, most studies use computer programs lacking a convenient user interface or the flexibility to meet the current demands of open science. In particular, current computer frameworks can only evaluate the efficiency of a single importance sampling scheme or compare the efficiencies of different schemes in an ad hoc manner. Results. We have designed a general framework (http://coalescent.sourceforge.net; language: Java; License: GPLv3) for importance sampling that computes likelihoods under the standard neutral coalescent model of a single, well-mixed population of constant size over time following infinite sites model of mutation. The framework models the necessary core concepts, comes integrated with several data sets of varying size, implements the standard competing proposals, and integrates tightly with our previous framework for calculating exact probabilities. For a given dataset, it computes the likelihood and provides the maximum likelihood estimate of the mutation parameter. Well-known benchmarks in the coalescent literature validate the accuracy of the framework. The framework provides an intuitive user interface with minimal clutter. For performance, the framework switches automatically to modern multicore hardware, if available. It runs on three major platforms (Windows, Mac and Linux). Extensive tests and coverage make the framework reliable and maintainable. Conclusions. In coalescent theory, many studies of computational efficiency consider only

  12. Methods for characterizing, classifying, and identifying unknowns in samples

    DOEpatents

    Grate, Jay W [West Richland, WA; Wise, Barry M [Manson, WA

    2002-01-01

    Disclosed is a method for taking the data generated from an array of responses from a multichannel instrument, and determining the characteristics of a chemical in the sample without the necessity of calibrating or training the instrument with known samples containing the same chemical. The characteristics determined by the method are then used to classify and identify the chemical in the sample. The method can also be used to quantify the concentration of the chemical in the sample.

  13. Methods for characterizing, classifying, and identifying unknowns in samples

    DOEpatents

    Grate, Jay W.; Wise, Barry M.

    2003-08-12

    Disclosed is a method for taking the data generated from an array of responses from a multichannel instrument, and determining the characteristics of a chemical in the sample without the necessity of calibrating or training the instrument with known samples containing the same chemical. The characteristics determined by the method are then used to classify and identify the chemical in the sample. The method can also be used to quantify the concentration of the chemical in the sample.

  14. Field evaluation of endotoxin air sampling assay methods.

    PubMed

    Thorne, P S; Reynolds, S J; Milton, D K; Bloebaum, P D; Zhang, X; Whitten, P; Burmeister, L F

    1997-11-01

    This study tested the importance of filter media, extraction and assay protocol, and bioaerosol source on the determination of endotoxin under field conditions in swine and poultry confinement buildings. Multiple simultaneous air samples were collected using glass fiber (GF) and polycarbonate (PC) filters, and these were assayed using two methods in two separate laboratories: an endpoint chromogenic Limulus amebocyte lysate (LAL) assay (QCL) performed in water and a kinetic chromogenic LAL assay (KQCL) performed in buffer with resistant-parallel line estimation analysis (KLARE). In addition, two aqueous filter extraction methods were compared in the QCL assay: 120 min extraction at 22 degrees C with vigorous shaking and 30 min extraction at 68 degrees C with gentle rocking. These extraction methods yielded endotoxin activities that were not significantly different and were very highly correlated. Reproducibility of endotoxin determinations from duplicate air sampling filters was very high (Cronbach alpha all > 0.94). When analyzed by the QCL method GF filters yielded significantly higher endotoxin activity than PC filters. QCL and KLARE methods gave similar estimates for endotoxin activity from PC filters; however, GF filters analyzed by the QCL method yielded significantly higher endotoxin activity estimates, suggesting enhancement of the QCL assay or inhibition of the KLARE asay with GF filters. Correlation between QCL-GF and QCL-PC was high (r = 0.98) while that between KLARE-GF and KLARE-PC was moderate (r = 0.68). Analysis of variance demonstrated that assay methodology, filter-type, barn-type, and interactions between assay and filter-type and between assay and barn-type were important factors influencing endotoxin exposure assessment.

  15. [Wound microbial sampling methods in surgical practice, imprint techniques].

    PubMed

    Chovanec, Z; Veverková, L; Votava, M; Svoboda, J; Peštál, A; Doležel, J; Jedlička, V; Veselý, M; Wechsler, J; Čapov, I

    2012-12-01

    The wound is a damage of tissue. The process of healing is influenced by many systemic and local factors. The most crucial and the most discussed local factor of wound healing is infection. Surgical site infection in the wound is caused by micro-organisms. This information is known for many years, however the conditions leading to an infection occurrence have not been sufficiently described yet. Correct sampling technique, correct storage, transportation, evaluation, and valid interpretation of these data are very important in clinical practice. There are many methods for microbiological sampling, but the best one has not been yet identified and validated. We aim to discuss the problem with the focus on the imprint technique.

  16. Evaluation of sample preservation methods for poultry manure.

    PubMed

    Pan, J; Fadel, J G; Zhang, R; El-Mashad, H M; Ying, Y; Rumsey, T

    2009-08-01

    When poultry manure is collected but cannot be analyzed immediately, a method for storing the manure is needed to ensure accurate subsequent analyses. This study has 3 objectives: (1) to investigate effects of 4 poultry manure sample preservation methods (refrigeration, freezing, acidification, and freeze-drying) on the compositional characteristics of poultry manure; (2) to determine compositional differences in fresh manure with manure samples at 1, 2, and 3 d of accumulation under bird cages; and (3) to assess the influence of 14-d freezing storage on the composition of manure when later exposed to 25 degrees C for 7 d as compared with fresh manure. All manure samples were collected from a layer house. Analyses performed on the manure samples included total Kjeldahl nitrogen, uric acid nitrogen, ammonia nitrogen, and urea nitrogen. In experiment 1, the storage methods most similar to fresh manure, in order of preference, were freezing, freeze-drying, acidification, and refrigeration. Thoroughly mixing manure samples and compressing them to 2 to 3 mm is important for the freezing and freeze-dried samples. In general, refrigeration was found unacceptable for nitrogen analyses. A significant effect (P < 0.0001) of time for refrigeration was found on uric acid nitrogen and ammonia nitrogen. In experiment 2, the total Kjeldahl nitrogen and uric acid nitrogen were significantly lower (P < 0.05) for 1, 2, and 3 d of accumulation compared with fresh manure. Manure after 1, 2, and 3 d of accumulation had similar nitrogen compositions. The results from experiment 3 show that nitrogen components from fresh manure samples and thawed samples from 14 d of freezing are similar at 7 d but high variability of nitrogen compositions during intermediate times from 0 to 7 d prevents the recommendation of freezing manure for use in subsequent experiments and warrants future experimentation. In conclusion, fresh poultry manure can be frozen for accurate subsequent nitrogen

  17. 19 CFR 151.83 - Method of sampling.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 19 Customs Duties 2 2010-04-01 2010-04-01 false Method of sampling. 151.83 Section 151.83 Customs Duties U.S. CUSTOMS AND BORDER PROTECTION, DEPARTMENT OF HOMELAND SECURITY; DEPARTMENT OF THE TREASURY (CONTINUED) EXAMINATION, SAMPLING, AND TESTING OF MERCHANDISE Cotton § 151.83 Method of sampling. For...

  18. 7 CFR 29.110 - Method of sampling.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 7 Agriculture 2 2014-01-01 2014-01-01 false Method of sampling. 29.110 Section 29.110 Agriculture Regulations of the Department of Agriculture AGRICULTURAL MARKETING SERVICE (Standards, Inspections, Marketing... INSPECTION Regulations Inspectors, Samplers, and Weighers § 29.110 Method of sampling. In sampling tobacco...

  19. 7 CFR 29.110 - Method of sampling.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 7 Agriculture 2 2013-01-01 2013-01-01 false Method of sampling. 29.110 Section 29.110 Agriculture Regulations of the Department of Agriculture AGRICULTURAL MARKETING SERVICE (Standards, Inspections, Marketing... INSPECTION Regulations Inspectors, Samplers, and Weighers § 29.110 Method of sampling. In sampling tobacco...

  20. 19 CFR 151.83 - Method of sampling.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 19 Customs Duties 2 2012-04-01 2012-04-01 false Method of sampling. 151.83 Section 151.83 Customs Duties U.S. CUSTOMS AND BORDER PROTECTION, DEPARTMENT OF HOMELAND SECURITY; DEPARTMENT OF THE TREASURY (CONTINUED) EXAMINATION, SAMPLING, AND TESTING OF MERCHANDISE Cotton § 151.83 Method of sampling. For...

  1. 19 CFR 151.83 - Method of sampling.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 19 Customs Duties 2 2014-04-01 2014-04-01 false Method of sampling. 151.83 Section 151.83 Customs Duties U.S. CUSTOMS AND BORDER PROTECTION, DEPARTMENT OF HOMELAND SECURITY; DEPARTMENT OF THE TREASURY (CONTINUED) EXAMINATION, SAMPLING, AND TESTING OF MERCHANDISE Cotton § 151.83 Method of sampling. For...

  2. 7 CFR 29.110 - Method of sampling.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 7 Agriculture 2 2012-01-01 2012-01-01 false Method of sampling. 29.110 Section 29.110 Agriculture Regulations of the Department of Agriculture AGRICULTURAL MARKETING SERVICE (Standards, Inspections, Marketing... INSPECTION Regulations Inspectors, Samplers, and Weighers § 29.110 Method of sampling. In sampling tobacco...

  3. 7 CFR 29.110 - Method of sampling.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 7 Agriculture 2 2011-01-01 2011-01-01 false Method of sampling. 29.110 Section 29.110 Agriculture Regulations of the Department of Agriculture AGRICULTURAL MARKETING SERVICE (Standards, Inspections, Marketing... INSPECTION Regulations Inspectors, Samplers, and Weighers § 29.110 Method of sampling. In sampling tobacco...

  4. 7 CFR 29.110 - Method of sampling.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 2 2010-01-01 2010-01-01 false Method of sampling. 29.110 Section 29.110 Agriculture Regulations of the Department of Agriculture AGRICULTURAL MARKETING SERVICE (Standards, Inspections, Marketing... INSPECTION Regulations Inspectors, Samplers, and Weighers § 29.110 Method of sampling. In sampling tobacco...

  5. Sediment sampling and processing methods in Hungary, and possible improvements

    NASA Astrophysics Data System (ADS)

    Tamas, Eniko Anna; Koch, Daniel; Varga, Gyorgy

    2016-04-01

    The importance of the monitoring of sediment processes is unquestionable: sediment balance of regulated rivers suffered substantial alterations in the past century, affecting navigation, energy production, fish habitats and floodplain ecosystems alike; infiltration times to our drinking water wells have shortened, exposing them to an eventual pollution event and making them vulnerable; and sediment-attached contaminants accumulate in floodplains and reservoirs, threatening our healthy environment. The changes in flood characteristics and rating curves of our rivers are regularly being researched and described, involving state-of-the-art measurement methods, modeling tools and traditional statistics. Sediment processes however, are much less known. Unlike the investigation of flow processes, sediment-related research is scarce, which is partly due to the outdated methodology and poor database background in the specific field. Sediment-related data, information and analyses form an important and integral part of Civil engineering in relation to rivers all over the world. In relation to the second largest river of Europe, the Danube, it is widely known in expert community and for long discussed at different expert forums that the sediment balance of the river Danube has changed drastically over the past century. Sediment monitoring on the river Danube started as early as the end of the 19th century, with scattered measurements carried out. Regular sediment sampling was developed in the first half of the 20th century all along the river, with different station density and monitoring frequencies in different countries. After the first few decades of regular sampling, the concept of (mainly industrial) development changed along the river and data needs changed as well, furthermore the complicated and inexact methods of sampling bed load on the alluvial reach of the river were not developed further. Frequency of suspended sediment sampling is very low along the river

  6. Minimally important difference estimates and methods: a protocol

    PubMed Central

    Johnston, Bradley C; Ebrahim, Shanil; Carrasco-Labra, Alonso; Furukawa, Toshi A; Patrick, Donald L; Crawford, Mark W; Hemmelgarn, Brenda R; Schunemann, Holger J; Guyatt, Gordon H; Nesrallah, Gihad

    2015-01-01

    Introduction Patient-reported outcomes (PROs) are often the outcomes of greatest importance to patients. The minimally important difference (MID) provides a measure of the smallest change in the PRO that patients perceive as important. An anchor-based approach is the most appropriate method for MID determination. No study or database currently exists that provides all anchor-based MIDs associated with PRO instruments; nor are there any accepted standards for appraising the credibility of MID estimates. Our objectives are to complete a systematic survey of the literature to collect and characterise published anchor-based MIDs associated with PRO instruments used in evaluating the effects of interventions on chronic medical and psychiatric conditions and to assess their credibility. Methods and analysis We will search MEDLINE, EMBASE and PsycINFO (1989 to present) to identify studies addressing methods to estimate anchor-based MIDs of target PRO instruments or reporting empirical ascertainment of anchor-based MIDs. Teams of two reviewers will screen titles and abstracts, review full texts of citations, and extract relevant data. On the basis of findings from studies addressing methods to estimate anchor-based MIDs, we will summarise the available methods and develop an instrument addressing the credibility of empirically ascertained MIDs. We will evaluate the credibility of all studies reporting on the empirical ascertainment of anchor-based MIDs using the credibility instrument, and assess the instrument's inter-rater reliability. We will separately present reports for adult and paediatric populations. Ethics and dissemination No research ethics approval was required as we will be using aggregate data from published studies. Our work will summarise anchor-based methods available to establish MIDs, provide an instrument to assess the credibility of available MIDs, determine the reliability of that instrument, and provide a comprehensive compendium of published anchor

  7. Cool walking: a new Markov chain Monte Carlo sampling method.

    PubMed

    Brown, Scott; Head-Gordon, Teresa

    2003-01-15

    Effective relaxation processes for difficult systems like proteins or spin glasses require special simulation techniques that permit barrier crossing to ensure ergodic sampling. Numerous adaptations of the venerable Metropolis Monte Carlo (MMC) algorithm have been proposed to improve its sampling efficiency, including various hybrid Monte Carlo (HMC) schemes, and methods designed specifically for overcoming quasi-ergodicity problems such as Jump Walking (J-Walking), Smart Walking (S-Walking), Smart Darting, and Parallel Tempering. We present an alternative to these approaches that we call Cool Walking, or C-Walking. In C-Walking two Markov chains are propagated in tandem, one at a high (ergodic) temperature and the other at a low temperature. Nonlocal trial moves for the low temperature walker are generated by first sampling from the high-temperature distribution, then performing a statistical quenching process on the sampled configuration to generate a C-Walking jump move. C-Walking needs only one high-temperature walker, satisfies detailed balance, and offers the important practical advantage that the high and low-temperature walkers can be run in tandem with minimal degradation of sampling due to the presence of correlations. To make the C-Walking approach more suitable to real problems we decrease the required number of cooling steps by attempting to jump at intermediate temperatures during cooling. We further reduce the number of cooling steps by utilizing "windows" of states when jumping, which improves acceptance ratios and lowers the average number of cooling steps. We present C-Walking results with comparisons to J-Walking, S-Walking, Smart Darting, and Parallel Tempering on a one-dimensional rugged potential energy surface in which the exact normalized probability distribution is known. C-Walking shows superior sampling as judged by two ergodic measures.

  8. Sampling methods in Clinical Research; an Educational Review.

    PubMed

    Elfil, Mohamed; Negida, Ahmed

    2017-01-01

    Clinical research usually involves patients with a certain disease or a condition. The generalizability of clinical research findings is based on multiple factors related to the internal and external validity of the research methods. The main methodological issue that influences the generalizability of clinical research findings is the sampling method. In this educational article, we are explaining the different sampling methods in clinical research.

  9. Photoacoustic sample vessel and method of elevated pressure operation

    DOEpatents

    Autrey, Tom; Yonker, Clement R.

    2004-05-04

    An improved photoacoustic vessel and method of photoacoustic analysis. The photoacoustic sample vessel comprises an acoustic detector, an acoustic couplant, and an acoustic coupler having a chamber for holding the acoustic couplant and a sample. The acoustic couplant is selected from the group consisting of liquid, solid, and combinations thereof. Passing electromagnetic energy through the sample generates an acoustic signal within the sample, whereby the acoustic signal propagates through the sample to and through the acoustic couplant to the acoustic detector.

  10. The Importance of Complete Sample Dissolution and Spike-Sample Equilibration on Lu-Hf Isotope Studies

    NASA Astrophysics Data System (ADS)

    Mahlen, N. J.; Beard, B. L.; Johnson, C. M.; Lapen, T. J.

    2005-12-01

    Lu-Hf geochronology has gained attention due to its potential for precisely determining the age of garnet growth in a wide variety of rocks. A unique aspect of Lu-Hf analysis, however, is the disparate chemical behavior of Hf and Lu. For example, Hf is soluble in HF and Lu is soluble in HCl, which can create problems for spike-sample equilibration during dissolution as discussed by Unruh et al. 1984 JGR 89:B459 and later by Beard et al. 1998 GCA 62:525. Although partial dissolution may appear as an attractive means to preferentially dissolve garnet relative to refractory inclusions such as rutile and zircon, our investigations have shown that incomplete spike-sample equilibration may occur in such approaches. This leads to erroneous Lu and Hf contents that can adversely affect Lu-Hf isochron ages and calculated initial Hf isotope compositions. Dissolution of whole-rock powders using hot plates (low-pressure) or short-term microwave dissolution may produce inaccurate Lu-Hf isotope and concentration results, whereas high-temperature and -pressure dissolution in traditional Teflon steel-jacketed (Parr) bombs produces precise and accurate results. The greatest disparity in Lu-Hf isotope and concentration systematics of whole-rock powders among dissolution methods occurs for zircon- and garnet-bearing samples. In contrast, Sm-Nd isotope results are not affected by these different dissolution methods. Lu-Hf isochrons involving garnet may be affected by the dissolution method in a manner similar to that observed for whole-rock powders. Incomplete dissolution of garnet generally increases the measured Lu/Hf ratios, potentially increasing the precision of the isochron. In a number of lithologies, however, including garnet-bearing eclogites and amphibolites, significant errors may be introduced in the Lu-Hf age using hot plates (low-pressure) or short-term microwave dissolution, as compared to those obtained using high-temperature and -pressure dissolution bombs. These

  11. Selective Sampling Importance Resampling Particle Filter Tracking With Multibag Subspace Restoration.

    PubMed

    Jenkins, Mark David; Barrie, Peter; Buggy, Tom; Morison, Gordon

    2016-12-08

    The focus of this paper is a novel object tracking algorithm which combines an incrementally updated subspace-based appearance model, reconstruction error likelihood function and a two stage selective sampling importance resampling particle filter with motion estimation through autoregressive filtering techniques. The primary contribution of this paper is the use of multiple bags of subspaces with which we aim to tackle the issue of appearance model update. The use of a multibag approach allows our algorithm to revert to a previously successful appearance model in the event that the primary model fails. The aim of this is to eliminate tracker drift by undoing updates to the model that lead to error accumulation and to redetect targets after periods of occlusion by removing the subspace updates carried out during the period of occlusion. We compare our algorithm with several state-of-the-art methods and test on a range of challenging, publicly available image sequences. Our findings indicate a significant robustness to drift and occlusion as a result of our multibag approach and results show that our algorithm competes well with current state-of-the-art algorithms.

  12. Systems and methods for self-synchronized digital sampling

    NASA Technical Reports Server (NTRS)

    Samson, Jr., John R. (Inventor)

    2008-01-01

    Systems and methods for self-synchronized data sampling are provided. In one embodiment, a system for capturing synchronous data samples is provided. The system includes an analog to digital converter adapted to capture signals from one or more sensors and convert the signals into a stream of digital data samples at a sampling frequency determined by a sampling control signal; and a synchronizer coupled to the analog to digital converter and adapted to receive a rotational frequency signal from a rotating machine, wherein the synchronizer is further adapted to generate the sampling control signal, and wherein the sampling control signal is based on the rotational frequency signal.

  13. Fe(3+)-Fe(2+) transformation method: an important antioxidant assay.

    PubMed

    Gülçin, İlhami

    2015-01-01

    If we look at the multitude of varied and interesting reaction that constitute biochemistry and bioorganic chemistry, it is possible to classify a great many as either oxidation or reduction reactions. The reducing agent transfers electrons to another substance and is thus it oxidized. And, because it gives electrons, it is also called an electron donor. Electron donors can also form charge transfer complexes with electron acceptors. Reductants in biochemistry are very diverse. For example ferric ions (Fe(3+)) are good reducing agents. Also, different bioanalytical reduction methods are available such as Fe(3+)-ferrous ions (Fe(2+)) reduction method, ferric reducing antioxidant power reducing assay. In this section, Fe(3+)-Fe(2+) transformation will be discussed. Recently there has been growing interest in research into the role of plant-derived antioxidants in food and human health. The beneficial influence of many foodstuffs and beverages including fruits, vegetables, tea, coffee, and cacao on human health has been recently recognized to originate from their antioxidant activity. For this purpose, the most commonly method used in vitro determination of reducing capacity of pure food constituents or plant extracts is Fe(3+) reducing ability. This commonly used reducing power method is reviewed and presented in this study. Also, the general chemistry underlying this assay was clarified. Hence, this overview provides a basis and rationale for developing standardized antioxidant capacity methods for the food, nutraceutical, and dietary supplement industries. In addition, the most important advantages of this method were detected and highlighted. The chemical principles of these methods are outlined and critically discussed. The chemical principles of methods of Fe(3+)-Fe(2+) transformation assay are outlined and critically discussed.

  14. A field test of cut-off importance sampling for bole volume

    Treesearch

    Jeffrey H. Gove; Harry T. Valentine; Michael J. Holmes

    2000-01-01

    Cut-off importance sampling has recently been introduced as a technique for estimating bole volume to some point below the tree tip, termed the cut-off point. A field test of this technique was conducted on a small population of eastern white pine trees using dendrometry as the standard for volume estimation. Results showed that the differences in volume estimates...

  15. Importance Sampling in the Evaluation and Optimization of Buffered Failure Probability

    DTIC Science & Technology

    2015-07-01

    Importance Sampling in the Evaluation and Optimization of Buffered Failure Probability Marwan M. Harajli Graduate Student, Dept. of Civil and Environ...Seattle, USA Johannes O. Royset Associate Professor, Operations Research Dept., Naval Postgraduate School , Monterey, USA ABSTRACT: Engineering design is...criterion is usually the failure probability. In this paper, we examine the buffered failure probability as an attractive alternative to the failure

  16. Large Deviations and Importance Sampling for Systems of Slow-Fast Motion

    SciTech Connect

    Spiliopoulos, Konstantinos

    2013-02-15

    In this paper we develop the large deviations principle and a rigorous mathematical framework for asymptotically efficient importance sampling schemes for general, fully dependent systems of stochastic differential equations of slow and fast motion with small noise in the slow component. We assume periodicity with respect to the fast component. Depending on the interaction of the fast scale with the smallness of the noise, we get different behavior. We examine how one range of interaction differs from the other one both for the large deviations and for the importance sampling. We use the large deviations results to identify asymptotically optimal importance sampling schemes in each case. Standard Monte Carlo schemes perform poorly in the small noise limit. In the presence of multiscale aspects one faces additional difficulties and straightforward adaptation of importance sampling schemes for standard small noise diffusions will not produce efficient schemes. It turns out that one has to consider the so called cell problem from the homogenization theory for Hamilton-Jacobi-Bellman equations in order to guarantee asymptotic optimality. We use stochastic control arguments.

  17. Post awakening salivary cortisol secretion and trait well-being: The importance of sample timing accuracy.

    PubMed

    Smyth, Nina; Thorn, Lisa; Hucklebridge, Frank; Evans, Phil; Clow, Angela

    2015-08-01

    Indices of post awakening cortisol secretion (PACS), include the rise in cortisol (cortisol awakening response: CAR) and overall cortisol concentrations (e.g., area under the curve with reference to ground: AUCg) in the first 30-45 min. Both are commonly investigated in relation to psychosocial variables. Although sampling within the domestic setting is ecologically valid, participant non-adherence to the required timing protocol results in erroneous measurement of PACS and this may explain discrepancies in the literature linking these measures to trait well-being (TWB). We have previously shown that delays of little over 5 min (between awakening and the start of sampling) to result in erroneous CAR estimates. In this study, we report for the first time on the negative impact of sample timing inaccuracy (verified by electronic-monitoring) on the efficacy to detect significant relationships between PACS and TWB when measured in the domestic setting. Healthy females (N=49, 20.5±2.8 years) selected for differences in TWB collected saliva samples (S1-4) on 4 days at 0, 15, 30, 45 min post awakening, to determine PACS. Adherence to the sampling protocol was objectively monitored using a combination of electronic estimates of awakening (actigraphy) and sampling times (track caps). Relationships between PACS and TWB were found to depend on sample timing accuracy. Lower TWB was associated with higher post awakening cortisol AUCg in proportion to the mean sample timing accuracy (p<.005). There was no association between TWB and the CAR even taking into account sample timing accuracy. These results highlight the importance of careful electronic monitoring of participant adherence for measurement of PACS in the domestic setting. Mean sample timing inaccuracy, mainly associated with delays of >5 min between awakening and collection of sample 1 (median=8 min delay), negatively impacts on the sensitivity of analysis to detect associations between PACS and TWB.

  18. Monte Carlo dynamically weighted importance sampling for spatial models with intractable normalizing constants

    NASA Astrophysics Data System (ADS)

    Liang, Faming; Cheon, Sooyoung

    2009-12-01

    The problem of simulating from distributions with intractable normalizing constants has received much attention in the recent literature. In this paper, we propose a new MCMC algorithm, the so-called Monte Carlo dynamically weighted importance sampler, for tickling this problem. The new algorithm is illustrated with the spatial autologistic models. The novelty of our algorithm is that it allows for the use of Monte Carlo estimates in MCMC simulations, while still leaving the target distribution invariant under the criterion of dynamically weighted importance sampling. Unlike the auxiliary variable MCMC algorithms, the new algorithm removes the need of perfect sampling, and thus can be applied to a wide range of problems for which perfect sampling is not available or very expensive. The new algorithm can also be used for simulating from the incomplete posterior distribution for the missing data problem.

  19. Accounting for sampling patterns reverses the relative importance of trade and climate for the global sharing of exotic plants

    USGS Publications Warehouse

    Sofaer, Helen; Jarnevich, Catherine S.

    2017-01-01

    AimThe distributions of exotic species reflect patterns of human-mediated dispersal, species climatic tolerances and a suite of other biotic and abiotic factors. The relative importance of each of these factors will shape how the spread of exotic species is affected by ongoing economic globalization and climate change. However, patterns of trade may be correlated with variation in scientific sampling effort globally, potentially confounding studies that do not account for sampling patterns.LocationGlobal.Time periodMuseum records, generally from the 1800s up to 2015.Major taxa studiedPlant species exotic to the United States.MethodsWe used data from the Global Biodiversity Information Facility (GBIF) to summarize the number of plant species with exotic occurrences in the United States that also occur in each other country world-wide. We assessed the relative importance of trade and climatic similarity for explaining variation in the number of shared species while evaluating several methods to account for variation in sampling effort among countries.ResultsAccounting for variation in sampling effort reversed the relative importance of trade and climate for explaining numbers of shared species. Trade was strongly correlated with numbers of shared U.S. exotic plants between the United States and other countries before, but not after, accounting for sampling variation among countries. Conversely, accounting for sampling effort strengthened the relationship between climatic similarity and species sharing. Using the number of records as a measure of sampling effort provided a straightforward approach for the analysis of occurrence data, whereas species richness estimators and rarefaction were less effective at removing sampling bias.Main conclusionsOur work provides support for broad-scale climatic limitation on the distributions of exotic species, illustrates the need to account for variation in sampling effort in large biodiversity databases, and highlights the

  20. Passive Samplers for Investigations of Air Quality: Method Description, Implementation, and Comparison to Alternative Sampling Methods

    EPA Science Inventory

    This Paper covers the basics of passive sampler design, compares passive samplers to conventional methods of air sampling, and discusses considerations when implementing a passive sampling program. The Paper also discusses field sampling and sample analysis considerations to ensu...

  1. Passive Samplers for Investigations of Air Quality: Method Description, Implementation, and Comparison to Alternative Sampling Methods

    EPA Science Inventory

    This Paper covers the basics of passive sampler design, compares passive samplers to conventional methods of air sampling, and discusses considerations when implementing a passive sampling program. The Paper also discusses field sampling and sample analysis considerations to ensu...

  2. DOE methods for evaluating environmental and waste management samples.

    SciTech Connect

    Goheen, S C; McCulloch, M; Thomas, B L; Riley, R G; Sklarew, D S; Mong, G M; Fadeff, S K

    1994-04-01

    DOE Methods for Evaluating Environmental and Waste Management Samples (DOE Methods) provides applicable methods in use by. the US Department of Energy (DOE) laboratories for sampling and analyzing constituents of waste and environmental samples. The development of DOE Methods is supported by the Laboratory Management Division (LMD) of the DOE. This document contains chapters and methods that are proposed for use in evaluating components of DOE environmental and waste management samples. DOE Methods is a resource intended to support sampling and analytical activities that will aid in defining the type and breadth of contamination and thus determine the extent of environmental restoration or waste management actions needed, as defined by the DOE, the US Environmental Protection Agency (EPA), or others.

  3. Sampling bee communities using pan traps: alternative methods increase sample size

    USDA-ARS?s Scientific Manuscript database

    Monitoring of the status of bee populations and inventories of bee faunas require systematic sampling. Efficiency and ease of implementation has encouraged the use of pan traps to sample bees. Efforts to find an optimal standardized sampling method for pan traps have focused on pan trap color. Th...

  4. Sampling methods for amphibians in streams in the Pacific Northwest.

    Treesearch

    R. Bruce Bury; Paul Stephen. Corn

    1991-01-01

    Methods describing how to sample aquatic and semiaquatic amphibians in small streams and headwater habitats in the Pacific Northwest are presented. We developed a technique that samples 10-meter stretches of selected streams, which was adequate to detect presence or absence of amphibian species and provided sample sizes statistically sufficient to compare abundance of...

  5. A random spatial sampling method in a rural developing nation

    Treesearch

    Michelle C. Kondo; Kent D.W. Bream; Frances K. Barg; Charles C. Branas

    2014-01-01

    Nonrandom sampling of populations in developing nations has limitations and can inaccurately estimate health phenomena, especially among hard-to-reach populations such as rural residents. However, random sampling of rural populations in developing nations can be challenged by incomplete enumeration of the base population. We describe a stratified random sampling method...

  6. Uniform Sampling Table Method and its Applications II--Evaluating the Uniform Sampling by Experiment.

    PubMed

    Chen, Yibin; Chen, Jiaxi; Chen, Xuan; Wang, Min; Wang, Wei

    2015-01-01

    A new method of uniform sampling is evaluated in this paper. The items and indexes were adopted to evaluate the rationality of the uniform sampling. The evaluation items included convenience of operation, uniformity of sampling site distribution, and accuracy and precision of measured results. The evaluation indexes included operational complexity, occupation rate of sampling site in a row and column, relative accuracy of pill weight, and relative deviation of pill weight. They were obtained from three kinds of drugs with different shape and size by four kinds of sampling methods. Gray correlation analysis was adopted to make the comprehensive evaluation by comparing it with the standard method. The experimental results showed that the convenience of uniform sampling method was 1 (100%), odds ratio of occupation rate in a row and column was infinity, relative accuracy was 99.50-99.89%, reproducibility RSD was 0.45-0.89%, and weighted incidence degree exceeded the standard method. Hence, the uniform sampling method was easy to operate, and the selected samples were distributed uniformly. The experimental results demonstrated that the uniform sampling method has good accuracy and reproducibility, which can be put into use in drugs analysis.

  7. Minimally important difference estimates and methods: a protocol.

    PubMed

    Johnston, Bradley C; Ebrahim, Shanil; Carrasco-Labra, Alonso; Furukawa, Toshi A; Patrick, Donald L; Crawford, Mark W; Hemmelgarn, Brenda R; Schunemann, Holger J; Guyatt, Gordon H; Nesrallah, Gihad

    2015-10-01

    Patient-reported outcomes (PROs) are often the outcomes of greatest importance to patients. The minimally important difference (MID) provides a measure of the smallest change in the PRO that patients perceive as important. An anchor-based approach is the most appropriate method for MID determination. No study or database currently exists that provides all anchor-based MIDs associated with PRO instruments; nor are there any accepted standards for appraising the credibility of MID estimates. Our objectives are to complete a systematic survey of the literature to collect and characterise published anchor-based MIDs associated with PRO instruments used in evaluating the effects of interventions on chronic medical and psychiatric conditions and to assess their credibility. We will search MEDLINE, EMBASE and PsycINFO (1989 to present) to identify studies addressing methods to estimate anchor-based MIDs of target PRO instruments or reporting empirical ascertainment of anchor-based MIDs. Teams of two reviewers will screen titles and abstracts, review full texts of citations, and extract relevant data. On the basis of findings from studies addressing methods to estimate anchor-based MIDs, we will summarise the available methods and develop an instrument addressing the credibility of empirically ascertained MIDs. We will evaluate the credibility of all studies reporting on the empirical ascertainment of anchor-based MIDs using the credibility instrument, and assess the instrument's inter-rater reliability. We will separately present reports for adult and paediatric populations. No research ethics approval was required as we will be using aggregate data from published studies. Our work will summarise anchor-based methods available to establish MIDs, provide an instrument to assess the credibility of available MIDs, determine the reliability of that instrument, and provide a comprehensive compendium of published anchor-based MIDs associated with PRO instruments which will help

  8. In-depth analysis of sampling optimization methods

    NASA Astrophysics Data System (ADS)

    Lee, Honggoo; Han, Sangjun; Kim, Myoungsoo; Habets, Boris; Buhl, Stefan; Guhlemann, Steffen; Rößiger, Martin; Bellmann, Enrico; Kim, Seop

    2016-03-01

    High order overlay and alignment models require good coverage of overlay or alignment marks on the wafer. But dense sampling plans are not possible for throughput reasons. Therefore, sampling plan optimization has become a key issue. We analyze the different methods for sampling optimization and discuss the different knobs to fine-tune the methods to constraints of high volume manufacturing. We propose a method to judge sampling plan quality with respect to overlay performance, run-to-run stability and dispositioning criteria using a number of use cases from the most advanced lithography processes.

  9. Configurations and calibration methods for passive sampling techniques.

    PubMed

    Ouyang, Gangfeng; Pawliszyn, Janusz

    2007-10-19

    Passive sampling technology has developed very quickly in the past 15 years, and is widely used for the monitoring of pollutants in different environments. The design and quantification of passive sampling devices require an appropriate calibration method. Current calibration methods that exist for passive sampling, including equilibrium extraction, linear uptake, and kinetic calibration, are presented in this review. A number of state-of-the-art passive sampling devices that can be used for aqueous and air monitoring are introduced according to their calibration methods.

  10. Engineering Study of 500 ML Sample Bottle Transportation Methods

    SciTech Connect

    BOGER, R.M.

    1999-08-25

    This engineering study reviews and evaluates all available methods for transportation of 500-mL grab sample bottles, reviews and evaluates transportation requirements and schedules and analyzes and recommends the most cost-effective method for transporting 500-mL grab sample bottles.

  11. Investigating Test Equating Methods in Small Samples through Various Factors

    ERIC Educational Resources Information Center

    Asiret, Semih; Sünbül, Seçil Ömür

    2016-01-01

    In this study, equating methods for random group design using small samples through factors such as sample size, difference in difficulty between forms, and guessing parameter was aimed for comparison. Moreover, which method gives better results under which conditions was also investigated. In this study, 5,000 dichotomous simulated data…

  12. GROUND WATER PURGING AND SAMPLING METHODS: HISTORY VS. HYSTERIA

    EPA Science Inventory

    It has been over 10 years since the low-flow ground water purging and sampling method was initially reported in the literature. The method grew from the recognition that well purging was necessary to collect representative samples, bailers could not achieve well purging, and high...

  13. Rapid method for sampling metals for materials identification

    NASA Technical Reports Server (NTRS)

    Higgins, L. E.

    1971-01-01

    Nondamaging process similar to electrochemical machining is useful in obtaining metal samples from places inaccessible to conventional sampling methods or where methods would be hazardous or contaminating to specimens. Process applies to industries where metals or metal alloys play a vital role.

  14. An adaptive importance sampling algorithm for Bayesian inversion with multimodal distributions

    SciTech Connect

    Li, Weixuan; Lin, Guang

    2015-03-21

    Parametric uncertainties are encountered in the simulations of many physical systems, and may be reduced by an inverse modeling procedure that calibrates the simulation results to observations on the real system being simulated. Following Bayes’ rule, a general approach for inverse modeling problems is to sample from the posterior distribution of the uncertain model parameters given the observations. However, the large number of repetitive forward simulations required in the sampling process could pose a prohibitive computational burden. This difficulty is particularly challenging when the posterior is multimodal. We present in this paper an adaptive importance sampling algorithm to tackle these challenges. Two essential ingredients of the algorithm are: 1) a Gaussian mixture (GM) model adaptively constructed as the proposal distribution to approximate the possibly multimodal target posterior, and 2) a mixture of polynomial chaos (PC) expansions, built according to the GM proposal, as a surrogate model to alleviate the computational burden caused by computational-demanding forward model evaluations. In three illustrative examples, the proposed adaptive importance sampling algorithm demonstrates its capabilities of automatically finding a GM proposal with an appropriate number of modes for the specific problem under study, and obtaining a sample accurately and efficiently representing the posterior with limited number of forward simulations.

  15. An adaptive importance sampling algorithm for Bayesian inversion with multimodal distributions

    DOE PAGES

    Li, Weixuan; Lin, Guang

    2015-03-21

    Parametric uncertainties are encountered in the simulations of many physical systems, and may be reduced by an inverse modeling procedure that calibrates the simulation results to observations on the real system being simulated. Following Bayes’ rule, a general approach for inverse modeling problems is to sample from the posterior distribution of the uncertain model parameters given the observations. However, the large number of repetitive forward simulations required in the sampling process could pose a prohibitive computational burden. This difficulty is particularly challenging when the posterior is multimodal. We present in this paper an adaptive importance sampling algorithm to tackle thesemore » challenges. Two essential ingredients of the algorithm are: 1) a Gaussian mixture (GM) model adaptively constructed as the proposal distribution to approximate the possibly multimodal target posterior, and 2) a mixture of polynomial chaos (PC) expansions, built according to the GM proposal, as a surrogate model to alleviate the computational burden caused by computational-demanding forward model evaluations. In three illustrative examples, the proposed adaptive importance sampling algorithm demonstrates its capabilities of automatically finding a GM proposal with an appropriate number of modes for the specific problem under study, and obtaining a sample accurately and efficiently representing the posterior with limited number of forward simulations.« less

  16. An adaptive importance sampling algorithm for Bayesian inversion with multimodal distributions

    SciTech Connect

    Li, Weixuan; Lin, Guang

    2015-08-01

    Parametric uncertainties are encountered in the simulations of many physical systems, and may be reduced by an inverse modeling procedure that calibrates the simulation results to observations on the real system being simulated. Following Bayes' rule, a general approach for inverse modeling problems is to sample from the posterior distribution of the uncertain model parameters given the observations. However, the large number of repetitive forward simulations required in the sampling process could pose a prohibitive computational burden. This difficulty is particularly challenging when the posterior is multimodal. We present in this paper an adaptive importance sampling algorithm to tackle these challenges. Two essential ingredients of the algorithm are: 1) a Gaussian mixture (GM) model adaptively constructed as the proposal distribution to approximate the possibly multimodal target posterior, and 2) a mixture of polynomial chaos (PC) expansions, built according to the GM proposal, as a surrogate model to alleviate the computational burden caused by computational-demanding forward model evaluations. In three illustrative examples, the proposed adaptive importance sampling algorithm demonstrates its capabilities of automatically finding a GM proposal with an appropriate number of modes for the specific problem under study, and obtaining a sample accurately and efficiently representing the posterior with limited number of forward simulations.

  17. Method and apparatus for imaging a sample on a device

    DOEpatents

    Trulson, Mark; Stern, David; Fiekowsky, Peter; Rava, Richard; Walton, Ian; Fodor, Stephen P. A.

    2001-01-01

    A method and apparatus for imaging a sample are provided. An electromagnetic radiation source generates excitation radiation which is sized by excitation optics to a line. The line is directed at a sample resting on a support and excites a plurality of regions on the sample. Collection optics collect response radiation reflected from the sample I and image the reflected radiation. A detector senses the reflected radiation and is positioned to permit discrimination between radiation reflected from a certain focal plane in the sample and certain other planes within the sample.

  18. [Clinical importance and diagnostic methods of minimal hepatic encephalopathy].

    PubMed

    Stawicka, Agnieszka; Zbrzeźniak, Justyna; Świderska, Aleksandra; Kilisińska, Natalia; Świderska, Magdalena; Jaroszewicz, Jerzy; Flisiak, Robert

    2016-02-01

    Minimal hepatic encephalopathy (MHE) encompasses a number of neuropsychological and neurophysiological disorders in patients suffering from liver cirrhosis, who do not display abnormalities during a medical interview or physical examination. A negative influence of MHE on the quality of life of patients suffering from liver cirrhosis was confirmed, which include retardation of ability of operating motor vehicles and disruption of multiple health-related areas, as well as functioning in the society. The data on frequency of traffic offences and accidents amongst patients diagnosed with MHE in comparison to patients diagnosed with liver cirrhosis without MHE, as well as healthy persons is alarming. Those patients are unaware of their disorder and retardation of their ability to operate vehicles, therefore it is of utmost importance to define this group. The term minimal hepatic encephalopathy (formerly "subclinical" encephalopathy) erroneously suggested the unnecessity of diagnostic and therapeutic procedures in patients with liver cirrhosis. Diagnosing MHE is an important predictive factor for occurrence of overt encephalopathy - more than 50% of patients with this diagnosis develop overt encephalopathy during a period of 30 months after. Early diagnosing MHE gives a chance to implement proper treatment which can be a prevention of overt encephalopathy. Due to continuing lack of clinical research there exist no commonly agreed-upon standards for definition, diagnostics, classification and treatment of hepatic encephalopathy. This article introduces the newest findings regarding the importance of MHE, scientific recommendations and provides detailed descriptions of the most valuable diagnostic methods.

  19. Performance evaluation of an importance sampling technique in a Jackson network

    NASA Astrophysics Data System (ADS)

    brahim Mahdipour, E.; Masoud Rahmani, Amir; Setayeshi, Saeed

    2014-03-01

    Importance sampling is a technique that is commonly used to speed up Monte Carlo simulation of rare events. However, little is known regarding the design of efficient importance sampling algorithms in the context of queueing networks. The standard approach, which simulates the system using an a priori fixed change of measure suggested by large deviation analysis, has been shown to fail in even the simplest network settings. Estimating probabilities associated with rare events has been a topic of great importance in queueing theory, and in applied probability at large. In this article, we analyse the performance of an importance sampling estimator for a rare event probability in a Jackson network. This article carries out strict deadlines to a two-node Jackson network with feedback whose arrival and service rates are modulated by an exogenous finite state Markov process. We have estimated the probability of network blocking for various sets of parameters, and also the probability of missing the deadline of customers for different loads and deadlines. We have finally shown that the probability of total population overflow may be affected by various deadline values, service rates and arrival rates.

  20. A Mixed Methods Sampling Methodology for a Multisite Case Study

    ERIC Educational Resources Information Center

    Sharp, Julia L.; Mobley, Catherine; Hammond, Cathy; Withington, Cairen; Drew, Sam; Stringfield, Sam; Stipanovic, Natalie

    2012-01-01

    The flexibility of mixed methods research strategies makes such approaches especially suitable for multisite case studies. Yet the utilization of mixed methods to select sites for these studies is rarely reported. The authors describe their pragmatic mixed methods approach to select a sample for their multisite mixed methods case study of a…

  1. A Mixed Methods Sampling Methodology for a Multisite Case Study

    ERIC Educational Resources Information Center

    Sharp, Julia L.; Mobley, Catherine; Hammond, Cathy; Withington, Cairen; Drew, Sam; Stringfield, Sam; Stipanovic, Natalie

    2012-01-01

    The flexibility of mixed methods research strategies makes such approaches especially suitable for multisite case studies. Yet the utilization of mixed methods to select sites for these studies is rarely reported. The authors describe their pragmatic mixed methods approach to select a sample for their multisite mixed methods case study of a…

  2. The Importance of Sample Processing in Analysis of Asbestos Content in Rocks and Soils

    NASA Astrophysics Data System (ADS)

    Neumann, R. D.; Wright, J.

    2012-12-01

    Analysis of asbestos content in rocks and soils using Air Resources Board (ARB) Test Method 435 (M435) involves the processing of samples for subsequent analysis by polarized light microscopy (PLM). The use of different equipment and procedures by commercial laboratories to pulverize rock and soil samples could result in different particle size distributions. It has long been theorized that asbestos-containing samples can be over-pulverized to the point where the particle dimensions of the asbestos no longer meet the required 3:1 length-to-width aspect ratio or the particles become so small that they no longer can be tested for optical characteristics using PLM where maximum PLM magnification is typically 400X. Recent work has shed some light on this issue. ARB staff conducted an interlaboratory study to investigate variability in preparation and analytical procedures used by laboratories performing M435 analysis. With regard to sample processing, ARB staff found that different pulverization equipment and processing procedures produced powders that have varying particle size distributions. PLM analysis of the finest powders produced by one laboratory showed all but one of the 12 samples were non-detect or below the PLM reporting limit; in contrast to the other 36 coarser samples from the same field sample and processed by three other laboratories where 21 samples were above the reporting limit. The set of 12, exceptionally fine powder samples produced by the same laboratory was re-analyzed by transmission electron microscopy (TEM) and results showed that these samples contained asbestos above the TEM reporting limit. However, the use of TEM as a stand-alone analytical procedure, usually performed at magnifications between 3,000 to 20,000X, also has its drawbacks because of the miniscule mass of sample that this method examines. The small amount of powder analyzed by TEM may not be representative of the field sample. The actual mass of the sample powder analyzed by

  3. Evaluating Composite Sampling Methods of Bacillus spores at Low Concentrations

    SciTech Connect

    Hess, Becky M.; Amidan, Brett G.; Anderson, Kevin K.; Hutchison, Janine R.

    2016-10-13

    Restoring facility operations after the 2001 Amerithrax attacks took over three months to complete, highlighting the need to reduce remediation time. The most time intensive tasks were environmental sampling and sample analyses. Composite sampling allows disparate samples to be combined, with only a single analysis needed, making it a promising method to reduce response times. We developed a statistical experimental design to test three different composite sampling methods: 1) single medium single pass composite: a single cellulose sponge samples multiple coupons; 2) single medium multi-pass composite: a single cellulose sponge is used to sample multiple coupons; and 3) multi-medium post-sample composite: a single cellulose sponge samples a single surface, and then multiple sponges are combined during sample extraction. Five spore concentrations of Bacillus atrophaeus Nakamura spores were tested; concentrations ranged from 5 to 100 CFU/coupon (0.00775 to 0.155CFU/cm2, respectively). Study variables included four clean surface materials (stainless steel, vinyl tile, ceramic tile, and painted wallboard) and three grime coated/dirty materials (stainless steel, vinyl tile, and ceramic tile). Analysis of variance for the clean study showed two significant factors: composite method (p-value < 0.0001) and coupon material (p-value = 0.0008). Recovery efficiency (RE) was higher overall using the post-sample composite (PSC) method compared to single medium composite from both clean and grime coated materials. RE with the PSC method for concentrations tested (10 to 100 CFU/coupon) was similar for ceramic tile, painted wall board, and stainless steel for clean materials. RE was lowest for vinyl tile with both composite methods. Statistical tests for the dirty study showed RE was significantly higher for vinyl and stainless steel materials, but significantly lower for ceramic tile. These results suggest post-sample compositing can be used to reduce sample analysis time when

  4. Improved importance sampling technique for efficient simulation of digital communication systems

    NASA Technical Reports Server (NTRS)

    Lu, Dingqing; Yao, Kung

    1988-01-01

    A new, improved importance sampling (IIS) approach to simulation is considered. Some basic concepts of IS are introduced, and detailed evolutions of simulation estimation variances for Monte Carlo (MC) and IS simulations are given. The general results obtained from these evolutions are applied to the specific previously known conventional importance sampling (CIS) technique and the new IIS technique. The derivation for a linear system with no signal random memory is considered in some detail. For the CIS technique, the optimum input scaling parameter is found, while for the IIS technique, the optimum translation parameter is found. The results are generalized to a linear system with memory and signals. Specific numerical and simulation results are given which show the advantages of CIS over MC and IIS over CIS for simulations of digital communications systems.

  5. Improved importance sampling technique for efficient simulation of digital communication systems

    NASA Technical Reports Server (NTRS)

    Lu, Dingqing; Yao, Kung

    1988-01-01

    A new, improved importance sampling (IIS) approach to simulation is considered. Some basic concepts of IS are introduced, and detailed evolutions of simulation estimation variances for Monte Carlo (MC) and IS simulations are given. The general results obtained from these evolutions are applied to the specific previously known conventional importance sampling (CIS) technique and the new IIS technique. The derivation for a linear system with no signal random memory is considered in some detail. For the CIS technique, the optimum input scaling parameter is found, while for the IIS technique, the optimum translation parameter is found. The results are generalized to a linear system with memory and signals. Specific numerical and simulation results are given which show the advantages of CIS over MC and IIS over CIS for simulations of digital communications systems.

  6. Burnout and Engagement: Relative Importance of Predictors and Outcomes in Two Health Care Worker Samples.

    PubMed

    Fragoso, Zachary L; Holcombe, Kyla J; McCluney, Courtney L; Fisher, Gwenith G; McGonagle, Alyssa K; Friebe, Susan J

    2016-06-09

    This study's purpose was twofold: first, to examine the relative importance of job demands and resources as predictors of burnout and engagement, and second, the relative importance of engagement and burnout related to health, depressive symptoms, work ability, organizational commitment, and turnover intentions in two samples of health care workers. Nurse leaders (n = 162) and licensed emergency medical technicians (EMTs; n = 102) completed surveys. In both samples, job demands predicted burnout more strongly than job resources, and job resources predicted engagement more strongly than job demands. Engagement held more weight than burnout for predicting commitment, and burnout held more weight for predicting health outcomes, depressive symptoms, and work ability. Results have implications for the design, evaluation, and effectiveness of workplace interventions to reduce burnout and improve engagement among health care workers. Actionable recommendations for increasing engagement and decreasing burnout in health care organizations are provided.

  7. An antithetic variate to facilitate upper-stem height measurements for critical height sampling with importance sampling

    Treesearch

    Thomas B. Lynch; Jeffrey H. Gove

    2013-01-01

    Critical height sampling (CHS) estimates cubic volume per unit area by multiplying the sum of critical heights measured on trees tallied in a horizontal point sample (HPS) by the HPS basal area factor. One of the barriers to practical application of CHS is the fact that trees near the field location of the point-sampling sample point have critical heights that occur...

  8. [Current methods for preparing samples on working with hematology analyzers].

    PubMed

    Tsyganova, A V; Pogorelov, V M; Naumova, I N; Kozinets, G I; Antonov, V S

    2011-03-01

    The paper raises a problem of preparing samples in hematology. It considers whether the preanalytical stage is of importance in hematological studies. The use of disposal vacuum blood collection systems is shown to solve the problem in the standardization of a blood sampling procedure. The benefits of the use of close tube hematology analyzers are also considered.

  9. Importance sampling allows Hd true tests of highly discriminating DNA profiles.

    PubMed

    Taylor, Duncan; Curran, James M; Buckleton, John

    2017-03-01

    Hd true testing is a way of assessing the performance of a model, or DNA profile interpretation system. These tests involve simulating DNA profiles of non-donors to a DNA mixture and calculating a likelihood ratio (LR) with one proposition postulating their contribution and the alternative postulating their non-contribution. Following Turing it is possible to predict that "The average LR for the Hd true tests should be one"[1]. This suggests a way of validating softwares. During discussions on the ISFG software validation guidelines [2] it was argued by some that this prediction had not been sufficiently examined experimentally to serve as a criterion for validation. More recently a high profile report [3] has emphasised large scale empirical examination. A limitation with Hd true tests, when non-donor profiles are generated at random (or in accordance with expectation from allele frequencies), is that the number of tests required depends on the discrimination power of the evidence profile. If the Hd true tests are to fully explore the genotype space that yields non-zero LRs then the number of simulations required could be in the 10s of orders of magnitude (well outside practical computing limits). We describe here the use of importance sampling, which allows the simulation of rare events to occur more commonly than they would at random, and then adjusting for this bias at the end of the simulation in order to recover all diagnostic values of interest. Importance sampling, whilst having been employed by others for Hd true tests, is largely unknown in forensic genetics. We take time in this paper to explain how importance sampling works, the advantages of using it and its application to Hd true tests. We conclude by showing that employing an importance sampling scheme brings Hd true testing ability to all profiles, regardless of discrimination power. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  10. Orientation sampling for dictionary-based diffraction pattern indexing methods

    NASA Astrophysics Data System (ADS)

    Singh, S.; De Graef, M.

    2016-12-01

    A general framework for dictionary-based indexing of diffraction patterns is presented. A uniform sampling method of orientation space using the cubochoric representation is introduced and used to derive an empirical relation between the average disorientation between neighboring sampling points and the number of grid points sampled along the semi-edge of the cubochoric cube. A method to uniformly sample misorientation iso-surfaces is also presented. This method is used to show that the dot product serves as a proxy for misorientation. Furthermore, it is shown that misorientation iso-surfaces in Rodrigues space are quadractic surfaces. Finally, using the concept of Riesz energies, it is shown that the sampling method results in a near optimal covering of orientation space.

  11. Importance of sample form and surface temperature for analysis by ambient plasma mass spectrometry (PADI).

    PubMed

    Salter, Tara La Roche; Bunch, Josephine; Gilmore, Ian S

    2014-09-16

    Many different types of samples have been analyzed in the literature using plasma-based ambient mass spectrometry sources; however, comprehensive studies of the important parameters for analysis are only just beginning. Here, we investigate the effect of the sample form and surface temperature on the signal intensities in plasma-assisted desorption ionization (PADI). The form of the sample is very important, with powders of all volatilities effectively analyzed. However, for the analysis of thin films at room temperature and using a low plasma power, a vapor pressure of greater than 10(-4) Pa is required to achieve a sufficiently good quality spectrum. Using thermal desorption, we are able to increase the signal intensity of less volatile materials with vapor pressures less than 10(-4) Pa, in thin film form, by between 4 and 7 orders of magnitude. This is achieved by increasing the temperature of the sample up to a maximum of 200 °C. Thermal desorption can also increase the signal intensity for the analysis of powders.

  12. Method for using polarization gating to measure a scattering sample

    DOEpatents

    Baba, Justin S.

    2015-08-04

    Described herein are systems, devices, and methods facilitating optical characterization of scattering samples. A polarized optical beam can be directed to pass through a sample to be tested. The optical beam exiting the sample can then be analyzed to determine its degree of polarization, from which other properties of the sample can be determined. In some cases, an apparatus can include a source of an optical beam, an input polarizer, a sample, an output polarizer, and a photodetector. In some cases, a signal from a photodetector can be processed through attenuation, variable offset, and variable gain.

  13. Interval sampling methods and measurement error: a computer simulation.

    PubMed

    Wirth, Oliver; Slaven, James; Taylor, Matthew A

    2014-01-01

    A simulation study was conducted to provide a more thorough account of measurement error associated with interval sampling methods. A computer program simulated the application of momentary time sampling, partial-interval recording, and whole-interval recording methods on target events randomly distributed across an observation period. The simulation yielded measures of error for multiple combinations of observation period, interval duration, event duration, and cumulative event duration. The simulations were conducted up to 100 times to yield measures of error variability. Although the present simulation confirmed some previously reported characteristics of interval sampling methods, it also revealed many new findings that pertain to each method's inherent strengths and weaknesses. The analysis and resulting error tables can help guide the selection of the most appropriate sampling method for observation-based behavioral assessments. © Society for the Experimental Analysis of Behavior.

  14. [Recent advances in sample preparation methods of plant hormones].

    PubMed

    Wu, Qian; Wang, Lus; Wu, Dapeng; Duan, Chunfeng; Guan, Yafeng

    2014-04-01

    Plant hormones are a group of naturally occurring trace substances which play a crucial role in controlling the plant development, growth and environment response. With the development of the chromatography and mass spectroscopy technique, chromatographic analytical method has become a widely used way for plant hormone analysis. Among the steps of chromatographic analysis, sample preparation is undoubtedly the most vital one. Thus, a highly selective and efficient sample preparation method is critical for accurate identification and quantification of phytohormones. For the three major kinds of plant hormones including acidic plant hormones & basic plant hormones, brassinosteroids and plant polypeptides, the sample preparation methods are reviewed in sequence especially the recently developed methods. The review includes novel methods, devices, extractive materials and derivative reagents for sample preparation of phytohormones analysis. Especially, some related works of our group are included. At last, the future developments in this field are also prospected.

  15. [Respondent-Driven Sampling: a new sampling method to study visible and hidden populations].

    PubMed

    Mantecón, Alejandro; Juan, Montse; Calafat, Amador; Becoña, Elisardo; Román, Encarna

    2008-01-01

    The paper introduces a variant of chain-referral sampling: respondent-driven sampling (RDS). This sampling method shows that methods based on network analysis can be combined with the statistical validity of standard probability sampling methods. In this sense, RDS appears to be a mathematical improvement of snowball sampling oriented to the study of hidden populations. However, we try to prove its validity with populations that are not within a sampling frame but can nonetheless be contacted without difficulty. The basics of RDS are explained through our research on young people (aged 14 to 25) who go clubbing, consume alcohol and other drugs, and have sex. Fieldwork was carried out between May and July 2007 in three Spanish regions: Baleares, Galicia and Comunidad Valenciana. The presentation of the study shows the utility of this type of sampling when the population is accessible but there is a difficulty deriving from the lack of a sampling frame. However, the sample obtained is not a random representative one in statistical terms of the target population. It must be acknowledged that the final sample is representative of a 'pseudo-population' that approximates to the target population but is not identical to it.

  16. A comparison of methods for representing sparsely sampled random quantities.

    SciTech Connect

    Romero, Vicente Jose; Swiler, Laura Painton; Urbina, Angel; Mullins, Joshua

    2013-09-01

    This report discusses the treatment of uncertainties stemming from relatively few samples of random quantities. The importance of this topic extends beyond experimental data uncertainty to situations involving uncertainty in model calibration, validation, and prediction. With very sparse data samples it is not practical to have a goal of accurately estimating the underlying probability density function (PDF). Rather, a pragmatic goal is that the uncertainty representation should be conservative so as to bound a specified percentile range of the actual PDF, say the range between 0.025 and .975 percentiles, with reasonable reliability. A second, opposing objective is that the representation not be overly conservative; that it minimally over-estimate the desired percentile range of the actual PDF. The presence of the two opposing objectives makes the sparse-data uncertainty representation problem interesting and difficult. In this report, five uncertainty representation techniques are characterized for their performance on twenty-one test problems (over thousands of trials for each problem) according to these two opposing objectives and other performance measures. Two of the methods, statistical Tolerance Intervals and a kernel density approach specifically developed for handling sparse data, exhibit significantly better overall performance than the others.

  17. The importance of a priori sample size estimation in strength and conditioning research.

    PubMed

    Beck, Travis W

    2013-08-01

    The statistical power, or sensitivity of an experiment, is defined as the probability of rejecting a false null hypothesis. Only 3 factors can affect statistical power: (a) the significance level (α), (b) the magnitude or size of the treatment effect (effect size), and (c) the sample size (n). Of these 3 factors, only the sample size can be manipulated by the investigator because the significance level is usually selected before the study, and the effect size is determined by the effectiveness of the treatment. Thus, selection of an appropriate sample size is one of the most important components of research design but is often misunderstood by beginning researchers. The purpose of this tutorial is to describe procedures for estimating sample size for a variety of different experimental designs that are common in strength and conditioning research. Emphasis is placed on selecting an appropriate effect size because this step fully determines sample size when power and the significance level are fixed. There are many different software packages that can be used for sample size estimation. However, I chose to describe the procedures for the G*Power software package (version 3.1.4) because this software is freely downloadable and capable of estimating sample size for many of the different statistical tests used in strength and conditioning research. Furthermore, G*Power provides a number of different auxiliary features that can be useful for researchers when designing studies. It is my hope that the procedures described in this article will be beneficial for researchers in the field of strength and conditioning.

  18. A random spatial sampling method in a rural developing nation

    PubMed Central

    2014-01-01

    Background Nonrandom sampling of populations in developing nations has limitations and can inaccurately estimate health phenomena, especially among hard-to-reach populations such as rural residents. However, random sampling of rural populations in developing nations can be challenged by incomplete enumeration of the base population. Methods We describe a stratified random sampling method using geographical information system (GIS) software and global positioning system (GPS) technology for application in a health survey in a rural region of Guatemala, as well as a qualitative study of the enumeration process. Results This method offers an alternative sampling technique that could reduce opportunities for bias in household selection compared to cluster methods. However, its use is subject to issues surrounding survey preparation, technological limitations and in-the-field household selection. Application of this method in remote areas will raise challenges surrounding the boundary delineation process, use and translation of satellite imagery between GIS and GPS, and household selection at each survey point in varying field conditions. This method favors household selection in denser urban areas and in new residential developments. Conclusions Random spatial sampling methodology can be used to survey a random sample of population in a remote region of a developing nation. Although this method should be further validated and compared with more established methods to determine its utility in social survey applications, it shows promise for use in developing nations with resource-challenged environments where detailed geographic and human census data are less available. PMID:24716473

  19. Transuranic waste characterization sampling and analysis methods manual

    SciTech Connect

    1995-05-01

    The Transuranic Waste Characterization Sampling and Analysis Methods Manual (Methods Manual) provides a unified source of information on the sampling and analytical techniques that enable Department of Energy (DOE) facilities to comply with the requirements established in the current revision of the Transuranic Waste Characterization Quality Assurance Program Plan (QAPP) for the Waste Isolation Pilot Plant (WIPP) Transuranic (TRU) Waste Characterization Program (the Program). This Methods Manual includes all of the testing, sampling, and analytical methodologies accepted by DOE for use in implementing the Program requirements specified in the QAPP.

  20. [Weighted estimation methods for multistage sampling survey data].

    PubMed

    Hou, Xiao-Yan; Wei, Yong-Yue; Chen, Feng

    2009-06-01

    Multistage sampling techniques are widely applied in the cross-sectional study of epidemiology, while methods based on independent assumption are still used to analyze such complex survey data. This paper aims to introduce the application of weighted estimation methods for the complex survey data. A brief overview of basic theory is described, and then a practical analysis is illustrated to apply to the weighted estimation algorithm in a stratified two-stage clustered sampling data. For multistage sampling survey data, weighted estimation method can be used to obtain unbiased point estimation and more reasonable variance estimation, and so make proper statistical inference by correcting the clustering, stratification and unequal probability effects.

  1. The Importance of Meteorite Collections to Sample Return Missions: Past, Present, and Future Considerations

    NASA Technical Reports Server (NTRS)

    Welzenbach, L. C.; McCoy, T. J.; Glavin, D. P.; Dworkin, J. P.; Abell, P. A.

    2012-01-01

    turn led to a new wave of Mars exploration that ultimately could lead to sample return focused on evidence for past or present life. This partnership between collections and missions will be increasingly important in the coming decades as we discover new questions to be addressed and identify targets for for both robotic and human exploration . Nowhere is this more true than in the ultimate search for the abiotic and biotic processes that produced life. Existing collections also provide the essential materials for developing and testing new analytical schemes to detect the rare markers of life and distinguish them from abiotic processes. Large collections of meteorites and the new types being identified within these collections, which come to us at a fraction of the cost of a sample return mission, will continue to shape the objectives of future missions and provide new ways of interpreting returned samples.

  2. Integration of sample analysis method (SAM) for polychlorinated biphenyls

    SciTech Connect

    Monagle, M.; Johnson, R.C.

    1996-05-01

    A completely integrated Sample Analysis Method (SAM) has been tested as part of the Contaminant Analysis Automation program. The SAM system was tested for polychlorinated biphenyl samples using five Standard Laboratory Modules{trademark}: two Soxtec{trademark} modules, a high volume concentrator module, a generic materials handling module, and the gas chromatographic module. With over 300 samples completed within the first phase of the validation, recovery and precision data were comparable to manual methods. Based on experience derived from the first evaluation of the automated system, efforts are underway to improve sample recoveries and integrate a sample cleanup procedure. In addition, initial work in automating the extraction of semivolatile samples using this system will also be discussed.

  3. Sequential sampling: a novel method in farm animal welfare assessment.

    PubMed

    Heath, C A E; Main, D C J; Mullan, S; Haskell, M J; Browne, W J

    2016-02-01

    Lameness in dairy cows is an important welfare issue. As part of a welfare assessment, herd level lameness prevalence can be estimated from scoring a sample of animals, where higher levels of accuracy are associated with larger sample sizes. As the financial cost is related to the number of cows sampled, smaller samples are preferred. Sequential sampling schemes have been used for informing decision making in clinical trials. Sequential sampling involves taking samples in stages, where sampling can stop early depending on the estimated lameness prevalence. When welfare assessment is used for a pass/fail decision, a similar approach could be applied to reduce the overall sample size. The sampling schemes proposed here apply the principles of sequential sampling within a diagnostic testing framework. This study develops three sequential sampling schemes of increasing complexity to classify 80 fully assessed UK dairy farms, each with known lameness prevalence. Using the Welfare Quality herd-size-based sampling scheme, the first 'basic' scheme involves two sampling events. At the first sampling event half the Welfare Quality sample size is drawn, and then depending on the outcome, sampling either stops or is continued and the same number of animals is sampled again. In the second 'cautious' scheme, an adaptation is made to ensure that correctly classifying a farm as 'bad' is done with greater certainty. The third scheme is the only scheme to go beyond lameness as a binary measure and investigates the potential for increasing accuracy by incorporating the number of severely lame cows into the decision. The three schemes are evaluated with respect to accuracy and average sample size by running 100 000 simulations for each scheme, and a comparison is made with the fixed size Welfare Quality herd-size-based sampling scheme. All three schemes performed almost as well as the fixed size scheme but with much smaller average sample sizes. For the third scheme, an overall

  4. A LITERATURE REVIEW OF WIPE SAMPLING METHODS FOR CHEMICAL WARFARE AGENTS AND TOXIC INDUSTRIAL CHEMICALS

    EPA Science Inventory

    Wipe sampling is an important technique for the estimation of contaminant deposition in buildings, homes, or outdoor surfaces as a source of possible human exposure. Numerous

    methods of wipe sampling exist, and each method has its own specification for the type of wipe, we...

  5. A LITERATURE REVIEW OF WIPE SAMPLING METHODS FOR CHEMICAL WARFARE AGENTS AND TOXIC INDUSTRIAL CHEMICALS

    EPA Science Inventory

    Wipe sampling is an important technique for the estimation of contaminant deposition in buildings, homes, or outdoor surfaces as a source of possible human exposure. Numerous

    methods of wipe sampling exist, and each method has its own specification for the type of wipe, we...

  6. Capillary microextraction: A new method for sampling methamphetamine vapour.

    PubMed

    Nair, M V; Miskelly, G M

    2016-11-01

    Clandestine laboratories pose a serious health risk to first responders, investigators, decontamination companies, and the public who may be inadvertently exposed to methamphetamine and other chemicals used in its manufacture. Therefore there is an urgent need for reliable methods to detect and measure methamphetamine at such sites. The most common method for determining methamphetamine contamination at former clandestine laboratory sites is selected surface wipe sampling, followed by analysis with gas chromatography-mass spectrometry (GC-MS). We are investigating the use of sampling for methamphetamine vapour to complement such wipe sampling. In this study, we report the use of capillary microextraction (CME) devices for sampling airborne methamphetamine, and compare their sampling efficiency with a previously reported dynamic SPME method. The CME devices consisted of PDMS-coated glass filter strips inside a glass tube. The devices were used to dynamically sample methamphetamine vapour in the range of 0.42-4.2μgm(-3), generated by a custom-built vapour dosing system, for 1-15min, and methamphetamine was analysed using a GC-MS fitted with a ChromatoProbe thermal desorption unit. The devices showed good reproducibility (RSD<15%), and a curvilinear pre-equilibrium relationship between sampling times and peak area, which can be utilised for calibration. Under identical sampling conditions, the CME devices were approximately 30 times more sensitive than the dynamic SPME method. The CME devices could be stored for up to 3days after sampling prior to analysis. Consecutive sampling of methamphetamine and its isotopic substitute, d-9 methamphetamine showed no competitive displacement. This suggests that CME devices, pre-loaded with an internal standard, could be a feasible method for sampling airborne methamphetamine at former clandestine laboratories.

  7. A random spatial sampling method in a rural developing nation.

    PubMed

    Kondo, Michelle C; Bream, Kent D W; Barg, Frances K; Branas, Charles C

    2014-04-10

    Nonrandom sampling of populations in developing nations has limitations and can inaccurately estimate health phenomena, especially among hard-to-reach populations such as rural residents. However, random sampling of rural populations in developing nations can be challenged by incomplete enumeration of the base population. We describe a stratified random sampling method using geographical information system (GIS) software and global positioning system (GPS) technology for application in a health survey in a rural region of Guatemala, as well as a qualitative study of the enumeration process. This method offers an alternative sampling technique that could reduce opportunities for bias in household selection compared to cluster methods. However, its use is subject to issues surrounding survey preparation, technological limitations and in-the-field household selection. Application of this method in remote areas will raise challenges surrounding the boundary delineation process, use and translation of satellite imagery between GIS and GPS, and household selection at each survey point in varying field conditions. This method favors household selection in denser urban areas and in new residential developments. Random spatial sampling methodology can be used to survey a random sample of population in a remote region of a developing nation. Although this method should be further validated and compared with more established methods to determine its utility in social survey applications, it shows promise for use in developing nations with resource-challenged environments where detailed geographic and human census data are less available.

  8. Optimized method for dissolved hydrogen sampling in groundwater.

    PubMed

    Alter, Marcus D; Steiof, Martin

    2005-06-01

    Dissolved hydrogen concentrations are used to characterize redox conditions of contaminated aquifers. The currently accepted and recommended bubble strip method for hydrogen sampling (Wiedemeier et al., 1998) requires relatively long sampling times and immediate field analysis. In this study we present methods for optimized sampling and for sample storage. The bubble strip sampling method was examined for various flow rates, bubble sizes (headspace volume in the sampling bulb) and two different H2 concentrations. The results were compared to a theoretical equilibration model. Turbulent flow in the sampling bulb was optimized for gas transfer by reducing the inlet diameter. Extraction with a 5 mL headspace volume and flow rates higher than 100 mL/min resulted in 95-100% equilibrium within 10-15 min. In order to investigate the storage of samples from the gas sampling bulb gas samples were kept in headspace vials for varying periods. Hydrogen samples (4.5 ppmv, corresponding to 3.5 nM in liquid phase) could be stored up to 48 h and 72 h with a recovery rate of 100.1+/-2.6% and 94.6+/-3.2%, respectively. These results are promising and prove the possibility of storage for 2-3 days before laboratory analysis. The optimized method was tested at a field site contaminated with chlorinated solvents. Duplicate gas samples were stored in headspace vials and analyzed after 24 h. Concentrations were measured in the range of 2.5-8.0 nM corresponding to known concentrations in reduced aquifers.

  9. Methods for collection and analysis of water samples

    USGS Publications Warehouse

    Rainwater, Frank Hays; Thatcher, Leland Lincoln

    1960-01-01

    This manual contains methods used by the U.S. Geological Survey to collect, preserve, and analyze water samples. Throughout, the emphasis is on obtaining analytical results that accurately describe the chemical composition of the water in situ. Among the topics discussed are selection of sampling sites, frequency of sampling, field equipment, preservatives and fixatives, analytical techniques of water analysis, and instruments. Seventy-seven laboratory and field procedures are given for determining fifty-three water properties.

  10. A comprehensive comparison of perpendicular distance sampling methods for sampling downed coarse woody debris

    Treesearch

    Jeffrey H. Gove; Mark J. Ducey; Harry T. Valentine; Michael S. Williams

    2013-01-01

    Many new methods for sampling down coarse woody debris have been proposed in the last dozen or so years. One of the most promising in terms of field application, perpendicular distance sampling (PDS), has several variants that have been progressively introduced in the literature. In this study, we provide an overview of the different PDS variants and comprehensive...

  11. Method and sample spinning apparatus for measuring the NMR spectrum of an orientationally disordered sample

    DOEpatents

    Pines, Alexander; Samoson, Ago

    1990-01-01

    An improved NMR apparatus and method are described which substantially improve the resolution of NMR measurements made on powdered or amorphous or otherwise orientationally disordered samples. The apparatus spins the sample about an axis. The angle of the axis is mechanically varied such that the time average of two or more Legendre polynomials are zero.

  12. Field evaluation of personal sampling methods for multiple bioaerosols.

    PubMed

    Wang, Chi-Hsun; Chen, Bean T; Han, Bor-Cheng; Liu, Andrew Chi-Yeu; Hung, Po-Chen; Chen, Chih-Yong; Chao, Hsing Jasmine

    2015-01-01

    Ambient bioaerosols are ubiquitous in the daily environment and can affect health in various ways. However, few studies have been conducted to comprehensively evaluate personal bioaerosol exposure in occupational and indoor environments because of the complex composition of bioaerosols and the lack of standardized sampling/analysis methods. We conducted a study to determine the most efficient collection/analysis method for the personal exposure assessment of multiple bioaerosols. The sampling efficiencies of three filters and four samplers were compared. According to our results, polycarbonate (PC) filters had the highest relative efficiency, particularly for bacteria. Side-by-side sampling was conducted to evaluate the three filter samplers (with PC filters) and the NIOSH Personal Bioaerosol Cyclone Sampler. According to the results, the Button Aerosol Sampler and the IOM Inhalable Dust Sampler had the highest relative efficiencies for fungi and bacteria, followed by the NIOSH sampler. Personal sampling was performed in a pig farm to assess occupational bioaerosol exposure and to evaluate the sampling/analysis methods. The Button and IOM samplers yielded a similar performance for personal bioaerosol sampling at the pig farm. However, the Button sampler is more likely to be clogged at high airborne dust concentrations because of its higher flow rate (4 L/min). Therefore, the IOM sampler is a more appropriate choice for performing personal sampling in environments with high dust levels. In summary, the Button and IOM samplers with PC filters are efficient sampling/analysis methods for the personal exposure assessment of multiple bioaerosols.

  13. Adaptive cluster sampling: An efficient method for assessing inconspicuous species

    Treesearch

    Andrea M. Silletti; Joan Walker

    2003-01-01

    Restorationistis typically evaluate the success of a project by estimating the population sizes of species that have been planted or seeded. Because total census is raely feasible, they must rely on sampling methods for population estimates. However, traditional random sampling designs may be inefficient for species that, for one reason or another, are challenging to...

  14. Field Evaluation of Personal Sampling Methods for Multiple Bioaerosols

    PubMed Central

    Wang, Chi-Hsun; Chen, Bean T.; Han, Bor-Cheng; Liu, Andrew Chi-Yeu; Hung, Po-Chen; Chen, Chih-Yong; Chao, Hsing Jasmine

    2015-01-01

    Ambient bioaerosols are ubiquitous in the daily environment and can affect health in various ways. However, few studies have been conducted to comprehensively evaluate personal bioaerosol exposure in occupational and indoor environments because of the complex composition of bioaerosols and the lack of standardized sampling/analysis methods. We conducted a study to determine the most efficient collection/analysis method for the personal exposure assessment of multiple bioaerosols. The sampling efficiencies of three filters and four samplers were compared. According to our results, polycarbonate (PC) filters had the highest relative efficiency, particularly for bacteria. Side-by-side sampling was conducted to evaluate the three filter samplers (with PC filters) and the NIOSH Personal Bioaerosol Cyclone Sampler. According to the results, the Button Aerosol Sampler and the IOM Inhalable Dust Sampler had the highest relative efficiencies for fungi and bacteria, followed by the NIOSH sampler. Personal sampling was performed in a pig farm to assess occupational bioaerosol exposure and to evaluate the sampling/analysis methods. The Button and IOM samplers yielded a similar performance for personal bioaerosol sampling at the pig farm. However, the Button sampler is more likely to be clogged at high airborne dust concentrations because of its higher flow rate (4 L/min). Therefore, the IOM sampler is a more appropriate choice for performing personal sampling in environments with high dust levels. In summary, the Button and IOM samplers with PC filters are efficient sampling/analysis methods for the personal exposure assessment of multiple bioaerosols. PMID:25799419

  15. Nominal Weights Mean Equating: A Method for Very Small Samples

    ERIC Educational Resources Information Center

    Babcock, Ben; Albano, Anthony; Raymond, Mark

    2012-01-01

    The authors introduced nominal weights mean equating, a simplified version of Tucker equating, as an alternative for dealing with very small samples. The authors then conducted three simulation studies to compare nominal weights mean equating to six other equating methods under the nonequivalent groups anchor test design with sample sizes of 20,…

  16. Nominal Weights Mean Equating: A Method for Very Small Samples

    ERIC Educational Resources Information Center

    Babcock, Ben; Albano, Anthony; Raymond, Mark

    2012-01-01

    The authors introduced nominal weights mean equating, a simplified version of Tucker equating, as an alternative for dealing with very small samples. The authors then conducted three simulation studies to compare nominal weights mean equating to six other equating methods under the nonequivalent groups anchor test design with sample sizes of 20,…

  17. A distance limited method for sampling downed coarse woody debris

    Treesearch

    Jeffrey H. Gove; Mark J. Ducey; Harry T. Valentine; Michael S. Williams

    2012-01-01

    A new sampling method for down coarse woody debris is proposed based on limiting the perpendicular distance from individual pieces to a randomly chosen sample point. Two approaches are presented that allow different protocols to be used to determine field measurements; estimators for each protocol are also developed. Both protocols are compared via simulation against...

  18. INTERVAL SAMPLING METHODS AND MEASUREMENT ERROR: A COMPUTER SIMULATION

    PubMed Central

    Wirth, Oliver; Slaven, James; Taylor, Matthew A.

    2015-01-01

    A simulation study was conducted to provide a more thorough account of measurement error associated with interval sampling methods. A computer program simulated the application of momentary time sampling, partial-interval recording, and whole-interval recording methods on target events randomly distributed across an observation period. The simulation yielded measures of error for multiple combinations of observation period, interval duration, event duration, and cumulative event duration. The simulations were conducted up to 100 times to yield measures of error variability. Although the present simulation confirmed some previously reported characteristics of interval sampling methods, it also revealed many new findings that pertain to each method’s inherent strengths and weaknesses. The analysis and resulting error tables can help guide the selection of the most appropriate sampling method for observation-based behavioral assessments. PMID:24127380

  19. A cryopreservation method for Pasteurella multocida from wetland samples

    USGS Publications Warehouse

    Moore, Melody K.; Shadduck, D.J.; Goldberg, D.R.; Samuel, M.D.

    1998-01-01

    A cryopreservation method and improved isolation techniques for detection of Pasteurella multocida from wetland samples were developed. Wetland water samples were collected in the field, diluted in dimethyl sulfoxide (DMSO, final concentration 10%), and frozen at -180 C in a liquid nitrogen vapor shipper. Frozen samples were transported to the laboratory where they were subsequently thawed and processed in Pasteurella multocida selective broth (PMSB) to isolate P. multocida. This method allowed for consistent isolation of 2 to 18 organisms/ml from water seeded with known concentrations of P. multocida. The method compared favorably with the standard mouse inoculation method and allowed for preservation of the samples until they could be processed in the laboratory.

  20. On the sampling method of the JSZ-4 Doppler receiver.

    NASA Astrophysics Data System (ADS)

    Cha, D.-Y.; Huang, K.-Y.

    The authors discuss the properties of the JSZ-4 Doppler receiver and the problem of optimal record. It is shown that the original sampling method losses information. A procedure of improvement is proposed.

  1. Sampling pig farms at the abattoir in a cross-sectional study - Evaluation of a sampling method.

    PubMed

    Birkegård, Anna Camilla; Halasa, Tariq; Toft, Nils

    2017-09-15

    A cross-sectional study design is relatively inexpensive, fast and easy to conduct when compared to other study designs. Careful planning is essential to obtaining a representative sample of the population, and the recommended approach is to use simple random sampling from an exhaustive list of units in the target population. This approach is rarely feasible in practice, and other sampling procedures must often be adopted. For example, when slaughter pigs are the target population, sampling the pigs on the slaughter line may be an alternative to on-site sampling at a list of farms. However, it is difficult to sample a large number of farms from an exact predefined list, due to the logistics and workflow of an abattoir. Therefore, it is necessary to have a systematic sampling procedure and to evaluate the obtained sample with respect to the study objective. We propose a method for 1) planning, 2) conducting, and 3) evaluating the representativeness and reproducibility of a cross-sectional study when simple random sampling is not possible. We used an example of a cross-sectional study with the aim of quantifying the association of antimicrobial resistance and antimicrobial consumption in Danish slaughter pigs. It was not possible to visit farms within the designated timeframe. Therefore, it was decided to use convenience sampling at the abattoir. Our approach was carried out in three steps: 1) planning: using data from meat inspection to plan at which abattoirs and how many farms to sample; 2) conducting: sampling was carried out at five abattoirs; 3) evaluation: representativeness was evaluated by comparing sampled and non-sampled farms, and the reproducibility of the study was assessed through simulated sampling based on meat inspection data from the period where the actual data collection was carried out. In the cross-sectional study samples were taken from 681 Danish pig farms, during five weeks from February to March 2015. The evaluation showed that the sampling

  2. Demonstration Report for Visual Sample Plan (VSP) Verification Sampling Methods at the Navy/DRI Site

    DTIC Science & Technology

    2011-08-01

    STATISTICAL VERIFICATION AND REMEDIATION SAMPLING METHODS (200837) August 2011 Pacific Northwest National Laboratory Brent Pulsipher...17. LIMIT ATIOH OF 1S. NUMSER 19~. NAME OS: RESPONSI’SLE PERSON ABSTRACT o• Brent Pulsipher ... ...., .. •. ’ · ’"" .... PAG .’ES uu 93 19b... Statistical Verification Sampling Methods in VSP ii August 2011 6.2.1  Transect Survey Design and Parameter Settings

  3. On-capillary sample cleanup method for the electrophoretic determination of carbohydrates in juice samples.

    PubMed

    Morales-Cid, Gabriel; Simonet, Bartolomé M; Cárdenas, Soledad; Valcárcel, Miguel

    2007-05-01

    On many occasions, sample treatment is a critical step in electrophoretic analysis. As an alternative to batch procedures, in this work, a new strategy is presented with a view to develop an on-capillary sample cleanup method. This strategy is based on the partial filling of the capillary with carboxylated single-walled carbon nanotube (c-SWNT). The nanoparticles retain interferences from the matrix allowing the determination and quantification of carbohydrates (viz glucose, maltose and fructose). The precision of the method for the analysis of real samples ranged from 5.3 to 6.4%. The proposed method was compared with a method based on a batch filtration of the juice sample through diatomaceous earth and further electrophoretic determination. This method was also validated in this work. The RSD for this other method ranged from 5.1 to 6%. The results obtained by both methods were statistically comparable demonstrating the accuracy of the proposed methods and their effectiveness. Electrophoretic separation of carbohydrates was achieved using 200 mM borate solution as a buffer at pH 9.5 and applying 15 kV. During separation, the capillary temperature was kept constant at 40 degrees C. For the on-capillary cleanup method, a solution containing 50 mg/L of c-SWNTs prepared in 300 mM borate solution at pH 9.5 was introduced for 60 s into the capillary just before sample introduction. For the electrophoretic analysis of samples cleaned in batch with diatomaceous earth, it is also recommended to introduce into the capillary, just before the sample, a 300 mM borate solution as it enhances the sensitivity and electrophoretic resolution.

  4. Neutron activation analysis of certified samples by the absolute method

    NASA Astrophysics Data System (ADS)

    Kadem, F.; Belouadah, N.; Idiri, Z.

    2015-07-01

    The nuclear reactions analysis technique is mainly based on the relative method or the use of activation cross sections. In order to validate nuclear data for the calculated cross section evaluated from systematic studies, we used the neutron activation analysis technique (NAA) to determine the various constituent concentrations of certified samples for animal blood, milk and hay. In this analysis, the absolute method is used. The neutron activation technique involves irradiating the sample and subsequently performing a measurement of the activity of the sample. The fundamental equation of the activation connects several physical parameters including the cross section that is essential for the quantitative determination of the different elements composing the sample without resorting to the use of standard sample. Called the absolute method, it allows a measurement as accurate as the relative method. The results obtained by the absolute method showed that the values are as precise as the relative method requiring the use of standard sample for each element to be quantified.

  5. A quantitative sampling method for Oncomelania quadrasi by filter paper.

    PubMed

    Tanaka, H; Santos, M J; Matsuda, H; Yasuraoka, K; Santos, A T

    1975-08-01

    Filter paper was found to attract Oncomelania quadrasi in waters the same way as fallen dried banana leaves, although less number of other species of snails was collected on the former than on the latter. Snails were collected in limited areas using a tube (85 cm2 area at cross-section) and a filter paper (20 X 20 CM) samplers. The sheet of filter paper was placed close to the spot where a tube sample was taken, and recovered after 24 hours. At each sampling, 30 samples were taken by each method in an area and sampling was made four times. The correlation of the number of snails collected by the tube and that by filter paper was studied. The ratio of the snail counts by the tube sampler to those by the filter paper was 1.18. A loose correlation was observed between snail counts of both methods as shown by the correlation coefficient r = 0.6502. The formulas for the regression line were Y = 0.77 X + 1.6 and X = 0.55 Y + 1.35 for 3 experiments where Y is the number of snails collected by tube sampling and X is the number of snails collected in the sheet of filter paper. The type of snail distribution was studied in the 30 samples taken by each method and this was observed to be nearly the same in both sampling methods. All sampling data were found to fit the negative binomial distribution with the values of the constant k varying very much from 0.5775 to 5.9186 in (q -- p)-k. In each experiment, the constant k was always larger in tube sampling than in filter paper sampling. This indicates that the uneven distribution of snails on the soil surface becomes more conspicuous by the filter paper sampling.

  6. Quality of plasma sampled by different methods for multiple blood sampling in mice.

    PubMed

    Christensen, S D; Mikkelsen, L F; Fels, J J; Bodvarsdóttir, T B; Hansen, A K

    2009-01-01

    For oral glucose tolerance test (OGTT) in mice, multiple blood samples need to be taken within a few hours from conscious mice. Today, a number of essential parameters may be analysed on very small amounts of plasma, thus reducing the number of animals to be used. It is, however, crucial to obtain high-quality plasma or serum in order to avoid increased data variation and thereby increased group sizes. The aim of this study was to find the most valid and reproducible method for withdrawal of blood samples when performing OGTT. Four methods, i.e. amputation of the tail tip, lateral tail incision, puncture of the tail tip and periorbital puncture, were selected for testing at 21 degrees C and 30 degrees C after a pilot study. For each method, four blood samples were drawn from C57BL/6 mice at 30 min intervals. The presence of clots was registered, haemolysis was monitored spectrophotometrically at 430 nm, and it was noted whether it was possible to achieve 30-50 microL blood. Furthermore, a small amount of extra blood was sampled before and after the four samplings for testing of whether the sampling induced a blood glucose change over the 90 min test period. All methods resulted in acceptable amounts of plasma. Clots were observed in a sparse number of samples with no significant differences between the methods. Periorbital puncture did not lead to any haemolysed samples at all, and lateral tail incision resulted in only a few haemolysed samples, while puncture or amputation of the tail tip induced haemolysis in a significant number of samples. All methods, except for puncture of the tail tip, influenced blood glucose. Periorbital puncture resulted in a dramatic increase in blood glucose of up to 3.5 mmol/L indicating that it is stressful. Although lateral tail incision also had some impact on blood glucose, it seems to be the method of choice for OGTT, as it is likely to produce a clot-free non-haemolysed sample, while periorbital sampling, although producing a

  7. DOE methods for evaluating environmental and waste management samples

    SciTech Connect

    Goheen, S.C.; McCulloch, M.; Thomas, B.L.; Riley, R.G.; Sklarew, D.S.; Mong, G.M.; Fadeff, S.K.

    1994-10-01

    DOE Methods for Evaluating Environmental and Waste Management Samples (DOE Methods) is a resource intended to support sampling and analytical activities for the evaluation of environmental and waste management samples from U.S. Department of Energy (DOE) sites. DOE Methods is the result of extensive cooperation from all DOE analytical laboratories. All of these laboratories have contributed key information and provided technical reviews as well as significant moral support leading to the success of this document. DOE Methods is designed to encompass methods for collecting representative samples and for determining the radioisotope activity and organic and inorganic composition of a sample. These determinations will aid in defining the type and breadth of contamination and thus determine the extent of environmental restoration or waste management actions needed, as defined by the DOE, the U.S. Environmental Protection Agency, or others. The development of DOE Methods is supported by the Analytical Services Division of DOE. Unique methods or methods consolidated from similar procedures in the DOE Procedures Database are selected for potential inclusion in this document. Initial selection is based largely on DOE needs and procedure applicability and completeness. Methods appearing in this document are one of two types, {open_quotes}Draft{close_quotes} or {open_quotes}Verified{close_quotes}. {open_quotes}Draft{close_quotes} methods that have been reviewed internally and show potential for eventual verification are included in this document, but they have not been reviewed externally, and their precision and bias may not be known. {open_quotes}Verified{close_quotes} methods in DOE Methods have been reviewed by volunteers from various DOE sites and private corporations. These methods have delineated measures of precision and accuracy.

  8. Probability Sampling Method for a Hidden Population Using Respondent-Driven Sampling: Simulation for Cancer Survivors.

    PubMed

    Jung, Minsoo

    2015-01-01

    When there is no sampling frame within a certain group or the group is concerned that making its population public would bring social stigma, we say the population is hidden. It is difficult to approach this kind of population survey-methodologically because the response rate is low and its members are not quite honest with their responses when probability sampling is used. The only alternative known to address the problems caused by previous methods such as snowball sampling is respondent-driven sampling (RDS), which was developed by Heckathorn and his colleagues. RDS is based on a Markov chain, and uses the social network information of the respondent. This characteristic allows for probability sampling when we survey a hidden population. We verified through computer simulation whether RDS can be used on a hidden population of cancer survivors. According to the simulation results of this thesis, the chain-referral sampling of RDS tends to minimize as the sample gets bigger, and it becomes stabilized as the wave progresses. Therefore, it shows that the final sample information can be completely independent from the initial seeds if a certain level of sample size is secured even if the initial seeds were selected through convenient sampling. Thus, RDS can be considered as an alternative which can improve upon both key informant sampling and ethnographic surveys, and it needs to be utilized for various cases domestically as well.

  9. Convenient mounting method for electrical measurements of thin samples

    NASA Technical Reports Server (NTRS)

    Matus, L. G.; Summers, R. L.

    1986-01-01

    A method for mounting thin samples for electrical measurements is described. The technique is based on a vacuum chuck concept in which the vacuum chuck simultaneously holds the sample and established electrical contact. The mounting plate is composed of a glass-ceramic insulating material and the surfaces of the plate and vacuum chuck are polished. The operation of the vacuum chuck is examined. The contacts on the sample and mounting plate, which are sputter-deposited through metal masks, are analyzed. The mounting method was utilized for van der Pauw measurements.

  10. Method and apparatus for imaging a sample on a device

    DOEpatents

    Trulson, Mark; Stern, David; Fiekowsky, Peter; Rava, Richard; Walton, Ian; Fodor, Stephen P. A.

    1996-01-01

    The present invention provides methods and systems for detecting a labeled marker on a sample located on a support. The imaging system comprises a body for immobilizing the support, an excitation radiation source and excitation optics to generate and direct the excitation radiation at the sample. In response, labeled material on the sample emits radiation which has a wavelength that is different from the excitation wavelength, which radiation is collected by collection optics and imaged onto a detector which generates an image of the sample.

  11. Soil separator and sampler and method of sampling

    DOEpatents

    O'Brien, Barry H [Idaho Falls, ID; Ritter, Paul D [Idaho Falls, ID

    2010-02-16

    A soil sampler includes a fluidized bed for receiving a soil sample. The fluidized bed may be in communication with a vacuum for drawing air through the fluidized bed and suspending particulate matter of the soil sample in the air. In a method of sampling, the air may be drawn across a filter, separating the particulate matter. Optionally, a baffle or a cyclone may be included within the fluidized bed for disentrainment, or dedusting, so only the finest particulate matter, including asbestos, will be trapped on the filter. The filter may be removable, and may be tested to determine the content of asbestos and other hazardous particulate matter in the soil sample.

  12. System and method for measuring fluorescence of a sample

    DOEpatents

    Riot, Vincent J

    2015-03-24

    The present disclosure provides a system and a method for measuring fluorescence of a sample. The sample may be a polymerase-chain-reaction (PCR) array, a loop-mediated-isothermal amplification array, etc. LEDs are used to excite the sample, and a photodiode is used to collect the sample's fluorescence. An electronic offset signal is used to reduce the effects of background fluorescence and the noises from the measurement system. An integrator integrates the difference between the output of the photodiode and the electronic offset signal over a given period of time. The resulting integral is then converted into digital domain for further processing and storage.

  13. System and method for measuring fluorescence of a sample

    DOEpatents

    Riot, Vincent J.

    2017-06-27

    The present disclosure provides a system and a method for measuring fluorescence of a sample. The sample may be a polymerase-chain-reaction (PCR) array, a loop-mediated-isothermal amplification array, etc. LEDs are used to excite the sample, and a photodiode is used to collect the sample's fluorescence. An electronic offset signal is used to reduce the effects of background fluorescence and the noises from the measurement system. An integrator integrates the difference between the output of the photodiode and the electronic offset signal over a given period of time. The resulting integral is then converted into digital domain for further processing and storage.

  14. The importance of quality control in validating concentrations of contaminants of emerging concern in source and treated drinking water samples.

    PubMed

    Batt, Angela L; Furlong, Edward T; Mash, Heath E; Glassmeyer, Susan T; Kolpin, Dana W

    2017-02-01

    A national-scale survey of 247 contaminants of emerging concern (CECs), including organic and inorganic chemical compounds, and microbial contaminants, was conducted in source and treated drinking water samples from 25 treatment plants across the United States. Multiple methods were used to determine these CECs, including six analytical methods to measure 174 pharmaceuticals, personal care products, and pesticides. A three-component quality assurance/quality control (QA/QC) program was designed for the subset of 174 CECs which allowed us to assess and compare performances of the methods used. The three components included: 1) a common field QA/QC protocol and sample design, 2) individual investigator-developed method-specific QA/QC protocols, and 3) a suite of 46 method comparison analytes that were determined in two or more analytical methods. Overall method performance for the 174 organic chemical CECs was assessed by comparing spiked recoveries in reagent, source, and treated water over a two-year period. In addition to the 247 CECs reported in the larger drinking water study, another 48 pharmaceutical compounds measured did not consistently meet predetermined quality standards. Methodologies that did not seem suitable for these analytes are overviewed. The need to exclude analytes based on method performance demonstrates the importance of additional QA/QC protocols.

  15. Extending the alias Monte Carlo sampling method to general distributions

    SciTech Connect

    Edwards, A.L.; Rathkopf, J.A. ); Smidt, R.K. )

    1991-01-07

    The alias method is a Monte Carlo sampling technique that offers significant advantages over more traditional methods. It equals the accuracy of table lookup and the speed of equal probable bins. The original formulation of this method sampled from discrete distributions and was easily extended to histogram distributions. We have extended the method further to applications more germane to Monte Carlo particle transport codes: continuous distributions. This paper presents the alias method as originally derived and our extensions to simple continuous distributions represented by piecewise linear functions. We also present a method to interpolate accurately between distributions tabulated at points other than the point of interest. We present timing studies that demonstrate the method's increased efficiency over table lookup and show further speedup achieved through vectorization. 6 refs., 12 figs., 2 tabs.

  16. Evaluating Composite Sampling Methods of Bacillus Spores at Low Concentrations

    PubMed Central

    Hess, Becky M.; Amidan, Brett G.; Anderson, Kevin K.; Hutchison, Janine R.

    2016-01-01

    Restoring all facility operations after the 2001 Amerithrax attacks took years to complete, highlighting the need to reduce remediation time. Some of the most time intensive tasks were environmental sampling and sample analyses. Composite sampling allows disparate samples to be combined, with only a single analysis needed, making it a promising method to reduce response times. We developed a statistical experimental design to test three different composite sampling methods: 1) single medium single pass composite (SM-SPC): a single cellulose sponge samples multiple coupons with a single pass across each coupon; 2) single medium multi-pass composite: a single cellulose sponge samples multiple coupons with multiple passes across each coupon (SM-MPC); and 3) multi-medium post-sample composite (MM-MPC): a single cellulose sponge samples a single surface, and then multiple sponges are combined during sample extraction. Five spore concentrations of Bacillus atrophaeus Nakamura spores were tested; concentrations ranged from 5 to 100 CFU/coupon (0.00775 to 0.155 CFU/cm2). Study variables included four clean surface materials (stainless steel, vinyl tile, ceramic tile, and painted dry wallboard) and three grime coated/dirty materials (stainless steel, vinyl tile, and ceramic tile). Analysis of variance for the clean study showed two significant factors: composite method (p< 0.0001) and coupon material (p = 0.0006). Recovery efficiency (RE) was higher overall using the MM-MPC method compared to the SM-SPC and SM-MPC methods. RE with the MM-MPC method for concentrations tested (10 to 100 CFU/coupon) was similar for ceramic tile, dry wall, and stainless steel for clean materials. RE was lowest for vinyl tile with both composite methods. Statistical tests for the dirty study showed RE was significantly higher for vinyl and stainless steel materials, but lower for ceramic tile. These results suggest post-sample compositing can be used to reduce sample analysis time when

  17. Tests of a comparative method of dating plutonium samples

    NASA Astrophysics Data System (ADS)

    West, D.

    1987-04-01

    Tests of a comparative method of dating plutonium samples have been carried out using 241Pu in aqueous solution. The six samples were of known ages (between 0.25 and 15 yr) and with one exception the measured ages, using particular samples as standards, agreed with the stated ages. In one case the agreement was beter than 1% in age. Mixed-oxide fuel pins were also intercompared. In this case it was with some difficulty that a sample of known age was obtaine. Comparison using this sample and an older one gave the same value (within ±1%) for the separation date of the unknown sample on three occasions over a three year period.

  18. COMPARISON OF MACROINVERTEBRATE SAMPLING METHODS FOR NONWADEABLE STREAMS

    EPA Science Inventory

    The bioassessment of nonwadeable streams in the United States is increasing, but methods for these systems are not as well developed as for wadeable streams. In this study, we compared six benthic macroinvertebrate field sampling methods for nonwadeable streams based on those us...

  19. The Precision Efficacy Analysis for Regression Sample Size Method.

    ERIC Educational Resources Information Center

    Brooks, Gordon P.; Barcikowski, Robert S.

    The general purpose of this study was to examine the efficiency of the Precision Efficacy Analysis for Regression (PEAR) method for choosing appropriate sample sizes in regression studies used for precision. The PEAR method, which is based on the algebraic manipulation of an accepted cross-validity formula, essentially uses an effect size to…

  20. COMPARISON OF MACROINVERTEBRATE SAMPLING METHODS FOR NONWADEABLE STREAMS

    EPA Science Inventory

    The bioassessment of nonwadeable streams in the United States is increasing, but methods for these systems are not as well developed as for wadeable streams. In this study, we compared six benthic macroinvertebrate field sampling methods for nonwadeable streams based on those us...

  1. Comparison of three different sampling methods for canine skin lipids.

    PubMed

    Angelbeck-Schulze, Mandy; Stahl, Jessica; Brodesser, Susanne; Rohn, Karl; Naim, Hassan; Hewicker-Trautwein, Marion; Kietzmann, Manfred; Bäumer, Wolfgang; Mischke, Reinhard

    2013-04-01

    Epidermal lipids are of major interest in dermatological research, especially in canine atopic dermatitis. Owing to the existence of several sampling methods, the interpretation of study results is often complicated. This study aimed to compare three different sampling methods and to establish a minimally invasive method for collecting canine epidermal lipids. Skin samples from five dogs with no obvious skin abnormalities were taken from the caudal back and the inguinal region postmortem. Samples consisted of heat-separated epidermis of three skin biopsies, three scrapes and three skin scrubs. Lipids were analysed by high-performance thin-layer chromatography; the resulting bands were identified by using corresponding standards, retardation factors and mass spectrometry. The influences of the sampling method, the body site and the ceramide standards were investigated. Between body sites, significant differences were found for cholesterol sulphate, cholesteryl esters and triglycerides. Significant differences between sampling methods were detected for all lipid fractions except for cholesterol sulphate and glucosylceramides within the lipid profile, and for at least four ceramide classes within the ceramide profile. The most obvious discrepancies were found between heat-separated epidermis and skin scrub. The reproducibility was high for scraping and skin scrub, but was lowest for heat-separated epidermis. Furthermore, this study revealed a marked influence of ceramide standards on the results regarding the ceramide profile. Scraping and skin scrub are comparably suitable methods for skin lipid sampling, whereas the analysis of heat-separated epidermis may not be the method of first choice. © 2013 The Authors. Veterinary Dermatology © 2013 ESVD and ACVD.

  2. Non-specific interference in the measurement of plasma ammonia: importance of using a sample blank.

    PubMed

    Herrera, Daniel Juan; Hutchin, Tim; Fullerton, Donna; Gray, George

    2010-01-01

    Enzymatic assays using glutamate dehydrogenase (GLDH) to monitor the transformation of NAD(P)H to NAD(P)(+) by a spectrophotometric technique are the most common methods to measure plasma ammonia (PA) in routine laboratories worldwide. However, these assays can potentially be subject to interference by substances in plasma able to oxidize NAD(P)H at a substantial rate, thereby providing falsely high results. To study this potential interference, we spiked a plasma pool with a liver homogenate and measured the ammonia concentration using a dry chemistry system (Vitros 250, Ortho Clinical Diagnostic, Raritan, NJ, USA), an enzymatic assay without a sample blanking step (Infinity Ammonia Liquid Stable Reagent, Thermo Fisher Scientific, Waltham, USA) and an enzymatic assay that corrects for the non-specific oxidation of NADPH (Ammonia kit, RANDOX Laboratories Ltd, Crumlin, UK). This experiment shows that the Infinity ammonia reagent kit is subject to a clinically significant interference and explains the discrepancies previously reported between these methods in patients with acute liver failure (ALF). When using enzymatic methods for the assessment of PA, we recommend including a sample blanking correction and this should be mandatory when monitoring patients with ALF.

  3. Statistics in brief: the importance of sample size in the planning and interpretation of medical research.

    PubMed

    Biau, David Jean; Kernéis, Solen; Porcher, Raphaël

    2008-09-01

    The increasing volume of research by the medical community often leads to increasing numbers of contradictory findings and conclusions. Although the differences observed may represent true differences, the results also may differ because of sampling variability as all studies are performed on a limited number of specimens or patients. When planning a study reporting differences among groups of patients or describing some variable in a single group, sample size should be considered because it allows the researcher to control for the risk of reporting a false-negative finding (Type II error) or to estimate the precision his or her experiment will yield. Equally important, readers of medical journals should understand sample size because such understanding is essential to interpret the relevance of a finding with regard to their own patients. At the time of planning, the investigator must establish (1) a justifiable level of statistical significance, (2) the chances of detecting a difference of given magnitude between the groups compared, ie, the power, (3) this targeted difference (ie, effect size), and (4) the variability of the data (for quantitative data). We believe correct planning of experiments is an ethical issue of concern to the entire community.

  4. Importance of anthropogenic climate impact, sampling error and urban development in sewer system design.

    PubMed

    Egger, C; Maurer, M

    2015-04-15

    Urban drainage design relying on observed precipitation series neglects the uncertainties associated with current and indeed future climate variability. Urban drainage design is further affected by the large stochastic variability of precipitation extremes and sampling errors arising from the short observation periods of extreme precipitation. Stochastic downscaling addresses anthropogenic climate impact by allowing relevant precipitation characteristics to be derived from local observations and an ensemble of climate models. This multi-climate model approach seeks to reflect the uncertainties in the data due to structural errors of the climate models. An ensemble of outcomes from stochastic downscaling allows for addressing the sampling uncertainty. These uncertainties are clearly reflected in the precipitation-runoff predictions of three urban drainage systems. They were mostly due to the sampling uncertainty. The contribution of climate model uncertainty was found to be of minor importance. Under the applied greenhouse gas emission scenario (A1B) and within the period 2036-2065, the potential for urban flooding in our Swiss case study is slightly reduced on average compared to the reference period 1981-2010. Scenario planning was applied to consider urban development associated with future socio-economic factors affecting urban drainage. The impact of scenario uncertainty was to a large extent found to be case-specific, thus emphasizing the need for scenario planning in every individual case. The results represent a valuable basis for discussions of new drainage design standards aiming specifically to include considerations of uncertainty.

  5. A multi-dimensional sampling method for locating small scatterers

    NASA Astrophysics Data System (ADS)

    Song, Rencheng; Zhong, Yu; Chen, Xudong

    2012-11-01

    A multiple signal classification (MUSIC)-like multi-dimensional sampling method (MDSM) is introduced to locate small three-dimensional scatterers using electromagnetic waves. The indicator is built with the most stable part of signal subspace of the multi-static response matrix on a set of combinatorial sampling nodes inside the domain of interest. It has two main advantages compared to the conventional MUSIC methods. First, the MDSM is more robust against noise. Second, it can work with a single incidence even for multi-scatterers. Numerical simulations are presented to show the good performance of the proposed method.

  6. Beryllium Wipe Sampling (differing methods - differing exposure potentials)

    SciTech Connect

    Kerr, Kent

    2005-03-09

    This research compared three wipe sampling techniques currently used to test for beryllium contamination on room and equipment surfaces in Department of Energy facilities. Efficiencies of removal of beryllium contamination from typical painted surfaces were tested by wipe sampling without a wetting agent, with water-moistened wipe materials, and by methanol-moistened wipes. Analysis indicated that methanol-moistened wipe sampling removed about twice as much beryllium/oil-film surface contamination as water-moistened wipes, which removed about twice as much residue as dry wipes. Criteria at 10 CFR 850.30 and .31 were established on unspecified wipe sampling method(s). The results of this study reveal a need to identify criteria-setting method and equivalency factors. As facilities change wipe sampling methods among the three compared in this study, these results may be useful for approximate correlations. Accurate decontamination decision-making depends on the selection of appropriate wetting agents for the types of residues and surfaces. Evidence for beryllium sensitization via skin exposure argues in favor of wipe sampling with wetting agents that provide enhanced removal efficiency such as methanol when surface contamination includes oil mist residue.

  7. Cooperative Nature of Gating Transitions in K+ Channels as seen from Dynamic Importance Sampling Calculations

    PubMed Central

    Denning, Elizabeth J.; Woolf, Thomas B.

    2009-01-01

    The growing dataset of K+ channel x-ray structures provides an excellent opportunity to begin a detailed molecular understanding of voltage-dependent gating. These structures, while differing in sequence, represent either a stable open or closed state. However, an understanding of the molecular details of gating will require models for the transitions and experimentally testable predictions for the gating transition. To explore these ideas, we apply Dynamic Importance Sampling (DIMS) to a set of homology models for the molecular conformations of K+ channels for four different sets of sequences and eight different states. In our results, we highlight the importance of particular residues upstream from the PVP region to the gating transition. This supports growing evidence that the PVP region is important for influencing the flexibility of the S6 helix and thus the opening of the gating domain. The results further suggest how gating on the molecular level depends on intra-subunit motions to influence the cooperative behavior of all four subunits of the K+ channel. We hypothesize that the gating process occurs in steps: first sidechain movement, then inter- S5-S6 subunit motions, and lastly the large-scale domain rearrangements. PMID:19950367

  8. Determining the relative importance of soil sample locations to predict risk of child lead exposure.

    PubMed

    Zahran, Sammy; Mielke, Howard W; McElmurry, Shawn P; Filippelli, Gabriel M; Laidlaw, Mark A S; Taylor, Mark P

    2013-10-01

    Soil lead in urban neighborhoods is a known predictor of child blood lead levels. In this paper, we address the question where one ought to concentrate soil sample collection efforts to efficiently predict children at-risk for soil Pb exposure. Two extensive data sets are combined, including 5467 surface soil samples collected from 286 census tracts, and geo-referenced blood Pb data for 55,551 children in metropolitan New Orleans, USA. Random intercept least squares, random intercept logistic, and quantile regression results indicate that soils collected within 1m adjacent to residential streets most reliably predict child blood Pb outcomes in child blood Pb levels. Regression decomposition results show that residential street soils account for 39.7% of between-neighborhood explained variation, followed by busy street soils (21.97%), open space soils (20.25%), and home foundation soils (18.71%). Just as the age of housing stock is used as a statistical shortcut for child risk of exposure to lead-based paint, our results indicate that one can shortcut the characterization of child risk of exposure to neighborhood soil Pb by concentrating sampling efforts within 1m and adjacent to residential and busy streets, while significantly reducing the total costs of collection and analysis. This efficiency gain can help advance proactive upstream, preventive methods of environmental Pb discovery.

  9. Rapid method for the determination of 226Ra in hydraulic fracturing wastewater samples

    DOE PAGES

    Maxwell, Sherrod L.; Culligan, Brian K.; Warren, Richard A.; ...

    2016-03-24

    A new method that rapidly preconcentrates and measures 226Ra from hydraulic fracturing wastewater samples was developed in the Savannah River Environmental Laboratory. The method improves the quality of 226Ra measurements using gamma spectrometry by providing up to 100x preconcentration of 226Ra from this difficult sample matrix, which contains very high levels of calcium, barium, strontium, magnesium and sodium. The high chemical yield, typically 80-90%, facilitates a low detection limit, important for lower level samples, and indicates method ruggedness. Ba-133 tracer is used to determine chemical yield and correct for geometry-related counting issues. The 226Ra sample preparation takes < 2 hours.

  10. Technical Evaluation of Sample-Processing, Collection, and Preservation Methods

    DTIC Science & Technology

    2014-07-01

    purification process. Several purification methods were preprogrammed into the instrument, and all of the necessary reagents were supplied as prefilled ...attached to syringes that filter samples 17 as a means of DNA isolation. Some advantages to this kit were that it required very few materials and was...fairly quick, if used with small amounts of sample. However, this kit was ideally used only with a small quantity at one time because of the syringe

  11. Fluidics platform and method for sample preparation and analysis

    SciTech Connect

    Benner, W. Henry; Dzenitis, John M.; Bennet, William J.; Baker, Brian R.

    2014-08-19

    Herein provided are fluidics platform and method for sample preparation and analysis. The fluidics platform is capable of analyzing DNA from blood samples using amplification assays such as polymerase-chain-reaction assays and loop-mediated-isothermal-amplification assays. The fluidics platform can also be used for other types of assays and analyzes. In some embodiments, a sample in a sealed tube can be inserted directly. The following isolation, detection, and analyzes can be performed without a user's intervention. The disclosed platform may also comprises a sample preparation system with a magnetic actuator, a heater, and an air-drying mechanism, and fluid manipulation processes for extraction, washing, elution, assay assembly, assay detection, and cleaning after reactions and between samples.

  12. Self-contained cryogenic gas sampling apparatus and method

    DOEpatents

    McManus, Gary J.; Motes, Billy G.; Bird, Susan K.; Kotter, Dale K.

    1996-01-01

    Apparatus for obtaining a whole gas sample, composed of: a sample vessel having an inlet for receiving a gas sample; a controllable valve mounted for controllably opening and closing the inlet; a valve control coupled to the valve for opening and closing the valve at selected times; a portable power source connected for supplying operating power to the valve control; and a cryogenic coolant in thermal communication with the vessel for cooling the interior of the vessel to cryogenic temperatures. A method of obtaining an air sample using the apparatus described above, by: placing the apparatus at a location at which the sample is to be obtained; operating the valve control to open the valve at a selected time and close the valve at a selected subsequent time; and between the selected times maintaining the vessel at a cryogenic temperature by heat exchange with the coolant.

  13. Self-contained cryogenic gas sampling apparatus and method

    DOEpatents

    McManus, G.J.; Motes, B.G.; Bird, S.K.; Kotter, D.K.

    1996-03-26

    Apparatus for obtaining a whole gas sample, is composed of: a sample vessel having an inlet for receiving a gas sample; a controllable valve mounted for controllably opening and closing the inlet; a valve control coupled to the valve for opening and closing the valve at selected times; a portable power source connected for supplying operating power to the valve control; and a cryogenic coolant in thermal communication with the vessel for cooling the interior of the vessel to cryogenic temperatures. A method is described for obtaining an air sample using the apparatus described above, by: placing the apparatus at a location at which the sample is to be obtained; operating the valve control to open the valve at a selected time and close the valve at a selected subsequent time; and between the selected times maintaining the vessel at a cryogenic temperature by heat exchange with the coolant. 3 figs.

  14. Importance of sampling design and analysis in animal population studies: a comment on Sergio et al

    USGS Publications Warehouse

    Kery, M.; Royle, J. Andrew; Schmid, Hans

    2008-01-01

    1. The use of predators as indicators and umbrellas in conservation has been criticized. In the Trentino region, Sergio et al. (2006; hereafter SEA) counted almost twice as many bird species in quadrats located in raptor territories than in controls. However, SEA detected astonishingly few species. We used contemporary Swiss Breeding Bird Survey data from an adjacent region and a novel statistical model that corrects for overlooked species to estimate the expected number of bird species per quadrat in that region. 2. There are two anomalies in SEA which render their results ambiguous. First, SEA detected on average only 6.8 species, whereas a value of 32 might be expected. Hence, they probably overlooked almost 80% of all species. Secondly, the precision of their mean species counts was greater in two-thirds of cases than in the unlikely case that all quadrats harboured exactly the same number of equally detectable species. This suggests that they detected consistently only a biased, unrepresentative subset of species. 3. Conceptually, expected species counts are the product of true species number and species detectability p. Plenty of factors may affect p, including date, hour, observer, previous knowledge of a site and mobbing behaviour of passerines in the presence of predators. Such differences in p between raptor and control quadrats could have easily created the observed effects. Without a method that corrects for such biases, or without quantitative evidence that species detectability was indeed similar between raptor and control quadrats, the meaning of SEA's counts is hard to evaluate. Therefore, the evidence presented by SEA in favour of raptors as indicator species for enhanced levels of biodiversity remains inconclusive. 4. Synthesis and application. Ecologists should pay greater attention to sampling design and analysis in animal population estimation. Species richness estimation means sampling a community. Samples should be representative for the

  15. RAPID METHOD FOR DETERMINATION OF RADIOSTRONTIUM IN EMERGENCY MILK SAMPLES

    SciTech Connect

    Maxwell, S.; Culligan, B.

    2008-07-17

    A new rapid separation method for radiostrontium in emergency milk samples was developed at the Savannah River Site (SRS) Environmental Bioassay Laboratory (Aiken, SC, USA) that will allow rapid separation and measurement of Sr-90 within 8 hours. The new method uses calcium phosphate precipitation, nitric acid dissolution of the precipitate to coagulate residual fat/proteins and a rapid strontium separation using Sr Resin (Eichrom Technologies, Darien, IL, USA) with vacuum-assisted flow rates. The method is much faster than previous method that use calcination or cation exchange pretreatment, has excellent chemical recovery, and effectively removes beta interferences. When a 100 ml sample aliquot is used, the method has a detection limit of 0.5 Bq/L, well below generic emergency action levels.

  16. Compressive sampling in computed tomography: Method and application

    NASA Astrophysics Data System (ADS)

    Hu, Zhanli; Liang, Dong; Xia, Dan; Zheng, Hairong

    2014-06-01

    Since Donoho and Candes et al. published their groundbreaking work on compressive sampling or compressive sensing (CS), CS theory has attracted a lot of attention and become a hot topic, especially in biomedical imaging. Specifically, some CS based methods have been developed to enable accurate reconstruction from sparse data in computed tomography (CT) imaging. In this paper, we will review the progress in CS based CT from aspects of three fundamental requirements of CS: sparse representation, incoherent sampling and reconstruction algorithm. In addition, some potential applications of compressive sampling in CT are introduced.

  17. Comparison of pigment content of paint samples using spectrometric methods.

    PubMed

    Trzcińska, Beata; Kowalski, Rafał; Zięba-Palus, Janina

    2014-09-15

    The aim of the paper was to evaluate the influence of pigment concentration and its distribution in polymer binder on the possibility of colour identification and paint sample comparison. Two sets of paint samples: one containing red and another one green pigment were prepared. Each set consisted of 13 samples differing gradually in the concentration of pigment. To obtain the sets of various colour shades white paint was mixed with the appropriate pigment in the form of a concentrated suspension. After solvents evaporation the samples were examined using spectrometric methods. The resin and main filler were identified by IR method. Colour and white pigments were identified on the base of Raman spectra. Colour of samples were compared based on Vis spectrometry according to colour theory. It was found that samples are homogenous (parameter measuring colour similarity ΔE<3). The values of ΔE between the neighbouring samples in the set revealed decreasing linear function and between the first and following one--a logarithmic function.

  18. Estimation variance bounds of importance sampling simulations in digital communication systems

    NASA Technical Reports Server (NTRS)

    Lu, D.; Yao, K.

    1991-01-01

    In practical applications of importance sampling (IS) simulation, two basic problems are encountered, that of determining the estimation variance and that of evaluating the proper IS parameters needed in the simulations. The authors derive new upper and lower bounds on the estimation variance which are applicable to IS techniques. The upper bound is simple to evaluate and may be minimized by the proper selection of the IS parameter. Thus, lower and upper bounds on the improvement ratio of various IS techniques relative to the direct Monte Carlo simulation are also available. These bounds are shown to be useful and computationally simple to obtain. Based on the proposed technique, one can readily find practical suboptimum IS parameters. Numerical results indicate that these bounding techniques are useful for IS simulations of linear and nonlinear communication systems with intersymbol interference in which bit error rate and IS estimation variances cannot be obtained readily using prior techniques.

  19. Monte Carlo importance sampling for the MCNP{trademark} general source

    SciTech Connect

    Lichtenstein, H.

    1996-01-09

    Research was performed to develop an importance sampling procedure for a radiation source. The procedure was developed for the MCNP radiation transport code, but the approach itself is general and can be adapted to other Monte Carlo codes. The procedure, as adapted to MCNP, relies entirely on existing MCNP capabilities. It has been tested for very complex descriptions of a general source, in the context of the design of spent-reactor-fuel storage casks. Dramatic improvements in calculation efficiency have been observed in some test cases. In addition, the procedure has been found to provide an acceleration to acceptable convergence, as well as the benefit of quickly identifying user specified variance-reduction in the transport that effects unstable convergence.

  20. Estimation variance bounds of importance sampling simulations in digital communication systems

    NASA Technical Reports Server (NTRS)

    Lu, D.; Yao, K.

    1991-01-01

    In practical applications of importance sampling (IS) simulation, two basic problems are encountered, that of determining the estimation variance and that of evaluating the proper IS parameters needed in the simulations. The authors derive new upper and lower bounds on the estimation variance which are applicable to IS techniques. The upper bound is simple to evaluate and may be minimized by the proper selection of the IS parameter. Thus, lower and upper bounds on the improvement ratio of various IS techniques relative to the direct Monte Carlo simulation are also available. These bounds are shown to be useful and computationally simple to obtain. Based on the proposed technique, one can readily find practical suboptimum IS parameters. Numerical results indicate that these bounding techniques are useful for IS simulations of linear and nonlinear communication systems with intersymbol interference in which bit error rate and IS estimation variances cannot be obtained readily using prior techniques.

  1. Adaptive importance sampling to accelerate training of a neural probabilistic language model.

    PubMed

    Bengio, Y; Senecal, J S

    2008-04-01

    Previous work on statistical language modeling has shown that it is possible to train a feedforward neural network to approximate probabilities over sequences of words, resulting in significant error reduction when compared to standard baseline models based on n-grams. However, training the neural network model with the maximum-likelihood criterion requires computations proportional to the number of words in the vocabulary. In this paper, we introduce adaptive importance sampling as a way to accelerate training of the model. The idea is to use an adaptive n-gram model to track the conditional distributions produced by the neural network. We show that a very significant speedup can be obtained on standard problems.

  2. On the Exploitation of Sensitivity Derivatives for Improving Sampling Methods

    NASA Technical Reports Server (NTRS)

    Cao, Yanzhao; Hussaini, M. Yousuff; Zang, Thomas A.

    2003-01-01

    Many application codes, such as finite-element structural analyses and computational fluid dynamics codes, are capable of producing many sensitivity derivatives at a small fraction of the cost of the underlying analysis. This paper describes a simple variance reduction method that exploits such inexpensive sensitivity derivatives to increase the accuracy of sampling methods. Three examples, including a finite-element structural analysis of an aircraft wing, are provided that illustrate an order of magnitude improvement in accuracy for both Monte Carlo and stratified sampling schemes.

  3. Two-dimensional signal reconstruction: The correlation sampling method

    SciTech Connect

    Roman, H. E.

    2007-12-15

    An accurate approach for reconstructing a time-dependent two-dimensional signal from non-synchronized time series recorded at points located on a grid is discussed. The method, denoted as correlation sampling, improves the standard conditional sampling approach commonly employed in the study of turbulence in magnetoplasma devices. Its implementation is illustrated in the case of an artificial time-dependent signal constructed using a fractal algorithm that simulates a fluctuating surface. A statistical method is also discussed for distinguishing coherent (i.e., collective) from purely random (noisy) behavior for such two-dimensional fluctuating phenomena.

  4. Evaluation of Stress Loaded Steel Samples Using Selected Electromagnetic Methods

    SciTech Connect

    Chady, T.

    2004-02-26

    In this paper the magnetic leakage flux and eddy current method were used to evaluate changes of materials' properties caused by stress. Seven samples made of ferromagnetic material with different level of applied stress were prepared. First, the leakage magnetic fields were measured by scanning the surface of the specimens with GMR gradiometer. Next, the same samples were evaluated using an eddy current sensor. A comparison between results obtained from both methods was carried out. Finally, selected parameters of the measured signal were calculated and utilized to evaluate level of the applied stress. A strong coincidence between amount of the applied stress and the maximum amplitude of the derivative was confirmed.

  5. Recording 2-D Nutation NQR Spectra by Random Sampling Method

    PubMed Central

    Sinyavsky, Nikolaj; Jadzyn, Maciej; Ostafin, Michal; Nogaj, Boleslaw

    2010-01-01

    The method of random sampling was introduced for the first time in the nutation nuclear quadrupole resonance (NQR) spectroscopy where the nutation spectra show characteristic singularities in the form of shoulders. The analytic formulae for complex two-dimensional (2-D) nutation NQR spectra (I = 3/2) were obtained and the condition for resolving the spectral singularities for small values of an asymmetry parameter η was determined. Our results show that the method of random sampling of a nutation interferogram allows significant reduction of time required to perform a 2-D nutation experiment and does not worsen the spectral resolution. PMID:20949121

  6. Recording 2-D Nutation NQR Spectra by Random Sampling Method.

    PubMed

    Glotova, Olga; Sinyavsky, Nikolaj; Jadzyn, Maciej; Ostafin, Michal; Nogaj, Boleslaw

    2010-10-01

    The method of random sampling was introduced for the first time in the nutation nuclear quadrupole resonance (NQR) spectroscopy where the nutation spectra show characteristic singularities in the form of shoulders. The analytic formulae for complex two-dimensional (2-D) nutation NQR spectra (I = 3/2) were obtained and the condition for resolving the spectral singularities for small values of an asymmetry parameter η was determined. Our results show that the method of random sampling of a nutation interferogram allows significant reduction of time required to perform a 2-D nutation experiment and does not worsen the spectral resolution.

  7. A Nonparametric Geostatistical Method For Estimating Species Importance

    Treesearch

    Andrew J. Lister; Rachel Riemann; Michael Hoppus

    2001-01-01

    Parametric statistical methods are not always appropriate for conducting spatial analyses of forest inventory data. Parametric geostatistical methods such as variography and kriging are essentially averaging procedures, and thus can be affected by extreme values. Furthermore, non normal distributions violate the assumptions of analyses in which test statistics are...

  8. Prey Selection by an Apex Predator: The Importance of Sampling Uncertainty

    PubMed Central

    Davis, Miranda L.; Stephens, Philip A.; Willis, Stephen G.; Bassi, Elena; Marcon, Andrea; Donaggio, Emanuela; Capitani, Claudia; Apollonio, Marco

    2012-01-01

    The impact of predation on prey populations has long been a focus of ecologists, but a firm understanding of the factors influencing prey selection, a key predictor of that impact, remains elusive. High levels of variability observed in prey selection may reflect true differences in the ecology of different communities but might also reflect a failure to deal adequately with uncertainties in the underlying data. Indeed, our review showed that less than 10% of studies of European wolf predation accounted for sampling uncertainty. Here, we relate annual variability in wolf diet to prey availability and examine temporal patterns in prey selection; in particular, we identify how considering uncertainty alters conclusions regarding prey selection. Over nine years, we collected 1,974 wolf scats and conducted drive censuses of ungulates in Alpe di Catenaia, Italy. We bootstrapped scat and census data within years to construct confidence intervals around estimates of prey use, availability and selection. Wolf diet was dominated by boar (61.5±3.90 [SE] % of biomass eaten) and roe deer (33.7±3.61%). Temporal patterns of prey densities revealed that the proportion of roe deer in wolf diet peaked when boar densities were low, not when roe deer densities were highest. Considering only the two dominant prey types, Manly's standardized selection index using all data across years indicated selection for boar (mean = 0.73±0.023). However, sampling error resulted in wide confidence intervals around estimates of prey selection. Thus, despite considerable variation in yearly estimates, confidence intervals for all years overlapped. Failing to consider such uncertainty could lead erroneously to the assumption of differences in prey selection among years. This study highlights the importance of considering temporal variation in relative prey availability and accounting for sampling uncertainty when interpreting the results of dietary studies. PMID:23110122

  9. Prey selection by an apex predator: the importance of sampling uncertainty.

    PubMed

    Davis, Miranda L; Stephens, Philip A; Willis, Stephen G; Bassi, Elena; Marcon, Andrea; Donaggio, Emanuela; Capitani, Claudia; Apollonio, Marco

    2012-01-01

    The impact of predation on prey populations has long been a focus of ecologists, but a firm understanding of the factors influencing prey selection, a key predictor of that impact, remains elusive. High levels of variability observed in prey selection may reflect true differences in the ecology of different communities but might also reflect a failure to deal adequately with uncertainties in the underlying data. Indeed, our review showed that less than 10% of studies of European wolf predation accounted for sampling uncertainty. Here, we relate annual variability in wolf diet to prey availability and examine temporal patterns in prey selection; in particular, we identify how considering uncertainty alters conclusions regarding prey selection.Over nine years, we collected 1,974 wolf scats and conducted drive censuses of ungulates in Alpe di Catenaia, Italy. We bootstrapped scat and census data within years to construct confidence intervals around estimates of prey use, availability and selection. Wolf diet was dominated by boar (61.5 ± 3.90 [SE] % of biomass eaten) and roe deer (33.7 ± 3.61%). Temporal patterns of prey densities revealed that the proportion of roe deer in wolf diet peaked when boar densities were low, not when roe deer densities were highest. Considering only the two dominant prey types, Manly's standardized selection index using all data across years indicated selection for boar (mean = 0.73 ± 0.023). However, sampling error resulted in wide confidence intervals around estimates of prey selection. Thus, despite considerable variation in yearly estimates, confidence intervals for all years overlapped. Failing to consider such uncertainty could lead erroneously to the assumption of differences in prey selection among years. This study highlights the importance of considering temporal variation in relative prey availability and accounting for sampling uncertainty when interpreting the results of dietary studies.

  10. A method to optimize sampling locations for measuring indoor air distributions

    NASA Astrophysics Data System (ADS)

    Huang, Yan; Shen, Xiong; Li, Jianmin; Li, Bingye; Duan, Ran; Lin, Chao-Hsin; Liu, Junjie; Chen, Qingyan

    2015-02-01

    Indoor air distributions, such as the distributions of air temperature, air velocity, and contaminant concentrations, are very important to occupants' health and comfort in enclosed spaces. When point data is collected for interpolation to form field distributions, the sampling locations (the locations of the point sensors) have a significant effect on time invested, labor costs and measuring accuracy on field interpolation. This investigation compared two different sampling methods: the grid method and the gradient-based method, for determining sampling locations. The two methods were applied to obtain point air parameter data in an office room and in a section of an economy-class aircraft cabin. The point data obtained was then interpolated to form field distributions by the ordinary Kriging method. Our error analysis shows that the gradient-based sampling method has 32.6% smaller error of interpolation than the grid sampling method. We acquired the function between the interpolation errors and the sampling size (the number of sampling points). According to the function, the sampling size has an optimal value and the maximum sampling size can be determined by the sensor and system errors. This study recommends the gradient-based sampling method for measuring indoor air distributions.

  11. A comparison of sampling methods for examining the laryngeal microbiome

    PubMed Central

    Hanshew, Alissa S.; Jetté, Marie E.; Tadayon, Stephanie; Thibeault, Susan L.

    2017-01-01

    Shifts in healthy human microbial communities have now been linked to disease in numerous body sites. Noninvasive swabbing remains the sampling technique of choice in most locations; however, it is not well known if this method samples the entire community, or only those members that are easily removed from the surface. We sought to compare the communities found via swabbing and biopsied tissue in true vocal folds, a location that is difficult to sample without causing potential damage and impairment to tissue function. A secondary aim of this study was to determine if swab sampling of the false vocal folds could be used as proxy for true vocal folds. True and false vocal fold mucosal samples (swabbed and biopsied) were collected from six pigs and used for 454 pyrosequencing of the V3–V5 region of the 16S rRNA gene. Most of the alpha and beta measures of diversity were found to be significantly similar between swabbed and biopsied tissue samples. Similarly, the communities found in true and false vocal folds did not differ considerably. These results suggest that samples taken via swabs are sufficient to assess the community, and that samples taken from the false vocal folds may be used as proxies for the true vocal folds. Assessment of these techniques opens an avenue to less traumatic means to explore the role microbes play in the development of diseases of the vocal folds, and perhaps the rest of the respiratory tract. PMID:28362810

  12. NEW COLUMN SEPARATION METHOD FOR EMERGENCY URINE SAMPLES

    SciTech Connect

    Maxwell, S; Brian Culligan, B

    2007-08-28

    The Savannah River Site Environmental Bioassay Lab participated in the 2007 NRIP Emergency Response program administered by the National Institute for Standards and Technology (NIST) in May, 2007. A new rapid column separation method was applied directly to the NRIP 2007 emergency urine samples, with only minimal sample preparation to reduce preparation time. Calcium phosphate precipitation, previously used to pre-concentrate actinides and Sr-90 in NRIP 2006 urine and water samples, was not used for the NRIP 2007 urine samples. Instead, the raw urine was acidified and passed directly through the stacked resin columns (TEVA+TRU+SR Resins) to separate the actinides and strontium from the NRIP urine samples more quickly. This improvement reduced sample preparation time for the NRIP 2007 emergency urine analyses significantly. This approach works well for small volume urine samples expected during an emergency response event. Based on initial feedback from NIST, the SRS Environmental Bioassay Lab had the most rapid analysis times for actinides and strontium-90 analyses for NRIP 2007 urine samples.

  13. Detecting spatial structures in throughfall data: the effect of extent, sample size, sampling design, and variogram estimation method

    NASA Astrophysics Data System (ADS)

    Voss, Sebastian; Zimmermann, Beate; Zimmermann, Alexander

    2016-04-01

    In the last three decades, an increasing number of studies analyzed spatial patterns in throughfall to investigate the consequences of rainfall redistribution for biogeochemical and hydrological processes in forests. In the majority of cases, variograms were used to characterize the spatial properties of the throughfall data. The estimation of the variogram from sample data requires an appropriate sampling scheme: most importantly, a large sample and an appropriate layout of sampling locations that often has to serve both variogram estimation and geostatistical prediction. While some recommendations on these aspects exist, they focus on Gaussian data and high ratios of the variogram range to the extent of the study area. However, many hydrological data, and throughfall data in particular, do not follow a Gaussian distribution. In this study, we examined the effect of extent, sample size, sampling design, and calculation methods on variogram estimation of throughfall data. For our investigation, we first generated non-Gaussian random fields based on throughfall data with heavy outliers. Subsequently, we sampled the fields with three extents (plots with edge lengths of 25 m, 50 m, and 100 m), four common sampling designs (two grid-based layouts, transect and random sampling), and five sample sizes (50, 100, 150, 200, 400). We then estimated the variogram parameters by method-of-moments and residual maximum likelihood. Our key findings are threefold. First, the choice of the extent has a substantial influence on the estimation of the variogram. A comparatively small ratio of the extent to the correlation length is beneficial for variogram estimation. Second, a combination of a minimum sample size of 150, a design that ensures the sampling of small distances and variogram estimation by residual maximum likelihood offers a good compromise between accuracy and efficiency. Third, studies relying on method-of-moments based variogram estimation may have to employ at least

  14. Detecting spatial structures in throughfall data: The effect of extent, sample size, sampling design, and variogram estimation method

    NASA Astrophysics Data System (ADS)

    Voss, Sebastian; Zimmermann, Beate; Zimmermann, Alexander

    2016-09-01

    In the last decades, an increasing number of studies analyzed spatial patterns in throughfall by means of variograms. The estimation of the variogram from sample data requires an appropriate sampling scheme: most importantly, a large sample and a layout of sampling locations that often has to serve both variogram estimation and geostatistical prediction. While some recommendations on these aspects exist, they focus on Gaussian data and high ratios of the variogram range to the extent of the study area. However, many hydrological data, and throughfall data in particular, do not follow a Gaussian distribution. In this study, we examined the effect of extent, sample size, sampling design, and calculation method on variogram estimation of throughfall data. For our investigation, we first generated non-Gaussian random fields based on throughfall data with large outliers. Subsequently, we sampled the fields with three extents (plots with edge lengths of 25 m, 50 m, and 100 m), four common sampling designs (two grid-based layouts, transect and random sampling) and five sample sizes (50, 100, 150, 200, 400). We then estimated the variogram parameters by method-of-moments (non-robust and robust estimators) and residual maximum likelihood. Our key findings are threefold. First, the choice of the extent has a substantial influence on the estimation of the variogram. A comparatively small ratio of the extent to the correlation length is beneficial for variogram estimation. Second, a combination of a minimum sample size of 150, a design that ensures the sampling of small distances and variogram estimation by residual maximum likelihood offers a good compromise between accuracy and efficiency. Third, studies relying on method-of-moments based variogram estimation may have to employ at least 200 sampling points for reliable variogram estimates. These suggested sample sizes exceed the number recommended by studies dealing with Gaussian data by up to 100 %. Given that most previous

  15. 19 CFR 19.8 - Examination of goods by importer; sampling; repacking; examination of merchandise by prospective...

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 19 Customs Duties 1 2010-04-01 2010-04-01 false Examination of goods by importer; sampling... goods by importer; sampling; repacking; examination of merchandise by prospective purchasers. Importers... conduct of Customs business and no danger to the revenue prospective purchaser may be permitted to examine...

  16. 19 CFR 19.8 - Examination of goods by importer; sampling; repacking; examination of merchandise by prospective...

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 19 Customs Duties 1 2011-04-01 2011-04-01 false Examination of goods by importer; sampling... goods by importer; sampling; repacking; examination of merchandise by prospective purchasers. Importers... conduct of Customs business and no danger to the revenue prospective purchaser may be permitted to examine...

  17. 19 CFR 19.8 - Examination of goods by importer; sampling; repacking; examination of merchandise by prospective...

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 19 Customs Duties 1 2013-04-01 2013-04-01 false Examination of goods by importer; sampling... goods by importer; sampling; repacking; examination of merchandise by prospective purchasers. Importers... conduct of Customs business and no danger to the revenue prospective purchaser may be permitted to examine...

  18. A molecular method to assess Phytophthora diversity in environmental samples.

    PubMed

    Scibetta, Silvia; Schena, Leonardo; Chimento, Antonio; Cacciola, Santa O; Cooke, David E L

    2012-03-01

    Current molecular detection methods for the genus Phytophthora are specific to a few key species rather than the whole genus and this is a recognized weakness of protocols for ecological studies and international plant health legislation. In the present study a molecular approach was developed to detect Phytophthora species in soil and water samples using novel sets of genus-specific primers designed against the internal transcribed spacer (ITS) regions. Two different rDNA primer sets were tested: one assay amplified a long product including the ITS1, 5.8S and ITS2 regions (LP) and the other a shorter product including the ITS1 only (SP). Both assays specifically amplified products from Phytophthora species without cross-reaction with the related Pythium s. lato, however the SP assay proved the more sensitive and reliable. The method was validated using woodland soil and stream water from Invergowrie, Scotland. On-site use of a knapsack sprayer and in-line water filters proved more rapid and effective than centrifugation at sampling Phytophthora propagules. A total of 15 different Phytophthora phylotypes were identified which clustered within the reported ITS-clades 1, 2, 3, 6, 7 and 8. The range and type of the sequences detected varied from sample to sample and up to three and five different Phytophthora phylotypes were detected within a single sample of soil or water, respectively. The most frequently detected sequences were related to members of ITS-clade 6 (i.e. P. gonapodyides-like). The new method proved very effective at discriminating multiple species in a given sample and can also detect as yet unknown species. The reported primers and methods will prove valuable for ecological studies, biosecurity and commercial plant, soil or water (e.g. irrigation water) testing as well as the wider metagenomic sampling of this fascinating component of microbial pathogen diversity.

  19. RAPID SEPARATION METHOD FOR EMERGENCY WATER AND URINE SAMPLES

    SciTech Connect

    Maxwell, S.; Culligan, B.

    2008-08-27

    The Savannah River Site Environmental Bioassay Lab participated in the 2008 NRIP Emergency Response program administered by the National Institute for Standards and Technology (NIST) in May, 2008. A new rapid column separation method was used for analysis of actinides and {sup 90}Sr the NRIP 2008 emergency water and urine samples. Significant method improvements were applied to reduce analytical times. As a result, much faster analysis times were achieved, less than 3 hours for determination of {sup 90}Sr and 3-4 hours for actinides. This represents a 25%-33% improvement in analysis times from NRIP 2007 and a {approx}100% improvement compared to NRIP 2006 report times. Column flow rates were increased by a factor of two, with no significant adverse impact on the method performance. Larger sample aliquots, shorter count times, faster cerium fluoride microprecipitation and streamlined calcium phosphate precipitation were also employed. Based on initial feedback from NIST, the SRS Environmental Bioassay Lab had the most rapid analysis times for actinides and {sup 90}Sr analyses for NRIP 2008 emergency urine samples. High levels of potential matrix interferences may be present in emergency samples and rugged methods are essential. Extremely high levels of {sup 210}Po were found to have an adverse effect on the uranium results for the NRIP-08 urine samples, while uranium results for NRIP-08 water samples were not affected. This problem, which was not observed for NRIP-06 or NRIP-07 urine samples, was resolved by using an enhanced {sup 210}Po removal step, which will be described.

  20. 40 CFR 80.1348 - What gasoline sample retention requirements apply to refiners and importers?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... Benzene Sampling, Testing and Retention Requirements § 80.1348 What gasoline sample retention requirements... independent laboratory shall also include with the retained sample the test result for benzene as...

  1. 40 CFR 80.1348 - What gasoline sample retention requirements apply to refiners and importers?

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... Benzene Sampling, Testing and Retention Requirements § 80.1348 What gasoline sample retention requirements... independent laboratory shall also include with the retained sample the test result for benzene as...

  2. Comparison of sampling methods for radiocarbon dating of carbonyls in air samples via accelerator mass spectrometry

    NASA Astrophysics Data System (ADS)

    Schindler, Matthias; Kretschmer, Wolfgang; Scharf, Andreas; Tschekalinskij, Alexander

    2016-05-01

    Three new methods to sample and prepare various carbonyl compounds for radiocarbon measurements were developed and tested. Two of these procedures utilized the Strecker synthetic method to form amino acids from carbonyl compounds with either sodium cyanide or trimethylsilyl cyanide. The third procedure used semicarbazide to form crystalline carbazones with the carbonyl compounds. The resulting amino acids and semicarbazones were then separated and purified using thin layer chromatography. The separated compounds were then combusted to CO2 and reduced to graphite to determine 14C content by accelerator mass spectrometry (AMS). All of these methods were also compared with the standard carbonyl compound sampling method wherein a compound is derivatized with 2,4-dinitrophenylhydrazine and then separated by high-performance liquid chromatography (HPLC).

  3. Proteome Analysis of Human Perilymph using an Intraoperative Sampling Method.

    PubMed

    Schmitt, Heike Andrea; Pich, Andreas; Schröder, Anke; Scheper, Verena; Lilli, Giorgio; Reuter, Günter; Lenarz, Thomas

    2017-03-10

    The knowledge about the etiology and pathophysiology of sensorineural hearing loss (SNHL) is still very limited. The project aims at the improvement of understanding different types of SNHL by proteome analysis of human perilymph. Sampling of perilymph has been established during inner ear surgeries (cochlear implant and vestibular schwannoma surgeries) and safety of the sampling method was determined by pure tone audiometry. An in-depth shot-gun proteomics approach was performed to identify cochlear proteins and individual proteome in perilymph of patients. This method enables the identification and quantification of protein composition of perilymph. The proteome of 41 collected perilymph samples with volumes of 1-12 µl was analyzed by data dependent acquisition resulting in overall 878 detected protein groups. At least 203 protein groups were solely identified in perilymph, not in reference samples (serum, cerebrospinal fluid), displaying a specific protein pattern for perilymph. Samples were grouped according to age of patients and type of surgery leading to identification of some proteins specific to particular subgroups. Proteins with different abundances between different sample groups were subjected to classification by gene ontology annotations. The identified proteins might be used to develop tools for non-invasive inner ear diagnostics and to elucidate molecular profiles of SNHL.

  4. Importance of long-time simulations for rare event sampling in zinc finger proteins.

    PubMed

    Godwin, Ryan; Gmeiner, William; Salsbury, Freddie R

    2016-01-01

    Molecular dynamics (MD) simulation methods have seen significant improvement since their inception in the late 1950s. Constraints of simulation size and duration that once impeded the field have lessened with the advent of better algorithms, faster processors, and parallel computing. With newer techniques and hardware available, MD simulations of more biologically relevant timescales can now sample a broader range of conformational and dynamical changes including rare events. One concern in the literature has been under which circumstances it is sufficient to perform many shorter timescale simulations and under which circumstances fewer longer simulations are necessary. Herein, our simulations of the zinc finger NEMO (2JVX) using multiple simulations of length 15, 30, 1000, and 3000 ns are analyzed to provide clarity on this point.

  5. The importance of measuring and accounting for potential biases in respondent-driven samples.

    PubMed

    Rudolph, Abby E; Fuller, Crystal M; Latkin, Carl

    2013-07-01

    Respondent-driven sampling (RDS) is often viewed as a superior method for recruiting hard-to-reach populations disproportionately burdened with poor health outcomes. As an analytic approach, it has been praised for its ability to generate unbiased population estimates via post-stratified weights which account for non-random recruitment. However, population estimates generated with RDSAT (RDS Analysis Tool) are sensitive to variations in degree weights. Several assumptions are implicit in the degree weight and are not routinely assessed. Failure to meet these assumptions could result in inaccurate degree measures and consequently result in biased population estimates. We highlight potential biases associated with violating the assumptions implicit in degree weights for the RDSAT estimator and propose strategies to measure and possibly correct for biases in the analysis.

  6. The importance of measuring and accounting for potential biases in respondent-driven samples

    PubMed Central

    Rudolph, Abby E.; Fuller, Crystal M.; Latkin, Carl

    2013-01-01

    Respondent-driven sampling (RDS) is often viewed as a superior method for recruiting hard-to-reach populations disproportionately burdened with poor health outcomes. As an analytic approach, it has been praised for its ability to generate unbiased population estimates via post-stratified weights which account for non-random recruitment. However, population estimates generated with RDSAT (RDS Analysis Tool) are sensitive to variations in degree weights. Several assumptions are implicit in the degree weight and are not routinely assessed. Failure to meet these assumptions could result in inaccurate degree measures and consequently result in biased population estimates. We highlight potential biases associated with violating the assumptions implicit in degree weights for the RDSAT estimator and propose strategies to measure and possibly correct for biases in the analysis. PMID:23515641

  7. Blue noise sampling method based on mixture distance

    NASA Astrophysics Data System (ADS)

    Qin, Hongxing; Hong, XiaoYang; Xiao, Bin; Zhang, Shaoting; Wang, Guoyin

    2014-11-01

    Blue noise sampling is a core component for a large number of computer graphic applications such as imaging, modeling, animation, and rendering. However, most existing methods are concentrated on preserving spatial domain properties like density and anisotropy, while ignoring feature preserving. In order to solve the problem, we present a new distance metric called mixture distance for blue noise sampling, which is a combination of geodesic and feature distances. Based on mixture distance, the blue noise property and features can be preserved by controlling the ratio of the geodesic distance to the feature distance. With the intention of meeting different requirements from various applications, an adaptive adjustment for parameters is also proposed to achieve a balance between the preservation of features and spatial properties. Finally, implementation on a graphic processing unit is introduced to improve the efficiency of computation. The efficacy of the method is demonstrated by the results of image stippling, surface sampling, and remeshing.

  8. Source sampling and analysis guidance: A methods directory

    SciTech Connect

    Jackson, M.D.; Johnson, L.D.; Baughman, K.W.; James, R.H.; Spafford, R.B.

    1991-01-01

    Sampling and analytical methodologies are needed by EPA and industry for testing stationary sources for specific organic compounds such as those listed under the Resource Conservation and Recovery Act (RCRA) Appendix 8 and Appendix 9 and the Clean Air Act of 1990. A computerized directory, Problem POHC Reference Directory, has been developed that supplies information on available field sampling and analytical methodology for each compound in those lists. Existing EPA methods are referenced if applicable, along with their validation status. At the present, the data base is strongly oriented toward combustion sources. The base may be searched on the basis of several parameters including name, Chemical Abstracts Service (CAS) number, physical properties, thermal stability, combustion rank, or general problem areas in sampling or analysis. The methods directory is menu driven and requires no programming ability; however, some familiarity with dBASE III+ would be helpful.

  9. Method and apparatus for sampling low-yield wells

    DOEpatents

    Last, George V.; Lanigan, David C.

    2003-04-15

    An apparatus and method for collecting a sample from a low-yield well or perched aquifer includes a pump and a controller responsive to water level sensors for filling a sample reservoir. The controller activates the pump to fill the reservoir when the water level in the well reaches a high level as indicated by the sensor. The controller deactivates the pump when the water level reaches a lower level as indicated by the sensors. The pump continuously activates and deactivates the pump until the sample reservoir is filled with a desired volume, as indicated by a reservoir sensor. At the beginning of each activation cycle, the controller optionally can select to purge an initial quantity of water prior to filling the sample reservoir. The reservoir can be substantially devoid of air and the pump is a low volumetric flow rate pump. Both the pump and the reservoir can be located either inside or outside the well.

  10. The impact of particle size selective sampling methods on occupational assessment of airborne beryllium particulates.

    PubMed

    Sleeth, Darrah K

    2013-05-01

    In 2010, the American Conference of Governmental Industrial Hygienists (ACGIH) formally changed its Threshold Limit Value (TLV) for beryllium from a 'total' particulate sample to an inhalable particulate sample. This change may have important implications for workplace air sampling of beryllium. A history of particle size-selective sampling methods, with a special focus on beryllium, will be provided. The current state of the science on inhalable sampling will also be presented, including a look to the future at what new methods or technology may be on the horizon. This includes new sampling criteria focused on particle deposition in the lung, proposed changes to the existing inhalable convention, as well as how the issues facing beryllium sampling may help drive other changes in sampling technology.

  11. 40 CFR Appendix I to Part 261 - Representative Sampling Methods

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 26 2014-07-01 2014-07-01 false Representative Sampling Methods I Appendix I to Part 261 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) SOLID WASTES (CONTINUED) IDENTIFICATION AND LISTING OF HAZARDOUS WASTE Pt. 261, App. I Appendix I to Part...

  12. A General Linear Method for Equating with Small Samples

    ERIC Educational Resources Information Center

    Albano, Anthony D.

    2015-01-01

    Research on equating with small samples has shown that methods with stronger assumptions and fewer statistical estimates can lead to decreased error in the estimated equating function. This article introduces a new approach to linear observed-score equating, one which provides flexible control over how form difficulty is assumed versus estimated…

  13. METHODS FOR THE ANALYSIS OF CARPET SAMPLES FOR ASBESTOS

    EPA Science Inventory

    Assessing asbestos fiber contamination in a carpet is complicated by the nature of the carpeting – because of the pile’s rough surface and thickness, samples cannot be collected directly from carpet for analysis by TEM. Two indirect methods are currently used by laboratories when...

  14. METHODS FOR THE ANALYSIS OF CARPET SAMPLES FOR ASBESTOS

    EPA Science Inventory

    Assessing asbestos fiber contamination in a carpet is complicated by the nature of the carpeting – because of the pile’s rough surface and thickness, samples cannot be collected directly from carpet for analysis by TEM. Two indirect methods are currently used by laboratories when...

  15. 7 CFR 58.245 - Method of sample analysis.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 7 Agriculture 3 2011-01-01 2011-01-01 false Method of sample analysis. 58.245 Section 58.245..., GENERAL SPECIFICATIONS FOR APPROVED PLANTS AND STANDARDS FOR GRADES OF DAIRY PRODUCTS 1 General Specifications for Dairy Plants Approved for USDA Inspection and Grading Service 1 Operations and Operating...

  16. 7 CFR 58.812 - Methods of sample analysis.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 7 Agriculture 3 2011-01-01 2011-01-01 false Methods of sample analysis. 58.812 Section 58.812..., GENERAL SPECIFICATIONS FOR APPROVED PLANTS AND STANDARDS FOR GRADES OF DAIRY PRODUCTS 1 General Specifications for Dairy Plants Approved for USDA Inspection and Grading Service 1 Operations and Operating...

  17. A General Linear Method for Equating with Small Samples

    ERIC Educational Resources Information Center

    Albano, Anthony D.

    2015-01-01

    Research on equating with small samples has shown that methods with stronger assumptions and fewer statistical estimates can lead to decreased error in the estimated equating function. This article introduces a new approach to linear observed-score equating, one which provides flexible control over how form difficulty is assumed versus estimated…

  18. Tracking workforce diversity in dentistry: importance, methods, and challenges.

    PubMed

    Mertz, Elizabeth; Wides, Cynthia; Cooke, Alexis; Gates, Paul E

    2016-01-01

    The objectives of this paper are to describe sources of data on underrepresented minority (URM) dental providers and to perform a structured critique of primary survey research on African American (AA), Hispanic/Latino (HL), and American Indian/Alaska Native (AI/AN) dentists. A national sample survey was conducted between October 2012 and March 2013, and secondary datasets were assessed for comparability. The survey used 21 sampling frames, with censuses of AI/AN and nonurban dentists, and assessed demographics, education, practice history, patient population, volunteerism, experiences with discrimination, and opinions on issues in dentistry. The survey was developed with constituent input, pilot-tested, and distributed online and through US mail with three reminder postcards, phone, and email follow-up. Continuing education credit and entry to a prize drawing were provided for participation. Existing data sources cannot answer critical research questions about URM dentists. Using best practices, the survey received a 34 percent adjusted response rate. Selection likelihood and measurable response bias were adjusted for using base and poststratification weights. The survey design was consistent with best practices, and our response analytics provide high confidence that the survey produced data representative of the URM dentist population. Enhanced study design, content, and response rates of existing survey efforts would be needed to provide a more robust body of knowledge on URM providers, perspectives, and practices. © 2015 American Association of Public Health Dentistry.

  19. 40 CFR 80.1348 - What gasoline sample retention requirements apply to refiners and importers?

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... Benzene Sampling, Testing and Retention Requirements § 80.1348 What gasoline sample retention requirements... include with the retained sample the test result for benzene as conducted pursuant to § 80.46(e). (b... sample the test result for benzene as conducted pursuant to § 80.47....

  20. 40 CFR 80.330 - What are the sampling and testing requirements for refiners and importers?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... Practice for Manual Sampling of Petroleum and Petroleum Products.” (ii) Samples collected under the... present that could affect the sulfur test result. (2) Automatic sampling of petroleum products in..., entitled “Standard Practice for Automatic Sampling of Petroleum and Petroleum Products.” (c) Test...

  1. [Methods of the elaboration of data of the cardiological importance].

    PubMed

    Marchesi, C; Taddei, A; Varanini, M

    1987-12-01

    This paper deals with some introductory topics of signal processing and decision making in cardiology. In both instances the matter is referred to general schemes well suited to host different applications. Signal processing is divided in some phases: acquisition, storing, analysis and each of them is described with applications to specific signals. In a similar manner the methods for decision making have been simplified to a scheme including a "knowledge base" and an "inference method". The scheme is used to classify various implementations. Bayes analysis and expert systems have been introduced with some details.

  2. Universal nucleic acids sample preparation method for cells, spores and their mixture

    DOEpatents

    Bavykin, Sergei [Darien, IL

    2011-01-18

    The present invention relates to a method for extracting nucleic acids from biological samples. More specifically the invention relates to a universal method for extracting nucleic acids from unidentified biological samples. An advantage of the presently invented method is its ability to effectively and efficiently extract nucleic acids from a variety of different cell types including but not limited to prokaryotic or eukaryotic cells and/or recalcitrant organisms (i.e. spores). Unlike prior art methods which are focused on extracting nucleic acids from vegetative cell or spores, the present invention effectively extracts nucleic acids from spores, multiple cell types or mixtures thereof using a single method. Important that the invented method has demonstrated an ability to extract nucleic acids from spores and vegetative bacterial cells with similar levels effectiveness. The invented method employs a multi-step protocol which erodes the cell structure of the biological sample, isolates, labels, fragments nucleic acids and purifies labeled samples from the excess of dye.

  3. Evaluation of seven aquatic sampling methods for amphibians and other aquatic fauna

    USGS Publications Warehouse

    Gunzburger, M.S.

    2007-01-01

    To design effective and efficient research and monitoring programs researchers must have a thorough understanding of the capabilities and limitations of their sampling methods. Few direct comparative studies exist for aquatic sampling methods for amphibians. The objective of this study was to simultaneously employ seven aquatic sampling methods in 10 wetlands to compare amphibian species richness and number of individuals detected with each method. Four sampling methods allowed counts of individuals (metal dipnet, D-frame dipnet, box trap, crayfish trap), whereas the other three methods allowed detection of species (visual encounter, aural, and froglogger). Amphibian species richness was greatest with froglogger, box trap, and aural samples. For anuran species, the sampling methods by which each life stage was detected was related to relative length of larval and breeding periods and tadpole size. Detection probability of amphibians varied across sampling methods. Box trap sampling resulted in the most precise amphibian count, but the precision of all four count-based methods was low (coefficient of variation > 145 for all methods). The efficacy of the four count sampling methods at sampling fish and aquatic invertebrates was also analyzed because these predatory taxa are known to be important predictors of amphibian habitat distribution. Species richness and counts were similar for fish with the four methods, whereas invertebrate species richness and counts were greatest in box traps. An effective wetland amphibian monitoring program in the southeastern United States should include multiple sampling methods to obtain the most accurate assessment of species community composition at each site. The combined use of frogloggers, crayfish traps, and dipnets may be the most efficient and effective amphibian monitoring protocol. ?? 2007 Brill Academic Publishers.

  4. Bayesian Methods for Determining the Importance of Effects

    USDA-ARS?s Scientific Manuscript database

    Criticisms have plagued the frequentist null-hypothesis significance testing (NHST) procedure since the day it was created from the Fisher Significance Test and Hypothesis Test of Jerzy Neyman and Egon Pearson. Alternatives to NHST exist in frequentist statistics, but competing methods are also avai...

  5. A Modified Trap for Adult Sampling of Medically Important Flies (Insecta: Diptera)

    PubMed Central

    Akbarzadeh, Kamran; Rafinejad, Javad; Nozari, Jamasb; Rassi, Yavar; Sedaghat, Mohammad Mehdi; Hosseini, Mostafa

    2012-01-01

    Background: Bait-trapping appears to be a generally useful method of studying fly populations. The aim of this study was to construct a new adult flytrap by some modifications in former versions and to evaluate its applicability in a subtropical zone in southern Iran. Methods: The traps were constructed with modification by adding some equipment to a polyethylene container (18× 20× 33 cm) with lid. The fresh sheep meat was used as bait. Totally 27 adult modified traps were made and tested for their efficacies to attract adult flies. The experiment was carried out in a range of different topographic areas of Fars Province during June 2010. Results: The traps were able to attract various groups of adult flies belonging to families of: Calliphoridae, Sarcophagidae, Muscidae, and Faniidae. The species of Calliphora vicina (Diptera: Calliphoridae), Sarcophaga argyrostoma (Diptera: Sarcophagidae) and Musca domestica (Diptera: Muscidae) include the majority of the flies collected by this sheep-meat baited trap. Conclusion: This adult flytrap can be recommended for routine field sampling to study diversity and population dynamics of flies where conducting of daily collection is difficult. PMID:23378969

  6. Sample Selected Averaging Method for Analyzing the Event Related Potential

    NASA Astrophysics Data System (ADS)

    Taguchi, Akira; Ono, Youhei; Kimura, Tomoaki

    The event related potential (ERP) is often measured through the oddball task. On the oddball task, subjects are given “rare stimulus” and “frequent stimulus”. Measured ERPs were analyzed by the averaging technique. In the results, amplitude of the ERP P300 becomes large when the “rare stimulus” is given. However, measured ERPs are included samples without an original feature of ERP. Thus, it is necessary to reject unsuitable measured ERPs when using the averaging technique. In this paper, we propose the rejection method for unsuitable measured ERPs for the averaging technique. Moreover, we combine the proposed method and Woody's adaptive filter method.

  7. Comparison of DNA preservation methods for environmental bacterial community samples

    USGS Publications Warehouse

    Gray, Michael A.; Pratte, Zoe A.; Kellogg, Christina A.

    2013-01-01

    Field collections of environmental samples, for example corals, for molecular microbial analyses present distinct challenges. The lack of laboratory facilities in remote locations is common, and preservation of microbial community DNA for later study is critical. A particular challenge is keeping samples frozen in transit. Five nucleic acid preservation methods that do not require cold storage were compared for effectiveness over time and ease of use. Mixed microbial communities of known composition were created and preserved by DNAgard™, RNAlater®, DMSO–EDTA–salt (DESS), FTA® cards, and FTA Elute® cards. Automated ribosomal intergenic spacer analysis and clone libraries were used to detect specific changes in the faux communities over weeks and months of storage. A previously known bias in FTA® cards that results in lower recovery of pure cultures of Gram-positive bacteria was also detected in mixed community samples. There appears to be a uniform bias across all five preservation methods against microorganisms with high G + C DNA. Overall, the liquid-based preservatives (DNAgard™, RNAlater®, and DESS) outperformed the card-based methods. No single liquid method clearly outperformed the others, leaving method choice to be based on experimental design, field facilities, shipping constraints, and allowable cost.

  8. Comparison of aquatic macroinvertebrate samples collected using different field methods

    USGS Publications Warehouse

    Lenz, Bernard N.; Miller, Michael A.

    1996-01-01

    Government agencies, academic institutions, and volunteer monitoring groups in the State of Wisconsin collect aquatic macroinvertebrate data to assess water quality. Sampling methods differ among agencies, reflecting the differences in the sampling objectives of each agency. Lack of infor- mation about data comparability impedes data shar- ing among agencies, which can result in duplicated sampling efforts or the underutilization of avail- able information. To address these concerns, com- parisons were made of macroinvertebrate samples collected from wadeable streams in Wisconsin by personnel from the U.S. Geological Survey- National Water Quality Assessment Program (USGS-NAWQA), the Wisconsin Department of Natural Resources (WDNR), the U.S. Department of Agriculture-Forest Service (USDA-FS), and volunteers from the Water Action Volunteer-Water Quality Monitoring Program (WAV). This project was part of the Intergovernmental Task Force on Monitoring Water Quality (ITFM) Wisconsin Water Resources Coordination Project. The numbers, types, and environmental tolerances of the organ- isms collected were analyzed to determine if the four different field methods that were used by the different agencies and volunteer groups provide comparable results. Additionally, this study com- pared the results of samples taken from different locations and habitats within the same streams.

  9. Methods for assessing relative importance in preference based outcome measures.

    PubMed

    Kaplan, R M; Feeny, D; Revicki, D A

    1993-12-01

    This paper reviews issues relevant to preference assessment for utility based measures of health-related quality of life. Cost/utility studies require a common measurement of health outcome, such as the quality adjusted life year (QALY). A key element in the QALY methodology is the measure of preference that estimates subjective health quality. Economists and psychologists differ on their preferred approach to preference measurement. Economists rely on utility assessment methods that formally consider economic trades. These methods include the standard gamble, time-trade off and person trade-off. However, some evidence suggests that many of the assumptions that underlie economic measurements of choice are open to challenge because human information processors do poorly at integrating complex probability information when making decisions that involve risk. Further, economic analysis assumes that choices accurately correspond to the way rational humans use information. Psychology experiments suggest that methods commonly used for economic analysis do not represent the underlying true preference continuum and some evidence supports the use of simple rating scales. More recent research by economists attempts integrated cognitive models, while contemporary research by psychologists considers economic models of choice. The review also suggests that difference in preference between different social groups tends to be small.

  10. Transuranic waste characterization sampling and analysis methods manual. Revision 1

    SciTech Connect

    Suermann, J.F.

    1996-04-01

    This Methods Manual provides a unified source of information on the sampling and analytical techniques that enable Department of Energy (DOE) facilities to comply with the requirements established in the current revision of the Transuranic Waste Characterization Quality Assurance Program Plan (QAPP) for the Waste Isolation Pilot Plant (WIPP) Transuranic (TRU) Waste Characterization Program (the Program) and the WIPP Waste Analysis Plan. This Methods Manual includes all of the testing, sampling, and analytical methodologies accepted by DOE for use in implementing the Program requirements specified in the QAPP and the WIPP Waste Analysis Plan. The procedures in this Methods Manual are comprehensive and detailed and are designed to provide the necessary guidance for the preparation of site-specific procedures. With some analytical methods, such as Gas Chromatography/Mass Spectrometry, the Methods Manual procedures may be used directly. With other methods, such as nondestructive characterization, the Methods Manual provides guidance rather than a step-by-step procedure. Sites must meet all of the specified quality control requirements of the applicable procedure. Each DOE site must document the details of the procedures it will use and demonstrate the efficacy of such procedures to the Manager, National TRU Program Waste Characterization, during Waste Characterization and Certification audits.

  11. New Methods of Sample Preparation for Atom Probe Specimens

    NASA Technical Reports Server (NTRS)

    Kuhlman, Kimberly, R.; Kowalczyk, Robert S.; Ward, Jennifer R.; Wishard, James L.; Martens, Richard L.; Kelly, Thomas F.

    2003-01-01

    Magnetite is a common conductive mineral found on Earth and Mars. Disk-shaped precipitates approximately 40 nm in diameter have been shown to have manganese and aluminum concentrations. Atom-probe field-ion microscopy (APFIM) is the only technique that can potentially quantify the composition of these precipitates. APFIM will be used to characterize geological and planetary materials, analyze samples of interest for geomicrobiology; and, for the metrology of nanoscale instrumentation. Prior to APFIM sample preparation was conducted by electropolishing, the method of sharp shards (MSS), or Bosch process (deep reactive ion etching) with focused ion beam (FIB) milling as a final step. However, new methods are required for difficult samples. Many materials are not easily fabricated using electropolishing, MSS, or the Bosch process, FIB milling is slow and expensive, and wet chemistry and the reactive ion etching are typically limited to Si and other semiconductors. APFIM sample preparation using the dicing saw is commonly used to section semiconductor wafers into individual devices following manufacture. The dicing saw is a time-effective method for preparing high aspect ratio posts of poorly conducting materials. Femtosecond laser micromachining is also suitable for preparation of posts. FIB time required is reduced by about a factor of 10 and multi-tip specimens can easily be fabricated using the dicing saw.

  12. New Methods of Sample Preparation for Atom Probe Specimens

    NASA Technical Reports Server (NTRS)

    Kuhlman, Kimberly, R.; Kowalczyk, Robert S.; Ward, Jennifer R.; Wishard, James L.; Martens, Richard L.; Kelly, Thomas F.

    2003-01-01

    Magnetite is a common conductive mineral found on Earth and Mars. Disk-shaped precipitates approximately 40 nm in diameter have been shown to have manganese and aluminum concentrations. Atom-probe field-ion microscopy (APFIM) is the only technique that can potentially quantify the composition of these precipitates. APFIM will be used to characterize geological and planetary materials, analyze samples of interest for geomicrobiology; and, for the metrology of nanoscale instrumentation. Prior to APFIM sample preparation was conducted by electropolishing, the method of sharp shards (MSS), or Bosch process (deep reactive ion etching) with focused ion beam (FIB) milling as a final step. However, new methods are required for difficult samples. Many materials are not easily fabricated using electropolishing, MSS, or the Bosch process, FIB milling is slow and expensive, and wet chemistry and the reactive ion etching are typically limited to Si and other semiconductors. APFIM sample preparation using the dicing saw is commonly used to section semiconductor wafers into individual devices following manufacture. The dicing saw is a time-effective method for preparing high aspect ratio posts of poorly conducting materials. Femtosecond laser micromachining is also suitable for preparation of posts. FIB time required is reduced by about a factor of 10 and multi-tip specimens can easily be fabricated using the dicing saw.

  13. A method for sampling microbial aerosols using high altitude balloons.

    PubMed

    Bryan, N C; Stewart, M; Granger, D; Guzik, T G; Christner, B C

    2014-12-01

    Owing to the challenges posed to microbial aerosol sampling at high altitudes, very little is known about the abundance, diversity, and extent of microbial taxa in the Earth-atmosphere system. To directly address this knowledge gap, we designed, constructed, and tested a system that passively samples aerosols during ascent through the atmosphere while tethered to a helium-filled latex sounding balloon. The sampling payload is ~ 2.7 kg and comprised of an electronics box and three sampling chambers (one serving as a procedural control). Each chamber is sealed with retractable doors that can be commanded to open and close at designated altitudes. The payload is deployed together with radio beacons that transmit GPS coordinates (latitude, longitude and altitude) in real time for tracking and recovery. A cut mechanism separates the payload string from the balloon at any desired altitude, returning all equipment safely to the ground on a parachute. When the chambers are opened, aerosol sampling is performed using the Rotorod® collection method (40 rods per chamber), with each rod passing through 0.035 m3 per km of altitude sampled. Based on quality control measurements, the collection of ~ 100 cells rod(-1) provided a 3-sigma confidence level of detection. The payload system described can be mated with any type of balloon platform and provides a tool for characterizing the vertical distribution of microorganisms in the troposphere and stratosphere. Copyright © 2014 Elsevier B.V. All rights reserved.

  14. RAPID SEPARATION METHOD FOR ACTINIDES IN EMERGENCY SOIL SAMPLES

    SciTech Connect

    Maxwell, S.; Culligan, B.; Noyes, G.

    2009-11-09

    A new rapid method for the determination of actinides in soil and sediment samples has been developed at the Savannah River Site Environmental Lab (Aiken, SC, USA) that can be used for samples up to 2 grams in emergency response situations. The actinides in soil method utilizes a rapid sodium hydroxide fusion method, a lanthanum fluoride soil matrix removal step, and a streamlined column separation process with stacked TEVA, TRU and DGA Resin cartridges. Lanthanum was separated rapidly and effectively from Am and Cm on DGA Resin. Vacuum box technology and rapid flow rates are used to reduce analytical time. Alpha sources are prepared using cerium fluoride microprecipitation for counting by alpha spectrometry. The method showed high chemical recoveries and effective removal of interferences. This new procedure was applied to emergency soil samples received in the NRIP Emergency Response exercise administered by the National Institute for Standards and Technology (NIST) in April, 2009. The actinides in soil results were reported within 4-5 hours with excellent quality.

  15. On the Importance of Accounting for Competing Risks in Pediatric Brain Cancer: II. Regression Modeling and Sample Size

    SciTech Connect

    Tai, Bee-Choo; Grundy, Richard; Machin, David

    2011-03-15

    Purpose: To accurately model the cumulative need for radiotherapy in trials designed to delay or avoid irradiation among children with malignant brain tumor, it is crucial to account for competing events and evaluate how each contributes to the timing of irradiation. An appropriate choice of statistical model is also important for adequate determination of sample size. Methods and Materials: We describe the statistical modeling of competing events (A, radiotherapy after progression; B, no radiotherapy after progression; and C, elective radiotherapy) using proportional cause-specific and subdistribution hazard functions. The procedures of sample size estimation based on each method are outlined. These are illustrated by use of data comparing children with ependymoma and other malignant brain tumors. The results from these two approaches are compared. Results: The cause-specific hazard analysis showed a reduction in hazards among infants with ependymoma for all event types, including Event A (adjusted cause-specific hazard ratio, 0.76; 95% confidence interval, 0.45-1.28). Conversely, the subdistribution hazard analysis suggested an increase in hazard for Event A (adjusted subdistribution hazard ratio, 1.35; 95% confidence interval, 0.80-2.30), but the reduction in hazards for Events B and C remained. Analysis based on subdistribution hazard requires a larger sample size than the cause-specific hazard approach. Conclusions: Notable differences in effect estimates and anticipated sample size were observed between methods when the main event showed a beneficial effect whereas the competing events showed an adverse effect on the cumulative incidence. The subdistribution hazard is the most appropriate for modeling treatment when its effects on both the main and competing events are of interest.

  16. The laboratory methods of induced polarization measurement of manganese sample

    NASA Astrophysics Data System (ADS)

    Adhiguna, D.; Handayani, G.

    2015-09-01

    Metallic minerals are polarizable. The polarizable property can be used as the basis for metallic minerals exploration process. By use of induced polarization method, we observed polarization phenomenon that occur in metallic material. In this study, physical events were observed that occur in rocks containing manganese minerals using induced polarization method. Induced polarization method is a geophysical method that is based on the principle of electrical charging and discharging of a capacitor which is applied to the rock. By using the method of induced polarization, chargeability values can be determined for the rock. Chargeability is one of the important properties of metal material. Measurement on this research will be done in two different ways to determine the induced events that occurred in both methods.

  17. 40 CFR 80.1348 - What gasoline sample retention requirements apply to refiners and importers?

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 17 2012-07-01 2012-07-01 false What gasoline sample retention... PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) REGULATION OF FUELS AND FUEL ADDITIVES Gasoline Benzene Sampling, Testing and Retention Requirements § 80.1348 What gasoline sample retention requirements...

  18. 40 CFR 80.1348 - What gasoline sample retention requirements apply to refiners and importers?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 16 2010-07-01 2010-07-01 false What gasoline sample retention... PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) REGULATION OF FUELS AND FUEL ADDITIVES Gasoline Benzene Sampling, Testing and Retention Requirements § 80.1348 What gasoline sample retention requirements...

  19. Spanish Multicenter Normative Studies (NEURONORMA Project): methods and sample characteristics.

    PubMed

    Peña-Casanova, Jordi; Blesa, Rafael; Aguilar, Miquel; Gramunt-Fombuena, Nina; Gómez-Ansón, Beatriz; Oliva, Rafael; Molinuevo, José Luis; Robles, Alfredo; Barquero, María Sagrario; Antúnez, Carmen; Martínez-Parra, Carlos; Frank-García, Anna; Fernández, Manuel; Alfonso, Verónica; Sol, Josep M

    2009-06-01

    This paper describes the methods and sample characteristics of a series of Spanish normative studies (The NEURONORMA project). The primary objective of our research was to collect normative and psychometric information on a sample of people aged over 49 years. The normative information was based on a series of selected, but commonly used, neuropsychological tests covering attention, language, visuo-perceptual abilities, constructional tasks, memory, and executive functions. A sample of 356 community dwelling individuals was studied. Demographics, socio-cultural, and medical data were collected. Cognitive normality was validated via informants and a cognitive screening test. Norms were calculated for midpoint age groups. Effects of age, education, and sex were determined. The use of these norms should improve neuropsychological diagnostic accuracy in older Spanish subjects. These data may also be of considerable use for comparisons with other normative studies. Limitations of these normative data are also commented on.

  20. 40 CFR 80.1644 - Sampling and testing requirements for producers and importers of certified ethanol denaturant.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... producers and importers of certified ethanol denaturant. 80.1644 Section 80.1644 Protection of Environment... ethanol denaturant. (a) Sample and test each batch of certified ethanol denaturant. (1) Producers and importers of certified ethanol denaturant shall collect a representative sample from each batch of...

  1. A Mixed Methods Investigation of Mixed Methods Sampling Designs in Social and Health Science Research

    ERIC Educational Resources Information Center

    Collins, Kathleen M. T.; Onwuegbuzie, Anthony J.; Jiao, Qun G.

    2007-01-01

    A sequential design utilizing identical samples was used to classify mixed methods studies via a two-dimensional model, wherein sampling designs were grouped according to the time orientation of each study's components and the relationship of the qualitative and quantitative samples. A quantitative analysis of 121 studies representing nine fields…

  2. A Mixed Methods Investigation of Mixed Methods Sampling Designs in Social and Health Science Research

    ERIC Educational Resources Information Center

    Collins, Kathleen M. T.; Onwuegbuzie, Anthony J.; Jiao, Qun G.

    2007-01-01

    A sequential design utilizing identical samples was used to classify mixed methods studies via a two-dimensional model, wherein sampling designs were grouped according to the time orientation of each study's components and the relationship of the qualitative and quantitative samples. A quantitative analysis of 121 studies representing nine fields…

  3. A direct method for e-cigarette aerosol sample collection.

    PubMed

    Olmedo, Pablo; Navas-Acien, Ana; Hess, Catherine; Jarmul, Stephanie; Rule, Ana

    2016-08-01

    E-cigarette use is increasing in populations around the world. Recent evidence has shown that the aerosol produced by e-cigarettes can contain a variety of toxicants. Published studies characterizing toxicants in e-cigarette aerosol have relied on filters, impingers or sorbent tubes, which are methods that require diluting or extracting the sample in a solution during collection. We have developed a collection system that directly condenses e-cigarette aerosol samples for chemical and toxicological analyses. The collection system consists of several cut pipette tips connected with short pieces of tubing. The pipette tip-based collection system can be connected to a peristaltic pump, a vacuum pump, or directly to an e-cigarette user for the e-cigarette aerosol to flow through the system. The pipette tip-based system condenses the aerosol produced by the e-cigarette and collects a liquid sample that is ready for analysis without the need of intermediate extraction solutions. We tested a total of 20 e-cigarettes from 5 different brands commercially available in Maryland. The pipette tip-based collection system condensed between 0.23 and 0.53mL of post-vaped e-liquid after 150 puffs. The proposed method is highly adaptable, can be used during field work and in experimental settings, and allows collecting aerosol samples from a wide variety of e-cigarette devices, yielding a condensate of the likely exact substance that is being delivered to the lungs.

  4. 40 CFR 80.583 - What alternative sampling and testing requirements apply to importers who transport motor vehicle...

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... applicable. (b) Quality assurance program. The importer must conduct a quality assurance program, as specified in this paragraph (b), for each truck or rail car loading terminal. (1) Quality assurance samples... frequency of the quality assurance sampling and testing must be at least one sample for each 50 of...

  5. Harmonisation of microbial sampling and testing methods for distillate fuels

    SciTech Connect

    Hill, G.C.; Hill, E.C.

    1995-05-01

    Increased incidence of microbial infection in distillate fuels has led to a demand for organisations such as the Institute of Petroleum to propose standards for microbiological quality, based on numbers of viable microbial colony forming units. Variations in quality requirements, and in the spoilage significance of contaminating microbes plus a tendency for temporal and spatial changes in the distribution of microbes, makes such standards difficult to implement. The problem is compounded by a diversity in the procedures employed for sampling and testing for microbial contamination and in the interpretation of the data obtained. The following paper reviews these problems and describes the efforts of The Institute of Petroleum Microbiology Fuels Group to address these issues and in particular to bring about harmonisation of sampling and testing methods. The benefits and drawbacks of available test methods, both laboratory based and on-site, are discussed.

  6. Methods in Enzymology: “Flexible backbone sampling methods to model and design protein alternative conformations”

    PubMed Central

    Ollikainen, Noah; Smith, Colin A.; Fraser, James S.; Kortemme, Tanja

    2013-01-01

    Sampling alternative conformations is key to understanding how proteins work and engineering them for new functions. However, accurately characterizing and modeling protein conformational ensembles remains experimentally and computationally challenging. These challenges must be met before protein conformational heterogeneity can be exploited in protein engineering and design. Here, as a stepping stone, we describe methods to detect alternative conformations in proteins and strategies to model these near-native conformational changes based on backrub-type Monte Carlo moves in Rosetta. We illustrate how Rosetta simulations that apply backrub moves improve modeling of point mutant side chain conformations, native side chain conformational heterogeneity, functional conformational changes, tolerated sequence space, protein interaction specificity, and amino acid co-variation across protein-protein interfaces. We include relevant Rosetta command lines and RosettaScripts to encourage the application of these types of simulations to other systems. Our work highlights that critical scoring and sampling improvements will be necessary to approximate conformational landscapes. Challenges for the future development of these methods include modeling conformational changes that propagate away from designed mutation sites and modulating backbone flexibility to predictively design functionally important conformational heterogeneity. PMID:23422426

  7. Rock sampling. [method for controlling particle size distribution

    NASA Technical Reports Server (NTRS)

    Blum, P. (Inventor)

    1971-01-01

    A method for sampling rock and other brittle materials and for controlling resultant particle sizes is described. The method involves cutting grooves in the rock surface to provide a grouping of parallel ridges and subsequently machining the ridges to provide a powder specimen. The machining step may comprise milling, drilling, lathe cutting or the like; but a planing step is advantageous. Control of the particle size distribution is effected primarily by changing the height and width of these ridges. This control exceeds that obtainable by conventional grinding.

  8. Recent advances in sample preparation techniques for effective bioanalytical methods.

    PubMed

    Kole, Prashant Laxman; Venkatesh, Gantala; Kotecha, Jignesh; Sheshala, Ravi

    2011-01-01

    This paper reviews the recent developments in bioanalysis sample preparation techniques and gives an update on basic principles, theory, applications and possibilities for automation, and a comparative discussion on the advantages and limitation of each technique. Conventional liquid-liquid extraction (LLE), protein precipitation (PP) and solid-phase extraction (SPE) techniques are now been considered as methods of the past. The last decade has witnessed a rapid development of novel sample preparation techniques in bioanalysis. Developments in SPE techniques such as selective sorbents and in the overall approach to SPE, such as hybrid SPE and molecularly imprinted polymer SPE, have been addressed. Considerable literature has been published in the area of solid-phase micro-extraction and its different versions, e.g. stir bar sorptive extraction, and their application in the development of selective and sensitive bioanalytical methods. Techniques such as dispersive solid-phase extraction, disposable pipette extraction and micro-extraction by packed sorbent offer a variety of extraction phases and provide unique advantages to bioanalytical methods. On-line SPE utilizing column-switching techniques is rapidly gaining acceptance in bioanalytical applications. PP sample preparation techniques such as PP filter plates/tubes offer many advantages like removal of phospholipids and proteins in plasma/serum. Newer approaches to conventional LLE techniques (salting-out LLE) are also covered in this review article.

  9. Random Sampling of Quantum States: a Survey of Methods. And Some Issues Regarding the Overparametrized Method

    NASA Astrophysics Data System (ADS)

    Maziero, Jonas

    2015-12-01

    The numerical generation of random quantum states (RQS) is an important procedure for investigations in quantum information science. Here, we review some methods that may be used for performing that task. We start by presenting a simple procedure for generating random state vectors, for which the main tool is the random sampling of unbiased discrete probability distributions (DPD). Afterwards, the creation of random density matrices is addressed. In this context, we first present the standard method, which consists in using the spectral decomposition of a quantum state for getting RQS from random DPDs and random unitary matrices. In the sequence, the Bloch vector parametrization method is described. This approach, despite being useful in several instances, is not in general convenient for RQS generation. In the last part of the article, we regard the overparametrized method (OPM) and the related Ginibre and Bures techniques. The OPM can be used to create random positive semidefinite matrices with unit trace from randomly produced general complex matrices in a simple way that is friendly for numerical implementations. We consider a physically relevant issue related to the possible domains that may be used for the real and imaginary parts of the elements of such general complex matrices. Subsequently, a too fast concentration of measure in the quantum state space that appears in this parametrization is noticed.

  10. OBSERVATIONAL STUDIES OF PATIENTS IN THE EMERGENCY DEPARTMENT: A COMPARISON OF FOUR SAMPLING METHODS

    PubMed Central

    Valley, Morgan A.; Heard, Kennon J.; Ginde, Adit A.; Lezotte, Dennis C.; Lowenstein, Steven R.

    2012-01-01

    Objectives We evaluated the ability of four sampling methods to generate representative samples of the Emergency Department (ED) population. Methods We analyzed the electronic records of 21,662 consecutive patient visits at an urban, academic ED. From this population, we simulated different models of study recruitment in the ED by employing two sample sizes (n = 200, 400) and four sampling methods: 1) true random; 2) random 4-hour time blocks by exact sample size; 3) random 4-hour time blocks by a pre-determined number of blocks; and 4) convenience or “business hours.” For each method and sample size, we obtained 1,000 samples from the population. Using chi-square tests, we measured the number of statistically significant differences between the sample and the population for eight variables (age, gender, race/ethnicity, language, triage acuity, arrival mode, disposition and payer source). Then, for each variable, method and sample size, we compared the proportion of the 1,000 samples that differed from the overall ED population to the expected proportion (5%). Results Only the true random samples represented the population with respect to gender, race/ethnicity, triage acuity, mode of arrival, language and payer source in at least 95% of the samples. Patient samples obtained using random 4-hour time blocks and business hours sampling systematically differed from the overall ED patient population for several important demographic and clinical variables. However, the magnitude of these differences was not large. Conclusions Common sampling strategies selected for ED-based studies may affect parameter estimates for several representative population variables. However, the potential for bias for these variables appears small. PMID:22401950

  11. Simplified sample preparation method for triclosan and methyltriclosan determination in biota and foodstuff samples.

    PubMed

    Canosa, P; Rodríguez, I; Rubí, E; Ramil, M; Cela, R

    2008-04-25

    An improved method for the determination of triclosan (TCS) and methyltriclosan (MTCS) in fish and foodstuff samples is presented. Analytes were simultaneously extracted and purified using the matrix solid-phase dispersion (MSPD) technique, and then selectively determined by gas chromatography with tandem mass spectrometry (GC-MS/MS). Several combinations of dispersants, clean-up co-sorbents and extraction solvents were tested in order to obtain lipid-free extracts and quantitative recoveries for TCS and MTCS. Under optimised conditions, 0.5 g samples were dispersed using 1.5 g of neutral silica in a mortar with a pestle, and transferred to a polypropylene cartridge containing 3 g of silica impregnated with 10% of sulphuric acid (SiO2-H2SO4, 10%, w/w). Analytes were recovered with 10 mL of dichloromethane whereas lipids were oxidized in the layer of acidic silica. The extract was concentrated to dryness and re-constituted with 1 mL of ethyl acetate. Then, a fraction of 0.5 mL was mixed with 50 microL of N-methyl-N-(tert-butyldimethylsilyl)trifluoroacetamide (MTBSTFA) and injected in the GC-MS/MS system. The developed method provided absolute recoveries between 77 and 120% for different samples spiked at the low ng g(-1) level, quantification limits in the range of 1-2 ng g(-1) and a considerable simplicity in comparison with previously developed sample preparation approaches. Experiments carried out placing sliced food samples in direct contact with TCS-treated kitchenware surfaces showed the capability of the biocide to migrate into foodstuffs.

  12. Sampling Small Mammals in Southeastern Forests: The Importance of Trapping in Trees

    SciTech Connect

    Loeb, S.C.; Chapman, G.L.; Ridley, T.R.

    1999-01-01

    We investigated the effect of sampling methodology on the richness and abundance of small mammal communities in loblolly pine forests. Trapping in trees using Sherman live traps was included along with routine ground trapping using the same device. Estimates of species richness did not differ among samples in which tree traps were included or excluded. However, diversity indeces (Shannon-Wiener, Simpson, Shannon and Brillouin) were strongly effected. The indeces were significantly greater than if tree samples were included primarily the result of flying squirrel captures. Without tree traps, the results suggested that cotton mince dominated the community. We recommend that tree traps we included in sampling.

  13. On the importance of accounting for competing risks in pediatric brain cancer: II. Regression modeling and sample size.

    PubMed

    Tai, Bee-Choo; Grundy, Richard; Machin, David

    2011-03-15

    To accurately model the cumulative need for radiotherapy in trials designed to delay or avoid irradiation among children with malignant brain tumor, it is crucial to account for competing events and evaluate how each contributes to the timing of irradiation. An appropriate choice of statistical model is also important for adequate determination of sample size. We describe the statistical modeling of competing events (A, radiotherapy after progression; B, no radiotherapy after progression; and C, elective radiotherapy) using proportional cause-specific and subdistribution hazard functions. The procedures of sample size estimation based on each method are outlined. These are illustrated by use of data comparing children with ependymoma and other malignant brain tumors. The results from these two approaches are compared. The cause-specific hazard analysis showed a reduction in hazards among infants with ependymoma for all event types, including Event A (adjusted cause-specific hazard ratio, 0.76; 95% confidence interval, 0.45-1.28). Conversely, the subdistribution hazard analysis suggested an increase in hazard for Event A (adjusted subdistribution hazard ratio, 1.35; 95% confidence interval, 0.80-2.30), but the reduction in hazards for Events B and C remained. Analysis based on subdistribution hazard requires a larger sample size than the cause-specific hazard approach. Notable differences in effect estimates and anticipated sample size were observed between methods when the main event showed a beneficial effect whereas the competing events showed an adverse effect on the cumulative incidence. The subdistribution hazard is the most appropriate for modeling treatment when its effects on both the main and competing events are of interest. Copyright © 2011 Elsevier Inc. All rights reserved.

  14. Rapid separation method for actinides in emergency air filter samples.

    PubMed

    Maxwell, Sherrod L; Culligan, Brian K; Noyes, Gary W

    2010-12-01

    A new rapid method for the determination of actinides and strontium in air filter samples has been developed at the Savannah River Site Environmental Lab (Aiken, SC, USA) that can be used in emergency response situations. The actinides and strontium in air filter method utilizes a rapid acid digestion method and a streamlined column separation process with stacked TEVA, TRU and Sr Resin cartridges. Vacuum box technology and rapid flow rates are used to reduce analytical time. Alpha emitters are prepared using cerium fluoride microprecipitation for counting by alpha spectrometry. The purified (90)Sr fractions are mounted directly on planchets and counted by gas flow proportional counting. The method showed high chemical recoveries and effective removal of interferences. This new procedure was applied to emergency air filter samples received in the NRIP Emergency Response exercise administered by the National Institute for Standards and Technology (NIST) in April, 2009. The actinide and (90)Sr in air filter results were reported in less than 4 h with excellent quality. Copyright 2010 Elsevier Ltd. All rights reserved.

  15. RAPID SEPARATION METHOD FOR ACTINIDES IN EMERGENCY AIR FILTER SAMPLES

    SciTech Connect

    Maxwell, S.; Noyes, G.; Culligan, B.

    2010-02-03

    A new rapid method for the determination of actinides and strontium in air filter samples has been developed at the Savannah River Site Environmental Lab (Aiken, SC, USA) that can be used in emergency response situations. The actinides and strontium in air filter method utilizes a rapid acid digestion method and a streamlined column separation process with stacked TEVA, TRU and Sr Resin cartridges. Vacuum box technology and rapid flow rates are used to reduce analytical time. Alpha emitters are prepared using cerium fluoride microprecipitation for counting by alpha spectrometry. The purified {sup 90}Sr fractions are mounted directly on planchets and counted by gas flow proportional counting. The method showed high chemical recoveries and effective removal of interferences. This new procedure was applied to emergency air filter samples received in the NRIP Emergency Response exercise administered by the National Institute for Standards and Technology (NIST) in April, 2009. The actinide and {sup 90}Sr in air filter results were reported in {approx}4 hours with excellent quality.

  16. 40 CFR 80.1347 - What are the sampling and testing requirements for refiners and importers?

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... Benzene Sampling, Testing and Retention Requirements § 80.1347 What are the sampling and testing... benzene requirements of this subpart, except as modified by paragraphs (a)(2), (a)(3) and (a)(4) of this... benzene concentration for compliance with the requirements of this subpart. (ii) Independent...

  17. 40 CFR 80.1347 - What are the sampling and testing requirements for refiners and importers?

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... Benzene Sampling, Testing and Retention Requirements § 80.1347 What are the sampling and testing... benzene requirements of this subpart, except as modified by paragraphs (a)(2), (a)(3) and (a)(4) of this..., 2015, to determine its benzene concentration for compliance with the requirements of this...

  18. 40 CFR 80.1347 - What are the sampling and testing requirements for refiners and importers?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... Benzene Sampling, Testing and Retention Requirements § 80.1347 What are the sampling and testing... benzene requirements of this subpart, except as modified by paragraphs (a)(2), (a)(3) and (a)(4) of this... benzene concentration for compliance with the requirements of this subpart. (ii) Independent...

  19. Bandpass Sampling--An Opportunity to Stress the Importance of In-Depth Understanding

    ERIC Educational Resources Information Center

    Stern, Harold P. E.

    2010-01-01

    Many bandpass signals can be sampled at rates lower than the Nyquist rate, allowing significant practical advantages. Illustrating this phenomenon after discussing (and proving) Shannon's sampling theorem provides a valuable opportunity for an instructor to reinforce the principle that innovation is possible when students strive to have a complete…

  20. Important issues related to using pooled samples for environmental chemical biomonitoring.

    PubMed

    Caudill, Samuel P

    2011-02-28

    Pooling samples for analysis was first proposed in the 1940s to reduce analytical measurement costs associated with screening World War II recruits for syphilis. Later, it progressed to more complex screening strategies, to population prevalence estimation for discrete quantities, and to population mean estimation for continuous quantities. Recently, pooled samples have also been used to provide efficient alternatives for gene microarray analyses, epidemiologic studies of biomarkers of exposure, and characterization of populations regarding environmental chemical exposures. In this study, we address estimation and bias issues related to using pooled-sample variance information from an auxiliary source to augment pooled-sample variance estimates from the study of interest. The findings are illustrated by using pooled samples from the National Health and Nutrition Examination Survey 2001-2002 to assess exposures to perfluorooctanesulfonate and other polyfluoroalkyl compounds in the U.S. population. Published in 2011 by John Wiley & Sons, Ltd.

  1. A Novel Method for Sampling Alpha-Helical Protein Backbones

    DOE R&D Accomplishments Database

    Fain, Boris; Levitt, Michael

    2001-01-01

    We present a novel technique of sampling the configurations of helical proteins. Assuming knowledge of native secondary structure, we employ assembly rules gathered from a database of existing structures to enumerate the geometrically possible 3-D arrangements of the constituent helices. We produce a library of possible folds for 25 helical protein cores. In each case the method finds significant numbers of conformations close to the native structure. In addition we assign coordinates to all atoms for 4 of the 25 proteins. In the context of database driven exhaustive enumeration our method performs extremely well, yielding significant percentages of structures (0.02%--82%) within 6A of the native structure. The method's speed and efficiency make it a valuable contribution towards the goal of predicting protein structure.

  2. A time domain sampling method for inverse acoustic scattering problems

    NASA Astrophysics Data System (ADS)

    Guo, Yukun; Hömberg, Dietmar; Hu, Guanghui; Li, Jingzhi; Liu, Hongyu

    2016-06-01

    This work concerns the inverse scattering problems of imaging unknown/inaccessible scatterers by transient acoustic near-field measurements. Based on the analysis of the migration method, we propose efficient and effective sampling schemes for imaging small and extended scatterers from knowledge of time-dependent scattered data due to incident impulsive point sources. Though the inverse scattering problems are known to be nonlinear and ill-posed, the proposed imaging algorithms are totally ;direct; involving only integral calculations on the measurement surface. Theoretical justifications are presented and numerical experiments are conducted to demonstrate the effectiveness and robustness of our methods. In particular, the proposed static imaging functionals enhance the performance of the total focusing method (TFM) and the dynamic imaging functionals show analogous behavior to the time reversal inversion but without solving time-dependent wave equations.

  3. Nonuniform sampling of urodynamic signals: a comparison of different methods.

    PubMed

    Kocjan, T; van Mastrigt, R

    1994-01-01

    Several different techniques for urodynamic signal compression have been proposed in the last few years. Using these techniques it is possible to reduce the requirements for digital storage or transmission. There are a number of applications where it is essential to use such techniques in diagnostic and ambulatory urodynamics. The purpose of this study is to compare different techniques of urodynamic data compression. The so-called FAN, voltage triggered, two point projection and second difference methods. The comparison between the methods is based on 65 pressure, 46 uroflow and 18 surface electromyogram signals. The reduction ratio achieved for different allowable errors between the original and compressed signals is calculated and compared for the different techniques. Results show that it is possible to store urodynamic signals accurately at a low sampling rate, where FAN and voltage triggered methods seem to be superior to the rest.

  4. Impact of blood sample collection and processing methods on glucose levels in community outreach studies.

    PubMed

    Turchiano, Michael; Nguyen, Cuong; Fierman, Arthur; Lifshitz, Mark; Convit, Antonio

    2013-01-01

    Glucose obtained from unprocessed blood samples can decrease by 5%-7% per hour due to glycolysis. This study compared the impact of glucose degradation on measured glucose values by examining two different collection methods. For the first method, blood samples were collected in tubes containing sodium fluoride (NaF), a glycolysis inhibitor. For the second method, blood samples were collected in tubes containing a clot activator and serum gel separator and were centrifuged to separate the serum and plasma 20 minutes after sample collection. The samples used in the two methods were collected during the same blood draw and were assayed by the clinical laboratory 2-4 hours after the samples were obtained. A total of 256 pairs of samples were analyzed. The average glucose reading for the centrifuged tubes was significantly higher than the NaF tubes by 0.196 ± 0.159 mmol/L (P < 0.01) or 4.2%. This study demonstrates the important role collection methods play in accurately assessing glucose levels of blood samples collected in the field, where working environment may be suboptimal. Therefore, blood samples collected in the field should be promptly centrifuged before being transported to clinical labs to ensure accurate glucose level measurements.

  5. Novel method for pairing wood samples in choice tests.

    PubMed

    Oberst, Sebastian; Evans, Theodore A; Lai, Joseph C S

    2014-01-01

    Choice tests are a standard method to determine preferences in bio-assays, e.g. for food types and food additives such as bait attractants and toxicants. Choice between food additives can be determined only when the food substrate is sufficiently homogeneous. This is difficult to achieve for wood eating organisms as wood is a highly variable biological material, even within a tree species due to the age of the tree (e.g. sapwood vs. heartwood), and components therein (sugar, starch, cellulose and lignin). The current practice to minimise variation is to use wood from the same tree, yet the variation can still be large and the quantity of wood from one tree may be insufficient. We used wood samples of identical volume from multiple sources, measured three physical properties (dry weight, moisture absorption and reflected light intensity), then ranked and clustered the samples using fuzzy c-means clustering. A reverse analysis of the clustered samples found a high correlation between their physical properties and their source of origin. This suggested approach allows a quantifiable, consistent, repeatable, simple and quick method to maximize control over similarity of wood used in choice tests.

  6. Novel Method for Pairing Wood Samples in Choice Tests

    PubMed Central

    Oberst, Sebastian; Evans, Theodore A.; Lai, Joseph C. S.

    2014-01-01

    Choice tests are a standard method to determine preferences in bio-assays, e.g. for food types and food additives such as bait attractants and toxicants. Choice between food additives can be determined only when the food substrate is sufficiently homogeneous. This is difficult to achieve for wood eating organisms as wood is a highly variable biological material, even within a tree species due to the age of the tree (e.g. sapwood vs. heartwood), and components therein (sugar, starch, cellulose and lignin). The current practice to minimise variation is to use wood from the same tree, yet the variation can still be large and the quantity of wood from one tree may be insufficient. We used wood samples of identical volume from multiple sources, measured three physical properties (dry weight, moisture absorption and reflected light intensity), then ranked and clustered the samples using fuzzy c-means clustering. A reverse analysis of the clustered samples found a high correlation between their physical properties and their source of origin. This suggested approach allows a quantifiable, consistent, repeatable, simple and quick method to maximize control over similarity of wood used in choice tests. PMID:24551173

  7. Hand held sample tube manipulator, system and method

    DOEpatents

    Kenny, Donald V [Liberty Township, OH; Smith, Deborah L [Liberty Township, OH; Severance, Richard A [late of Columbus, OH

    2001-01-01

    A manipulator apparatus, system and method for measuring analytes present in sample tubes. The manipulator apparatus includes a housing having a central bore with an inlet end and outlet end; a plunger mechanism with at least a portion thereof slideably disposed for reciprocal movement within the central bore, the plunger mechanism having a tubular gas channel with an inlet end and an outlet end, the gas channel inlet end disposed in the same direction as said inlet end of the central bore, wherein the inlet end of said plunger mechanism is adapted for movement so as to expel a sample tube inserted in the bore at the outlet end of the housing, the inlet end of the plunger mechanism is adapted for connection to gas supply; a first seal is disposed in the housing for sealing between the central bore and the plunger mechanism; a second seal is disposed at the outlet end of the housing for sealing between the central bore and a sample tube; a holder mounted on the housing for holding the sample tube; and a biasing mechanism for returning the plunger mechanism to a starting position.

  8. 40 CFR 80.583 - What alternative sampling and testing requirements apply to importers who transport motor vehicle...

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... requirements apply to importers who transport motor vehicle diesel fuel, NRLM diesel fuel, or ECA marine fuel...; Nonroad, Locomotive, and Marine Diesel Fuel; and ECA Marine Fuel Sampling and Testing § 80.583 What... diesel fuel, or ECA marine fuel by truck or rail car? Importers who import diesel fuel subject to the...

  9. Evaluating Service Quality from Patients' Perceptions: Application of Importance-performance Analysis Method.

    PubMed

    Mohebifar, Rafat; Hasani, Hana; Barikani, Ameneh; Rafiei, Sima

    2016-08-01

    Providing high service quality is one of the main functions of health systems. Measuring service quality is the basic prerequisite for improving quality. The aim of this study was to evaluate the quality of service in teaching hospitals using importance-performance analysis matrix. A descriptive-analytic study was conducted through a cross-sectional method in six academic hospitals of Qazvin, Iran, in 2012. A total of 360 patients contributed to the study. The sampling technique was stratified random sampling. Required data were collected based on a standard questionnaire (SERVQUAL). Data analysis was done through SPSS version 18 statistical software and importance-performance analysis matrix. The results showed a significant gap between importance and performance in all five dimensions of service quality (p < 0.05). In reviewing the gap, "reliability" (2.36) and "assurance" (2.24) dimensions had the highest quality gap and "responsiveness" had the lowest gap (1.97). Also, according to findings, reliability and assurance were in Quadrant (I), empathy was in Quadrant (II), and tangibles and responsiveness were in Quadrant (IV) of the importance-performance matrix. The negative gap in all dimensions of quality shows that quality improvement is necessary in all dimensions. Using quality and diagnosis measurement instruments such as importance-performance analysis will help hospital managers with planning of service quality improvement and achieving long-term goals.

  10. THE IMPORTANCE OF THE MAGNETIC FIELD FROM AN SMA-CSO-COMBINED SAMPLE OF STAR-FORMING REGIONS

    SciTech Connect

    Koch, Patrick M.; Tang, Ya-Wen; Ho, Paul T. P.; Chen, Huei-Ru Vivien; Liu, Hau-Yu Baobab; Yen, Hsi-Wei; Lai, Shih-Ping; Zhang, Qizhou; Chen, How-Huan; Ching, Tao-Chung; Girart, Josep M.; Frau, Pau; Li, Hua-Bai; Li, Zhi-Yun; Padovani, Marco; Qiu, Keping; Rao, Ramprasad

    2014-12-20

    Submillimeter dust polarization measurements of a sample of 50 star-forming regions, observed with the Submillimeter Array (SMA) and the Caltech Submillimeter Observatory (CSO) covering parsec-scale clouds to milliparsec-scale cores, are analyzed in order to quantify the magnetic field importance. The magnetic field misalignment δ—the local angle between magnetic field and dust emission gradient—is found to be a prime observable, revealing distinct distributions for sources where the magnetic field is preferentially aligned with or perpendicular to the source minor axis. Source-averaged misalignment angles (|δ|) fall into systematically different ranges, reflecting the different source-magnetic field configurations. Possible bimodal (|δ|) distributions are found for the separate SMA and CSO samples. Combining both samples broadens the distribution with a wide maximum peak at small (|δ|) values. Assuming the 50 sources to be representative, the prevailing source-magnetic field configuration is one that statistically prefers small magnetic field misalignments |δ|. When interpreting |δ| together with a magnetohydrodynamics force equation, as developed in the framework of the polarization-intensity gradient method, a sample-based log-linear scaling fits the magnetic field tension-to-gravity force ratio (Σ {sub B}) versus (|δ|) with (Σ {sub B}) = 0.116 · exp (0.047 · (|δ|)) ± 0.20 (mean error), providing a way to estimate the relative importance of the magnetic field, only based on measurable field misalignments |δ|. The force ratio Σ {sub B} discriminates systems that are collapsible on average ((Σ {sub B}) < 1) from other molecular clouds where the magnetic field still provides enough resistance against gravitational collapse ((Σ {sub B}) > 1). The sample-wide trend shows a transition around (|δ|) ≈ 45°. Defining an effective gravitational force ∼1 – (Σ {sub B}), the average magnetic-field-reduced star formation efficiency is at least a

  11. Observational studies of patients in the emergency department: a comparison of 4 sampling methods.

    PubMed

    Valley, Morgan A; Heard, Kennon J; Ginde, Adit A; Lezotte, Dennis C; Lowenstein, Steven R

    2012-08-01

    We evaluate the ability of 4 sampling methods to generate representative samples of the emergency department (ED) population. We analyzed the electronic records of 21,662 consecutive patient visits at an urban, academic ED. From this population, we simulated different models of study recruitment in the ED by using 2 sample sizes (n=200 and n=400) and 4 sampling methods: true random, random 4-hour time blocks by exact sample size, random 4-hour time blocks by a predetermined number of blocks, and convenience or "business hours." For each method and sample size, we obtained 1,000 samples from the population. Using χ(2) tests, we measured the number of statistically significant differences between the sample and the population for 8 variables (age, sex, race/ethnicity, language, triage acuity, arrival mode, disposition, and payer source). Then, for each variable, method, and sample size, we compared the proportion of the 1,000 samples that differed from the overall ED population to the expected proportion (5%). Only the true random samples represented the population with respect to sex, race/ethnicity, triage acuity, mode of arrival, language, and payer source in at least 95% of the samples. Patient samples obtained using random 4-hour time blocks and business hours sampling systematically differed from the overall ED patient population for several important demographic and clinical variables. However, the magnitude of these differences was not large. Common sampling strategies selected for ED-based studies may affect parameter estimates for several representative population variables. However, the potential for bias for these variables appears small. Copyright © 2012. Published by Mosby, Inc.

  12. Evaluation of Methods for Sampling, Recovery, and Enumeration of Bacteria Applied to the Phylloplane

    PubMed Central

    Donegan, Katherine; Matyac, Carl; Seidler, Ramon; Porteous, Arlene

    1991-01-01

    Determining the fate and survival of genetically engineered microorganisms released into the environment requires the development and application of accurate and practical methods of detection and enumeration. Several experiments were performed to examine quantitative recovery methods that are commonly used or that have potential applications. In these experiments, Erwinia herbicola and Enterobacter cloacae were applied in greenhouses to Blue Lake bush beans (Phaseolus vulgaris) and Cayuse oats (Avena sativa). Sampling indicated that the variance in bacterial counts among leaves increased over time and that this increase caused an overestimation of the mean population size by bulk leaf samples relative to single leaf samples. An increase in the number of leaves in a bulk sample, above a minimum number, did not significantly reduce the variance between samples. Experiments evaluating recovery methods demonstrated that recovery of bacteria from leaves was significantly better with stomacher blending, than with blending, sonication, or washing and that the recovery efficiency was constant over a range of sample inoculum densities. Delayed processing of leaf samples, by storage in a freezer, did not significantly lower survival and recovery of microorganisms when storage was short term and leaves were not stored in buffer. The drop plate technique for enumeration of bacteria did not significantly differ from the spread plate method. Results of these sampling, recovery, and enumeration experiments indicate a need for increased development and standardization of methods used by researchers as there are significant differences among, and also important limitations to, some of the methods used. PMID:16348404

  13. Fetal blood sampling in baboons (Papio spp.): important procedural aspects and literature review.

    PubMed

    Joy, S D; O'Shaughnessy, R; Schlabritz-Loutsevitch, N; Leland, M M; Frost, P; Fan-Havard, P

    2009-06-01

    The baboons (Papio cynocephalus) have similarities with human placentation and fetal development. Fetal blood sampling allows investigators to assess fetal condition at a specific point in gestation as well as transplacental transfer of medications. Unfortunately, assessing fetal status during gestation has been difficult and fetal instrumentation associated with high rate of pregnancy loss. Our objectives are to describe the technique of ultrasound guided cordocentesis (UGC) in baboons, report post-procedural outcomes, and review existing publications. This is a procedural paper describing the technique of UGC in baboons. After confirming pregnancy and gestational age via ultrasound, animals participating in approved research protocols that required fetal assessment underwent UGC. We successfully performed UGC in four animals (five samples) using this technique. Animals were sampled in the second and third trimesters with fetal blood sampling achieved by sampling a free cord loop, placental cord insertion site or the intrahepatic umbilical vein. All procedures were without complication and these animals delivered at term. Ultrasound guided fetal umbilical cord venipuncture is a useful and safe technique to sample the fetal circulation with minimal risk to the fetus or mother. We believe this technique could be used for repeated fetal venous blood sampling in the baboons.

  14. Methods for parasitic protozoans detection in the environmental samples.

    PubMed

    Skotarczak, B

    2009-09-01

    The environmental route of transmission of many parasitic protozoa and their potential for producing large numbers of transmissive stages constitute persistent threats to public and veterinary health. Conventional and new immunological and molecular methods enable to assess the occurrence, prevalence, levels and sources of waterborne protozoa. Concentration, purification, and detection are the three key steps in all methods that have been approved for routine monitoring of waterborne cysts and oocysts. These steps have been optimized to such an extent that low levels of naturally occurring (oo)cysts of protozoan can be efficiently recovered from water. Ten years have passed since the United States Environmental Protection Agency (USEPA) introduced the 1622 and 1623 methods and used them to concentrate and detect the oocysts of Cryptosporidium and cysts of Giardia in water samples. Nevertheless, the methods still need studies and improvements. Pre-PCR processing procedures have been developed and they are still improved to remove or reduce the effects of PCR inhibitors. The progress in molecular methods allows to more precise distinction of species or simultaneous detection of several parasites, however, they are still not routinely used and need standardization. Standardized methods are required to maximize public health surveillance.

  15. Path Sampling Methods for Enzymatic Quantum Particle Transfer Reactions.

    PubMed

    Dzierlenga, M W; Varga, M J; Schwartz, S D

    2016-01-01

    The mechanisms of enzymatic reactions are studied via a host of computational techniques. While previous methods have been used successfully, many fail to incorporate the full dynamical properties of enzymatic systems. This can lead to misleading results in cases where enzyme motion plays a significant role in the reaction coordinate, which is especially relevant in particle transfer reactions where nuclear tunneling may occur. In this chapter, we outline previous methods, as well as discuss newly developed dynamical methods to interrogate mechanisms of enzymatic particle transfer reactions. These new methods allow for the calculation of free energy barriers and kinetic isotope effects (KIEs) with the incorporation of quantum effects through centroid molecular dynamics (CMD) and the full complement of enzyme dynamics through transition path sampling (TPS). Recent work, summarized in this chapter, applied the method for calculation of free energy barriers to reaction in lactate dehydrogenase (LDH) and yeast alcohol dehydrogenase (YADH). We found that tunneling plays an insignificant role in YADH but plays a more significant role in LDH, though not dominant over classical transfer. Additionally, we summarize the application of a TPS algorithm for the calculation of reaction rates in tandem with CMD to calculate the primary H/D KIE of YADH from first principles. We found that the computationally obtained KIE is within the margin of error of experimentally determined KIEs and corresponds to the KIE of particle transfer in the enzyme. These methods provide new ways to investigate enzyme mechanism with the inclusion of protein and quantum dynamics.

  16. Comparing two sampling methods to engage hard-to-reach communities in research priority setting.

    PubMed

    Valerio, Melissa A; Rodriguez, Natalia; Winkler, Paula; Lopez, Jaime; Dennison, Meagen; Liang, Yuanyuan; Turner, Barbara J

    2016-10-28

    Effective community-partnered and patient-centered outcomes research needs to address community priorities. However, optimal sampling methods to engage stakeholders from hard-to-reach, vulnerable communities to generate research priorities have not been identified. In two similar rural, largely Hispanic communities, a community advisory board guided recruitment of stakeholders affected by chronic pain using a different method in each community: 1) snowball sampling, a chain- referral method or 2) purposive sampling to recruit diverse stakeholders. In both communities, three groups of stakeholders attended a series of three facilitated meetings to orient, brainstorm, and prioritize ideas (9 meetings/community). Using mixed methods analysis, we compared stakeholder recruitment and retention as well as priorities from both communities' stakeholders on mean ratings of their ideas based on importance and feasibility for implementation in their community. Of 65 eligible stakeholders in one community recruited by snowball sampling, 55 (85 %) consented, 52 (95 %) attended the first meeting, and 36 (65 %) attended all 3 meetings. In the second community, the purposive sampling method was supplemented by convenience sampling to increase recruitment. Of 69 stakeholders recruited by this combined strategy, 62 (90 %) consented, 36 (58 %) attended the first meeting, and 26 (42 %) attended all 3 meetings. Snowball sampling recruited more Hispanics and disabled persons (all P < 0.05). Despite differing recruitment strategies, stakeholders from the two communities identified largely similar ideas for research, focusing on non-pharmacologic interventions for management of chronic pain. Ratings on importance and feasibility for community implementation differed only on the importance of massage services (P = 0.045) which was higher for the purposive/convenience sampling group and for city improvements/transportation services (P = 0.004) which was higher for the

  17. Transfer of sampling methods for studies on most-at-risk populations (MARPs) in Brazil.

    PubMed

    Barbosa Júnior, Aristides; Pascom, Ana Roberta Pati; Szwarcwald, Célia Landmann; Kendall, Carl; McFarland, Willi

    2011-01-01

    The objective of this paper was to describe the process of transferring two methods for sampling most-at-risk populations: respondent-driven sampling (RDS) and time-space sampling (TSS). The article describes steps in the process, the methods used in the 10 pilot studies, and lessons learned. The process was conducted in six steps, from a state-of-the-art seminar to a workshop on writing articles with the results of the pilot studies. The principal investigators reported difficulties in the fieldwork and data analysis, independently of the pilot sampling method. One of the most important results of the transfer process is that Brazil now has more than 100 researchers able to sample MARPs using RDS or TSS. The process also enabled the construction of baselines for MARPS, thus providing a broader understanding of the dynamics of HIV infection in the country and the use of evidence to plan the national response to the epidemic in these groups.

  18. Well fluid isolation and sample apparatus and method

    DOEpatents

    Schalla, Ronald; Smith, Ronald M.; Hall, Stephen H.; Smart, John E.

    1995-01-01

    The present invention specifically permits purging and/or sampling of a well but only removing, at most, about 25% of the fluid volume compared to conventional methods and, at a minimum, removing none of the fluid volume from the well. The invention is an isolation assembly that is inserted into the well. The isolation assembly is designed so that only a volume of fluid between the outside diameter of the isolation assembly and the inside diameter of the well over a fluid column height from the bottom of the well to the top of the active portion (lower annulus) is removed. A seal may be positioned above the active portion thereby sealing the well and preventing any mixing or contamination of inlet fluid with fluid above the packer. Purged well fluid is stored in a riser above the packer. Ports in the wall of the isolation assembly permit purging and sampling of the lower annulus along the height of the active portion.

  19. ADAPTIVE ANNEALED IMPORTANCE SAMPLING FOR MULTIMODAL POSTERIOR EXPLORATION AND MODEL SELECTION WITH APPLICATION TO EXTRASOLAR PLANET DETECTION

    SciTech Connect

    Liu, Bin

    2014-07-01

    We describe an algorithm that can adaptively provide mixture summaries of multimodal posterior distributions. The parameter space of the involved posteriors ranges in size from a few dimensions to dozens of dimensions. This work was motivated by an astrophysical problem called extrasolar planet (exoplanet) detection, wherein the computation of stochastic integrals that are required for Bayesian model comparison is challenging. The difficulty comes from the highly nonlinear models that lead to multimodal posterior distributions. We resort to importance sampling (IS) to estimate the integrals, and thus translate the problem to be how to find a parametric approximation of the posterior. To capture the multimodal structure in the posterior, we initialize a mixture proposal distribution and then tailor its parameters elaborately to make it resemble the posterior to the greatest extent possible. We use the effective sample size (ESS) calculated based on the IS draws to measure the degree of approximation. The bigger the ESS is, the better the proposal resembles the posterior. A difficulty within this tailoring operation lies in the adjustment of the number of mixing components in the mixture proposal. Brute force methods just preset it as a large constant, which leads to an increase in the required computational resources. We provide an iterative delete/merge/add process, which works in tandem with an expectation-maximization step to tailor such a number online. The efficiency of our proposed method is tested via both simulation studies and real exoplanet data analysis.

  20. Miniaturized sample preparation method for determination of amphetamines in urine.

    PubMed

    Nishida, Manami; Namera, Akira; Yashiki, Mikio; Kimura, Kojiro

    2004-07-16

    A simple and miniaturized sample preparation method for determination of amphetamines in urine was developed using on-column derivatization and gas chromatography-mass spectrometry (GC-MS). Urine was directly applied to the extraction column that was pre-packed with Extrelut and sodium carbonate. Amphetamine (AP) and methamphetamine (MA) in urine were adsorbed on the surface of Extrelut. AP and MA were then converted to a free base and derivatized to N-propoxycarbonyl derivatives using propylchloroformate on the column. Pentadeuterated MA was used as an internal standard. The recoveries of AP and MA from urine were 100 and 102%, respectively. The calibration curves showed linearity in the range of 0.50-50 microg/mL for AP and MA in urine. When urine samples containing two different concentrations (0.50 and 5.0 microg/mL) of AP and MA were determined, the intra-day and inter-day coefficients of variation were 1.4-7.7%. This method was applied to 14 medico-legal cases of MA intoxication. The results were compared and a good agreement was obtained with a HPLC method.

  1. [Sampling methods for PM2.5 from stationary sources: a review].

    PubMed

    Jiang, Jing-Kun; Deng, Jian-Guo; Li, Zhen; Li, Xing-Hua; Duan, Lei; Hao, Ji-Ming

    2014-05-01

    The new China national ambient air quality standard has been published in 2012 and will be implemented in 2016. To meet the requirements in this new standard, monitoring and controlling PM2,,5 emission from stationary sources are very important. However, so far there is no national standard method on sampling PM2.5 from stationary sources. Different sampling methods for PM2.5 from stationary sources and relevant international standards were reviewed in this study. It includes the methods for PM2.5 sampling in flue gas and the methods for PM2.5 sampling after dilution. Both advantages and disadvantages of these sampling methods were discussed. For environmental management, the method for PM2.5 sampling in flue gas such as impactor and virtual impactor was suggested as a standard to determine filterable PM2.5. To evaluate environmental and health effects of PM2.5 from stationary sources, standard dilution method for sampling of total PM2.5 should be established.

  2. A novel method for sampling bacteria on plant root and soil surfaces at the microhabitat scale.

    PubMed

    Dennis, Paul G; Miller, Anthony J; Clark, Ian M; Taylor, Richard G; Valsami-Jones, Eugenia; Hirsch, Penny R

    2008-09-01

    This study reports the first method for sampling bacteria at a spatial scale approximating a microhabitat. At the core of this method is the use of tungsten rods with laser-cut tips of known surface area (0.013 mm(2)). Exposed plant root or soil surfaces were viewed with a dissecting microscope and micro-sampling rods were guided to sample sites using a micro-manipulator. Bacteria that adhered to the sampling tips were then recovered for microbiological analyses. The efficiency of this method for removing bacteria from root surfaces was similar to that with which bacteria are recovered from dissected root segments using the conventional technique of washing. However, as the surface area of the micro-sampling tips was known, the new method has the advantage of eliminating inaccuracy in estimates of bacterial densities due to inaccurate estimation of the root or soil surface sampled. When used to investigate spatial distributions of rhizoplane bacteria, the new technique revealed trends that were consistent with those reported with existing methods, while providing access to additional information about community structure at a much smaller spatial scale. The spatial scale of this new method is ca. 1000-times smaller than other sampling methods involving swabbing. This novel technique represents an important methodological step facilitating microbial ecological investigations at a microhabitat scale.

  3. Vadose Zone Sampling Methods for Detection of Preferential Pesticides Transport

    NASA Astrophysics Data System (ADS)

    Peranginangin, N.; Richards, B. K.; Steenhuis, T. S.

    2003-12-01

    Leaching of agricultural applied chemicals through the vadose zone is a major cause for the occurrence of agrichemicals in groundwater. Accurate soil water sampling methods are needed to ensure meaningful monitoring results, especially for soils that have significant preferential flow paths. The purpose of this study was to assess the capability and the effectiveness of various soil water sampling methods in detecting preferential transport of pesticides in a strongly-structured silty clay loam (Hudson series) soil. Soil water sampling devices tested were wick pan and gravity pan lysimeters, tile lines, porous ceramic cups, and pipe lysimeters; all installed at 45 to105 cm depth below the ground surface. A reasonable worse-case scenario was tested by applying a simulated rain storm soon after pesticides were sprayed at agronomic rates. Herbicides atrazine (6-chloro-N2-ethyl-N4-isopropyl-1,3,5-triazine-2,4-diamine) and 2,4-D (2,4-dichloro-phenoxyacetic acid) were chosen as model compounds. Chloride (KCl) tracer was used to determine spatial and temporal distribution of non-reactive solute and water as well as a basis for determining the retardation in pesticides movement. Results show that observed pesticide mobility was much greater than would be predicted by uniform flow. Under relatively high soil moisture conditions, gravity and wick pan lysimeters had comparably good collection efficiencies, whereas the wick samplers had an advantage over gravity driven sampler when the soil moisture content was below field capacity. Pipe lysimeters had breakthrough patterns that were similar to pan samplers. At small plot scale, tile line samplers tended to underestimate solute concentration because of water dilution around the samplers. The use of porous cup samplers performed poorly because of their sensitivity to local profile characteristics: only by chance can they intercept and sample the preferential flow paths that are critical to transport. Wick sampler had the least

  4. Method optimization for fecal sample collection and fecal DNA extraction.

    PubMed

    Mathay, Conny; Hamot, Gael; Henry, Estelle; Georges, Laura; Bellora, Camille; Lebrun, Laura; de Witt, Brian; Ammerlaan, Wim; Buschart, Anna; Wilmes, Paul; Betsou, Fay

    2015-04-01

    This is the third in a series of publications presenting formal method validation for biospecimen processing in the context of accreditation in laboratories and biobanks. We report here optimization of a stool processing protocol validated for fitness-for-purpose in terms of downstream DNA-based analyses. Stool collection was initially optimized in terms of sample input quantity and supernatant volume using canine stool. Three DNA extraction methods (PerkinElmer MSM I®, Norgen Biotek All-In-One®, MoBio PowerMag®) and six collection container types were evaluated with human stool in terms of DNA quantity and quality, DNA yield, and its reproducibility by spectrophotometry, spectrofluorometry, and quantitative PCR, DNA purity, SPUD assay, and 16S rRNA gene sequence-based taxonomic signatures. The optimal MSM I protocol involves a 0.2 g stool sample and 1000 μL supernatant. The MSM I extraction was superior in terms of DNA quantity and quality when compared to the other two methods tested. Optimal results were obtained with plain Sarstedt tubes (without stabilizer, requiring immediate freezing and storage at -20°C or -80°C) and Genotek tubes (with stabilizer and RT storage) in terms of DNA yields (total, human, bacterial, and double-stranded) according to spectrophotometry and spectrofluorometry, with low yield variability and good DNA purity. No inhibitors were identified at 25 ng/μL. The protocol was reproducible in terms of DNA yield among different stool aliquots. We validated a stool collection method suitable for downstream DNA metagenomic analysis. DNA extraction with the MSM I method using Genotek tubes was considered optimal, with simple logistics in terms of collection and shipment and offers the possibility of automation. Laboratories and biobanks should ensure protocol conditions are systematically recorded in the scope of accreditation.

  5. Drum plug piercing and sampling device and method

    DOEpatents

    Counts, Kevin T [Aiken, SC

    2011-04-26

    An apparatus and method for piercing a drum plug of a drum in order to sample and/or vent gases that may accumulate in a space of the drum is provided. The drum is not damaged and can be reused since the pierced drum plug can be subsequently replaced. The apparatus includes a frame that is configured for engagement with the drum. A cylinder actuated by a fluid is mounted to the frame. A piercer is placed into communication with the cylinder so that actuation of the cylinder causes the piercer to move in a linear direction so that the piercer may puncture the drum plug of the drum.

  6. A GPU code for analytic continuation through a sampling method

    NASA Astrophysics Data System (ADS)

    Nordström, Johan; Schött, Johan; Locht, Inka L. M.; Di Marco, Igor

    We here present a code for performing analytic continuation of fermionic Green's functions and self-energies as well as bosonic susceptibilities on a graphics processing unit (GPU). The code is based on the sampling method introduced by Mishchenko et al. (2000), and is written for the widely used CUDA platform from NVidia. Detailed scaling tests are presented, for two different GPUs, in order to highlight the advantages of this code with respect to standard CPU computations. Finally, as an example of possible applications, we provide the analytic continuation of model Gaussian functions, as well as more realistic test cases from many-body physics.

  7. Sample Size for Assessing Agreement between Two Methods of Measurement by Bland-Altman Method.

    PubMed

    Lu, Meng-Jie; Zhong, Wei-Hua; Liu, Yu-Xiu; Miao, Hua-Zhang; Li, Yong-Chang; Ji, Mu-Huo

    2016-11-01

    The Bland-Altman method has been widely used for assessing agreement between two methods of measurement. However, it remains unsolved about sample size estimation. We propose a new method of sample size estimation for Bland-Altman agreement assessment. According to the Bland-Altman method, the conclusion on agreement is made based on the width of the confidence interval for LOAs (limits of agreement) in comparison to predefined clinical agreement limit. Under the theory of statistical inference, the formulae of sample size estimation are derived, which depended on the pre-determined level of α, β, the mean and the standard deviation of differences between two measurements, and the predefined limits. With this new method, the sample sizes are calculated under different parameter settings which occur frequently in method comparison studies, and Monte-Carlo simulation is used to obtain the corresponding powers. The results of Monte-Carlo simulation showed that the achieved powers could coincide with the pre-determined level of powers, thus validating the correctness of the method. The method of sample size estimation can be applied in the Bland-Altman method to assess agreement between two methods of measurement.

  8. Advances in sample preparation in electromigration, chromatographic and mass spectrometric separation methods.

    PubMed

    Gilar, M; Bouvier, E S; Compton, B J

    2001-02-16

    The quality of sample preparation is a key factor in determining the success of analysis. While analysis of pharmaceutically important compounds in biological matrixes has driven forward the development of sample clean-up procedures in last 20 years, today's chemists face an additional challenge: sample preparation and analysis of complex biochemical samples for characterization of genotypic or phenotypic information contained in DNA and proteins. This review focuses on various sample pretreatment methods designed to meet the requirements for the analysis of biopolymers and small drugs in complex matrices. We discuss the advances in development of solid-phase extraction (SPE) sorbents, on-line SPE, membrane-based sample preparation, and sample clean-up of biopolymers prior to their analysis by mass spectrometry.

  9. Comparison of individual and pooled sampling methods for detecting bacterial pathogens of fish

    USGS Publications Warehouse

    Mumford, Sonia; Patterson, Chris; Evered, J.; Brunson, Ray; Levine, J.; Winton, J.

    2005-01-01

    Examination of finfish populations for viral and bacterial pathogens is an important component of fish disease control programs worldwide. Two methods are commonly used for collecting tissue samples for bacteriological culture, the currently accepted standards for detection of bacterial fish pathogens. The method specified in the Office International des Epizooties Manual of Diagnostic Tests for Aquatic Animals permits combining renal and splenic tissues from as many as 5 fish into pooled samples. The American Fisheries Society (AFS) Blue Book/US Fish and Wildlife Service (USFWS) Inspection Manual specifies the use of a bacteriological loop for collecting samples from the kidney of individual fish. An alternative would be to more fully utilize the pooled samples taken for virology. If implemented, this approach would provide substantial savings in labor and materials. To compare the relative performance of the AFS/USFWS method and this alternative approach, cultures of Yersinia ruckeri were used to establish low-level infections in groups of rainbow trout (Oncorhynchus mykiss) that were sampled by both methods. Yersinia ruckeri was cultured from 22 of 37 groups by at least 1 method. The loop method yielded 18 positive groups, with 1 group positive in the loop samples but negative in the pooled samples. The pooled samples produced 21 positive groups, with 4 groups positive in the pooled samples but negative in the loop samples. There was statistically significant agreement (Spearman coefficient 0.80, P < 0.001) in the relative ability of the 2 sampling methods to permit detection of low-level bacterial infections of rainbow trout.

  10. Method for testing earth samples for contamination by organic contaminants

    DOEpatents

    Schabron, J.F.

    1996-10-01

    Provided is a method for testing earth samples for contamination by organic contaminants, and particularly for aromatic compounds such as those found in diesel fuel and other heavy fuel oils, kerosene, creosote, coal oil, tars and asphalts. A drying step is provided in which a drying agent is contacted with either the earth sample or a liquid extract phase to reduce to possibility of false indications of contamination that could occur when humic material is present in the earth sample. This is particularly a problem when using relatively safe, non-toxic and inexpensive polar solvents such as isopropyl alcohol since the humic material tends to be very soluble in those solvents when water is present. Also provided is an ultraviolet spectroscopic measuring technique for obtaining an indication as to whether a liquid extract phase contains aromatic organic contaminants. In one embodiment, the liquid extract phase is subjected to a narrow and discrete band of radiation including a desired wave length and the ability of the liquid extract phase to absorb that wavelength of ultraviolet radiation is measured to provide an indication of the presence of aromatic organic contaminants. 2 figs.

  11. Eigenvector method for umbrella sampling enables error analysis

    NASA Astrophysics Data System (ADS)

    Thiede, Erik H.; Van Koten, Brian; Weare, Jonathan; Dinner, Aaron R.

    2016-08-01

    Umbrella sampling efficiently yields equilibrium averages that depend on exploring rare states of a model by biasing simulations to windows of coordinate values and then combining the resulting data with physical weighting. Here, we introduce a mathematical framework that casts the step of combining the data as an eigenproblem. The advantage to this approach is that it facilitates error analysis. We discuss how the error scales with the number of windows. Then, we derive a central limit theorem for averages that are obtained from umbrella sampling. The central limit theorem suggests an estimator of the error contributions from individual windows, and we develop a simple and computationally inexpensive procedure for implementing it. We demonstrate this estimator for simulations of the alanine dipeptide and show that it emphasizes low free energy pathways between stable states in comparison to existing approaches for assessing error contributions. Our work suggests the possibility of using the estimator and, more generally, the eigenvector method for umbrella sampling to guide adaptation of the simulation parameters to accelerate convergence.

  12. Method for testing earth samples for contamination by organic contaminants

    DOEpatents

    Schabron, John F.

    1996-01-01

    Provided is a method for testing earth samples for contamination by organic contaminants, and particularly for aromatic compounds such as those found in diesel fuel and other heavy fuel oils, kerosene, creosote, coal oil, tars and asphalts. A drying step is provided in which a drying agent is contacted with either the earth sample or a liquid extract phase to reduce to possibility of false indications of contamination that could occur when humic material is present in the earth sample. This is particularly a problem when using relatively safe, non-toxic and inexpensive polar solvents such as isopropyl alcohol since the humic material tends to be very soluble in those solvents when water is present. Also provided is an ultraviolet spectroscopic measuring technique for obtaining an indication as to whether a liquid extract phase contains aromatic organic contaminants. In one embodiment, the liquid extract phase is subjected to a narrow and discrete band of radiation including a desired wave length and the ability of the liquid extract phase to absorb that wavelength of ultraviolet radiation is measured to provide an indication of the presence of aromatic organic contaminants.

  13. Eigenvector method for umbrella sampling enables error analysis

    PubMed Central

    Thiede, Erik H.; Van Koten, Brian; Weare, Jonathan; Dinner, Aaron R.

    2016-01-01

    Umbrella sampling efficiently yields equilibrium averages that depend on exploring rare states of a model by biasing simulations to windows of coordinate values and then combining the resulting data with physical weighting. Here, we introduce a mathematical framework that casts the step of combining the data as an eigenproblem. The advantage to this approach is that it facilitates error analysis. We discuss how the error scales with the number of windows. Then, we derive a central limit theorem for averages that are obtained from umbrella sampling. The central limit theorem suggests an estimator of the error contributions from individual windows, and we develop a simple and computationally inexpensive procedure for implementing it. We demonstrate this estimator for simulations of the alanine dipeptide and show that it emphasizes low free energy pathways between stable states in comparison to existing approaches for assessing error contributions. Our work suggests the possibility of using the estimator and, more generally, the eigenvector method for umbrella sampling to guide adaptation of the simulation parameters to accelerate convergence. PMID:27586912

  14. Eigenvector method for umbrella sampling enables error analysis.

    PubMed

    Thiede, Erik H; Van Koten, Brian; Weare, Jonathan; Dinner, Aaron R

    2016-08-28

    Umbrella sampling efficiently yields equilibrium averages that depend on exploring rare states of a model by biasing simulations to windows of coordinate values and then combining the resulting data with physical weighting. Here, we introduce a mathematical framework that casts the step of combining the data as an eigenproblem. The advantage to this approach is that it facilitates error analysis. We discuss how the error scales with the number of windows. Then, we derive a central limit theorem for averages that are obtained from umbrella sampling. The central limit theorem suggests an estimator of the error contributions from individual windows, and we develop a simple and computationally inexpensive procedure for implementing it. We demonstrate this estimator for simulations of the alanine dipeptide and show that it emphasizes low free energy pathways between stable states in comparison to existing approaches for assessing error contributions. Our work suggests the possibility of using the estimator and, more generally, the eigenvector method for umbrella sampling to guide adaptation of the simulation parameters to accelerate convergence.

  15. Personal sampling of airborne particles: method performance and data quality.

    PubMed

    Janssen, N A; Hoek, G; Harssema, H; Brunekreef, B

    1998-01-01

    A study of personal exposure to respirable particles (PM10) and fine particles (FP) was conducted in groups of 50-70 year-old adults and primary school children in the Netherlands. Four to eight personal measurements per subject were conducted, on weekdays only. Averaging time was 24 hours. Method performance was evaluated regarding compliance, flow, weighing procedure, field blanks and co-located operation of the personal samplers with stationary methods. Furthermore, the possibility that subjects change their behavior due to the wearing of personal sampling equipment was studied by comparing time activity on days of personal sampling with time activity other weekdays. Compliance was high; 95% of the subjects who agreed to continue participating after the first measurement, successfully completed the study, and, expect for the first two days of FP sampling, over 90% of all personal measurements were successful. All pre and post sampling flow readings were within 10% of the required flow rate of 4 L/min. For PM10 precision of the gravimetric analyses was 2.8 microgram/m3 and 0.7 micrograms/m3 for filters weighted on an analytical and a micro-balance respectively. The detection limit was 10.8 micrograms/m3 and 8.6 micrograms/m3 respectively. For FP, weighing precision was 0.4 micrograms/m3 and the detection limit was 5.3 micrograms/m3. All measurements were above the detection limit. Co-located operation of the personal sampler with stationary samplers gave highly correlated concentration (R > 0.90). Outdoor PM10 concentrations measured with the personal sampler were on average 4% higher compared to a Sierra Anderson (SA) inlet and 9% higher compared to a PM10 Harvard Impactor (HI). With the FP cyclone 6% higher classroom concentrations were measured compared to a PM2.5 HI. Adults spent significantly less time outdoor (0.5 hour) and more time at home (0.9 hour) on days of personal sampling compared to other weekdays. For children no significant differences in time

  16. Report: EPA Can Better Reduce Risks From Illegal Pesticides by Effectively Identifying Imports for Inspection and Sampling

    EPA Pesticide Factsheets

    Report #17-P-0412, September 28, 2017. Low rates of inspections and sampling can create a risk that the EPA may not be identifying and deterring the import of pesticides harmful to people or the environment.

  17. SAMSAN- MODERN NUMERICAL METHODS FOR CLASSICAL SAMPLED SYSTEM ANALYSIS

    NASA Technical Reports Server (NTRS)

    Frisch, H. P.

    1994-01-01

    SAMSAN was developed to aid the control system analyst by providing a self consistent set of computer algorithms that support large order control system design and evaluation studies, with an emphasis placed on sampled system analysis. Control system analysts have access to a vast array of published algorithms to solve an equally large spectrum of controls related computational problems. The analyst usually spends considerable time and effort bringing these published algorithms to an integrated operational status and often finds them less general than desired. SAMSAN reduces the burden on the analyst by providing a set of algorithms that have been well tested and documented, and that can be readily integrated for solving control system problems. Algorithm selection for SAMSAN has been biased toward numerical accuracy for large order systems with computational speed and portability being considered important but not paramount. In addition to containing relevant subroutines from EISPAK for eigen-analysis and from LINPAK for the solution of linear systems and related problems, SAMSAN contains the following not so generally available capabilities: 1) Reduction of a real non-symmetric matrix to block diagonal form via a real similarity transformation matrix which is well conditioned with respect to inversion, 2) Solution of the generalized eigenvalue problem with balancing and grading, 3) Computation of all zeros of the determinant of a matrix of polynomials, 4) Matrix exponentiation and the evaluation of integrals involving the matrix exponential, with option to first block diagonalize, 5) Root locus and frequency response for single variable transfer functions in the S, Z, and W domains, 6) Several methods of computing zeros for linear systems, and 7) The ability to generate documentation "on demand". All matrix operations in the SAMSAN algorithms assume non-symmetric matrices with real double precision elements. There is no fixed size limit on any matrix in any

  18. SAMSAN- MODERN NUMERICAL METHODS FOR CLASSICAL SAMPLED SYSTEM ANALYSIS

    NASA Technical Reports Server (NTRS)

    Frisch, H. P.

    1994-01-01

    SAMSAN was developed to aid the control system analyst by providing a self consistent set of computer algorithms that support large order control system design and evaluation studies, with an emphasis placed on sampled system analysis. Control system analysts have access to a vast array of published algorithms to solve an equally large spectrum of controls related computational problems. The analyst usually spends considerable time and effort bringing these published algorithms to an integrated operational status and often finds them less general than desired. SAMSAN reduces the burden on the analyst by providing a set of algorithms that have been well tested and documented, and that can be readily integrated for solving control system problems. Algorithm selection for SAMSAN has been biased toward numerical accuracy for large order systems with computational speed and portability being considered important but not paramount. In addition to containing relevant subroutines from EISPAK for eigen-analysis and from LINPAK for the solution of linear systems and related problems, SAMSAN contains the following not so generally available capabilities: 1) Reduction of a real non-symmetric matrix to block diagonal form via a real similarity transformation matrix which is well conditioned with respect to inversion, 2) Solution of the generalized eigenvalue problem with balancing and grading, 3) Computation of all zeros of the determinant of a matrix of polynomials, 4) Matrix exponentiation and the evaluation of integrals involving the matrix exponential, with option to first block diagonalize, 5) Root locus and frequency response for single variable transfer functions in the S, Z, and W domains, 6) Several methods of computing zeros for linear systems, and 7) The ability to generate documentation "on demand". All matrix operations in the SAMSAN algorithms assume non-symmetric matrices with real double precision elements. There is no fixed size limit on any matrix in any

  19. Photothermal method using a pyroelectric sensor for thermophysical characterization of agricultural and biological samples

    NASA Astrophysics Data System (ADS)

    Frandas, A.; Dadarlat, Dorin; Chirtoc, Mihai; Jalink, Henk; Bicanic, Dane D.; Paris, D.; Antoniow, Jean S.; Egee, Michel; Ungureanu, Costica

    1998-07-01

    The photopyroelectric method in different experimental configurations was used for thermophysical characterization of agricultural and biological samples. The study appears important due to the relation of thermal parameters to the quality of foodstuffs (connected to their preservation, storage and adulteration), migration profiles in biodegradable packages, and the mechanism of desiccation tolerance of seeds. Results are presented on the thermal parameters measurement and their dependence on temperature and water content for samples such as: honey, starch, seeds.

  20. BMAA extraction of cyanobacteria samples: which method to choose?

    PubMed

    Lage, Sandra; Burian, Alfred; Rasmussen, Ulla; Costa, Pedro Reis; Annadotter, Heléne; Godhe, Anna; Rydberg, Sara

    2016-01-01

    β-N-Methylamino-L-alanine (BMAA), a neurotoxin reportedly produced by cyanobacteria, diatoms and dinoflagellates, is proposed to be linked to the development of neurological diseases. BMAA has been found in aquatic and terrestrial ecosystems worldwide, both in its phytoplankton producers and in several invertebrate and vertebrate organisms that bioaccumulate it. LC-MS/MS is the most frequently used analytical technique in BMAA research due to its high selectivity, though consensus is lacking as to the best extraction method to apply. This study accordingly surveys the efficiency of three extraction methods regularly used in BMAA research to extract BMAA from cyanobacteria samples. The results obtained provide insights into possible reasons for the BMAA concentration discrepancies in previous publications. In addition and according to the method validation guidelines for analysing cyanotoxins, the TCA protein precipitation method, followed by AQC derivatization and LC-MS/MS analysis, is now validated for extracting protein-bound (after protein hydrolysis) and free BMAA from cyanobacteria matrix. BMAA biological variability was also tested through the extraction of diatom and cyanobacteria species, revealing a high variance in BMAA levels (0.0080-2.5797 μg g(-1) DW).

  1. Methods to maximise recovery of environmental DNA from water samples

    PubMed Central

    Gleeson, Dianne; Lintermans, Mark

    2017-01-01

    The environmental DNA (eDNA) method is a detection technique that is rapidly gaining credibility as a sensitive tool useful in the surveillance and monitoring of invasive and threatened species. Because eDNA analysis often deals with small quantities of short and degraded DNA fragments, methods that maximize eDNA recovery are required to increase detectability. In this study, we performed experiments at different stages of the eDNA analysis to show which combinations of methods give the best recovery rate for eDNA. Using Oriental weatherloach (Misgurnus anguillicaudatus) as a study species, we show that various combinations of DNA capture, preservation and extraction methods can significantly affect DNA yield. Filtration using cellulose nitrate filter paper preserved in ethanol or stored in a -20°C freezer and extracted with the Qiagen DNeasy kit outperformed other combinations in terms of cost and efficiency of DNA recovery. Our results support the recommendation to filter water samples within 24hours but if this is not possible, our results suggest that refrigeration may be a better option than freezing for short-term storage (i.e., 3–5 days). This information is useful in designing eDNA detection of low-density invasive or threatened species, where small variations in DNA recovery can signify the difference between detection success or failure. PMID:28604830

  2. Importance sampling-based Monte Carlo simulation of time-domain optical coherence tomography with embedded objects.

    PubMed

    Periyasamy, Vijitha; Pramanik, Manojit

    2016-04-10

    Monte Carlo simulation for light propagation in biological tissue is widely used to study light-tissue interaction. Simulation for optical coherence tomography (OCT) studies requires handling of embedded objects of various shapes. In this work, time-domain OCT simulations for multilayered tissue with embedded objects (such as sphere, cylinder, ellipsoid, and cuboid) was done. Improved importance sampling (IS) was implemented for the proposed OCT simulation for faster speed. At first, IS was validated against standard and angular biased Monte Carlo methods for OCT. Both class I and class II photons were in agreement in all the three methods. However, the IS method had more than tenfold improvement in terms of simulation time. Next, B-scan images were obtained for four types of embedded objects. All the four shapes are clearly visible from the B-scan OCT images. With the improved IS B-scan OCT images of embedded objects can be obtained with reasonable simulation time using a standard desktop computer. User-friendly, C-based, Monte Carlo simulation for tissue layers with embedded objects for OCT (MCEO-OCT) will be very useful for time-domain OCT simulations in many biological applications.

  3. Automated Aqueous Sample Concentration Methods for in situ Astrobiological Instrumentation

    NASA Astrophysics Data System (ADS)

    Aubrey, A. D.; Grunthaner, F. J.

    2009-12-01

    The era of wet chemical experiments for in situ planetary science investigations is upon us, as evidenced by recent results from the surface of Mars by Phoenix’s microscopy, electrochemistry, and conductivity analyzer, MECA [1]. Studies suggest that traditional thermal volatilization methods for planetary science in situ investigations induce organic degradation during sample processing [2], an effect that is enhanced in the presence of oxidants [3]. Recent developments have trended towards adaptation of non-destructive aqueous extraction and analytical methods for future astrobiological instrumentation. Wet chemical extraction techniques under investigation include subcritical water extraction, SCWE [4], aqueous microwave assisted extraction, MAE, and organic solvent extraction [5]. Similarly, development of miniaturized analytical space flight instruments that require aqueous extracts include microfluidic capillary electrophoresis chips, μCE [6], liquid-chromatography mass-spectrometrometers, LC-MS [7], and life marker chips, LMC [8]. If organics are present on the surface of Mars, they are expected to be present at extremely low concentrations (parts-per-billion), orders of magnitude below the sensitivities of most flight instrument technologies. Therefore, it becomes necessary to develop and integrate concentration mechanisms for in situ sample processing before delivery to analytical flight instrumentation. We present preliminary results of automated solid-phase-extraction (SPE) sample purification and concentration methods for the treatment of highly saline aqueous soil extracts. These methods take advantage of the affinity of low molecular weight organic compounds with natural and synthetic scavenger materials. These interactions allow for the separation of target organic analytes from unfavorable background species (i.e. salts) during inline treatment, and a clever method for selective desorption is utilized to obtain concentrated solutions on the order

  4. A Gibbs sampling method to determine biomarkers for asthma.

    PubMed

    Huang, Zhi-Jian; Shen, Qin-Hai; Wu, Yan-Sheng; Huang, Ya-Li

    2017-04-01

    To identify potential biomarkers and to uncover the mechanisms underlying asthma based on Gibbs sampling. The molecular functions (MFs) with genes greater than 5 were determined using AnnotationMFGO of BAGS package, and the obtained MFs were then transformed to Markov chain (MC). Gibbs sampling was conducted to obtain a new MC. Meanwhile, the average probabilities of MFs were computed via MC Monte Carlo (MCMC) algorithm, followed by identification of differentially expressed MFs based on the probabilities of MF more than 0.6. Moreover, the differentially expressed genes (DEGs) and their correlated genes were screened and merged, called as co-expressed genes. Pathways enrichment analysis was implemented for the co-expressed genes. Based on the gene set more than 5, overall 396 MFs were determined. After Gibbs sampling, 5 differentially expressed MF were acquired according to alfa.pi>0.6. Moreover, the genes in these 5 differentially expressed MF were merged, and 110 DEGs were identified. Subsequently, 338 co-expressed genes were gained. Based on the P value<0.01, the co-expressed genes were significantly enriched in 6 pathways. Among these, ubiquitin mediated proteolysis contained the maximum numbers of 35 co-expressed genes, and cell cycle were enriched by the second largest number of 11 co-expressed genes, respectively. The identified pathways such as ubiquitin mediated proteolysis and cell cycle might play important roles in the development of asthma and may be useful for developing the credible therapeutic approaches for diagnosis and treatment of asthma in future. Copyright © 2017 Elsevier Ltd. All rights reserved.

  5. RESULTS FROM EPA FUNDED RESEARCH PROGRAMS ON THE IMPORTANCE OF PURGE VOLUME, SAMPLE VOLUME, SAMPLE FLOW RATE AND TEMPORAL VARIATIONS ON SOIL GAS CONCENTRATIONS

    EPA Science Inventory

    Two research studies funded and overseen by EPA have been conducted since October 2006 on soil gas sampling methods and variations in shallow soil gas concentrations with the purpose of improving our understanding of soil gas methods and data for vapor intrusion applications. Al...

  6. RESULTS FROM EPA FUNDED RESEARCH PROGRAMS ON THE IMPORTANCE OF PURGE VOLUME, SAMPLE VOLUME, SAMPLE FLOW RATE AND TEMPORAL VARIATIONS ON SOIL GAS CONCENTRATIONS

    EPA Science Inventory

    Two research studies funded and overseen by EPA have been conducted since October 2006 on soil gas sampling methods and variations in shallow soil gas concentrations with the purpose of improving our understanding of soil gas methods and data for vapor intrusion applications. Al...

  7. 19 CFR 151.70 - Method of sampling by Customs.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... THE TREASURY (CONTINUED) EXAMINATION, SAMPLING, AND TESTING OF MERCHANDISE Wool and Hair § 151.70... feasible to obtain a representative general sample of the wool or hair in a sampling unit or to test such a...

  8. 19 CFR 151.70 - Method of sampling by Customs.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... THE TREASURY (CONTINUED) EXAMINATION, SAMPLING, AND TESTING OF MERCHANDISE Wool and Hair § 151.70... feasible to obtain a representative general sample of the wool or hair in a sampling unit or to test such a...

  9. 19 CFR 151.70 - Method of sampling by Customs.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... THE TREASURY (CONTINUED) EXAMINATION, SAMPLING, AND TESTING OF MERCHANDISE Wool and Hair § 151.70... feasible to obtain a representative general sample of the wool or hair in a sampling unit or to test such...

  10. 19 CFR 151.70 - Method of sampling by Customs.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... THE TREASURY (CONTINUED) EXAMINATION, SAMPLING, AND TESTING OF MERCHANDISE Wool and Hair § 151.70... feasible to obtain a representative general sample of the wool or hair in a sampling unit or to test such...

  11. 19 CFR 151.70 - Method of sampling by Customs.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... THE TREASURY (CONTINUED) EXAMINATION, SAMPLING, AND TESTING OF MERCHANDISE Wool and Hair § 151.70... feasible to obtain a representative general sample of the wool or hair in a sampling unit or to test such...

  12. [Proportion of aflatoxin B1 contaminated kernels and its concentration in imported peanut samples].

    PubMed

    Hirano, S; Shima, T; Shimada, T

    2001-08-01

    Moldy and split peanut kernels were separated from peanuts exported from Brazil, Sudan, India and Taiwan by visual inspection. The remaining peanuts from Brazil, Sudan and India were roasted lightly and the skins were removed. Stained peanuts were separated from the others. Aflatoxin was detected in moldy and stained peanuts. There was a positive correlation between % of aflatoxin-contaminated peanut kernels and aflatoxin B1 concentration in whole samples. Aflatoxin concentration of moldy peanuts was higher than that of stained peanut kernels.

  13. Sampling small mammals in southeastern forests: the importance of trapping in trees

    Treesearch

    Susan C. Loeb; Gregg L. Chapman; Theodore R. Ridley

    2001-01-01

    Because estimates of small mammal species richness and diversity are strongly influenced by sampling methodology, 2 or more trap types are often used in studies of small mammal communities. However, in most cases, all traps are placed at ground level. In contrast, we used Sherman live traps placed at 1.5 m in trees in addition to Sherman live traps and Mosby box traps...

  14. Rapid method for the determination of 226Ra in hydraulic fracturing wastewater samples

    SciTech Connect

    Maxwell, Sherrod L.; Culligan, Brian K.; Warren, Richard A.; McAlister, Daniel R.

    2016-03-24

    A new method that rapidly preconcentrates and measures 226Ra from hydraulic fracturing wastewater samples was developed in the Savannah River Environmental Laboratory. The method improves the quality of 226Ra measurements using gamma spectrometry by providing up to 100x preconcentration of 226Ra from this difficult sample matrix, which contains very high levels of calcium, barium, strontium, magnesium and sodium. The high chemical yield, typically 80-90%, facilitates a low detection limit, important for lower level samples, and indicates method ruggedness. Ba-133 tracer is used to determine chemical yield and correct for geometry-related counting issues. The 226Ra sample preparation takes < 2 hours.

  15. Approaches of using the beard testing method to obtain complete length distributions of the original samples

    USDA-ARS?s Scientific Manuscript database

    The fiber testing instruments such as HVI can rapidly measure fiber length by testing a tapered fiber beard of the sample. But these instruments that use the beard testing method only report a limited number of fiber length parameters instead of the complete length distribution that is important fo...

  16. Clustering Methods with Qualitative Data: a Mixed-Methods Approach for Prevention Research with Small Samples.

    PubMed

    Henry, David; Dymnicki, Allison B; Mohatt, Nathaniel; Allen, James; Kelly, James G

    2015-10-01

    Qualitative methods potentially add depth to prevention research but can produce large amounts of complex data even with small samples. Studies conducted with culturally distinct samples often produce voluminous qualitative data but may lack sufficient sample sizes for sophisticated quantitative analysis. Currently lacking in mixed-methods research are methods allowing for more fully integrating qualitative and quantitative analysis techniques. Cluster analysis can be applied to coded qualitative data to clarify the findings of prevention studies by aiding efforts to reveal such things as the motives of participants for their actions and the reasons behind counterintuitive findings. By clustering groups of participants with similar profiles of codes in a quantitative analysis, cluster analysis can serve as a key component in mixed-methods research. This article reports two studies. In the first study, we conduct simulations to test the accuracy of cluster assignment using three different clustering methods with binary data as produced when coding qualitative interviews. Results indicated that hierarchical clustering, K-means clustering, and latent class analysis produced similar levels of accuracy with binary data and that the accuracy of these methods did not decrease with samples as small as 50. Whereas the first study explores the feasibility of using common clustering methods with binary data, the second study provides a "real-world" example using data from a qualitative study of community leadership connected with a drug abuse prevention project. We discuss the implications of this approach for conducting prevention research, especially with small samples and culturally distinct communities.

  17. Clustering Methods with Qualitative Data: A Mixed Methods Approach for Prevention Research with Small Samples

    PubMed Central

    Henry, David; Dymnicki, Allison B.; Mohatt, Nathaniel; Allen, James; Kelly, James G.

    2016-01-01

    Qualitative methods potentially add depth to prevention research, but can produce large amounts of complex data even with small samples. Studies conducted with culturally distinct samples often produce voluminous qualitative data, but may lack sufficient sample sizes for sophisticated quantitative analysis. Currently lacking in mixed methods research are methods allowing for more fully integrating qualitative and quantitative analysis techniques. Cluster analysis can be applied to coded qualitative data to clarify the findings of prevention studies by aiding efforts to reveal such things as the motives of participants for their actions and the reasons behind counterintuitive findings. By clustering groups of participants with similar profiles of codes in a quantitative analysis, cluster analysis can serve as a key component in mixed methods research. This article reports two studies. In the first study, we conduct simulations to test the accuracy of cluster assignment using three different clustering methods with binary data as produced when coding qualitative interviews. Results indicated that hierarchical clustering, K-Means clustering, and latent class analysis produced similar levels of accuracy with binary data, and that the accuracy of these methods did not decrease with samples as small as 50. Whereas the first study explores the feasibility of using common clustering methods with binary data, the second study provides a “real-world” example using data from a qualitative study of community leadership connected with a drug abuse prevention project. We discuss the implications of this approach for conducting prevention research, especially with small samples and culturally distinct communities. PMID:25946969

  18. 40 CFR 80.583 - What alternative sampling and testing requirements apply to importers who transport motor vehicle...

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... by truck or rail car? 80.583 Section 80.583 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY... diesel fuel, or ECA marine fuel by truck or rail car? Importers who import diesel fuel subject to the 15... car may comply with the following requirements instead of the requirements to sample and test...

  19. 40 CFR 80.583 - What alternative sampling and testing requirements apply to importers who transport motor vehicle...

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... by truck or rail car? 80.583 Section 80.583 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY... diesel fuel, or ECA marine fuel by truck or rail car? Importers who import diesel fuel subject to the 15... car may comply with the following requirements instead of the requirements to sample and test...

  20. 40 CFR 80.583 - What alternative sampling and testing requirements apply to importers who transport motor vehicle...

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... by truck or rail car? 80.583 Section 80.583 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY... diesel fuel, or ECA marine fuel by truck or rail car? Importers who import diesel fuel subject to the 15... car may comply with the following requirements instead of the requirements to sample and test...

  1. Active Search on Carcasses versus Pitfall Traps: a Comparison of Sampling Methods.

    PubMed

    Zanetti, N I; Camina, R; Visciarelli, E C; Centeno, N D

    2016-04-01

    The study of insect succession in cadavers and the classification of arthropods have mostly been done by placing a carcass in a cage, protected from vertebrate scavengers, which is then visited periodically. An alternative is to use specific traps. Few studies on carrion ecology and forensic entomology involving the carcasses of large vertebrates have employed pitfall traps. The aims of this study were to compare both sampling methods (active search on a carcass and pitfall trapping) for each coleopteran family, and to establish whether there is a discrepancy (underestimation and/or overestimation) in the presence of each family by either method. A great discrepancy was found for almost all families with some of them being more abundant in samples obtained through active search on carcasses and others in samples from traps, whereas two families did not show any bias towards a given sampling method. The fact that families may be underestimated or overestimated by the type of sampling technique highlights the importance of combining both methods, active search on carcasses and pitfall traps, in order to obtain more complete information on decomposition, carrion habitat and cadaveric families or species. Furthermore, a hypothesis advanced on the reasons for the underestimation by either sampling method showing biases towards certain families. Information about the sampling techniques indicating which would be more appropriate to detect or find a particular family is provided.

  2. A New Automated Method and Sample Data Flow for Analysis of Volatile Nitrosamines in Human Urine*

    PubMed Central

    Hodgson, James A.; Seyler, Tiffany H.; McGahee, Ernest; Arnstein, Stephen; Wang, Lanqing

    2016-01-01

    Volatile nitrosamines (VNAs) are a group of compounds classified as probable (group 2A) and possible (group 2B) carcinogens in humans. Along with certain foods and contaminated drinking water, VNAs are detected at high levels in tobacco products and in both mainstream and sidestream smoke. Our laboratory monitors six urinary VNAs—N-nitrosodimethylamine (NDMA), N-nitrosomethylethylamine (NMEA), N-nitrosodiethylamine (NDEA), N-nitrosopiperidine (NPIP), N-nitrosopyrrolidine (NPYR), and N-nitrosomorpholine (NMOR)—using isotope dilution GC-MS/MS (QQQ) for large population studies such as the National Health and Nutrition Examination Survey (NHANES). In this paper, we report for the first time a new automated sample preparation method to more efficiently quantitate these VNAs. Automation is done using Hamilton STAR™ and Caliper Staccato™ workstations. This new automated method reduces sample preparation time from 4 hours to 2.5 hours while maintaining precision (inter-run CV < 10%) and accuracy (85% - 111%). More importantly this method increases sample throughput while maintaining a low limit of detection (<10 pg/mL) for all analytes. A streamlined sample data flow was created in parallel to the automated method, in which samples can be tracked from receiving to final LIMs output with minimal human intervention, further minimizing human error in the sample preparation process. This new automated method and the sample data flow are currently applied in bio-monitoring of VNAs in the US non-institutionalized population NHANES 2013-2014 cycle. PMID:26949569

  3. Detection of Acanthamoeba and Toxoplasma in River Water Samples by Molecular Methods in Iran

    PubMed Central

    MAHMOUDI, Mohammad Reza; KAZEMI, Bahram; HAGHIGHI, Ali; KARANIS, Panagiotis

    2015-01-01

    Background: Free-living amoebae such as Acanthamoeba species may act as carriers of Cryptosporidium and Toxoplasma oocysts, thus, may play an important role in the water-borne transmission of these parasites. In the present study, a loop mediated isothermal amplification (LAMP) method for detection of Toxoplasma and a PCR assay were developed for investigation of Acanthamoeba in environmental water samples. Methods: A total of 34 samples were collected from the surface water in Guilan Province. Water samples were filtrated with membrane filters and followed by DNA extraction. PCR and LAMP methods used for detection of the protozoan parasites Acanthamoeba and Toxoplasma respectively. Results: Totally 30 and 2 of 34 samples were positive for Acanthamoeba and Toxoplasma oocysts respectively. Two samples were positive for both investigated parasites. Conclusion: The investigated water supplies, are contaminated by Toxoplasma and Acanthamoeba (oo)cystes. Acanthamoeba may play an important role in water-borne transmission of Toxoplasma in the study area. For the first time in Iran, protocol of LAMP method was used effectively for the detection of Toxoplasma in surface water samples in Iran. PMID:26246823

  4. Passive sampling methods for contaminated sediments: Risk assessment and management

    PubMed Central

    Greenberg, Marc S; Chapman, Peter M; Allan, Ian J; Anderson, Kim A; Apitz, Sabine E; Beegan, Chris; Bridges, Todd S; Brown, Steve S; Cargill, John G; McCulloch, Megan C; Menzie, Charles A; Shine, James P; Parkerton, Thomas F

    2014-01-01

    This paper details how activity-based passive sampling methods (PSMs), which provide information on bioavailability in terms of freely dissolved contaminant concentrations (Cfree), can be used to better inform risk management decision making at multiple points in the process of assessing and managing contaminated sediment sites. PSMs can increase certainty in site investigation and management, because Cfree is a better predictor of bioavailability than total bulk sediment concentration (Ctotal) for 4 key endpoints included in conceptual site models (benthic organism toxicity, bioaccumulation, sediment flux, and water column exposures). The use of passive sampling devices (PSDs) presents challenges with respect to representative sampling for estimating average concentrations and other metrics relevant for exposure and risk assessment. These challenges can be addressed by designing studies that account for sources of variation associated with PSMs and considering appropriate spatial scales to meet study objectives. Possible applications of PSMs include: quantifying spatial and temporal trends in bioavailable contaminants, identifying and evaluating contaminant source contributions, calibrating site-specific models, and, improving weight-of-evidence based decision frameworks. PSM data can be used to assist in delineating sediment management zones based on likelihood of exposure effects, monitor remedy effectiveness, and, evaluate risk reduction after sediment treatment, disposal, or beneficial reuse after management actions. Examples are provided illustrating why PSMs and freely dissolved contaminant concentrations (Cfree) should be incorporated into contaminated sediment investigations and study designs to better focus on and understand contaminant bioavailability, more accurately estimate exposure to sediment-associated contaminants, and better inform risk management decisions. Research and communication needs for encouraging broader use are discussed. Integr

  5. Passive sampling methods for contaminated sediments: risk assessment and management.

    PubMed

    Greenberg, Marc S; Chapman, Peter M; Allan, Ian J; Anderson, Kim A; Apitz, Sabine E; Beegan, Chris; Bridges, Todd S; Brown, Steve S; Cargill, John G; McCulloch, Megan C; Menzie, Charles A; Shine, James P; Parkerton, Thomas F

    2014-04-01

    This paper details how activity-based passive sampling methods (PSMs), which provide information on bioavailability in terms of freely dissolved contaminant concentrations (Cfree ), can be used to better inform risk management decision making at multiple points in the process of assessing and managing contaminated sediment sites. PSMs can increase certainty in site investigation and management, because Cfree is a better predictor of bioavailability than total bulk sediment concentration (Ctotal ) for 4 key endpoints included in conceptual site models (benthic organism toxicity, bioaccumulation, sediment flux, and water column exposures). The use of passive sampling devices (PSDs) presents challenges with respect to representative sampling for estimating average concentrations and other metrics relevant for exposure and risk assessment. These challenges can be addressed by designing studies that account for sources of variation associated with PSMs and considering appropriate spatial scales to meet study objectives. Possible applications of PSMs include: quantifying spatial and temporal trends in bioavailable contaminants, identifying and evaluating contaminant source contributions, calibrating site-specific models, and, improving weight-of-evidence based decision frameworks. PSM data can be used to assist in delineating sediment management zones based on likelihood of exposure effects, monitor remedy effectiveness, and, evaluate risk reduction after sediment treatment, disposal, or beneficial reuse after management actions. Examples are provided illustrating why PSMs and freely dissolved contaminant concentrations (Cfree ) should be incorporated into contaminated sediment investigations and study designs to better focus on and understand contaminant bioavailability, more accurately estimate exposure to sediment-associated contaminants, and better inform risk management decisions. Research and communication needs for encouraging broader use are discussed. © 2014

  6. Improved transition path sampling methods for simulation of rare events.

    PubMed

    Chopra, Manan; Malshe, Rohit; Reddy, Allam S; de Pablo, J J

    2008-04-14

    The free energy surfaces of a wide variety of systems encountered in physics, chemistry, and biology are characterized by the existence of deep minima separated by numerous barriers. One of the central aims of recent research in computational chemistry and physics has been to determine how transitions occur between deep local minima on rugged free energy landscapes, and transition path sampling (TPS) Monte-Carlo methods have emerged as an effective means for numerical investigation of such transitions. Many of the shortcomings of TPS-like approaches generally stem from their high computational demands. Two new algorithms are presented in this work that improve the efficiency of TPS simulations. The first algorithm uses biased shooting moves to render the sampling of reactive trajectories more efficient. The second algorithm is shown to substantially improve the accuracy of the transition state ensemble by introducing a subset of local transition path simulations in the transition state. The system considered in this work consists of a two-dimensional rough energy surface that is representative of numerous systems encountered in applications. When taken together, these algorithms provide gains in efficiency of over two orders of magnitude when compared to traditional TPS simulations.

  7. Importance of closely spaced vertical sampling in delineating chemical and microbiological gradients in groundwater studies

    USGS Publications Warehouse

    Smith, R.L.; Harvey, R.W.; LeBlanc, D.R.

    1991-01-01

    Vertical gradients of selected chemical constituents, bacterial populations, bacterial activity and electron acceptors were investigated for an unconfined aquifer contaminated with nitrate and organic compounds on Cape Cod, Massachusetts, U.S.A. Fifteen-port multilevel sampling devices (MLS's) were installed within the contaminant plume at the source of the contamination, and at 250 and 2100 m downgradient from the source. Depth profiles of specific conductance and dissolved oxygen at the downgradient sites exhibited vertical gradients that were both steep and inversely related. Narrow zones (2-4 m thick) of high N2O and NH4+ concentrations were also detected within the contaminant plume. A 27-fold change in bacterial abundance; a 35-fold change in frequency of dividing cells (FDC), an indicator of bacterial growth; a 23-fold change in 3H-glucose uptake, a measure of heterotrophic activity; and substantial changes in overall cell morphology were evident within a 9-m vertical interval at 250 m downgradient. The existence of these gradients argues for the need for closely spaced vertical sampling in groundwater studies because small differences in the vertical placement of a well screen can lead to incorrect conclusions about the chemical and microbiological processes within an aquifer.Vertical gradients of selected chemical constituents, bacterial populations, bacterial activity and electron acceptors were investigated for an unconfined aquifer contaminated with nitrate and organic compounds on Cape Cod, Massachusetts, USA. Fifteen-port multilevel sampling devices (MLS's) were installed within the contaminant plume at the source of the contamination, and at 250 and 2100 m downgradient from the source. Depth profiles of specific conductance and dissolved oxygen at the downgradient sites exhibited vertical gradients that were both steep and inversely related. Narrow zones (2-4 m thick) of high N2O and NH4+ concentrations were also detected within the contaminant plume

  8. A conditional random fields method for RNA sequence-structure relationship modeling and conformation sampling.

    PubMed

    Wang, Zhiyong; Xu, Jinbo

    2011-07-01

    Accurate tertiary structures are very important for the functional study of non-coding RNA molecules. However, predicting RNA tertiary structures is extremely challenging, because of a large conformation space to be explored and lack of an accurate scoring function differentiating the native structure from decoys. The fragment-based conformation sampling method (e.g. FARNA) bears shortcomings that the limited size of a fragment library makes it infeasible to represent all possible conformations well. A recent dynamic Bayesian network method, BARNACLE, overcomes the issue of fragment assembly. In addition, neither of these methods makes use of sequence information in sampling conformations. Here, we present a new probabilistic graphical model, conditional random fields (CRFs), to model RNA sequence-structure relationship, which enables us to accurately estimate the probability of an RNA conformation from sequence. Coupled with a novel tree-guided sampling scheme, our CRF model is then applied to RNA conformation sampling. Experimental results show that our CRF method can model RNA sequence-structure relationship well and sequence information is important for conformation sampling. Our method, named as TreeFolder, generates a much higher percentage of native-like decoys than FARNA and BARNACLE, although we use the same simple energy function as BARNACLE. zywang@ttic.edu; j3xu@ttic.edu Supplementary data are available at Bioinformatics online.

  9. Rectal swab screening assays of public health importance in molecular diagnostics: Sample adequacy control.

    PubMed

    Glisovic, Sanja; Eintracht, Shaun; Longtin, Yves; Oughton, Matthew; Brukner, Ivan

    2017-08-08

    Rectal swabs are routinely used by public health authorities to screen for multi-drug resistant enteric bacteria including vancomycin-resistant enterococci (VRE) and carbapenem-resistant enterobacteriaceae (CRE). Screening sensitivity can be influenced by the quality of the swabbing, whether performed by the patient (self-swabbing) or a healthcare practitioner. One common exclusion criterion for rectal swabs is absence of "visible soiling" from fecal matter. In our institution, this criterion excludes almost 10% of rectal swabs received in the microbiology laboratory. Furthermore, over 30% of patients in whom rectal swabs are cancelled will not be re-screened within the next 48h, resulting in delays in removing infection prevention measures. We describe two quantitative polymerase chain reaction (qPCR)-based assays, human RNAse P and eubacterial 16S rDNA, which might serve as suitable controls for sampling adequacy. However, lower amounts of amplifiable human DNA make the 16s rDNA assay a better candidate for sample adequacy control. Copyright © 2017. Published by Elsevier Ltd.

  10. Characterization of spinal cord lesions in cattle and horses with rabies: the importance of correct sampling.

    PubMed

    Bassuino, Daniele M; Konradt, Guilherme; Cruz, Raquel A S; Silva, Gustavo S; Gomes, Danilo C; Pavarini, Saulo P; Driemeier, David

    2016-07-01

    Twenty-six cattle and 7 horses were diagnosed with rabies. Samples of brain and spinal cord were processed for hematoxylin and eosin staining and immunohistochemistry (IHC). In addition, refrigerated fragments of brain and spinal cord were tested by direct fluorescent antibody test and intracerebral inoculation in mice. Statistical analyses and Fisher exact test were performed by commercial software. Histologic lesions were observed in the spinal cord in all of the cattle and horses. Inflammatory lesions in horses were moderate at the thoracic, lumbar, and sacral levels, and marked at the lumbar enlargement level. Gitter cells were present in large numbers in the lumbar enlargement region. IHC staining intensity ranged from moderate to strong. Inflammatory lesions in cattle were moderate in all spinal cord sections, and gitter cells were present in small numbers. IHC staining intensity was strong in all spinal cord sections. Only 2 horses exhibited lesions in the brain, which were located mainly in the obex and cerebellum; different from that observed in cattle, which had lesions in 25 cases. Fisher exact test showed that the odds of detecting lesions caused by rabies in horses are 3.5 times higher when spinal cord sections are analyzed, as compared to analysis of brain samples alone.

  11. De novo mutations from sporadic schizophrenia cases highlight important signaling genes in an independent sample.

    PubMed

    Kranz, Thorsten M; Harroch, Sheila; Manor, Orly; Lichtenberg, Pesach; Friedlander, Yechiel; Seandel, Marco; Harkavy-Friedman, Jill; Walsh-Messinger, Julie; Dolgalev, Igor; Heguy, Adriana; Chao, Moses V; Malaspina, Dolores

    2015-08-01

    Schizophrenia is a debilitating syndrome with high heritability. Genomic studies reveal more than a hundred genetic variants, largely nonspecific and of small effect size, and not accounting for its high heritability. De novo mutations are one mechanism whereby disease related alleles may be introduced into the population, although these have not been leveraged to explore the disease in general samples. This paper describes a framework to find high impact genes for schizophrenia. This study consists of two different datasets. First, whole exome sequencing was conducted to identify disruptive de novo mutations in 14 complete parent-offspring trios with sporadic schizophrenia from Jerusalem, which identified 5 sporadic cases with de novo gene mutations in 5 different genes (PTPRG, TGM5, SLC39A13, BTK, CDKN3). Next, targeted exome capture of these genes was conducted in 48 well-characterized, unrelated, ethnically diverse schizophrenia cases, recruited and characterized by the same research team in New York (NY sample), which demonstrated extremely rare and potentially damaging variants in three of the five genes (MAF<0.01) in 12/48 cases (25%); including PTPRG (5 cases), SCL39A13 (4 cases) and TGM5 (4 cases), a higher number than usually identified by whole exome sequencing. Cases differed in cognition and illness features based on which mutation-enriched gene they carried. Functional de novo mutations in protein-interaction domains in sporadic schizophrenia can illuminate risk genes that increase the propensity to develop schizophrenia across ethnicities.

  12. Evaluating the performance of sampling plans to detect hypoglycin A in ackee fruit shipments imported into the United States.

    PubMed

    Whitaker, Thomas B; Saltsman, Joyce J; Ware, George M; Slate, Andrew B

    2007-01-01

    Hypoglycin A (HGA) is a toxic amino acid that is naturally produced in unripe ackee fruit. In 1973, the U.S. Food and Drug Administration (FDA) placed a worldwide import alert on ackee fruit, which banned the product from entering the United States. The FDA has considered establishing a regulatory limit for HGA and lifting the ban, which will require development of a monitoring program. The establishment of a regulatory limit for HGA requires the development of a scientifically based sampling plan to detect HGA in ackee fruit imported into the United States. Thirty-three lots of ackee fruit were sampled according to an experimental protocol in which 10 samples, i.e., ten 19 oz cans, were randomly taken from each lot and analyzed for HGA by using liquid chromatography. The total variance was partitioned into sampling and analytical variance components, which were found to be a function of the HGA concentration. Regression equations were developed to predict the total, sampling, and analytical variances as a function of HGA concentration. The observed HGA distribution among the test results for the 10 HGA samples was compared with the normal and lognormal distributions. A computer model based on the lognormal distribution was developed to predict the performance of sampling plan designs to detect HGA in ackee fruit shipments. The performance of several sampling plan designs was evaluated to demonstrate how to manipulate sample size and accept/reject limits to reduce misclassification of ackee fruit lots.

  13. Sampling trace organic compounds in water: a comparison of a continuous active sampler to continuous passive and discrete sampling methods.

    PubMed

    Coes, Alissa L; Paretti, Nicholas V; Foreman, William T; Iverson, Jana L; Alvarez, David A

    2014-03-01

    A continuous active sampling method was compared to continuous passive and discrete sampling methods for the sampling of trace organic compounds (TOCs) in water. Results from each method are compared and contrasted in order to provide information for future investigators to use while selecting appropriate sampling methods for their research. The continuous low-level aquatic monitoring (CLAM) sampler (C.I.Agent® Storm-Water Solutions) is a submersible, low flow-rate sampler, that continuously draws water through solid-phase extraction media. CLAM samplers were deployed at two wastewater-dominated stream field sites in conjunction with the deployment of polar organic chemical integrative samplers (POCIS) and the collection of discrete (grab) water samples. All samples were analyzed for a suite of 69 TOCs. The CLAM and POCIS samples represent time-integrated samples that accumulate the TOCs present in the water over the deployment period (19-23 h for CLAM and 29 days for POCIS); the discrete samples represent only the TOCs present in the water at the time and place of sampling. Non-metric multi-dimensional scaling and cluster analysis were used to examine patterns in both TOC detections and relative concentrations between the three sampling methods. A greater number of TOCs were detected in the CLAM samples than in corresponding discrete and POCIS samples, but TOC concentrations in the CLAM samples were significantly lower than in the discrete and (or) POCIS samples. Thirteen TOCs of varying polarity were detected by all of the three methods. TOC detections and concentrations obtained by the three sampling methods, however, are dependent on multiple factors. This study found that stream discharge, constituent loading, and compound type all affected TOC concentrations detected by each method. In addition, TOC detections and concentrations were affected by the reporting limits, bias, recovery, and performance of each method. Published by Elsevier B.V.

  14. Sampling trace organic compounds in water: a comparison of a continuous active sampler to continuous passive and discrete sampling methods

    USGS Publications Warehouse

    Coes, Alissa L.; Paretti, Nicholas V.; Foreman, William T.; Iverson, Jana L.; Alvarez, David A.

    2014-01-01

    A continuous active sampling method was compared to continuous passive and discrete sampling methods for the sampling of trace organic compounds (TOCs) in water. Results from each method are compared and contrasted in order to provide information for future investigators to use while selecting appropriate sampling methods for their research. The continuous low-level aquatic monitoring (CLAM) sampler (C.I.Agent® Storm-Water Solutions) is a submersible, low flow-rate sampler, that continuously draws water through solid-phase extraction media. CLAM samplers were deployed at two wastewater-dominated stream field sites in conjunction with the deployment of polar organic chemical integrative samplers (POCIS) and the collection of discrete (grab) water samples. All samples were analyzed for a suite of 69 TOCs. The CLAM and POCIS samples represent time-integrated samples that accumulate the TOCs present in the water over the deployment period (19–23 h for CLAM and 29 days for POCIS); the discrete samples represent only the TOCs present in the water at the time and place of sampling. Non-metric multi-dimensional scaling and cluster analysis were used to examine patterns in both TOC detections and relative concentrations between the three sampling methods. A greater number of TOCs were detected in the CLAM samples than in corresponding discrete and POCIS samples, but TOC concentrations in the CLAM samples were significantly lower than in the discrete and (or) POCIS samples. Thirteen TOCs of varying polarity were detected by all of the three methods. TOC detections and concentrations obtained by the three sampling methods, however, are dependent on multiple factors. This study found that stream discharge, constituent loading, and compound type all affected TOC concentrations detected by each method. In addition, TOC detections and concentrations were affected by the reporting limits, bias, recovery, and performance of each method.

  15. A simple capacitive method to evaluate ethanol fuel samples

    NASA Astrophysics Data System (ADS)

    Vello, Tatiana P.; de Oliveira, Rafael F.; Silva, Gustavo O.; de Camargo, Davi H. S.; Bufon, Carlos C. B.

    2017-02-01

    Ethanol is a biofuel used worldwide. However, the presence of excessive water either during the distillation process or by fraudulent adulteration is a major concern in the use of ethanol fuel. High water levels may cause engine malfunction, in addition to being considered illegal. Here, we describe the development of a simple, fast and accurate platform based on nanostructured sensors to evaluate ethanol samples. The device fabrication is facile, based on standard microfabrication and thin-film deposition methods. The sensor operation relies on capacitance measurements employing a parallel plate capacitor containing a conformational aluminum oxide (Al2O3) thin layer (15 nm). The sensor operates over the full range water concentration, i.e., from approximately 0% to 100% vol. of water in ethanol, with water traces being detectable down to 0.5% vol. These characteristics make the proposed device unique with respect to other platforms. Finally, the good agreement between the sensor response and analyses performed by gas chromatography of ethanol biofuel endorses the accuracy of the proposed method. Due to the full operation range, the reported sensor has the technological potential for use as a point-of-care analytical tool at gas stations or in the chemical, pharmaceutical, and beverage industries, to mention a few.

  16. Martian Radiative Transfer Modeling Using the Optimal Spectral Sampling Method

    NASA Technical Reports Server (NTRS)

    Eluszkiewicz, J.; Cady-Pereira, K.; Uymin, G.; Moncet, J.-L.

    2005-01-01

    The large volume of existing and planned infrared observations of Mars have prompted the development of a new martian radiative transfer model that could be used in the retrievals of atmospheric and surface properties. The model is based on the Optimal Spectral Sampling (OSS) method [1]. The method is a fast and accurate monochromatic technique applicable to a wide range of remote sensing platforms (from microwave to UV) and was originally developed for the real-time processing of infrared and microwave data acquired by instruments aboard the satellites forming part of the next-generation global weather satellite system NPOESS (National Polarorbiting Operational Satellite System) [2]. As part of our on-going research related to the radiative properties of the martian polar caps, we have begun the development of a martian OSS model with the goal of using it to perform self-consistent atmospheric corrections necessary to retrieve caps emissivity from the Thermal Emission Spectrometer (TES) spectra. While the caps will provide the initial focus area for applying the new model, it is hoped that the model will be of interest to the wider Mars remote sensing community.

  17. A simple capacitive method to evaluate ethanol fuel samples

    PubMed Central

    Vello, Tatiana P.; de Oliveira, Rafael F.; Silva, Gustavo O.; de Camargo, Davi H. S.; Bufon, Carlos C. B.

    2017-01-01

    Ethanol is a biofuel used worldwide. However, the presence of excessive water either during the distillation process or by fraudulent adulteration is a major concern in the use of ethanol fuel. High water levels may cause engine malfunction, in addition to being considered illegal. Here, we describe the development of a simple, fast and accurate platform based on nanostructured sensors to evaluate ethanol samples. The device fabrication is facile, based on standard microfabrication and thin-film deposition methods. The sensor operation relies on capacitance measurements employing a parallel plate capacitor containing a conformational aluminum oxide (Al2O3) thin layer (15 nm). The sensor operates over the full range water concentration, i.e., from approximately 0% to 100% vol. of water in ethanol, with water traces being detectable down to 0.5% vol. These characteristics make the proposed device unique with respect to other platforms. Finally, the good agreement between the sensor response and analyses performed by gas chromatography of ethanol biofuel endorses the accuracy of the proposed method. Due to the full operation range, the reported sensor has the technological potential for use as a point-of-care analytical tool at gas stations or in the chemical, pharmaceutical, and beverage industries, to mention a few. PMID:28240312

  18. MARKOV CHAIN MONTE CARLO POSTERIOR SAMPLING WITH THE HAMILTONIAN METHOD

    SciTech Connect

    K. HANSON

    2001-02-01

    The Markov Chain Monte Carlo technique provides a means for drawing random samples from a target probability density function (pdf). MCMC allows one to assess the uncertainties in a Bayesian analysis described by a numerically calculated posterior distribution. This paper describes the Hamiltonian MCMC technique in which a momentum variable is introduced for each parameter of the target pdf. In analogy to a physical system, a Hamiltonian H is defined as a kinetic energy involving the momenta plus a potential energy {var_phi}, where {var_phi} is minus the logarithm of the target pdf. Hamiltonian dynamics allows one to move along trajectories of constant H, taking large jumps in the parameter space with relatively few evaluations of {var_phi} and its gradient. The Hamiltonian algorithm alternates between picking a new momentum vector and following such trajectories. The efficiency of the Hamiltonian method for multidimensional isotropic Gaussian pdfs is shown to remain constant at around 7% for up to several hundred dimensions. The Hamiltonian method handles correlations among the variables much better than the standard Metropolis algorithm. A new test, based on the gradient of {var_phi}, is proposed to measure the convergence of the MCMC sequence.

  19. Understanding the role of Propionibacterium acnes in acne vulgaris: The critical importance of skin sampling methodologies.

    PubMed

    Omer, Hélène; McDowell, Andrew; Alexeyev, Oleg A

    Acne vulgaris is a chronic inflammatory skin condition classified by the Global Burden of Disease Study as the eighth most prevalent disease worldwide. The pathophysiology of the condition has been extensively studied, with an increase in sebum production, abnormal keratinization of the pilosebaceous follicle, and an inflammatory immune response all implicated in its etiology. One of the most disputed points, however, is the role of the gram-positive anaerobic bacterium Propionibacterium acnes in the development of acne, particularly when this organism is also found in normal sebaceous follicles of healthy skin. Against this background, we now describe the different sampling strategies that have been adopted for qualitative and quantitative study of P acnes within intact hair follicles of the skin and discuss the strengths and weaknesses of such methodologies for investigating the role of P acnes in the development of acne. Copyright © 2016 Elsevier Inc. All rights reserved.

  20. Allergic contact dermatitis from exotic woods: importance of patch-testing with patient-provided samples.

    PubMed

    Podjasek, Joshua O; Cook-Norris, Robert H; Richardson, Donna M; Drage, Lisa A; Davis, Mark D P

    2011-01-01

    Exotic woods from tropical and subtropical regions (eg, from South America, south Asia, and Africa) frequently are used occupationally and recreationally by woodworkers and hobbyists. These exotic woods more commonly provoke irritant contact dermatitis reactions, but they also can provoke allergic contact dermatitis reactions. We report three patients seen at Mayo Clinic (Rochester, MN) with allergic contact dermatitis reactions to exotic woods. Patch testing was performed and included patient-provided wood samples. Avoidance of identified allergens was recommended. For all patients, the dermatitis cleared or improved after avoidance of the identified allergens. Clinicians must be aware of the potential for allergic contact dermatitis reactions to compounds in exotic woods. Patch testing should be performed with suspected woods for diagnostic confirmation and allowance of subsequent avoidance of the allergens.

  1. Rapid fusion method for the determination of refractory thorium and uranium isotopes in soil samples

    SciTech Connect

    Maxwell, Sherrod L.; Hutchison, Jay B.; McAlister, Daniel R.

    2015-02-14

    Recently, approximately 80% of participating laboratories failed to accurately determine uranium isotopes in soil samples in the U.S Department of Energy Mixed Analyte Performance Evaluation Program (MAPEP) Session 30, due to incomplete dissolution of refractory particles in the samples. Failing laboratories employed acid dissolution methods, including hydrofluoric acid, to recover uranium from the soil matrix. The failures illustrate the importance of rugged soil dissolution methods for the accurate measurement of analytes in the sample matrix. A new rapid fusion method has been developed by the Savannah River National Laboratory (SRNL) to prepare 1-2 g soil sample aliquots very quickly, with total dissolution of refractory particles. Soil samples are fused with sodium hydroxide at 600 ºC in zirconium crucibles to enable complete dissolution of the sample. Uranium and thorium are separated on stacked TEVA and TRU extraction chromatographic resin cartridges, prior to isotopic measurements by alpha spectrometry on cerium fluoride microprecipitation sources. Plutonium can also be separated and measured using this method. Batches of 12 samples can be prepared for measurement in <5 hours.

  2. Rapid fusion method for the determination of refractory thorium and uranium isotopes in soil samples

    DOE PAGES

    Maxwell, Sherrod L.; Hutchison, Jay B.; McAlister, Daniel R.

    2015-02-14

    Recently, approximately 80% of participating laboratories failed to accurately determine uranium isotopes in soil samples in the U.S Department of Energy Mixed Analyte Performance Evaluation Program (MAPEP) Session 30, due to incomplete dissolution of refractory particles in the samples. Failing laboratories employed acid dissolution methods, including hydrofluoric acid, to recover uranium from the soil matrix. The failures illustrate the importance of rugged soil dissolution methods for the accurate measurement of analytes in the sample matrix. A new rapid fusion method has been developed by the Savannah River National Laboratory (SRNL) to prepare 1-2 g soil sample aliquots very quickly, withmore » total dissolution of refractory particles. Soil samples are fused with sodium hydroxide at 600 ºC in zirconium crucibles to enable complete dissolution of the sample. Uranium and thorium are separated on stacked TEVA and TRU extraction chromatographic resin cartridges, prior to isotopic measurements by alpha spectrometry on cerium fluoride microprecipitation sources. Plutonium can also be separated and measured using this method. Batches of 12 samples can be prepared for measurement in <5 hours.« less

  3. Development of active and diffusive sampling methods for determination of 3-methoxybutyl acetate in workplace air.

    PubMed

    Takeuchi, Akito; Takigawa, Tomoko; Kawasumi, Yaeko; Yasugi, Tomojiro; Endo, Yoko; Wang, Da-Hong; Takaki, Jiro; Sakurai, Haruhiko; Ogino, Keiki

    2007-11-01

    Monitoring of the workplace concentration of 3-methoxybutyl acetate (MBA), which is used in printer's ink and thinner for screen-printing and as an organic solvent to dissolve various resins, is important for health reasons. An active and a diffusive sampling method, using a gas chromatograph equipped with a flame ionization detector, were developed for the determination of MBA in workplace air. For the active sampling method using an activated charcoal tube, the overall desorption efficiency was 101%, the overall recovery was 104%, and the recovery after 8 days of storage in a refrigerator was more than 90%. For the diffusive sampling method using the 3M 3500 organic vapor monitor, the MBA sampling rate was 19.89 cm(3) min(-1). The linear range was from 0.01 to 96.00 microg ml(-1), with a correlation coefficient of 0.999, and the detection limits of the active and diffusive samplers were 0.04 and 0.07 microg sample(-1), respectively. The geometric mean of stationary sampling and personal sampling in a screen-printing factory were 12.61 and 16.52 ppm, respectively, indicating that both methods can be used to measure MBA in workplace air.

  4. Chlamydophila pneumoniae diagnostics: importance of methodology in relation to timing of sampling.

    PubMed

    Hvidsten, D; Halvorsen, D S; Berdal, B P; Gutteberg, T J

    2009-01-01

    The diagnostic impact of PCR-based detection was compared to single-serum IgM antibody measurement and IgG antibody seroconversion during an outbreak of Chlamydophila pneumoniae in a military community. Nasopharyngeal swabs for PCR-based detection, and serum, were obtained from 127 conscripts during the outbreak. Serum, drawn many months before the outbreak, provided the baseline antibody status. C. pneumoniae IgM and IgG antibodies were assayed using microimmunofluorescence (MIF), enzyme immunoassay (EIA) and recombinant ELISA (rELISA). Two reference standard tests were applied: (i) C. pneumoniae PCR; and (ii) assay of C. pneumoniae IgM antibodies, defined as positive if >or=2 IgM antibody assays (i.e. rELISA with MIF and/or EIA) were positive. In 33 subjects, of whom two tested negative according to IgM antibody assays and IgG seroconversion, C. pneumoniae DNA was detected by PCR. The sensitivities were 79%, 85%, 88% and 68%, respectively, and the specificities were 86%, 84%, 78% and 93%, respectively, for MIF IgM, EIA IgM, rELISA IgM and PCR. In two subjects, acute infection was diagnosed on the basis of IgG antibody seroconversion alone. The sensitivity of PCR detection was lower than that of any IgM antibody assay. This may be explained by the late sampling, or clearance of the organism following antibiotic treatment. The results of assay evaluation studies are affected not only by the choice of reference standard tests, but also by the timing of sampling for the different test principles used. On the basis of these findings, a combination of nasopharyngeal swabbing for PCR detection and specific single-serum IgM measurement is recommended in cases of acute respiratory C. pneumoniae infection.

  5. Spline methods for approximating quantile functions and generating random samples

    NASA Technical Reports Server (NTRS)

    Schiess, J. R.; Matthews, C. G.

    1985-01-01

    Two cubic spline formulations are presented for representing the quantile function (inverse cumulative distribution function) of a random sample of data. Both B-spline and rational spline approximations are compared with analytic representations of the quantile function. It is also shown how these representations can be used to generate random samples for use in simulation studies. Comparisons are made on samples generated from known distributions and a sample of experimental data. The spline representations are more accurate for multimodal and skewed samples and to require much less time to generate samples than the analytic representation.

  6. Probing methane hydrate nucleation through the forward flux sampling method.

    PubMed

    Bi, Yuanfei; Li, Tianshu

    2014-11-26

    Understanding the nucleation of hydrate is the key to developing effective strategies for controlling methane hydrate formation. Here we present a computational study of methane hydrate nucleation, by combining the forward flux sampling (FFS) method and the coarse-grained water model mW. To facilitate the application of FFS in studying the formation of methane hydrate, we developed an effective order parameter λ on the basis of the topological analysis of the tetrahedral network. The order parameter capitalizes the signature of hydrate structure, i.e., polyhedral cages, and is capable of efficiently distinguishing hydrate from ice and liquid water while allowing the formation of different hydrate phases, i.e., sI, sII, and amorphous. Integration of the order parameter λ with FFS allows explicitly computing hydrate nucleation rates and obtaining an ensemble of nucleation trajectories under conditions where spontaneous hydrate nucleation becomes too slow to occur in direct simulation. The convergence of the obtained hydrate nucleation rate was found to depend crucially on the convergence of the spatial distribution for the spontaneously formed hydrate seeds obtained from the initial sampling of FFS. The validity of the approach is also verified by the agreement between the calculated nucleation rate and that inferred from the direct simulation. Analyzing the obtained large ensemble of hydrate nucleation trajectories, we show hydrate formation at 220 K and 500 bar is initiated by the nucleation events occurring in the vicinity of water-methane interface, and facilitated by a gradual transition from amorphous to crystalline structure. The latter provides the direct support to the proposed two-step nucleation mechanism of methane hydrate.

  7. Importance of Numeracy as a Risk Factor for Elder Financial Exploitation in a Community Sample.

    PubMed

    Wood, Stacey A; Liu, Pi-Ju; Hanoch, Yaniv; Estevez-Cores, Sara

    2016-11-01

    To examine the role of numeracy, or comfort with numbers, as a potential risk factor for financial elder exploitation in a community sample. Individually administered surveys were given to 201 independent, community-dwelling adults aged 60 and older. Risk for financial elder exploitation was assessed using the Older Adult Financial Exploitation Measure (OAFEM). Other variables of interest included numeracy, executive functioning, and other risk factors identified from the literature. Assessments were completed individually at the Wood Lab at Scripps College in Claremont, CA and neighboring community centers. After controlling for other variables, including education, lower numeracy was related to higher scores on the OAFEM consistent with higher risk for financial exploitation. Self-reported physical and mental health, male gender, and younger age were also related to increased risk. Results indicated that numeracy is a significant risk factor for elder financial exploitation after controlling for other commonly reported variables. These findings are consistent with the broader literature relating numeracy to wealth and debt levels and extend them to the area of elder financial exploitation. © The Author 2015. Published by Oxford University Press on behalf of The Gerontological Society of America. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  8. An economic passive sampling method to detect particulate pollutants using magnetic measurements.

    PubMed

    Cao, Liwan; Appel, Erwin; Hu, Shouyun; Ma, Mingming

    2015-10-01

    Identifying particulate matter (PM) emitted from industrial processes into the atmosphere is an important issue in environmental research. This paper presents a passive sampling method using simple artificial samplers that maintains the advantage of bio-monitoring, but overcomes some of its disadvantages. The samplers were tested in a heavily polluted area (Linfen, China) and compared to results from leaf samples. Spatial variations of magnetic susceptibility from artificial passive samplers and leaf samples show very similar patterns. Scanning electron microscopy suggests that the collected PM are mostly in the range of 2-25 μm; frequent occurrence of spherical shape indicates industrial combustion dominates PM emission. Magnetic properties around power plants show different features than other plants. This sampling method provides a suitable and economic tool for semi-quantifying temporal and spatial distribution of air quality; they can be installed in a regular grid and calibrate the weight of PM.

  9. The evaluation methods of sampling rate performance in GNSS receiver

    NASA Astrophysics Data System (ADS)

    Ke, Ting; Hu, Xiulin; Liu, Yuqi; Ran, Yihang

    2009-12-01

    This paper investigates into the performance of sampling rate on time discrimination of PRN code in GNSS, proposes an innovative performance evaluation criterion for actual time discrimination of noncommensurate sampling technique, and then develops an algorithm to quickly obtain this criterion. Computer simulation verification guarantees the correctness of the proposed fast algorithm. The proposed algorithm can be adopted in all PRN code ranging based applications to choose the "better" sampling rate, which could achieve better time discrimination performance with lower sampling rate.

  10. Importance of sample size for the estimation of repeater F waves in amyotrophic lateral sclerosis.

    PubMed

    Fang, Jia; Liu, Ming-Sheng; Guan, Yu-Zhou; Cui, Bo; Cui, Li-Ying

    2015-02-20

    In amyotrophic lateral sclerosis (ALS), repeater F waves are increased. Accurate assessment of repeater F waves requires an adequate sample size. We studied the F waves of left ulnar nerves in ALS patients. Based on the presence or absence of pyramidal signs in the left upper limb, the ALS patients were divided into two groups: One group with pyramidal signs designated as P group and the other without pyramidal signs designated as NP group. The Index repeating neurons (RN) and Index repeater F waves (Freps) were compared among the P, NP and control groups following 20 and 100 stimuli respectively. For each group, the Index RN and Index Freps obtained from 20 and 100 stimuli were compared. In the P group, the Index RN (P = 0.004) and Index Freps (P = 0.001) obtained from 100 stimuli were significantly higher than from 20 stimuli. For F waves obtained from 20 stimuli, no significant differences were identified between the P and NP groups for Index RN (P = 0.052) and Index Freps (P = 0.079); The Index RN (P < 0.001) and Index Freps (P < 0.001) of the P group were significantly higher than the control group; The Index RN (P = 0.002) of the NP group was significantly higher than the control group. For F waves obtained from 100 stimuli, the Index RN (P < 0.001) and Index Freps (P < 0.001) of the P group were significantly higher than the NP group; The Index RN (P < 0.001) and Index Freps (P < 0.001) of the P and NP groups were significantly higher than the control group. Increased repeater F waves reflect increased excitability of motor neuron pool and indicate upper motor neuron dysfunction in ALS. For an accurate evaluation of repeater F waves in ALS patients especially those with moderate to severe muscle atrophy, 100 stimuli would be required.

  11. Preparation of samples for leaf architecture studies, a method for mounting cleared leaves1

    PubMed Central

    Vasco, Alejandra; Thadeo, Marcela; Conover, Margaret; Daly, Douglas C.

    2014-01-01

    • Premise of the study: Several recent waves of interest in leaf architecture have shown an expanding range of approaches and applications across a number of disciplines. Despite this increased interest, examination of existing archives of cleared and mounted leaves shows that current methods for mounting, in particular, yield unsatisfactory results and deterioration of samples over relatively short periods. Although techniques for clearing and staining leaves are numerous, published techniques for mounting leaves are scarce. • Methods and Results: Here we present a complete protocol and recommendations for clearing, staining, and imaging leaves, and, most importantly, a method to permanently mount cleared leaves. • Conclusions: The mounting protocol is faster than other methods, inexpensive, and straightforward; moreover, it yields clear and permanent samples that can easily be imaged, scanned, and stored. Specimens mounted with this method preserve well, with leaves that were mounted more than 35 years ago showing no signs of bubbling or discoloration. PMID:25225627

  12. Biocatalytic spectrophotometric method to detect paracetamol in water samples.

    PubMed

    Méndez-Albores, Alia; Tarín, Cristina; Rebollar-Pérez, Georgette; Dominguez-Ramirez, Lenin; Torres, Eduardo

    2015-01-01

    A biocatalytic methodology based on the quantification of the laccase inhibition during the oxidation of a standard substrate ABTS (2,2'-azino-bis(3-ethylbenzothiazoline-6-sulphonic acid) for the indirect determination of paracetamol in drinking water has been developed. The method displayed a fast response time (20 s), and high selectivity to paracetamol in presence of interfering substances such as naproxen, estradiol, ketoprofen, sulfamethoxazole, and diclofenac. The limit of detection (LOD) and limit of quantification (LOQ) were noticed to be 0.55 µM and 8.3 µM, respectively. By comparing the catalytic constants value KM and kcat for ABTS oxidation in the absence and presence of various concentrations of paracetamol, a competitive-type inhibition was disclosed. On the other hand, the close value between Ki and KM indicates similar binding affinity of the enzyme to ABTS and paracetamol corroborated by docking studies. The methodology was successfully applied to real water samples, presenting an interesting potential for further development of a biosensor to paracetamol detection.

  13. Evaluation of Environmental Sample Analysis Methods and Results Reporting in the National Children's Study Vanguard Study.

    PubMed

    Heikkinen, Maire S A; Khalaf, Abdisalam; Beard, Barbara; Viet, Susan M; Dellarco, Michael

    2016-05-03

    During the initial Vanguard phase of the U.S. National Children's Study (NCS), about 2000 tap water, surface wipe, and air samples were collected and analyzed immediately. The shipping conditions, analysis methods, results, and laboratory performance were evaluated to determine the best approaches for use in the NCS Main Study. The main conclusions were (1) to employ established sample analysis methods, when possible, and alternate methodologies only after careful consideration with method validation studies; (2) lot control and prescreening sample collection materials are important quality assurance procedures; (3) packing samples correctly requires careful training and adjustment of shipping conditions to local conditions; (4) trip blanks and spiked samples should be considered for samplers with short expiration times and labile analytes; (5) two study-specific results reports should be required: laboratory electronic data deliverables (EDD) of sample results in a useable electronic format (CSV or SEDD XML/CSV) and a data package with sample results and supporting information in PDF format. These experiences and lessons learned can be applied to any long-term study.

  14. Using implicit association tests in age-heterogeneous samples: The importance of cognitive abilities and quad model processes.

    PubMed

    Wrzus, Cornelia; Egloff, Boris; Riediger, Michaela

    2017-08-01

    Implicit association tests (IATs) are increasingly used to indirectly assess people's traits, attitudes, or other characteristics. In addition to measuring traits or attitudes, IAT scores also reflect differences in cognitive abilities because scores are based on reaction times (RTs) and errors. As cognitive abilities change with age, questions arise concerning the usage and interpretation of IATs for people of different age. To address these questions, the current study examined how cognitive abilities and cognitive processes (i.e., quad model parameters) contribute to IAT results in a large age-heterogeneous sample. Participants (N = 549; 51% female) in an age-stratified sample (range = 12-88 years) completed different IATs and 2 tasks to assess cognitive processing speed and verbal ability. From the IAT data, D2-scores were computed based on RTs, and quad process parameters (activation of associations, overcoming bias, detection, guessing) were estimated from individual error rates. Substantial IAT scores and quad processes except guessing varied with age. Quad processes AC and D predicted D2-scores of the content-specific IAT. Importantly, the effects of cognitive abilities and quad processes on IAT scores were not significantly moderated by participants' age. These findings suggest that IATs seem suitable for age-heterogeneous studies from adolescence to old age when IATs are constructed and analyzed appropriately, for example with D-scores and process parameters. We offer further insight into how D-scoring controls for method effects in IATs and what IAT scores capture in addition to implicit representations of characteristics. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  15. High resolution melting curve analysis of DNA samples isolated by different DNA extraction methods.

    PubMed

    Martín-Núñez, Gracia M; Gómez-Zumaquero, Juan M; Soriguer, Federico; Morcillo, Sonsoles

    2012-01-18

    High resolution melting is a post-PCR-based method for detecting DNA sequence variation by measuring changes in the melting of a DNA duplex. Melting of double-stranded DNA molecules is influenced by several factors. We evaluated the influence of the DNA isolation method in the melting curve analysis to detect genetic variations. We isolated DNA from whole blood of 547 subjects by two different methods: Maxwell 16 Instrument and DNA FlexiGene Kit. A fragment of 159 bp was amplified and analyzed by high resolution melting. Those samples that showed a different melting curve pattern were sequenced. Of the samples extracted with the Maxwell 16 Instrument, 42% showed variation compared with 0.18% of the samples extracted with DNA FlexiGene Kit. After sequencing, we showed that all samples extracted with the Maxwell 16 Instrument were false positive except one, which coincided with the only sample that showed variation in those extracted with the DNA FlexiGene Kit. The method used to extract DNA is an important factor to consider in the analysis of melting curves obtained by high resolution melting, as it may influence the melting behaviour of the samples, giving false positive results in the detection of genetic variants. Copyright © 2011 Elsevier B.V. All rights reserved.

  16. Accelerated Evaluation of Automated Vehicles Safety in Lane-Change Scenarios Based on Importance Sampling Techniques

    PubMed Central

    Zhao, Ding; Lam, Henry; Peng, Huei; Bao, Shan; LeBlanc, David J.; Nobukawa, Kazutoshi; Pan, Christopher S.

    2016-01-01

    Automated vehicles (AVs) must be thoroughly evaluated before their release and deployment. A widely used evaluation approach is the Naturalistic-Field Operational Test (N-FOT), which tests prototype vehicles directly on the public roads. Due to the low exposure to safety-critical scenarios, N-FOTs are time consuming and expensive to conduct. In this paper, we propose an accelerated evaluation approach for AVs. The results can be used to generate motions of the other primary vehicles to accelerate the verification of AVs in simulations and controlled experiments. Frontal collision due to unsafe cut-ins is the target crash type of this paper. Human-controlled vehicles making unsafe lane changes are modeled as the primary disturbance to AVs based on data collected by the University of Michigan Safety Pilot Model Deployment Program. The cut-in scenarios are generated based on skewed statistics of collected human driver behaviors, which generate risky testing scenarios while preserving the statistical information so that the safety benefits of AVs in nonaccelerated cases can be accurately estimated. The cross-entropy method is used to recursively search for the optimal skewing parameters. The frequencies of the occurrences of conflicts, crashes, and injuries are estimated for a modeled AV, and the achieved accelerated rate is around 2000 to 20 000. In other words, in the accelerated simulations, driving for 1000 miles will expose the AV with challenging scenarios that will take about 2 to 20 million miles of real-world driving to encounter. This technique thus has the potential to greatly reduce the development and validation time for AVs. PMID:27840592

  17. Accelerated Evaluation of Automated Vehicles Safety in Lane-Change Scenarios Based on Importance Sampling Techniques.

    PubMed

    Zhao, Ding; Lam, Henry; Peng, Huei; Bao, Shan; LeBlanc, David J; Nobukawa, Kazutoshi; Pan, Christopher S

    2016-08-05

    Automated vehicles (AVs) must be thoroughly evaluated before their release and deployment. A widely used evaluation approach is the Naturalistic-Field Operational Test (N-FOT), which tests prototype vehicles directly on the public roads. Due to the low exposure to safety-critical scenarios, N-FOTs are time consuming and expensive to conduct. In this paper, we propose an accelerated evaluation approach for AVs. The results can be used to generate motions of the other primary vehicles to accelerate the verification of AVs in simulations and controlled experiments. Frontal collision due to unsafe cut-ins is the target crash type of this paper. Human-controlled vehicles making unsafe lane changes are modeled as the primary disturbance to AVs based on data collected by the University of Michigan Safety Pilot Model Deployment Program. The cut-in scenarios are generated based on skewed statistics of collected human driver behaviors, which generate risky testing scenarios while preserving the statistical information so that the safety benefits of AVs in nonaccelerated cases can be accurately estimated. The cross-entropy method is used to recursively search for the optimal skewing parameters. The frequencies of the occurrences of conflicts, crashes, and injuries are estimated for a modeled AV, and the achieved accelerated rate is around 2000 to 20 000. In other words, in the accelerated simulations, driving for 1000 miles will expose the AV with challenging scenarios that will take about 2 to 20 million miles of real-world driving to encounter. This technique thus has the potential to greatly reduce the development and validation time for AVs.

  18. An evaluation of long-term preservation methods for brown bear (Ursus arctos) faecal DNA samples

    USGS Publications Warehouse

    Murphy, M.A.; Waits, L.P.; Kendall, K.C.; Wasser, S.K.; Higbee, J.A.; Bogden, R.

    2002-01-01

    Relatively few large-scale faecal DNA studies have been initiated due to difficulties in amplifying low quality and quantity DNA template. To improve brown bear faecal DNA PCR amplification success rates and to determine post collection sample longevity, five preservation methods were evaluated: 90% ethanol, DETs buffer, silica-dried, oven-dried stored at room temperature, and oven-dried stored at -20??C. Preservation effectiveness was evaluated for 50 faecal samples by PCR amplification of a mitochondrial DNA (mtDNA) locus (???146 bp) and a nuclear DNA (nDNA) locus (???200 bp) at time points of one week, one month, three months and six months. Preservation method and storage time significantly impacted mtDNA and nDNA amplification success rates. For mtDNA, all preservation methods had ??? 75% success at one week, but storage time had a significant impact on the effectiveness of the silica preservation method. Ethanol preserved samples had the highest success rates for both mtDNA (86.5%) and nDNA (84%). Nuclear DNA amplification success rates ranged from 26-88%, and storage time had a significant impact on all methods but ethanol. Preservation method and storage time should be important considerations for researchers planning projects utilizing faecal DNA. We recommend preservation of faecal samples in 90% ethanol when feasible, although when collecting in remote field conditions or for both DNA and hormone assays a dry collection method may be advantageous.

  19. Out-of-Sample Extensions for Non-Parametric Kernel Methods.

    PubMed

    Pan, Binbin; Chen, Wen-Sheng; Chen, Bo; Xu, Chen; Lai, Jianhuang

    2017-02-01

    Choosing suitable kernels plays an important role in the performance of kernel methods. Recently, a number of studies were devoted to developing nonparametric kernels. Without assuming any parametric form of the target kernel, nonparametric kernel learning offers a flexible scheme to utilize the information of the data, which may potentially characterize the data similarity better. The kernel methods using nonparametric kernels are referred to as nonparametric kernel methods. However, many nonparametric kernel methods are restricted to transductive learning, where the prediction function is defined only over the data points given beforehand. They have no straightforward extension for the out-of-sample data points, and thus cannot be applied to inductive learning. In this paper, we show how to make the nonparametric kernel methods applicable to inductive learning. The key problem of out-of-sample extension is how to extend the nonparametric kernel matrix to the corresponding kernel function. A regression approach in the hyper reproducing kernel Hilbert space is proposed to solve this problem. Empirical results indicate that the out-of-sample performance is comparable to the in-sample performance in most cases. Experiments on face recognition demonstrate the superiority of our nonparametric kernel method over the state-of-the-art parametric kernel methods.

  20. Nasal swab samples and real-time polymerase chain reaction assays in community-based, longitudinal studies of respiratory viruses: the importance of sample integrity and quality control

    PubMed Central

    2014-01-01

    Background Carefully conducted, community-based, longitudinal studies are required to gain further understanding of the nature and timing of respiratory viruses causing infections in the population. However, such studies pose unique challenges for field specimen collection, including as we have observed the appearance of mould in some nasal swab specimens. We therefore investigated the impact of sample collection quality and the presence of visible mould in samples upon respiratory virus detection by real-time polymerase chain reaction (PCR) assays. Methods Anterior nasal swab samples were collected from infants participating in an ongoing community-based, longitudinal, dynamic birth cohort study. The samples were first collected from each infant shortly after birth and weekly thereafter. They were then mailed to the laboratory where they were catalogued, stored at -80°C and later screened by PCR for 17 respiratory viruses. The quality of specimen collection was assessed by screening for human deoxyribonucleic acid (DNA) using endogenous retrovirus 3 (ERV3). The impact of ERV3 load upon respiratory virus detection and the impact of visible mould observed in a subset of swabs reaching the laboratory upon both ERV3 loads and respiratory virus detection was determined. Results In total, 4933 nasal swabs were received in the laboratory. ERV3 load in nasal swabs was associated with respiratory virus detection. Reduced respiratory virus detection (odds ratio 0.35; 95% confidence interval 0.27-0.44) was observed in samples where the ERV3 could not be identified. Mould was associated with increased time of samples reaching the laboratory and reduced ERV3 loads and respiratory virus detection. Conclusion Suboptimal sample collection and high levels of visible mould can impact negatively upon sample quality. Quality control measures, including monitoring human DNA loads using ERV3 as a marker for epithelial cell components in samples should be undertaken to optimize the

  1. Sexual violence and HIV risk behaviors among a nationally representative sample of heterosexual American women: The importance of sexual coercion

    PubMed Central

    Stockman, Jamila K; Campbell, Jacquelyn C; Celentano, David D

    2009-01-01

    Objectives Recent evidence suggests that it is important to consider behavioral-specific sexual violence measures in assessing women’s risk behaviors. This study investigated associations of history and types of sexual coercion on HIV risk behaviors in a nationally representative sample of heterosexually active American women. Methods Analyses were based on 5,857 women aged 18–44 participating in the 2002 National Survey of Family Growth. Types of lifetime sexual coercion included: victim given alcohol or drugs, verbally pressured, threatened with physical injury, and physically injured. Associations with HIV risk behaviors were assessed using logistic regression. Results Of 5,857 heterosexually active women, 16.4% reported multiple sex partners and 15.3% reported substance abuse. A coerced first sexual intercourse experience and coerced sex after sexual debut were independently associated with multiple sex partners and substance abuse; the highest risk was observed for women reporting a coerced first sexual intercourse experience. Among types of sexual coercion, alcohol or drug use at coerced sex was independently associated with multiple sex partners and substance abuse. Conclusions Our findings suggest that public health strategies are needed to address the violent components of heterosexual relationships. Future research should utilize longitudinal and qualitative research to characterize the relationship between continuums of sexual coercion and HIV risk. PMID:19734802

  2. Sampling methods for assessing syrphid biodiversity (Diptera: Syrphidae) in tropical forests.

    PubMed

    Marcos-García, M A; García-López, A; Zumbado, M A; Rotheray, G E

    2012-12-01

    When assessing the species richness of a taxonomic group in a specific area, the choice of sampling method is critical. In this study, the effectiveness of three methods for sampling syrphids (Diptera: Syrphidae) in tropical forests is compared: Malaise trapping, collecting adults with an entomological net, and collecting and rearing immatures. Surveys were made from 2008 to 2011 in six tropical forest sites in Costa Rica. The results revealed significant differences in the composition and richness of syrphid faunas obtained by each method. Collecting immatures was the most successful method based on numbers of species and individuals, whereas Malaise trapping was the least effective. This pattern of sampling effectiveness was independent of syrphid trophic or functional group and annual season. An advantage of collecting immatures over collecting adults is the quality and quantity of associated biological data obtained by the former method. However, complementarity between results of collecting adults and collecting immatures, showed that a combined sampling regime obtained the most complete inventory. Differences between these results and similar studies in more open Mediterranean habitats, suggest that for effective inventory, it is important to consider the effects of environmental characteristics on the catchability of syrphids as much as the costs and benefits of different sampling techniques.

  3. Photoacoustic spectroscopy sample array vessels and photoacoustic spectroscopy methods for using the same

    DOEpatents

    Amonette, James E.; Autrey, S. Thomas; Foster-Mills, Nancy S.

    2006-02-14

    Methods and apparatus for simultaneous or sequential, rapid analysis of multiple samples by photoacoustic spectroscopy are disclosed. Particularly, a photoacoustic spectroscopy sample array vessel including a vessel body having multiple sample cells connected thereto is disclosed. At least one acoustic detector is acoustically positioned near the sample cells. Methods for analyzing the multiple samples in the sample array vessels using photoacoustic spectroscopy are provided.

  4. Statistical methods for detecting differentially abundant features in clinical metagenomic samples.

    PubMed

    White, James Robert; Nagarajan, Niranjan; Pop, Mihai

    2009-04-01

    Numerous studies are currently underway to characterize the microbial communities inhabiting our world. These studies aim to dramatically expand our understanding of the microbial biosphere and, more importantly, hope to reveal the secrets of the complex symbiotic relationship between us and our commensal bacterial microflora. An important prerequisite for such discoveries are computational tools that are able to rapidly and accurately compare large datasets generated from complex bacterial communities to identify features that distinguish them.We present a statistical method for comparing clinical metagenomic samples from two treatment populations on the basis of count data (e.g. as obtained through sequencing) to detect differentially abundant features. Our method, Metastats, employs the false discovery rate to improve specificity in high-complexity environments, and separately handles sparsely-sampled features using Fisher's exact test. Under a variety of simulations, we show that Metastats performs well compared to previously used methods, and significantly outperforms other methods for features with sparse counts. We demonstrate the utility of our method on several datasets including a 16S rRNA survey of obese and lean human gut microbiomes, COG functional profiles of infant and mature gut microbiomes, and bacterial and viral metabolic subsystem data inferred from random sequencing of 85 metagenomes. The application of our method to the obesity dataset reveals differences between obese and lean subjects not reported in the original study. For the COG and subsystem datasets, we provide the first statistically rigorous assessment of the differences between these populations. The methods described in this paper are the first to address clinical metagenomic datasets comprising samples from multiple subjects. Our methods are robust across datasets of varied complexity and sampling level. While designed for metagenomic applications, our software can also be applied

  5. Automated PCR setup for forensic casework samples using the Normalization Wizard and PCR Setup robotic methods.

    PubMed

    Greenspoon, S A; Sykes, K L V; Ban, J D; Pollard, A; Baisden, M; Farr, M; Graham, N; Collins, B L; Green, M M; Christenson, C C

    2006-12-20

    Human genome, pharmaceutical and research laboratories have long enjoyed the application of robotics to performing repetitive laboratory tasks. However, the utilization of robotics in forensic laboratories for processing casework samples is relatively new and poses particular challenges. Since the quantity and quality (a mixture versus a single source sample, the level of degradation, the presence of PCR inhibitors) of the DNA contained within a casework sample is unknown, particular attention must be paid to procedural susceptibility to contamination, as well as DNA yield, especially as it pertains to samples with little biological material. The Virginia Department of Forensic Science (VDFS) has successfully automated forensic casework DNA extraction utilizing the DNA IQ(trade mark) System in conjunction with the Biomek 2000 Automation Workstation. Human DNA quantitation is also performed in a near complete automated fashion utilizing the AluQuant Human DNA Quantitation System and the Biomek 2000 Automation Workstation. Recently, the PCR setup for casework samples has been automated, employing the Biomek 2000 Automation Workstation and Normalization Wizard, Genetic Identity version, which utilizes the quantitation data, imported into the software, to create a customized automated method for DNA dilution, unique to that plate of DNA samples. The PCR Setup software method, used in conjunction with the Normalization Wizard method and written for the Biomek 2000, functions to mix the diluted DNA samples, transfer the PCR master mix, and transfer the diluted DNA samples to PCR amplification tubes. Once the process is complete, the DNA extracts, still on the deck of the robot in PCR amplification strip tubes, are transferred to pre-labeled 1.5 mL tubes for long-term storage using an automated method. The automation of these steps in the process of forensic DNA casework analysis has been accomplished by performing extensive optimization, validation and testing of the

  6. A comparison of four gravimetric fine particle sampling methods.

    PubMed

    Yanosky, J D; MacIntosh, D L

    2001-06-01

    A study was conducted to compare four gravimetric methods of measuring fine particle (PM2.5) concentrations in air: the BGI, Inc. PQ200 Federal Reference Method PM2.5 (FRM) sampler; the Harvard-Marple Impactor (HI); the BGI, Inc. GK2.05 KTL Respirable/Thoracic Cyclone (KTL); and the AirMetrics MiniVol (MiniVol). Pairs of FRM, HI, and KTL samplers and one MiniVol sampler were collocated and 24-hr integrated PM2.5 samples were collected on 21 days from January 6 through April 9, 2000. The mean and standard deviation of PM2.5 levels from the FRM samplers were 13.6 and 6.8 microg/m3, respectively. Significant systematic bias was found between mean concentrations from the FRM and the MiniVol (1.14 microg/m3, p = 0.0007), the HI and the MiniVol (0.85 microg/m3, p = 0.0048), and the KTL and the MiniVol (1.23 microg/m3, p = 0.0078) according to paired t test analyses. Linear regression on all pairwise combinations of the sampler types was used to evaluate measurements made by the samplers. None of the regression intercepts was significantly different from 0, and only two of the regression slopes were significantly different from 1, that for the FRM and the MiniVol [beta1 = 0.91, 95% CI (0.83-0.99)] and that for the KTL and the MiniVol [beta1 = 0.88, 95% CI (0.78-0.98)]. Regression R2 terms were 0.96 or greater between all pairs of samplers, and regression root mean square error terms (RMSE) were 1.65 microg/m3 or less. These results suggest that the MiniVol will underestimate measurements made by the FRM, the HI, and the KTL by an amount proportional to PM2.5 concentration. Nonetheless, these results indicate that all of the sampler types are comparable if approximately 10% variation on the mean levels and on individual measurement levels is considered acceptable and the actual concentration is within the range of this study (5-35 microg/m3).

  7. Evaluation of sample preservation methods for space mission

    NASA Technical Reports Server (NTRS)

    Schubert, W.; Rohatgi, N.; Kazarians, G.

    2002-01-01

    For interplanetary spacecraft that will travel to destinations where future life detection experiments may be conducted or samples are to be returned to earth, we should archive and preserve relevant samples from the spacecraft and cleanrooms for evaluation at a future date.

  8. An automated method of sample preparation of biofluids using pierceable caps to eliminate the uncapping of the sample tubes during sample transfer.

    PubMed

    Teitz, D S; Khan, S; Powell, M L; Jemal, M

    2000-09-11

    Biological samples are normally collected and stored frozen in capped tubes until analysis. To obtain aliquots of biological samples for analysis, the sample tubes have to be thawed, uncapped, samples removed and then recapped for further storage. In this paper, we report an automated method of sample transfer devised to eliminate the uncapping and recapping process. This sampling method was incorporated into an automated liquid-liquid extraction procedure of plasma samples. Using a robotic system, the plasma samples were transferred directly from pierceable capped tubes into microtubes contained in a 96-position block. The aliquoted samples were extracted with methyl-tert-butyl ether in the same microtubes. The supernatant organic layers were transferred to a 96-well collection plate and evaporated to dryness. The dried extracts were reconstituted and injected from the same plate for analysis by liquid chromatography with tandem mass spectrometry.

  9. Acoustically levitated droplets: a contactless sampling method for fluorescence studies.

    PubMed

    Leiterer, Jork; Grabolle, Markus; Rurack, Knut; Resch-Genger, Ute; Ziegler, Jan; Nann, Thomas; Panne, Ulrich

    2008-01-01

    Acoustic levitation is used as a new tool to study concentration-dependent processes in fluorescence spectroscopy. With this technique, small amounts of liquid and solid samples can be measured without the need for sample supports or containers, which often limits signal acquisition and can even alter sample properties due to interactions with the support material. We demonstrate that, because of the small sample volume, fluorescence measurements at high concentrations of an organic dye are possible without the limitation of inner-filter effects, which hamper such experiments in conventional, cuvette-based measurements. Furthermore, we show that acoustic levitation of liquid samples provides an experimentally simple way to study distance-dependent fluorescence modulations in semiconductor nanocrystals. The evaporation of the solvent during levitation leads to a continuous increase of solute concentration and can easily be monitored by laser-induced fluorescence.

  10. Validated Test Method 5030C: Purge-and-Trap for Aqueous Samples

    EPA Pesticide Factsheets

    This method describes a purge-and-trap procedure for the analysis of volatile organic compoundsin aqueous samples & water miscible liquid samples. It also describes the analysis of high concentration soil and waste sample extracts prepared in Method 5035.

  11. Alternative methods for determining the electrical conductivity of core samples.

    PubMed

    Lytle, R J; Duba, A G; Willows, J L

    1979-05-01

    Electrode configurations are described that can be used in measuring the electrical conductivity of a core sample and that do not require access to the core end faces. The use of these configurations eliminates the need for machining the core ends for placement of end electrodes. This is because the conductivity in the cases described is relatively insensitive to the length of the sample. We validated the measurement technique by comparing mathematical models with actual measurements that were made perpendicular and paralled to the core axis of granite samples.

  12. A Review of Biological Agent Sampling Methods and ...

    EPA Pesticide Factsheets

    Report This study was conducted to evaluate current sampling and analytical capabilities, from a time and resource perspective, for a large-scale biological contamination incident. The analysis will be useful for strategically directing future research investment.

  13. A method for time-resolved calorespirometry of terrestrial samples.

    PubMed

    Wadsö, Lars

    2015-04-01

    A new vessel for simultaneous isothermal calorimetry and respirometry (calorespirometry) on terrestrial (non-aqueous) samples has been developed. All types of small (<1 g) biological samples (insects, soil, leaves, fungi, etc.) can be studied. The respirometric measurements are made by opening and closing a valve to a vial inside the sample ampoule containing a carbon dioxide absorbent. Typically a 7 h measurement results in seven measurements of heat production rate, oxygen consumption and carbon dioxide production, which can be used to evaluate how the metabolic activity in a sample changes over time. Results from three experiments on leaves, a cut vegetable, and mold are given. As uncertainties--especially in the carbon dioxide production--tend to be quite high, improvements to the technique are also discussed.

  14. Field sampling method for quantifying odorants in humid environments.

    PubMed

    Trabue, Steven L; Scoggin, Kenwood D; Li, Hong; Burns, Robert; Xin, Hongwei

    2008-05-15

    Most air quality studies in agricultural environments use thermal desorption analysis for quantifying semivolatile organic compounds (SVOCs) associated with odor. The objective of this study was to develop a robust sampling technique for measuring SVOCs in humid environments. Test atmospheres were generated at ambient temperatures (23 +/- 1.5 degrees C) and 25, 50, and 80% relative humidity (RH). Sorbent material used included Tenax, graphitized carbon, and carbon molecular sieve (CMS). Sorbent tubes were challenged with 2, 4, 8, 12, and 24 L of air at various RHs. Sorbent tubes with CMS material performed poorly at both 50 and 80% RH dueto excessive sorption of water. Heating of CMS tubes during sampling or dry-purging of CMS tubes post sampling effectively reduced water sorption with heating of tubes being preferred due to the higher recovery and reproducibility. Tenaxtubes had breakthrough of the more volatile compounds and tended to form artifacts with increasing volumes of air sampled. Graphitized carbon sorbent tubes containing Carbopack X and Carbopack C performed best with quantitative recovery of all compounds at all RHs and sampling volumes tested. The graphitized carbon tubes were taken to the field for further testing. Field samples taken from inside swine feeding operations showed that butanoic acid, 4-methylphenol, 4-ethylphenol, indole, and 3-methylindole were the compounds detected most often above their odor threshold values. Field samples taken from a poultry facility demonstrated that butanoic acid, 3-methylbutanoic acid, and 4-methylphenol were the compounds above their odor threshold values detected most often, relative humidity, CAFO, VOC, SVOC, thermal desorption, swine, poultry, air quality, odor.

  15. Comparison of speed-vacuum method and heat-drying method to measure brain water content of small brain samples.

    PubMed

    Sebastiani, Anne; Hirnet, Tobias; Jahn-Eimermacher, Antje; Thal, Serge C

    2017-01-30

    A reliable measurement of brain water content (wet-to-dry ratio) is an important prerequisite for conducting research on mechanisms of brain edema formation. The conventionally used oven-drying method suffers from several limitations, especially in small samples. A technically demanding and time-consuming alternative is freeze-drying. Centrifugal vacuum concentrators (e.g. SpeedVac/speed-vacuum drying) are a combination of vacuum-drying and centrifugation, used to reduce the boiling temperature. These concentrators have the key advantages of improving the freeze-drying speed and maintaining the integrity of dried samples, thus, allowing e.g. DNA analyses. In the present study, we compared the heat-oven with speed-vacuum technique with regard to efficacy to remove moisture from water and brain samples and their effectiveness to distinguish treatment paradigms after experimental traumatic brain injury (TBI) caused by controlled cortical impact (CCI). Both techniques effectively removed water, the oven technique taking 24h and vacuum-drying taking 48h. Vacuum-drying showed lower variations in small samples (30-45mg) and was suitable for genomic analysis as exemplified by sex genotyping. The effect of sodium bicarbonate (NaBic8.4%) on brain edema formation after CCI was investigated in small samples (2×1mm). Only vacuum-drying showed low variation and significant improvement under NaBic8.4% treatment. The receiver operating curves (ROC) analysis demonstrated that vacuum-drying (area under the curve (AUC):0.867-0.967) was superior to the conventional heat-drying method (AUC:0.367-0.567). The vacuum method is superior in terms of quantifying water content in small samples. In addition, vacuum-dried samples can also be used for subsequent analyses, e.g., PCR analysis. Copyright © 2016 The Authors. Published by Elsevier B.V. All rights reserved.

  16. The importance of quality control in validating concentrations of contaminants of emerging concern in source and treated drinking water samples.

    EPA Science Inventory

    A national-scale survey of 247 contaminants of emerging concern (CECs), including organic and inorganic chemical compounds, and microbial contaminants, was conducted in source and treated drinking water samples from 25 treatment plants across the United States. Multiple methods w...

  17. The importance of quality control in validating concentrations of contaminants of emerging concern in source and treated drinking water samples.

    EPA Science Inventory

    A national-scale survey of 247 contaminants of emerging concern (CECs), including organic and inorganic chemical compounds, and microbial contaminants, was conducted in source and treated drinking water samples from 25 treatment plants across the United States. Multiple methods w...

  18. Surface Sampling Collection and Culture Methods for Escherichia coli in Household Environments with High Fecal Contamination.

    PubMed

    Exum, Natalie G; Kosek, Margaret N; Davis, Meghan F; Schwab, Kellogg J

    2017-08-22

    Empiric quantification of environmental fecal contamination is an important step toward understanding the impact that water, sanitation, and hygiene interventions have on reducing enteric infections. There is a need to standardize the methods used for surface sampling in field studies that examine fecal contamination in low-income settings. The dry cloth method presented in this manuscript improves upon the more commonly used swabbing technique that has been shown in the literature to have a low sampling efficiency. The recovery efficiency of a dry electrostatic cloth sampling method was evaluated using Escherichia coli and then applied to household surfaces in Iquitos, Peru, where there is high fecal contamination and enteric infection. Side-by-side measurements were taken from various floor locations within a household at the same time over a three-month period to compare for consistency of quantification of E. coli bacteria. The dry cloth sampling method in the laboratory setting showed 105% (95% Confidence Interval: 98%, 113%) E. coli recovery efficiency off of the cloths. The field application demonstrated strong agreement of side-by-side results (Pearson correlation coefficient for dirt surfaces was 0.83 (p < 0.0001) and 0.91 (p < 0.0001) for cement surfaces) and moderate agreement for results between entrance and kitchen samples (Pearson (0.53, p < 0.0001) and weighted Kappa statistic (0.54, p < 0.0001)). Our findings suggest that this method can be utilized in households with high bacterial loads using either continuous (quantitative) or categorical (semi-quantitative) data. The standardization of this low-cost, dry electrostatic cloth sampling method can be used to measure differences between households in intervention and non-intervention arms of randomized trials.

  19. Surface Sampling Collection and Culture Methods for Escherichia coli in Household Environments with High Fecal Contamination

    PubMed Central

    Kosek, Margaret N.; Schwab, Kellogg J.

    2017-01-01

    Empiric quantification of environmental fecal contamination is an important step toward understanding the impact that water, sanitation, and hygiene interventions have on reducing enteric infections. There is a need to standardize the methods used for surface sampling in field studies that examine fecal contamination in low-income settings. The dry cloth method presented in this manuscript improves upon the more commonly used swabbing technique that has been shown in the literature to have a low sampling efficiency. The recovery efficiency of a dry electrostatic cloth sampling method was evaluated using Escherichia coli and then applied to household surfaces in Iquitos, Peru, where there is high fecal contamination and enteric infection. Side-by-side measurements were taken from various floor locations within a household at the same time over a three-month period to compare for consistency of quantification of E. coli bacteria. The dry cloth sampling method in the laboratory setting showed 105% (95% Confidence Interval: 98%, 113%) E. coli recovery efficiency off of the cloths. The field application demonstrated strong agreement of side-by-side results (Pearson correlation coefficient for dirt surfaces was 0.83 (p < 0.0001) and 0.91 (p < 0.0001) for cement surfaces) and moderate agreement for results between entrance and kitchen samples (Pearson (0.53, p < 0.0001) and weighted Kappa statistic (0.54, p < 0.0001)). Our findings suggest that this method can be utilized in households with high bacterial loads using either continuous (quantitative) or categorical (semi-quantitative) data. The standardization of this low-cost, dry electrostatic cloth sampling method can be used to measure differences between households in intervention and non-intervention arms of randomized trials. PMID:28829392

  20. Method for preconcentrating a sample for subsequent analysis

    DOEpatents

    Zaromb, Solomon

    1990-01-01

    A system for analysis of trace concentration of contaminants in air includes a portable liquid chromatograph and a preconcentrator for the contaminants to be analyzed. The preconcentrator includes a sample bag having an inlet valve and an outlet valve for collecting an air sample. When the sample is collected the sample bag is connected in series with a sorbing apparatus in a recirculation loop. The sorbing apparatus has an inner gas-permeable container containing a sorbent material and an outer gas-impermeable container. The sample is circulated through the outer container and around the inner container for trapping and preconcentrating the contaminants in the sorbent material. The sorbent material may be a liquid having the same composition as the mobile phase of the chromatograph for direct injection thereinto. Alternatively, the sorbent material may be a porous, solid body, to which mobile phase liquid is added after preconcentration of the contaminants for dissolving the contaminants, the liquid solution then being withdrawn for injection into the chromatograph.