Science.gov

Sample records for importance sampling method

  1. A flexible importance sampling method for integrating subgrid processes

    DOE PAGES

    Raut, E. K.; Larson, V. E.

    2016-01-29

    Numerical models of weather and climate need to compute grid-box-averaged rates of physical processes such as microphysics. These averages are computed by integrating subgrid variability over a grid box. For this reason, an important aspect of atmospheric modeling is spatial integration over subgrid scales. The needed integrals can be estimated by Monte Carlo integration. Monte Carlo integration is simple and general but requires many evaluations of the physical process rate. To reduce the number of function evaluations, this paper describes a new, flexible method of importance sampling. It divides the domain of integration into eight categories, such as the portion that containsmore » both precipitation and cloud, or the portion that contains precipitation but no cloud. It then allows the modeler to prescribe the density of sample points within each of the eight categories. The new method is incorporated into the Subgrid Importance Latin Hypercube Sampler (SILHS). The resulting method is tested on drizzling cumulus and stratocumulus cases. In the cumulus case, the sampling error can be considerably reduced by drawing more sample points from the region of rain evaporation.« less

  2. A flexible importance sampling method for integrating subgrid processes

    NASA Astrophysics Data System (ADS)

    Raut, E. K.; Larson, V. E.

    2016-01-01

    Numerical models of weather and climate need to compute grid-box-averaged rates of physical processes such as microphysics. These averages are computed by integrating subgrid variability over a grid box. For this reason, an important aspect of atmospheric modeling is spatial integration over subgrid scales. The needed integrals can be estimated by Monte Carlo integration. Monte Carlo integration is simple and general but requires many evaluations of the physical process rate. To reduce the number of function evaluations, this paper describes a new, flexible method of importance sampling. It divides the domain of integration into eight categories, such as the portion that contains both precipitation and cloud, or the portion that contains precipitation but no cloud. It then allows the modeler to prescribe the density of sample points within each of the eight categories. The new method is incorporated into the Subgrid Importance Latin Hypercube Sampler (SILHS). The resulting method is tested on drizzling cumulus and stratocumulus cases. In the cumulus case, the sampling error can be considerably reduced by drawing more sample points from the region of rain evaporation.

  3. Importance sampling variance reduction for the Fokker-Planck rarefied gas particle method

    NASA Astrophysics Data System (ADS)

    Collyer, B. S.; Connaughton, C.; Lockerby, D. A.

    2016-11-01

    The Fokker-Planck approximation to the Boltzmann equation, solved numerically by stochastic particle schemes, is used to provide estimates for rarefied gas flows. This paper presents a variance reduction technique for a stochastic particle method that is able to greatly reduce the uncertainty of the estimated flow fields when the characteristic speed of the flow is small in comparison to the thermal velocity of the gas. The method relies on importance sampling, requiring minimal changes to the basic stochastic particle scheme. We test the importance sampling scheme on a homogeneous relaxation, planar Couette flow and a lid-driven-cavity flow, and find that our method is able to greatly reduce the noise of estimated quantities. Significantly, we find that as the characteristic speed of the flow decreases, the variance of the noisy estimators becomes independent of the characteristic speed.

  4. Improved algorithms and coupled neutron-photon transport for auto-importance sampling method

    NASA Astrophysics Data System (ADS)

    Wang, Xin; Li, Jun-Li; Wu, Zhen; Qiu, Rui; Li, Chun-Yan; Liang, Man-Chun; Zhang, Hui; Gang, Zhi; Xu, Hong

    2017-01-01

    The Auto-Importance Sampling (AIS) method is a Monte Carlo variance reduction technique proposed for deep penetration problems, which can significantly improve computational efficiency without pre-calculations for importance distribution. However, the AIS method is only validated with several simple examples, and cannot be used for coupled neutron-photon transport. This paper presents improved algorithms for the AIS method, including particle transport, fictitious particle creation and adjustment, fictitious surface geometry, random number allocation and calculation of the estimated relative error. These improvements allow the AIS method to be applied to complicated deep penetration problems with complex geometry and multiple materials. A Completely coupled Neutron-Photon Auto-Importance Sampling (CNP-AIS) method is proposed to solve the deep penetration problems of coupled neutron-photon transport using the improved algorithms. The NUREG/CR-6115 PWR benchmark was calculated by using the methods of CNP-AIS, geometry splitting with Russian roulette and analog Monte Carlo, respectively. The calculation results of CNP-AIS are in good agreement with those of geometry splitting with Russian roulette and the benchmark solutions. The computational efficiency of CNP-AIS for both neutron and photon is much better than that of geometry splitting with Russian roulette in most cases, and increased by several orders of magnitude compared with that of the analog Monte Carlo. Supported by the subject of National Science and Technology Major Project of China (2013ZX06002001-007, 2011ZX06004-007) and National Natural Science Foundation of China (11275110, 11375103)

  5. Fast computation of diffuse reflectance in optical coherence tomography using an importance sampling-based Monte Carlo method

    NASA Astrophysics Data System (ADS)

    Lima, Ivan T., Jr.; Kalra, Anshul; Hernández-Figueroa, Hugo E.; Sherif, Sherif S.

    2012-03-01

    Computer simulations of light transport in multi-layered turbid media are an effective way to theoretically investigate light transport in tissue, which can be applied to the analysis, design and optimization of optical coherence tomography (OCT) systems. We present a computationally efficient method to calculate the diffuse reflectance due to ballistic and quasi-ballistic components of photons scattered in turbid media, which represents the signal in optical coherence tomography systems. Our importance sampling based Monte Carlo method enables the calculation of the OCT signal with less than one hundredth of the computational time required by the conventional Monte Carlo method. It also does not produce a systematic bias in the statistical result that is typically observed in existing methods to speed up Monte Carlo simulations of light transport in tissue. This method can be used to assess and optimize the performance of existing OCT systems, and it can also be used to design novel OCT systems.

  6. Adaptive Importance Sampling for Control and Inference

    NASA Astrophysics Data System (ADS)

    Kappen, H. J.; Ruiz, H. C.

    2016-03-01

    Path integral (PI) control problems are a restricted class of non-linear control problems that can be solved formally as a Feynman-Kac PI and can be estimated using Monte Carlo sampling. In this contribution we review PI control theory in the finite horizon case. We subsequently focus on the problem how to compute and represent control solutions. We review the most commonly used methods in robotics and control. Within the PI theory, the question of how to compute becomes the question of importance sampling. Efficient importance samplers are state feedback controllers and the use of these requires an efficient representation. Learning and representing effective state-feedback controllers for non-linear stochastic control problems is a very challenging, and largely unsolved, problem. We show how to learn and represent such controllers using ideas from the cross entropy method. We derive a gradient descent method that allows to learn feed-back controllers using an arbitrary parametrisation. We refer to this method as the path integral cross entropy method or PICE. We illustrate this method for some simple examples. The PI control methods can be used to estimate the posterior distribution in latent state models. In neuroscience these problems arise when estimating connectivity from neural recording data using EM. We demonstrate the PI control method as an accurate alternative to particle filtering.

  7. The Importance of Microhabitat for Biodiversity Sampling

    PubMed Central

    Mehrabi, Zia; Slade, Eleanor M.; Solis, Angel; Mann, Darren J.

    2014-01-01

    Responses to microhabitat are often neglected when ecologists sample animal indicator groups. Microhabitats may be particularly influential in non-passive biodiversity sampling methods, such as baited traps or light traps, and for certain taxonomic groups which respond to fine scale environmental variation, such as insects. Here we test the effects of microhabitat on measures of species diversity, guild structure and biomass of dung beetles, a widely used ecological indicator taxon. We demonstrate that choice of trap placement influences dung beetle functional guild structure and species diversity. We found that locally measured environmental variables were unable to fully explain trap-based differences in species diversity metrics or microhabitat specialism of functional guilds. To compare the effects of habitat degradation on biodiversity across multiple sites, sampling protocols must be standardized and scale-relevant. Our work highlights the importance of considering microhabitat scale responses of indicator taxa and designing robust sampling protocols which account for variation in microhabitats during trap placement. We suggest that this can be achieved either through standardization of microhabitat or through better efforts to record relevant environmental variables that can be incorporated into analyses to account for microhabitat effects. This is especially important when rapidly assessing the consequences of human activity on biodiversity loss and associated ecosystem function and services. PMID:25469770

  8. Adaptively biased sequential importance sampling for rare events in reaction networks with comparison to exact solutions from finite buffer dCME method.

    PubMed

    Cao, Youfang; Liang, Jie

    2013-07-14

    Critical events that occur rarely in biological processes are of great importance, but are challenging to study using Monte Carlo simulation. By introducing biases to reaction selection and reaction rates, weighted stochastic simulation algorithms based on importance sampling allow rare events to be sampled more effectively. However, existing methods do not address the important issue of barrier crossing, which often arises from multistable networks and systems with complex probability landscape. In addition, the proliferation of parameters and the associated computing cost pose significant problems. Here we introduce a general theoretical framework for obtaining optimized biases in sampling individual reactions for estimating probabilities of rare events. We further describe a practical algorithm called adaptively biased sequential importance sampling (ABSIS) method for efficient probability estimation. By adopting a look-ahead strategy and by enumerating short paths from the current state, we estimate the reaction-specific and state-specific forward and backward moving probabilities of the system, which are then used to bias reaction selections. The ABSIS algorithm can automatically detect barrier-crossing regions, and can adjust bias adaptively at different steps of the sampling process, with bias determined by the outcome of exhaustively generated short paths. In addition, there are only two bias parameters to be determined, regardless of the number of the reactions and the complexity of the network. We have applied the ABSIS method to four biochemical networks: the birth-death process, the reversible isomerization, the bistable Schlögl model, and the enzymatic futile cycle model. For comparison, we have also applied the finite buffer discrete chemical master equation (dCME) method recently developed to obtain exact numerical solutions of the underlying discrete chemical master equations of these problems. This allows us to assess sampling results objectively

  9. Adaptively biased sequential importance sampling for rare events in reaction networks with comparison to exact solutions from finite buffer dCME method

    NASA Astrophysics Data System (ADS)

    Cao, Youfang; Liang, Jie

    2013-07-01

    Critical events that occur rarely in biological processes are of great importance, but are challenging to study using Monte Carlo simulation. By introducing biases to reaction selection and reaction rates, weighted stochastic simulation algorithms based on importance sampling allow rare events to be sampled more effectively. However, existing methods do not address the important issue of barrier crossing, which often arises from multistable networks and systems with complex probability landscape. In addition, the proliferation of parameters and the associated computing cost pose significant problems. Here we introduce a general theoretical framework for obtaining optimized biases in sampling individual reactions for estimating probabilities of rare events. We further describe a practical algorithm called adaptively biased sequential importance sampling (ABSIS) method for efficient probability estimation. By adopting a look-ahead strategy and by enumerating short paths from the current state, we estimate the reaction-specific and state-specific forward and backward moving probabilities of the system, which are then used to bias reaction selections. The ABSIS algorithm can automatically detect barrier-crossing regions, and can adjust bias adaptively at different steps of the sampling process, with bias determined by the outcome of exhaustively generated short paths. In addition, there are only two bias parameters to be determined, regardless of the number of the reactions and the complexity of the network. We have applied the ABSIS method to four biochemical networks: the birth-death process, the reversible isomerization, the bistable Schlögl model, and the enzymatic futile cycle model. For comparison, we have also applied the finite buffer discrete chemical master equation (dCME) method recently developed to obtain exact numerical solutions of the underlying discrete chemical master equations of these problems. This allows us to assess sampling results objectively

  10. Annealed Importance Sampling for Neural Mass Models

    PubMed Central

    Penny, Will; Sengupta, Biswa

    2016-01-01

    Neural Mass Models provide a compact description of the dynamical activity of cell populations in neocortical regions. Moreover, models of regional activity can be connected together into networks, and inferences made about the strength of connections, using M/EEG data and Bayesian inference. To date, however, Bayesian methods have been largely restricted to the Variational Laplace (VL) algorithm which assumes that the posterior distribution is Gaussian and finds model parameters that are only locally optimal. This paper explores the use of Annealed Importance Sampling (AIS) to address these restrictions. We implement AIS using proposals derived from Langevin Monte Carlo (LMC) which uses local gradient and curvature information for efficient exploration of parameter space. In terms of the estimation of Bayes factors, VL and AIS agree about which model is best but report different degrees of belief. Additionally, AIS finds better model parameters and we find evidence of non-Gaussianity in their posterior distribution. PMID:26942606

  11. Sampling system and method

    DOEpatents

    Decker, David L.; Lyles, Brad F.; Purcell, Richard G.; Hershey, Ronald Lee

    2013-04-16

    The present disclosure provides an apparatus and method for coupling conduit segments together. A first pump obtains a sample and transmits it through a first conduit to a reservoir accessible by a second pump. The second pump further conducts the sample from the reservoir through a second conduit.

  12. State estimation in large-scale open channel networks using sequential Monte Carlo methods: Optimal sampling importance resampling and implicit particle filters

    NASA Astrophysics Data System (ADS)

    Rafiee, Mohammad; Barrau, Axel; Bayen, Alexandre M.

    2013-06-01

    This article investigates the performance of Monte Carlo-based estimation methods for estimation of flow state in large-scale open channel networks. After constructing a state space model of the flow based on the Saint-Venant equations, we implement the optimal sampling importance resampling filter to perform state estimation in a case in which measurements are available at every time step. Considering a case in which measurements become available intermittently, a random-map implementation of the implicit particle filter is applied to estimate the state trajectory in the interval between the measurements. Finally, some heuristics are proposed, which are shown to improve the estimation results and lower the computational cost. In the first heuristics, considering the case in which measurements are available at every time step, we apply the implicit particle filter over time intervals of a desired size while incorporating all the available measurements over the corresponding time interval. As a second heuristic method, we introduce a maximum a posteriori (MAP) method, which does not require sampling. It will be seen, through implementation, that the MAP method provides more accurate results in the case of our application while having a smaller computational cost. All estimation methods are tested on a network of 19 tidally forced subchannels and 1 reservoir, Clifton Court Forebay, in Sacramento-San Joaquin Delta in California, and numerical results are presented.

  13. Sampling system and method

    DOEpatents

    Decker, David L.; Lyles, Brad F.; Purcell, Richard G.; Hershey, Ronald Lee

    2017-03-07

    In one embodiment, the present disclosure provides an apparatus and method for supporting a tubing bundle during installation or removal. The apparatus includes a clamp for securing the tubing bundle to an external wireline. In various examples, the clamp is external to the tubing bundle or integral with the tubing bundle. According to one method, a tubing bundle and wireline are deployed together and the tubing bundle periodically secured to the wireline using a clamp. In another embodiment, the present disclosure provides an apparatus and method for coupling conduit segments together. A first pump obtains a sample and transmits it through a first conduit to a reservoir accessible by a second pump. The second pump further conducts the sample from the reservoir through a second conduit. In a specific example, one or more clamps are used to connect the first and/or second conduits to an external wireline.

  14. Importance of sampling frequency when collecting diatoms

    PubMed Central

    Wu, Naicheng; Faber, Claas; Sun, Xiuming; Qu, Yueming; Wang, Chao; Ivetic, Snjezana; Riis, Tenna; Ulrich, Uta; Fohrer, Nicola

    2016-01-01

    There has been increasing interest in diatom-based bio-assessment but we still lack a comprehensive understanding of how to capture diatoms’ temporal dynamics with an appropriate sampling frequency (ASF). To cover this research gap, we collected and analyzed daily riverine diatom samples over a 1-year period (25 April 2013–30 April 2014) at the outlet of a German lowland river. The samples were classified into five clusters (1–5) by a Kohonen Self-Organizing Map (SOM) method based on similarity between species compositions over time. ASFs were determined to be 25 days at Cluster 2 (June-July 2013) and 13 days at Cluster 5 (February-April 2014), whereas no specific ASFs were found at Cluster 1 (April-May 2013), 3 (August-November 2013) (>30 days) and Cluster 4 (December 2013 - January 2014) (<1 day). ASFs showed dramatic seasonality and were negatively related to hydrological wetness conditions, suggesting that sampling interval should be reduced with increasing catchment wetness. A key implication of our findings for freshwater management is that long-term bio-monitoring protocols should be developed with the knowledge of tracking algal temporal dynamics with an appropriate sampling frequency. PMID:27841310

  15. Importance of sampling frequency when collecting diatoms

    NASA Astrophysics Data System (ADS)

    Wu, Naicheng; Faber, Claas; Sun, Xiuming; Qu, Yueming; Wang, Chao; Ivetic, Snjezana; Riis, Tenna; Ulrich, Uta; Fohrer, Nicola

    2016-11-01

    There has been increasing interest in diatom-based bio-assessment but we still lack a comprehensive understanding of how to capture diatoms’ temporal dynamics with an appropriate sampling frequency (ASF). To cover this research gap, we collected and analyzed daily riverine diatom samples over a 1-year period (25 April 2013–30 April 2014) at the outlet of a German lowland river. The samples were classified into five clusters (1–5) by a Kohonen Self-Organizing Map (SOM) method based on similarity between species compositions over time. ASFs were determined to be 25 days at Cluster 2 (June-July 2013) and 13 days at Cluster 5 (February-April 2014), whereas no specific ASFs were found at Cluster 1 (April-May 2013), 3 (August-November 2013) (>30 days) and Cluster 4 (December 2013 - January 2014) (<1 day). ASFs showed dramatic seasonality and were negatively related to hydrological wetness conditions, suggesting that sampling interval should be reduced with increasing catchment wetness. A key implication of our findings for freshwater management is that long-term bio-monitoring protocols should be developed with the knowledge of tracking algal temporal dynamics with an appropriate sampling frequency.

  16. Improved Sampling Method Reduces Isokinetic Sampling Errors.

    ERIC Educational Resources Information Center

    Karels, Gale G.

    The particulate sampling system currently in use by the Bay Area Air Pollution Control District, San Francisco, California is described in this presentation for the 12th Conference on Methods in Air Pollution and Industrial Hygiene Studies, University of Southern California, April, 1971. The method represents a practical, inexpensive tool that can…

  17. Sampling system and method

    SciTech Connect

    Decker, David L; Lyles, Brad F; Purcell, Richard G; Hershey, Ronald Lee

    2014-05-20

    An apparatus and method for supporting a tubing bundle during installation or removal. The apparatus includes a clamp for securing the tubing bundle to an external wireline. The method includes deploying the tubing bundle and wireline together, The tubing bundle is periodically secured to the wireline using a clamp.

  18. Accelerated nonrigid intensity-based image registration using importance sampling.

    PubMed

    Bhagalia, Roshni; Fessler, Jeffrey A; Kim, Boklye

    2009-08-01

    Nonrigid image registration methods using intensity-based similarity metrics are becoming increasingly common tools to estimate many types of deformations. Nonrigid warps can be very flexible with a large number of parameters and gradient optimization schemes are widely used to estimate them. However, for large datasets, the computation of the gradient of the similarity metric with respect to these many parameters becomes very time consuming. Using a small random subset of image voxels to approximate the gradient can reduce computation time. This work focuses on the use of importance sampling to reduce the variance of this gradient approximation. The proposed importance sampling framework is based on an edge-dependent adaptive sampling distribution designed for use with intensity-based registration algorithms. We compare the performance of registration based on stochastic approximations with and without importance sampling to that using deterministic gradient descent. Empirical results, on simulated magnetic resonance brain data and real computed tomography inhale-exhale lung data from eight subjects, show that a combination of stochastic approximation methods and importance sampling accelerates the registration process while preserving accuracy.

  19. Computing ensembles of transitions from stable states: Dynamic importance sampling.

    PubMed

    Perilla, Juan R; Beckstein, Oliver; Denning, Elizabeth J; Woolf, Thomas B

    2011-01-30

    There is an increasing dataset of solved biomolecular structures in more than one conformation and increasing evidence that large-scale conformational change is critical for biomolecular function. In this article, we present our implementation of a dynamic importance sampling (DIMS) algorithm that is directed toward improving our understanding of important intermediate states between experimentally defined starting and ending points. This complements traditional molecular dynamics methods where most of the sampling time is spent in the stable free energy wells defined by these initial and final points. As such, the algorithm creates a candidate set of transitions that provide insights for the much slower and probably most important, functionally relevant degrees of freedom. The method is implemented in the program CHARMM and is tested on six systems of growing size and complexity. These systems, the folding of Protein A and of Protein G, the conformational changes in the calcium sensor S100A6, the glucose-galactose-binding protein, maltodextrin, and lactoferrin, are also compared against other approaches that have been suggested in the literature. The results suggest good sampling on a diverse set of intermediates for all six systems with an ability to control the bias and thus to sample distributions of trajectories for the analysis of intermediate states.

  20. Consistent Adjoint Driven Importance Sampling using Space, Energy and Angle

    SciTech Connect

    Peplow, Douglas E.; Mosher, Scott W; Evans, Thomas M

    2012-08-01

    For challenging radiation transport problems, hybrid methods combine the accuracy of Monte Carlo methods with the global information present in deterministic methods. One of the most successful hybrid methods is CADIS Consistent Adjoint Driven Importance Sampling. This method uses a deterministic adjoint solution to construct a biased source distribution and consistent weight windows to optimize a specific tally in a Monte Carlo calculation. The method has been implemented into transport codes using just the spatial and energy information from the deterministic adjoint and has been used in many applications to compute tallies with much higher figures-of-merit than analog calculations. CADIS also outperforms user-supplied importance values, which usually take long periods of user time to develop. This work extends CADIS to develop weight windows that are a function of the position, energy, and direction of the Monte Carlo particle. Two types of consistent source biasing are presented: one method that biases the source in space and energy while preserving the original directional distribution and one method that biases the source in space, energy, and direction. Seven simple example problems are presented which compare the use of the standard space/energy CADIS with the new space/energy/angle treatments.

  1. Elaborating transition interface sampling methods

    SciTech Connect

    Erp, Titus S. van . E-mail: bolhuis@science.uva.nl

    2005-05-01

    We review two recently developed efficient methods for calculating rate constants of processes dominated by rare events in high-dimensional complex systems. The first is transition interface sampling (TIS), based on the measurement of effective fluxes through hypersurfaces in phase space. TIS improves efficiency with respect to standard transition path sampling (TPS) rate constant techniques, because it allows a variable path length and is less sensitive to recrossings. The second method is the partial path version of TIS. Developed for diffusive processes, it exploits the loss of long time correlation. We discuss the relation between the new techniques and the standard reactive flux methods in detail. Path sampling algorithms can suffer from ergodicity problems, and we introduce several new techniques to alleviate these problems, notably path swapping, stochastic configurational bias Monte Carlo shooting moves and order-parameter free path sampling. In addition, we give algorithms to calculate other interesting properties from path ensembles besides rate constants, such as activation energies and reaction mechanisms.

  2. A LITERATURE REVIEW OF WIPE SAMPLING METHODS ...

    EPA Pesticide Factsheets

    Wipe sampling is an important technique for the estimation of contaminant deposition in buildings, homes, or outdoor surfaces as a source of possible human exposure. Numerousmethods of wipe sampling exist, and each method has its own specification for the type of wipe, wetting solvent, and determinative step to be used, depending upon the contaminant of concern. The objective of this report is to concisely summarize the findings of a literature review that was conducted to identify the state-of-the-art wipe sampling techniques for a target list of compounds. This report describes the methods used to perform the literature review; a brief review of wipe sampling techniques in general; an analysis of physical and chemical properties of each target analyte; an analysis of wipe sampling techniques for the target analyte list; and asummary of the wipe sampling techniques for the target analyte list, including existing data gaps. In general, no overwhelming consensus can be drawn from the current literature on how to collect a wipe sample for the chemical warfare agents, organophosphate pesticides, and other toxic industrial chemicals of interest to this study. Different methods, media, and wetting solvents have been recommended and used by various groups and different studies. For many of the compounds of interest, no specific wipe sampling methodology has been established for their collection. Before a wipe sampling method (or methods) can be established for the co

  3. Sequential Importance Sampling for Rare Event Estimation with Computer Experiments

    SciTech Connect

    Williams, Brian J.; Picard, Richard R.

    2012-06-25

    Importance sampling often drastically improves the variance of percentile and quantile estimators of rare events. We propose a sequential strategy for iterative refinement of importance distributions for sampling uncertain inputs to a computer model to estimate quantiles of model output or the probability that the model output exceeds a fixed or random threshold. A framework is introduced for updating a model surrogate to maximize its predictive capability for rare event estimation with sequential importance sampling. Examples of the proposed methodology involving materials strength and nuclear reactor applications will be presented. The conclusions are: (1) Importance sampling improves UQ of percentile and quantile estimates relative to brute force approach; (2) Benefits of importance sampling increase as percentiles become more extreme; (3) Iterative refinement improves importance distributions in relatively few iterations; (4) Surrogates are necessary for slow running codes; (5) Sequential design improves surrogate quality in region of parameter space indicated by importance distributions; and (6) Importance distributions and VRFs stabilize quickly, while quantile estimates may converge slowly.

  4. 9 CFR 327.11 - Receipts to importers for import product samples.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 9 Animals and Animal Products 2 2010-01-01 2010-01-01 false Receipts to importers for import product samples. 327.11 Section 327.11 Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE... AND VOLUNTARY INSPECTION AND CERTIFICATION IMPORTED PRODUCTS § 327.11 Receipts to importers for...

  5. On the importance of incorporating sampling weights in ...

    EPA Pesticide Factsheets

    Occupancy models are used extensively to assess wildlife-habitat associations and to predict species distributions across large geographic regions. Occupancy models were developed as a tool to properly account for imperfect detection of a species. Current guidelines on survey design requirements for occupancy models focus on the number of sample units and the pattern of revisits to a sample unit within a season. We focus on the sampling design or how the sample units are selected in geographic space (e.g., stratified, simple random, unequal probability, etc). In a probability design, each sample unit has a sample weight which quantifies the number of sample units it represents in the finite (oftentimes areal) sampling frame. We demonstrate the importance of including sampling weights in occupancy model estimation when the design is not a simple random sample or equal probability design. We assume a finite areal sampling frame as proposed for a national bat monitoring program. We compare several unequal and equal probability designs and varying sampling intensity within a simulation study. We found the traditional single season occupancy model produced biased estimates of occupancy and lower confidence interval coverage rates compared to occupancy models that accounted for the sampling design. We also discuss how our findings inform the analyses proposed for the nascent North American Bat Monitoring Program and other collaborative synthesis efforts that propose h

  6. Duplex sampling apparatus and method

    DOEpatents

    Brown, Paul E.; Lloyd, Robert

    1992-01-01

    An improved apparatus is provided for sampling a gaseous mixture and for measuring mixture components. The apparatus includes two sampling containers connected in series serving as a duplex sampling apparatus. The apparatus is adapted to independently determine the amounts of condensable and noncondensable gases in admixture from a single sample. More specifically, a first container includes a first port capable of selectively connecting to and disconnecting from a sample source and a second port capable of selectively connecting to and disconnecting from a second container. A second container also includes a first port capable of selectively connecting to and disconnecting from the second port of the first container and a second port capable of either selectively connecting to and disconnecting from a differential pressure source. By cooling a mixture sample in the first container, the condensable vapors form a liquid, leaving noncondensable gases either as free gases or dissolved in the liquid. The condensed liquid is heated to drive out dissolved noncondensable gases, and all the noncondensable gases are transferred to the second container. Then the first and second containers are separated from one another in order to separately determine the amount of noncondensable gases and the amount of condensable gases in the sample.

  7. Apparatus and method for handheld sampling

    DOEpatents

    Staab, Torsten A.

    2005-09-20

    The present invention includes an apparatus, and corresponding method, for taking a sample. The apparatus is built around a frame designed to be held in at least one hand. A sample media is used to secure the sample. A sample media adapter for securing the sample media is operated by a trigger mechanism connectively attached within the frame to the sample media adapter.

  8. Soil sampling kit and a method of sampling therewith

    DOEpatents

    Thompson, C.V.

    1991-02-05

    A soil sampling device and a sample containment device for containing a soil sample is disclosed. In addition, a method for taking a soil sample using the soil sampling device and soil sample containment device to minimize the loss of any volatile organic compounds contained in the soil sample prior to analysis is disclosed. The soil sampling device comprises two close fitting, longitudinal tubular members of suitable length, the inner tube having the outward end closed. With the inner closed tube withdrawn a selected distance, the outer tube can be inserted into the ground or other similar soft material to withdraw a sample of material for examination. The inner closed end tube controls the volume of the sample taken and also serves to eject the sample. The soil sample containment device has a sealing member which is adapted to attach to an analytical apparatus which analyzes the volatile organic compounds contained in the sample. The soil sampling device in combination with the soil sample containment device allows an operator to obtain a soil sample containing volatile organic compounds and minimizing the loss of the volatile organic compounds prior to analysis of the soil sample for the volatile organic compounds. 11 figures.

  9. Soil sampling kit and a method of sampling therewith

    DOEpatents

    Thompson, Cyril V.

    1991-01-01

    A soil sampling device and a sample containment device for containing a soil sample is disclosed. In addition, a method for taking a soil sample using the soil sampling device and soil sample containment device to minimize the loss of any volatile organic compounds contained in the soil sample prior to analysis is disclosed. The soil sampling device comprises two close fitting, longitudinal tubular members of suitable length, the inner tube having the outward end closed. With the inner closed tube withdrawn a selected distance, the outer tube can be inserted into the ground or other similar soft material to withdraw a sample of material for examination. The inner closed end tube controls the volume of the sample taken and also serves to eject the sample. The soil sample containment device has a sealing member which is adapted to attach to an analytical apparatus which analyzes the volatile organic compounds contained in the sample. The soil sampling device in combination with the soil sample containment device allow an operator to obtain a soil sample containing volatile organic compounds and minimizing the loss of the volatile organic compounds prior to analysis of the soil sample for the volatile organic compounds.

  10. Sampling High-Altitude and Stratified Mating Flights of Red Imported Fire Ant

    Technology Transfer Automated Retrieval System (TEKTRAN)

    With the exception of an airplane equipped with nets, no method has been developed that successfully samples red imported fire ant, Solenopsis invicta Buren, sexuals in mating/dispersal flights throughout their potential altitudinal trajectories. We developed and tested a method for sampling queens ...

  11. Probabilistic inference using linear Gaussian importance sampling for hybrid Bayesian networks

    NASA Astrophysics Data System (ADS)

    Sun, Wei; Chang, K. C.

    2005-05-01

    Probabilistic inference for Bayesian networks is in general NP-hard using either exact algorithms or approximate methods. However, for very complex networks, only the approximate methods such as stochastic sampling could be used to provide a solution given any time constraint. There are several simulation methods currently available. They include logic sampling (the first proposed stochastic method for Bayesian networks, the likelihood weighting algorithm) the most commonly used simulation method because of its simplicity and efficiency, the Markov blanket scoring method, and the importance sampling algorithm. In this paper, we first briefly review and compare these available simulation methods, then we propose an improved importance sampling algorithm called linear Gaussian importance sampling algorithm for general hybrid model (LGIS). LGIS is aimed for hybrid Bayesian networks consisting of both discrete and continuous random variables with arbitrary distributions. It uses linear function and Gaussian additive noise to approximate the true conditional probability distribution for continuous variable given both its parents and evidence in a Bayesian network. One of the most important features of the newly developed method is that it can adaptively learn the optimal important function from the previous samples. We test the inference performance of LGIS using a 16-node linear Gaussian model and a 6-node general hybrid model. The performance comparison with other well-known methods such as Junction tree (JT) and likelihood weighting (LW) shows that LGIS-GHM is very promising.

  12. An importance sampling algorithm for estimating extremes of perpetuity sequences

    NASA Astrophysics Data System (ADS)

    Collamore, Jeffrey F.

    2012-09-01

    In a wide class of problems in insurance and financial mathematics, it is of interest to study the extremal events of a perpetuity sequence. This paper addresses the problem of numerically evaluating these rare event probabilities. Specifically, an importance sampling algorithm is described which is efficient in the sense that it exhibits bounded relative error, and which is optimal in an appropriate asymptotic sense. The main idea of the algorithm is to use a "dual" change of measure, which is employed to an associated Markov chain over a randomly-stopped time interval. The algorithm also makes use of the so-called forward sequences generated to the given stochastic recursion, together with elements of Markov chain theory.

  13. 19 CFR 151.67 - Sampling by importer.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... TREASURY (CONTINUED) EXAMINATION, SAMPLING, AND TESTING OF MERCHANDISE Wool and Hair § 151.67 Sampling by... quantities from the packages of wool or hair designated for examination, provided the bales or bags...

  14. 19 CFR 151.67 - Sampling by importer.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... TREASURY (CONTINUED) EXAMINATION, SAMPLING, AND TESTING OF MERCHANDISE Wool and Hair § 151.67 Sampling by... quantities from the packages of wool or hair designated for examination, provided the bales or bags...

  15. 19 CFR 151.67 - Sampling by importer.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... TREASURY (CONTINUED) EXAMINATION, SAMPLING, AND TESTING OF MERCHANDISE Wool and Hair § 151.67 Sampling by... quantities from the packages of wool or hair designated for examination, provided the bales or bags...

  16. 19 CFR 151.67 - Sampling by importer.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... TREASURY (CONTINUED) EXAMINATION, SAMPLING, AND TESTING OF MERCHANDISE Wool and Hair § 151.67 Sampling by... quantities from the packages of wool or hair designated for examination, provided the bales or bags...

  17. 19 CFR 151.67 - Sampling by importer.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... TREASURY (CONTINUED) EXAMINATION, SAMPLING, AND TESTING OF MERCHANDISE Wool and Hair § 151.67 Sampling by... quantities from the packages of wool or hair designated for examination, provided the bales or bags...

  18. Subrandom methods for multidimensional nonuniform sampling

    NASA Astrophysics Data System (ADS)

    Worley, Bradley

    2016-08-01

    Methods of nonuniform sampling that utilize pseudorandom number sequences to select points from a weighted Nyquist grid are commonplace in biomolecular NMR studies, due to the beneficial incoherence introduced by pseudorandom sampling. However, these methods require the specification of a non-arbitrary seed number in order to initialize a pseudorandom number generator. Because the performance of pseudorandom sampling schedules can substantially vary based on seed number, this can complicate the task of routine data collection. Approaches such as jittered sampling and stochastic gap sampling are effective at reducing random seed dependence of nonuniform sampling schedules, but still require the specification of a seed number. This work formalizes the use of subrandom number sequences in nonuniform sampling as a means of seed-independent sampling, and compares the performance of three subrandom methods to their pseudorandom counterparts using commonly applied schedule performance metrics. Reconstruction results using experimental datasets are also provided to validate claims made using these performance metrics.

  19. Method and apparatus for data sampling

    DOEpatents

    Odell, D.M.C.

    1994-04-19

    A method and apparatus for sampling radiation detector outputs and determining event data from the collected samples is described. The method uses high speed sampling of the detector output, the conversion of the samples to digital values, and the discrimination of the digital values so that digital values representing detected events are determined. The high speed sampling and digital conversion is performed by an A/D sampler that samples the detector output at a rate high enough to produce numerous digital samples for each detected event. The digital discrimination identifies those digital samples that are not representative of detected events. The sampling and discrimination also provides for temporary or permanent storage, either serially or in parallel, to a digital storage medium. 6 figures.

  20. Method and apparatus for data sampling

    DOEpatents

    Odell, Daniel M. C.

    1994-01-01

    A method and apparatus for sampling radiation detector outputs and determining event data from the collected samples. The method uses high speed sampling of the detector output, the conversion of the samples to digital values, and the discrimination of the digital values so that digital values representing detected events are determined. The high speed sampling and digital conversion is performed by an A/D sampler that samples the detector output at a rate high enough to produce numerous digital samples for each detected event. The digital discrimination identifies those digital samples that are not representative of detected events. The sampling and discrimination also provides for temporary or permanent storage, either serially or in parallel, to a digital storage medium.

  1. Mixed Methods Sampling: A Typology with Examples

    ERIC Educational Resources Information Center

    Teddlie, Charles; Yu, Fen

    2007-01-01

    This article presents a discussion of mixed methods (MM) sampling techniques. MM sampling involves combining well-established qualitative and quantitative techniques in creative ways to answer research questions posed by MM research designs. Several issues germane to MM sampling are presented including the differences between probability and…

  2. ALARA ASSESSMENT OF SETTLER SLUDGE SAMPLING METHODS

    SciTech Connect

    NELSEN LA

    2009-01-30

    The purpose of this assessment is to compare underwater and above water settler sludge sampling methods to determine if the added cost for underwater sampling for the sole purpose of worker dose reductions is justified. Initial planning for sludge sampling included container, settler and knock-out-pot (KOP) sampling. Due to the significantly higher dose consequence of KOP sludge, a decision was made to sample KOP underwater to achieve worker dose reductions. Additionally, initial plans were to utilize the underwater sampling apparatus for settler sludge. Since there are no longer plans to sample KOP sludge, the decision for underwater sampling for settler sludge needs to be revisited. The present sampling plan calls for spending an estimated $2,500,000 to design and construct a new underwater sampling system (per A21 C-PL-001 RevOE). This evaluation will compare and contrast the present method of above water sampling to the underwater method that is planned by the Sludge Treatment Project (STP) and determine if settler samples can be taken using the existing sampling cart (with potentially minor modifications) while maintaining doses to workers As Low As Reasonably Achievable (ALARA) and eliminate the need for costly redesigns, testing and personnel retraining.

  3. Uniform sampling table method and its applications: establishment of a uniform sampling method.

    PubMed

    Chen, Yibin; Chen, Jiaxi; Wang, Wei

    2013-01-01

    A novel uniform sampling method is proposed in this paper. According to the requirements of uniform sampling, we propose the properties that must be met by analyzing the distribution of samples. Based on this, the proposed uniform sampling method is demonstrated and evaluated strictly by mathematical means such as inference. The uniform sampling tables with respect to Cn(t2) and Cn(t3) are established. Furthermore, a one-dimension uniform sampling method and a multidimension method are proposed. The proposed novel uniform sampling method, which is guided by uniform design theory, enjoys the advantages of simplified use and good representativeness of the whole sample.

  4. Sampling Methods in Cardiovascular Nursing Research: An Overview.

    PubMed

    Kandola, Damanpreet; Banner, Davina; O'Keefe-McCarthy, Sheila; Jassal, Debbie

    2014-01-01

    Cardiovascular nursing research covers a wide array of topics from health services to psychosocial patient experiences. The selection of specific participant samples is an important part of the research design and process. The sampling strategy employed is of utmost importance to ensure that a representative sample of participants is chosen. There are two main categories of sampling methods: probability and non-probability. Probability sampling is the random selection of elements from the population, where each element of the population has an equal and independent chance of being included in the sample. There are five main types of probability sampling including simple random sampling, systematic sampling, stratified sampling, cluster sampling, and multi-stage sampling. Non-probability sampling methods are those in which elements are chosen through non-random methods for inclusion into the research study and include convenience sampling, purposive sampling, and snowball sampling. Each approach offers distinct advantages and disadvantages and must be considered critically. In this research column, we provide an introduction to these key sampling techniques and draw on examples from the cardiovascular research. Understanding the differences in sampling techniques may aid nurses in effective appraisal of research literature and provide a reference pointfor nurses who engage in cardiovascular research.

  5. 40 CFR 80.1349 - Alternative sampling and testing requirements for importers who import gasoline into the United...

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... requirements for importers who import gasoline into the United States by truck. 80.1349 Section 80.1349... FUELS AND FUEL ADDITIVES Gasoline Benzene Sampling, Testing and Retention Requirements § 80.1349 Alternative sampling and testing requirements for importers who import gasoline into the United States...

  6. Sampling Plant Diversity and Rarity at Landscape Scales: Importance of Sampling Time in Species Detectability

    PubMed Central

    Zhang, Jian; Nielsen, Scott E.; Grainger, Tess N.; Kohler, Monica; Chipchar, Tim; Farr, Daniel R.

    2014-01-01

    Documenting and estimating species richness at regional or landscape scales has been a major emphasis for conservation efforts, as well as for the development and testing of evolutionary and ecological theory. Rarely, however, are sampling efforts assessed on how they affect detection and estimates of species richness and rarity. In this study, vascular plant richness was sampled in 356 quarter hectare time-unlimited survey plots in the boreal region of northeast Alberta. These surveys consisted of 15,856 observations of 499 vascular plant species (97 considered to be regionally rare) collected by 12 observers over a 2 year period. Average survey time for each quarter-hectare plot was 82 minutes, ranging from 20 to 194 minutes, with a positive relationship between total survey time and total plant richness. When survey time was limited to a 20-minute search, as in other Alberta biodiversity methods, 61 species were missed. Extending the survey time to 60 minutes, reduced the number of missed species to 20, while a 90-minute cut-off time resulted in the loss of 8 species. When surveys were separated by habitat type, 60 minutes of search effort sampled nearly 90% of total observed richness for all habitats. Relative to rare species, time-unlimited surveys had ∼65% higher rare plant detections post-20 minutes than during the first 20 minutes of the survey. Although exhaustive sampling was attempted, observer bias was noted among observers when a subsample of plots was re-surveyed by different observers. Our findings suggest that sampling time, combined with sample size and observer effects, should be considered in landscape-scale plant biodiversity surveys. PMID:24740179

  7. Sampling plant diversity and rarity at landscape scales: importance of sampling time in species detectability.

    PubMed

    Zhang, Jian; Nielsen, Scott E; Grainger, Tess N; Kohler, Monica; Chipchar, Tim; Farr, Daniel R

    2014-01-01

    Documenting and estimating species richness at regional or landscape scales has been a major emphasis for conservation efforts, as well as for the development and testing of evolutionary and ecological theory. Rarely, however, are sampling efforts assessed on how they affect detection and estimates of species richness and rarity. In this study, vascular plant richness was sampled in 356 quarter hectare time-unlimited survey plots in the boreal region of northeast Alberta. These surveys consisted of 15,856 observations of 499 vascular plant species (97 considered to be regionally rare) collected by 12 observers over a 2 year period. Average survey time for each quarter-hectare plot was 82 minutes, ranging from 20 to 194 minutes, with a positive relationship between total survey time and total plant richness. When survey time was limited to a 20-minute search, as in other Alberta biodiversity methods, 61 species were missed. Extending the survey time to 60 minutes, reduced the number of missed species to 20, while a 90-minute cut-off time resulted in the loss of 8 species. When surveys were separated by habitat type, 60 minutes of search effort sampled nearly 90% of total observed richness for all habitats. Relative to rare species, time-unlimited surveys had ∼ 65% higher rare plant detections post-20 minutes than during the first 20 minutes of the survey. Although exhaustive sampling was attempted, observer bias was noted among observers when a subsample of plots was re-surveyed by different observers. Our findings suggest that sampling time, combined with sample size and observer effects, should be considered in landscape-scale plant biodiversity surveys.

  8. Dynamic Method for Identifying Collected Sample Mass

    NASA Technical Reports Server (NTRS)

    Carson, John

    2008-01-01

    G-Sample is designed for sample collection missions to identify the presence and quantity of sample material gathered by spacecraft equipped with end effectors. The software method uses a maximum-likelihood estimator to identify the collected sample's mass based on onboard force-sensor measurements, thruster firings, and a dynamics model of the spacecraft. This makes sample mass identification a computation rather than a process requiring additional hardware. Simulation examples of G-Sample are provided for spacecraft model configurations with a sample collection device mounted on the end of an extended boom. In the absence of thrust knowledge errors, the results indicate that G-Sample can identify the amount of collected sample mass to within 10 grams (with 95-percent confidence) by using a force sensor with a noise and quantization floor of 50 micrometers. These results hold even in the presence of realistic parametric uncertainty in actual spacecraft inertia, center-of-mass offset, and first flexibility modes. Thrust profile knowledge is shown to be a dominant sensitivity for G-Sample, entering in a nearly one-to-one relationship with the final mass estimation error. This means thrust profiles should be well characterized with onboard accelerometers prior to sample collection. An overall sample-mass estimation error budget has been developed to approximate the effect of model uncertainty, sensor noise, data rate, and thrust profile error on the expected estimate of collected sample mass.

  9. Innovative methods for inorganic sample preparation

    SciTech Connect

    Essling, A.M.; Huff, E.A.; Graczyk, D.G.

    1992-04-01

    Procedures and guidelines are given for the dissolution of a variety of selected materials using fusion, microwave, and Parr bomb techniques. These materials include germanium glass, corium-concrete mixtures, and zeolites. Emphasis is placed on sample-preparation approaches that produce a single master solution suitable for complete multielement characterization of the sample. In addition, data are presented on the soil microwave digestion method approved by the Environmental Protection Agency (EPA). Advantages and disadvantages of each sample-preparation technique are summarized.

  10. Method and apparatus for sampling atmospheric mercury

    DOEpatents

    Trujillo, Patricio E.; Campbell, Evan E.; Eutsler, Bernard C.

    1976-01-20

    A method of simultaneously sampling particulate mercury, organic mercurial vapors, and metallic mercury vapor in the working and occupational environment and determining the amount of mercury derived from each such source in the sampled air. A known volume of air is passed through a sampling tube containing a filter for particulate mercury collection, a first adsorber for the selective adsorption of organic mercurial vapors, and a second adsorber for the adsorption of metallic mercury vapor. Carbon black molecular sieves are particularly useful as the selective adsorber for organic mercurial vapors. The amount of mercury adsorbed or collected in each section of the sampling tube is readily quantitatively determined by flameless atomic absorption spectrophotometry.

  11. The rank product method with two samples.

    PubMed

    Koziol, James A

    2010-11-05

    Breitling et al. (2004) introduced a statistical technique, the rank product method, for detecting differentially regulated genes in replicated microarray experiments. The technique has achieved widespread acceptance and is now used more broadly, in such diverse fields as RNAi analysis, proteomics, and machine learning. In this note, we extend the rank product method to the two sample setting, provide distribution theory attending the rank product method in this setting, and give numerical details for implementing the method.

  12. 40 CFR 80.1630 - Sampling and testing requirements for refiners, gasoline importers and producers and importers of...

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... refiners, gasoline importers and producers and importers of certified ethanol denaturant. 80.1630 Section... refiners, gasoline importers and producers and importers of certified ethanol denaturant. (a) Sample and test each batch of gasoline and certified ethanol denaturant. (1) Refiners and importers shall...

  13. An Importance Sampling EM Algorithm for Latent Regression Models

    ERIC Educational Resources Information Center

    von Davier, Matthias; Sinharay, Sandip

    2007-01-01

    Reporting methods used in large-scale assessments such as the National Assessment of Educational Progress (NAEP) rely on latent regression models. To fit the latent regression model using the maximum likelihood estimation technique, multivariate integrals must be evaluated. In the computer program MGROUP used by the Educational Testing Service for…

  14. Decomposition and (importance) sampling techniques for multi-stage stochastic linear programs

    SciTech Connect

    Infanger, G.

    1993-11-01

    The difficulty of solving large-scale multi-stage stochastic linear programs arises from the sheer number of scenarios associated with numerous stochastic parameters. The number of scenarios grows exponentially with the number of stages and problems get easily out of hand even for very moderate numbers of stochastic parameters per stage. Our method combines dual (Benders) decomposition with Monte Carlo sampling techniques. We employ importance sampling to efficiently obtain accurate estimates of both expected future costs and gradients and right-hand sides of cuts. The method enables us to solve practical large-scale problems with many stages and numerous stochastic parameters per stage. We discuss the theory of sharing and adjusting cuts between different scenarios in a stage. We derive probabilistic lower and upper bounds, where we use importance path sampling for the upper bound estimation. Initial numerical results turned out to be promising.

  15. Use of enveloping distribution sampling to evaluate important characteristics of biomolecular force fields.

    PubMed

    Huang, Wei; Lin, Zhixiong; van Gunsteren, Wilfred F

    2014-06-19

    The predictive power of biomolecular simulation critically depends on the quality of the force field or molecular model used and on the extent of conformational sampling that can be achieved. Both issues are addressed. First, it is shown that widely used force fields for simulation of proteins in aqueous solution appear to have rather different propensities to stabilize or destabilize α-, π-, and 3(10)- helical structures, which is an important feature of a biomolecular force field due to the omni-presence of such secondary structure in proteins. Second, the relative stability of secondary structure elements in proteins can only be computationally determined through so-called free-energy calculations, the accuracy of which critically depends on the extent of configurational sampling. It is shown that the method of enveloping distribution sampling is a very efficient method to extensively sample different parts of configurational space.

  16. Actinide recovery method -- Large soil samples

    SciTech Connect

    Maxwell , S.L. III

    2000-04-25

    There is a need to measure actinides in environmental samples with lower and lower detection limits, requiring larger sample sizes. This analysis is adversely affected by sample-matrix interferences, which make analyzing soil samples above five-grams very difficult. A new Actinide-Recovery Method has been developed by the Savannah River Site Central Laboratory to preconcentrate actinides from large-soil samples. Diphonix Resin (Eichrom Industries), a 1994 R and D 100 winner, is used to preconcentrate the actinides from large soil samples, which are bound powerfully to the resin's diphosphonic acid groups. A rapid microwave-digestion technique is used to remove the actinides from the Diphonix Resin, which effectively eliminates interfering matrix components from the soil matrix. The microwave-digestion technique is more effective and less tedious than catalyzed hydrogen peroxide digestions of the resin or digestion of diphosphonic stripping agents such as HEDPA. After resin digestion, the actinides are recovered in a small volume of nitric acid which can be loaded onto small extraction chromatography columns, such as TEVA Resin, U-TEVA Resin or TRU Resin (Eichrom Industries). Small, selective extraction columns do not generate large volumes of liquid waste and provide consistent tracer recoveries after soil matrix elimination.

  17. Constrained sampling method for analytic continuation.

    PubMed

    Sandvik, Anders W

    2016-12-01

    A method for analytic continuation of imaginary-time correlation functions (here obtained in quantum Monte Carlo simulations) to real-frequency spectral functions is proposed. Stochastically sampling a spectrum parametrized by a large number of δ functions, treated as a statistical-mechanics problem, it avoids distortions caused by (as demonstrated here) configurational entropy in previous sampling methods. The key development is the suppression of entropy by constraining the spectral weight to within identifiable optimal bounds and imposing a set number of peaks. As a test case, the dynamic structure factor of the S=1/2 Heisenberg chain is computed. Very good agreement is found with Bethe ansatz results in the ground state (including a sharp edge) and with exact diagonalization of small systems at elevated temperatures.

  18. Constrained sampling method for analytic continuation

    NASA Astrophysics Data System (ADS)

    Sandvik, Anders W.

    2016-12-01

    A method for analytic continuation of imaginary-time correlation functions (here obtained in quantum Monte Carlo simulations) to real-frequency spectral functions is proposed. Stochastically sampling a spectrum parametrized by a large number of δ functions, treated as a statistical-mechanics problem, it avoids distortions caused by (as demonstrated here) configurational entropy in previous sampling methods. The key development is the suppression of entropy by constraining the spectral weight to within identifiable optimal bounds and imposing a set number of peaks. As a test case, the dynamic structure factor of the S =1 /2 Heisenberg chain is computed. Very good agreement is found with Bethe ansatz results in the ground state (including a sharp edge) and with exact diagonalization of small systems at elevated temperatures.

  19. Actinide Recovery Method for Large Soil Samples

    SciTech Connect

    Maxwell, S.L. III; Nichols, S.

    1998-11-01

    A new Actinide Recovery Method has been developed by the Savannah River Site Central Laboratory to preconcentrate actinides in very large soil samples. Diphonix Resin(r) is used eliminate soil matrix interferences and preconcentrate actinides after soil leaching or soil fusion. A rapid microwave digestion technique is used to remove the actinides from the Diphonix Resin(r). After the resin digestion, the actinides are recovered in a small volume of nitric acid which can be easily loaded onto small extraction-chromatography columns, such as TEVA Resin(r), U-TEVA Resin(r) or TRU Resin(r) (Eichrom Industries). This method enables the application of small, selective extraction-columns to recover actinides from very large soil samples with high selectivity, consistent tracer recoveries and minimal liquid waste.

  20. Methods for Sampling of Airborne Viruses

    PubMed Central

    Verreault, Daniel; Moineau, Sylvain; Duchaine, Caroline

    2008-01-01

    Summary: To better understand the underlying mechanisms of aerovirology, accurate sampling of airborne viruses is fundamental. The sampling instruments commonly used in aerobiology have also been used to recover viruses suspended in the air. We reviewed over 100 papers to evaluate the methods currently used for viral aerosol sampling. Differentiating infections caused by direct contact from those caused by airborne dissemination can be a very demanding task given the wide variety of sources of viral aerosols. While epidemiological data can help to determine the source of the contamination, direct data obtained from air samples can provide very useful information for risk assessment purposes. Many types of samplers have been used over the years, including liquid impingers, solid impactors, filters, electrostatic precipitators, and many others. The efficiencies of these samplers depend on a variety of environmental and methodological factors that can affect the integrity of the virus structure. The aerodynamic size distribution of the aerosol also has a direct effect on sampler efficiency. Viral aerosols can be studied under controlled laboratory conditions, using biological or nonbiological tracers and surrogate viruses, which are also discussed in this review. Lastly, general recommendations are made regarding future studies on the sampling of airborne viruses. PMID:18772283

  1. SOIL AND SEDIMENT SAMPLING METHODS | Science ...

    EPA Pesticide Factsheets

    The EPA Office of Solid Waste and Emergency Response's (OSWER) Office of Superfund Remediation and Technology Innovation (OSRTI) needs innovative methods and techniques to solve new and difficult sampling and analytical problems found at the numerous Superfund sites throughout the United States. Inadequate site characterization and a lack of knowledge of surface and subsurface contaminant distributions hinders EPA's ability to make the best decisions on remediation options and to conduct the most effective cleanup efforts. To assist OSWER, NERL conducts research to improve their capability to more accurately, precisely, and efficiently characterize Superfund, RCRA, LUST, oil spills, and brownfield sites and to improve their risk-based decision making capabilities, research is being conducted on improving soil and sediment sampling techniques and improving the sampling and handling of volatile organic compound (VOC) contaminated soils, among the many research programs and tasks being performed at ESD-LV.Under this task, improved sampling approaches and devices will be developed for characterizing the concentration of VOCs in soils. Current approaches and devices used today can lose up to 99% of the VOCs present in the sample due inherent weaknesses in the device and improper/inadequate collection techniques. This error generally causes decision makers to markedly underestimate the soil VOC concentrations and, therefore, to greatly underestimate the ecological

  2. Flow cytometric detection method for DNA samples

    DOEpatents

    Nasarabadi,Shanavaz; Langlois, Richard G.; Venkateswaran, Kodumudi S.

    2011-07-05

    Disclosed herein are two methods for rapid multiplex analysis to determine the presence and identity of target DNA sequences within a DNA sample. Both methods use reporting DNA sequences, e.g., modified conventional Taqman.RTM. probes, to combine multiplex PCR amplification with microsphere-based hybridization using flow cytometry means of detection. Real-time PCR detection can also be incorporated. The first method uses a cyanine dye, such as, Cy3.TM., as the reporter linked to the 5' end of a reporting DNA sequence. The second method positions a reporter dye, e.g., FAM.TM. on the 3' end of the reporting DNA sequence and a quencher dye, e.g., TAMRA.TM., on the 5' end.

  3. Flow cytometric detection method for DNA samples

    DOEpatents

    Nasarabadi, Shanavaz; Langlois, Richard G.; Venkateswaran, Kodumudi S.

    2006-08-01

    Disclosed herein are two methods for rapid multiplex analysis to determine the presence and identity of target DNA sequences within a DNA sample. Both methods use reporting DNA sequences, e.g., modified conventional Taqman.RTM. probes, to combine multiplex PCR amplification with microsphere-based hybridization using flow cytometry means of detection. Real-time PCR detection can also be incorporated. The first method uses a cyanine dye, such as, Cy3.TM., as the reporter linked to the 5' end of a reporting DNA sequence. The second method positions a reporter dye, e.g., FAM, on the 3' end of the reporting DNA sequence and a quencher dye, e.g., TAMRA, on the 5' end.

  4. Evaluation of Common Methods for Sampling Invertebrate Pollinator Assemblages: Net Sampling Out-Perform Pan Traps

    PubMed Central

    Popic, Tony J.; Davila, Yvonne C.; Wardle, Glenda M.

    2013-01-01

    Methods for sampling ecological assemblages strive to be efficient, repeatable, and representative. Unknowingly, common methods may be limited in terms of revealing species function and so of less value for comparative studies. The global decline in pollination services has stimulated surveys of flower-visiting invertebrates, using pan traps and net sampling. We explore the relative merits of these two methods in terms of species discovery, quantifying abundance, function, and composition, and responses of species to changing floral resources. Using a spatially-nested design we sampled across a 5000 km2 area of arid grasslands, including 432 hours of net sampling and 1296 pan trap-days, between June 2010 and July 2011. Net sampling yielded 22% more species and 30% higher abundance than pan traps, and better reflected the spatio-temporal variation of floral resources. Species composition differed significantly between methods; from 436 total species, 25% were sampled by both methods, 50% only by nets, and the remaining 25% only by pans. Apart from being less comprehensive, if pan traps do not sample flower-visitors, the link to pollination is questionable. By contrast, net sampling functionally linked species to pollination through behavioural observations of flower-visitation interaction frequency. Netted specimens are also necessary for evidence of pollen transport. Benefits of net-based sampling outweighed minor differences in overall sampling effort. As pan traps and net sampling methods are not equivalent for sampling invertebrate-flower interactions, we recommend net sampling of invertebrate pollinator assemblages, especially if datasets are intended to document declines in pollination and guide measures to retain this important ecosystem service. PMID:23799127

  5. Evaluation of common methods for sampling invertebrate pollinator assemblages: net sampling out-perform pan traps.

    PubMed

    Popic, Tony J; Davila, Yvonne C; Wardle, Glenda M

    2013-01-01

    Methods for sampling ecological assemblages strive to be efficient, repeatable, and representative. Unknowingly, common methods may be limited in terms of revealing species function and so of less value for comparative studies. The global decline in pollination services has stimulated surveys of flower-visiting invertebrates, using pan traps and net sampling. We explore the relative merits of these two methods in terms of species discovery, quantifying abundance, function, and composition, and responses of species to changing floral resources. Using a spatially-nested design we sampled across a 5000 km(2) area of arid grasslands, including 432 hours of net sampling and 1296 pan trap-days, between June 2010 and July 2011. Net sampling yielded 22% more species and 30% higher abundance than pan traps, and better reflected the spatio-temporal variation of floral resources. Species composition differed significantly between methods; from 436 total species, 25% were sampled by both methods, 50% only by nets, and the remaining 25% only by pans. Apart from being less comprehensive, if pan traps do not sample flower-visitors, the link to pollination is questionable. By contrast, net sampling functionally linked species to pollination through behavioural observations of flower-visitation interaction frequency. Netted specimens are also necessary for evidence of pollen transport. Benefits of net-based sampling outweighed minor differences in overall sampling effort. As pan traps and net sampling methods are not equivalent for sampling invertebrate-flower interactions, we recommend net sampling of invertebrate pollinator assemblages, especially if datasets are intended to document declines in pollination and guide measures to retain this important ecosystem service.

  6. A quantitative evaluation of two methods for preserving hair samples

    USGS Publications Warehouse

    Roon, David A.; Waits, L.P.; Kendall, K.C.

    2003-01-01

    Hair samples are an increasingly important DNA source for wildlife studies, yet optimal storage methods and DNA degradation rates have not been rigorously evaluated. We tested amplification success rates over a one-year storage period for DNA extracted from brown bear (Ursus arctos) hair samples preserved using silica desiccation and -20C freezing. For three nuclear DNA microsatellites, success rates decreased significantly after a six-month time point, regardless of storage method. For a 1000 bp mitochondrial fragment, a similar decrease occurred after a two-week time point. Minimizing delays between collection and DNA extraction will maximize success rates for hair-based noninvasive genetic sampling projects.

  7. Well purge and sample apparatus and method

    DOEpatents

    Schalla, Ronald; Smith, Ronald M.; Hall, Stephen H.; Smart, John E.; Gustafson, Gregg S.

    1995-01-01

    The present invention specifically permits purging and/or sampling of a well but only removing, at most, about 25% of the fluid volume compared to conventional methods and, at a minimum, removing none of the fluid volume from the well. The invention is an isolation assembly with a packer, pump and exhaust, that is inserted into the well. The isolation assembly is designed so that only a volume of fluid between the outside diameter of the isolation assembly and the inside diameter of the well over a fluid column height from the bottom of the well to the top of the active portion (lower annulus) is removed. The packer is positioned above the active portion thereby sealing the well and preventing any mixing or contamination of inlet fluid with fluid above the packer. Ports in the wall of the isolation assembly permit purging and sampling of the lower annulus along the height of the active portion.

  8. Well purge and sample apparatus and method

    DOEpatents

    Schalla, R.; Smith, R.M.; Hall, S.H.; Smart, J.E.; Gustafson, G.S.

    1995-10-24

    The present invention specifically permits purging and/or sampling of a well but only removing, at most, about 25% of the fluid volume compared to conventional methods and, at a minimum, removing none of the fluid volume from the well. The invention is an isolation assembly with a packer, pump and exhaust, that is inserted into the well. The isolation assembly is designed so that only a volume of fluid between the outside diameter of the isolation assembly and the inside diameter of the well over a fluid column height from the bottom of the well to the top of the active portion (lower annulus) is removed. The packer is positioned above the active portion thereby sealing the well and preventing any mixing or contamination of inlet fluid with fluid above the packer. Ports in the wall of the isolation assembly permit purging and sampling of the lower annulus along the height of the active portion. 8 figs.

  9. Field evaluation of a VOST sampling method

    SciTech Connect

    Jackson, M.D.; Johnson, L.D.; Fuerst, R.G.; McGaughey, J.F.; Bursey, J.T.; Merrill, R.G.

    1994-12-31

    The VOST (SW-846 Method 0030) specifies the use of Tenax{reg_sign} and a particular petroleum-based charcoal (SKC Lot 104, or its equivalent), that is no longer commercially available. In field evaluation studies of VOST methodology, a replacement petroleum-based charcoal has been used: candidate replacement sorbents for charcoal were studied, and Anasorb{reg_sign} 747, a carbon-based sorbent, was selected for field testing. The sampling train was modified to use only Anasorb{reg_sign} in the back tube and Tenax{reg_sign} in the two front tubes to avoid analytical difficulties associated with the analysis of the sequential bed back tube used in the standard VOST train. The standard (SW-846 Method 0030) and the modified VOST methods were evaluated at a chemical manufacturing facility using a quadruple probe system with quadruple trains. In this field test, known concentrations of the halogenated volatile organic compounds, that are listed in the Clean Air Act Amendments of 1990, Title 3, were introduced into the VOST train and the modified VOST train, using the same certified gas cylinder as a source of test compounds. Statistical tests of the comparability of methods were performed on a compound-by-compound basis. For most compounds, the VOST and modified VOST methods were found to be statistically equivalent.

  10. A method for sampling waste corn

    USGS Publications Warehouse

    Frederick, R.B.; Klaas, E.E.; Baldassarre, G.A.; Reinecke, K.J.

    1984-01-01

    Corn had become one of the most important wildlife food in the United States. It is eaten by a wide variety of animals, including white-tailed deer (Odocoileus virginianus ), raccoon (Procyon lotor ), ring-necked pheasant (Phasianus colchicus , wild turkey (Meleagris gallopavo ), and many species of aquatic birds. Damage to unharvested crops had been documented, but many birds and mammals eat waste grain after harvest and do not conflict with agriculture. A good method for measuring waste-corn availability can be essential to studies concerning food density and food and feeding habits of field-feeding wildlife. Previous methods were developed primarily for approximating losses due to harvest machinery. In this paper, a method is described for estimating the amount of waste corn potentially available to wildlife. Detection of temporal changes in food availability and differences caused by agricultural operations (e.g., recently harvested stubble fields vs. plowed fields) are discussed.

  11. Standard methods for sampling North American freshwater fishes

    USGS Publications Warehouse

    Bonar, Scott A.; Hubert, Wayne A.; Willis, David W.

    2009-01-01

    This important reference book provides standard sampling methods recommended by the American Fisheries Society for assessing and monitoring freshwater fish populations in North America. Methods apply to ponds, reservoirs, natural lakes, and streams and rivers containing cold and warmwater fishes. Range-wide and eco-regional averages for indices of abundance, population structure, and condition for individual species are supplied to facilitate comparisons of standard data among populations. Provides information on converting nonstandard to standard data, statistical and database procedures for analyzing and storing standard data, and methods to prevent transfer of invasive species while sampling.

  12. Sampling and sample preparation methods for determining concentrations of mycotoxins in foods and feeds.

    PubMed

    2012-01-01

    Sample variation is often the largest error in determining concentrations of mycotoxins in food commodities. The worldwide safety evaluation of mycotoxins requires sampling plans that give acceptably accurate values for the levels of contamination in specific batches or lots of a commodity. Mycotoxin concentrations show a skewed or uneven distribution in foods and feeds, especially in whole kernels (or nuts), so it is extremely difficult to collect a sample that accurately represents the mean batch concentration. Sample variance studies and sampling plans have been published for select mycotoxins such as aflatoxin, fumonisin, and deoxynivalenol, emphasizing the importance of sample selection, sample size, and the number of incremental samples. For meaningful data to be generated from surveillance studies, representative samples should be collected from carefully selected populations (batches or lots) of food that, in turn, should be representative of clearly defined locations (e.g. a country, a region within a country). Although sampling variability is unavoidable, it is essential that the precision of the sampling plan be clearly defined and be considered acceptable by those responsible for interpreting and reporting the surveillance data. The factors influencing variability are detailed here, with reference to both major mycotoxins and major commodities. Sampling of large bag stacks, bulk shipments, and domestic supplies are all discussed. Sampling plans currently accepted in international trade are outlined. Acceptance sampling plans and the variabilities that affect operating characteristic curves of such plans are also detailed. The constraints and issues related to the sampling of harvested crops within subsistence farming areas are also discussed in this chapter, as are the essential rules of sample labelling and storage. The chapter concludes with a short section on sample preparation methods.

  13. Catching Stardust and Bringing it Home: The Astronomical Importance of Sample Return

    NASA Astrophysics Data System (ADS)

    Brownlee, D.

    2002-12-01

    The return of lunar samples by the Apollo program provided the first opportunity to perform detailed laboratory studies of ancient solid materials from a known astronomical body. The highly detailed study of the samples, using the best available laboratory instruments and techniques, revolutionized our understanding of the Moon and provided fundamental insight into the remarkable and violent processes that occur early in the history of moons and terrestrial planets. This type of astronomical paleontology is only possible with samples and yet the last US sample return was made by Apollo 17- over thirty years ago! The NASA Stardust mission, began a new era of sample missions with its 1999 launch to retrieve samples from the short period comet Wild 2. Genesis (a solar wind collector) was launched in 2001, the Japanese MUSES-C asteroid sample return mission will launch in 2003 and Mars sample return missions are under study. All of these missions will use sophisticated ground-based instrumentation to provide types of information that cannot be obtained by astronomical and spacecraft remote sensing methods. In the case of Stardust, the goal is to determine the fundamental nature of the initial solid building blocks of solar systems at atomic-scale spatial resolution. The samples returned by the mission will be samples from the Kuiper Belt region and they are probably composed of submicron silicate and organic materials of both presolar and nebular origin. Unlocking the detailed records contained in the elemental, chemical, isotopic and mineralogical composition of these tiny components can only be appropriately explored with full power, precision and flexibility of laboratory instrumentation. Laboratory instrumentation has the advantage that is state-of-the-art and is not limited by serious considerations of power, mass, cost or even reliability. The comparison of the comet sample, accumulated beyond Neptune, with asteroidal meteorites that accumulated just beyond the

  14. Sampling high-altitude and stratified mating flights of red imported fire ant.

    PubMed

    Fritz, Gary N; Fritz, Ann H; Vander Meer, Robert K

    2011-05-01

    With the exception of an airplane equipped with nets, no method has been developed that successfully samples red imported fire ant, Solenopsis invicta Buren, sexuals in mating/dispersal flights throughout their potential altitudinal trajectories. We developed and tested a method for sampling queens and males during mating flights at altitudinal intervals reaching as high as "140 m. Our trapping system uses an electric winch and a 1.2-m spindle bolted to a swiveling platform. The winch dispenses up to 183 m of Kevlar-core, nylon rope and the spindle stores 10 panels (0.9 by 4.6 m each) of nylon tulle impregnated with Tangle-Trap. The panels can be attached to the rope at various intervals and hoisted into the air by using a 3-m-diameter, helium-filled balloon. Raising or lowering all 10 panels takes approximately 15-20 min. This trap also should be useful for altitudinal sampling of other insects of medical importance.

  15. System and Method for Isolation of Samples

    NASA Technical Reports Server (NTRS)

    Zhang, Ye (Inventor); Wu, Honglu (Inventor)

    2014-01-01

    Systems and methods for isolating samples are provided. The system comprises a first membrane and a second membrane disposed within an enclosure. First and second reservoirs can also be disposed within the enclosure and adapted to contain one or more reagents therein. A first valve can be disposed within the enclosure and in fluid communication with the first reservoir, the second reservoir, or both. The first valve can also be in fluid communication with the first or second membranes or both. The first valve can be adapted to selectively regulate the flow of the reagents from the first reservoir, through at least one of the first and second membranes, and into the second reservoir.

  16. [Progress in sample preparation and analytical methods for trace polar small molecules in complex samples].

    PubMed

    Zhang, Qianchun; Luo, Xialin; Li, Gongke; Xiao, Xiaohua

    2015-09-01

    Small polar molecules such as nucleosides, amines, amino acids are important analytes in biological, food, environmental, and other fields. It is necessary to develop efficient sample preparation and sensitive analytical methods for rapid analysis of these polar small molecules in complex matrices. Some typical materials in sample preparation, including silica, polymer, carbon, boric acid and so on, are introduced in this paper. Meanwhile, the applications and developments of analytical methods of polar small molecules, such as reversed-phase liquid chromatography, hydrophilic interaction chromatography, etc., are also reviewed.

  17. Sample preparation methods for determination of drugs of abuse in hair samples: A review.

    PubMed

    Vogliardi, Susanna; Tucci, Marianna; Stocchero, Giulia; Ferrara, Santo Davide; Favretto, Donata

    2015-02-01

    Hair analysis has assumed increasing importance in the determination of substances of abuse, both in clinical and forensic toxicology investigations. Hair analysis offers particular advantages over other biological matrices (blood and urine), including a larger window of detection, ease of collection and sample stability. In the present work, an overview of sample preparation techniques for the determination of substances of abuse in hair is provided, specifically regarding the principal steps in hair sample treatment-decontamination, extraction and purification. For this purpose, a survey of publications found in the MEDLINE database from 2000 to date was conducted. The most widely consumed substances of abuse and psychotropic drugs were considered. Trends in simplification of hair sample preparation, washing procedures and cleanup methods are discussed. Alternative sample extraction techniques, such as head-space solid phase microextraction (HS-SPDE), supercritical fluid extraction (SFE) and molecularly imprinted polymers (MIP) are also reported.

  18. A Review of Methods for Detecting Melamine in Food Samples.

    PubMed

    Lu, Yang; Xia, Yinqiang; Liu, Guozhen; Pan, Mingfei; Li, Mengjuan; Lee, Nanju Alice; Wang, Shuo

    2017-01-02

    Melamine is a synthetic chemical used in the manufacture of resins, pigments, and superplasticizers. Human beings can be exposed to melamine through various sources such as migration from related products into foods, pesticide contamination, and illegal addition to foods. Toxicity studies suggest that prolonged consumption of melamine could lead to the formation of kidney stones or even death. Therefore, reliable and accurate detection methods are essential to prevent human exposure to melamine. Sample preparation is of critical importance, since it could directly affect the performance of analytical methods. Some methods for the detection of melamine include instrumental analysis, immunoassays, and sensor methods. In this paper, we have summarized the state-of-the-art methods used for food sample preparation as well as the various detection techniques available for melamine. Combinations of multiple techniques and new materials used in the detection of melamine have also been reviewed. Finally, future perspectives on the applications of microfluidic devices have also been provided.

  19. Efficiency of snake sampling methods in the Brazilian semiarid region.

    PubMed

    Mesquita, Paula C M D; Passos, Daniel C; Cechin, Sonia Z

    2013-09-01

    The choice of sampling methods is a crucial step in every field survey in herpetology. In countries where time and financial support are limited, the choice of the methods is critical. The methods used to sample snakes often lack objective criteria, and the traditional methods have apparently been more important when making the choice. Consequently researches using not-standardized methods are frequently found in the literature. We have compared four commonly used methods for sampling snake assemblages in a semiarid area in Brazil. We compared the efficacy of each method based on the cost-benefit regarding the number of individuals and species captured, time, and financial investment. We found that pitfall traps were the less effective method in all aspects that were evaluated and it was not complementary to the other methods in terms of abundance of species and assemblage structure. We conclude that methods can only be considered complementary if they are standardized to the objectives of the study. The use of pitfall traps in short-term surveys of the snake fauna in areas with shrubby vegetation and stony soil is not recommended.

  20. Evaluation of Sampling Methods for Bacillus Spore ...

    EPA Pesticide Factsheets

    Journal Article Following a wide area release of biological materials, mapping the extent of contamination is essential for orderly response and decontamination operations. HVAC filters process large volumes of air and therefore collect highly representative particulate samples in buildings. HVAC filter extraction may have great utility in rapidly estimating the extent of building contamination following a large-scale incident. However, until now, no studies have been conducted comparing the two most appropriate sampling approaches for HVAC filter materials: direct extraction and vacuum-based sampling.

  1. Characterizing lentic freshwater fish assemblages using multiple sampling methods

    USGS Publications Warehouse

    Fischer, Jesse R.; Quist, Michael

    2014-01-01

    Characterizing fish assemblages in lentic ecosystems is difficult, and multiple sampling methods are almost always necessary to gain reliable estimates of indices such as species richness. However, most research focused on lentic fish sampling methodology has targeted recreationally important species, and little to no information is available regarding the influence of multiple methods and timing (i.e., temporal variation) on characterizing entire fish assemblages. Therefore, six lakes and impoundments (48–1,557 ha surface area) were sampled seasonally with seven gear types to evaluate the combined influence of sampling methods and timing on the number of species and individuals sampled. Probabilities of detection for species indicated strong selectivities and seasonal trends that provide guidance on optimal seasons to use gears when targeting multiple species. The evaluation of species richness and number of individuals sampled using multiple gear combinations demonstrated that appreciable benefits over relatively few gears (e.g., to four) used in optimal seasons were not present. Specifically, over 90 % of the species encountered with all gear types and season combinations (N = 19) from six lakes and reservoirs were sampled with nighttime boat electrofishing in the fall and benthic trawling, modified-fyke, and mini-fyke netting during the summer. Our results indicated that the characterization of lentic fish assemblages was highly influenced by the selection of sampling gears and seasons, but did not appear to be influenced by waterbody type (i.e., natural lake, impoundment). The standardization of data collected with multiple methods and seasons to account for bias is imperative to monitoring of lentic ecosystems and will provide researchers with increased reliability in their interpretations and decisions made using information on lentic fish assemblages.

  2. The experience sampling method: Investigating students' affective experience

    NASA Astrophysics Data System (ADS)

    Nissen, Jayson M.; Stetzer, MacKenzie R.; Shemwell, Jonathan T.

    2013-01-01

    Improving non-cognitive outcomes such as attitudes, efficacy, and persistence in physics courses is an important goal of physics education. This investigation implemented an in-the-moment surveying technique called the Experience Sampling Method (ESM) [1] to measure students' affective experience in physics. Measurements included: self-efficacy, cognitive efficiency, activation, intrinsic motivation, and affect. Data are presented that show contrasts in students' experiences (e.g., in physics vs. non-physics courses).

  3. A new approach to importance sampling for the simulation of false alarms. [in radar systems

    NASA Technical Reports Server (NTRS)

    Lu, D.; Yao, K.

    1987-01-01

    In this paper a modified importance sampling technique for improving the convergence of Importance Sampling is given. By using this approach to estimate low false alarm rates in radar simulations, the number of Monte Carlo runs can be reduced significantly. For one-dimensional exponential, Weibull, and Rayleigh distributions, a uniformly minimum variance unbiased estimator is obtained. For Gaussian distribution the estimator in this approach is uniformly better than that of previously known Importance Sampling approach. For a cell averaging system, by combining this technique and group sampling, the reduction of Monte Carlo runs for a reference cell of 20 and false alarm rate of lE-6 is on the order of 170 as compared to the previously known Importance Sampling approach.

  4. Rapid detection and differentiation of important Campylobacter spp. in poultry samples by dot blot and PCR.

    PubMed

    Fontanot, Marco; Iacumin, Lucilla; Cecchini, Francesca; Comi, Giuseppe; Manzano, Marisa

    2014-10-01

    The detection of Campylobacter, the most commonly reported cause of foodborne gastroenteritis in the European Union, is very important for human health. The most commonly recognised risk factor for infection is the handling and/or consumption of undercooked poultry meat. The methods typically applied to evaluate the presence/absence of Campylobacter in food samples are direct plating and/or enrichment culture based on the Horizontal Method for Detection and Enumeration of Campylobacter spp. (ISO 10272-1B: 2006) and PCR. Molecular methods also allow for the detection of cells that are viable but cannot be cultivated on agar media and that decrease the time required for species identification. The current study proposes the use of two molecular methods for species identification: dot blot and PCR. The dot blot method had a sensitivity of 25 ng for detection of DNA extracted from a pure culture using a digoxigenin-labelled probe for hybridisation; the target DNA was extracted from the enrichment broth at 24 h. PCR was performed using a pair of sensitive and specific primers for the detection of Campylobacter jejuni and Campylobacter coli after 24 h of enrichment in Preston broth. The initial samples were contaminated by 5 × 10 C. jejuni cells/g and 1.5 × 10(2)C. coli cells/g, thus the number of cells present in the enrichment broth at 0 h was 1 or 3 cell/g, respectively.

  5. System and method for extracting a sample from a surface

    DOEpatents

    Van Berkel, Gary; Covey, Thomas

    2015-06-23

    A system and method is disclosed for extracting a sample from a sample surface. A sample is provided and a sample surface receives the sample which is deposited on the sample surface. A hydrophobic material is applied to the sample surface, and one or more devices are configured to dispense a liquid on the sample, the liquid dissolving the sample to form a dissolved sample material, and the one or more devices are configured to extract the dissolved sample material from the sample surface.

  6. GeoLab Concept: The Importance of Sample Selection During Long Duration Human Exploration Mission

    NASA Technical Reports Server (NTRS)

    Calaway, M. J.; Evans, C. A.; Bell, M. S.; Graff, T. G.

    2011-01-01

    In the future when humans explore planetary surfaces on the Moon, Mars, and asteroids or beyond, the return of geologic samples to Earth will be a high priority for human spaceflight operations. All future sample return missions will have strict down-mass and volume requirements; methods for in-situ sample assessment and prioritization will be critical for selecting the best samples for return-to-Earth.

  7. Examination of Hydrate Formation Methods: Trying to Create Representative Samples

    SciTech Connect

    Kneafsey, T.J.; Rees, E.V.L.; Nakagawa, S.; Kwon, T.-H.

    2011-04-01

    Forming representative gas hydrate-bearing laboratory samples is important so that the properties of these materials may be measured, while controlling the composition and other variables. Natural samples are rare, and have often experienced pressure and temperature changes that may affect the property to be measured [Waite et al., 2008]. Forming methane hydrate samples in the laboratory has been done a number of ways, each having advantages and disadvantages. The ice-to-hydrate method [Stern et al., 1996], contacts melting ice with methane at the appropriate pressure to form hydrate. The hydrate can then be crushed and mixed with mineral grains under controlled conditions, and then compacted to create laboratory samples of methane hydrate in a mineral medium. The hydrate in these samples will be part of the load-bearing frame of the medium. In the excess gas method [Handa and Stupin, 1992], water is distributed throughout a mineral medium (e.g. packed moist sand, drained sand, moistened silica gel, other porous media) and the mixture is brought to hydrate-stable conditions (chilled and pressurized with gas), allowing hydrate to form. This method typically produces grain-cementing hydrate from pendular water in sand [Waite et al., 2004]. In the dissolved gas method [Tohidi et al., 2002], water with sufficient dissolved guest molecules is brought to hydrate-stable conditions where hydrate forms. In the laboratory, this is can be done by pre-dissolving the gas of interest in water and then introducing it to the sample under the appropriate conditions. With this method, it is easier to form hydrate from more soluble gases such as carbon dioxide. It is thought that this method more closely simulates the way most natural gas hydrate has formed. Laboratory implementation, however, is difficult, and sample formation is prohibitively time consuming [Minagawa et al., 2005; Spangenberg and Kulenkampff, 2005]. In another version of this technique, a specified quantity of gas

  8. The jigsaw puzzle of sequence phenotype inference: Piecing together Shannon entropy, importance sampling, and Empirical Bayes.

    PubMed

    Shreif, Zeina; Striegel, Deborah A; Periwal, Vipul

    2015-09-07

    A nucleotide sequence 35 base pairs long can take 1,180,591,620,717,411,303,424 possible values. An example of systems biology datasets, protein binding microarrays, contain activity data from about 40,000 such sequences. The discrepancy between the number of possible configurations and the available activities is enormous. Thus, albeit that systems biology datasets are large in absolute terms, they oftentimes require methods developed for rare events due to the combinatorial increase in the number of possible configurations of biological systems. A plethora of techniques for handling large datasets, such as Empirical Bayes, or rare events, such as importance sampling, have been developed in the literature, but these cannot always be simultaneously utilized. Here we introduce a principled approach to Empirical Bayes based on importance sampling, information theory, and theoretical physics in the general context of sequence phenotype model induction. We present the analytical calculations that underlie our approach. We demonstrate the computational efficiency of the approach on concrete examples, and demonstrate its efficacy by applying the theory to publicly available protein binding microarray transcription factor datasets and to data on synthetic cAMP-regulated enhancer sequences. As further demonstrations, we find transcription factor binding motifs, predict the activity of new sequences and extract the locations of transcription factor binding sites. In summary, we present a novel method that is efficient (requiring minimal computational time and reasonable amounts of memory), has high predictive power that is comparable with that of models with hundreds of parameters, and has a limited number of optimized parameters, proportional to the sequence length.

  9. Blood Sampling Seasonality as an Important Preanalytical Factor for Assessment of Vitamin D Status

    PubMed Central

    Bonelli, Patrizia; Buonocore, Ruggero; Aloe, Rosalia

    2016-01-01

    Summary Background The measurement of vitamin D is now commonplace for preventing osteoporosis and restoring an appropriate concentration that would be effective to counteract the occurrence of other human disorders. The aim of this study was to establish whether blood sampling seasonality may influence total vitamin D concentration in a general population of Italian unselected outpatients. Methods We performed a retrospective search in the laboratory information system of the University Hospital of Parma (Italy, temperate climate), to identify the values of total serum vitamin D (25-hydroxyvitamin D) measured in outpatients aged 18 years and older, who were referred for routine health check-up during the entire year 2014. Results The study population consisted in 11,150 outpatients (median age 62 years; 8592 women and 2558 men). The concentration of vitamin D was consistently lower in samples collected in Winter than in the other three seasons. The frequency of subjects with vitamin D deficiency was approximately double in samples drawn in Winter and Spring than in Summer and Autumn. In the multivariate analysis, the concentration of total vitamin D was found to be independently associated with sex and season of blood testing, but not with the age of the patients. Conclusions According to these findings, blood sampling seasonality should be regarded as an important preanalytical factor in vitamin D assessment. It is also reasonable to suggest that the amount of total vitamin D synthesized during the summer should be high enough to maintain the levels > 50 nmol/L throughout the remaining part of the year. PMID:28356869

  10. Exploration and Sampling Methods for Borrow Areas

    DTIC Science & Technology

    1990-12-01

    The principal geomorphic features associated with stratified deposits are kames, eskers , and proglacial outwash material. 60. Kames are moundlike...deposits without obtaining core samples or geophysical data. 61. Eskers are sinuous ridgelike features that form in channels, tun- nels, and crevasses in...the ice by meltwater transport and deposition of gla- cial material. Eskers are variable in dimensions but, at their largest, can be up to 100 m high

  11. 40 CFR 80.335 - What gasoline sample retention requirements apply to refiners and importers?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 16 2010-07-01 2010-07-01 false What gasoline sample retention... PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) REGULATION OF FUELS AND FUEL ADDITIVES Gasoline Sulfur Sampling, Testing and Retention Requirements for Refiners and Importers § 80.335 What gasoline...

  12. 40 CFR 80.335 - What gasoline sample retention requirements apply to refiners and importers?

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 17 2013-07-01 2013-07-01 false What gasoline sample retention... PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) REGULATION OF FUELS AND FUEL ADDITIVES Gasoline Sulfur Sampling, Testing and Retention Requirements for Refiners and Importers § 80.335 What gasoline...

  13. 40 CFR 80.335 - What gasoline sample retention requirements apply to refiners and importers?

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 17 2014-07-01 2014-07-01 false What gasoline sample retention... PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) REGULATION OF FUELS AND FUEL ADDITIVES Gasoline Sulfur Sampling, Testing and Retention Requirements for Refiners and Importers § 80.335 What gasoline...

  14. 40 CFR 80.335 - What gasoline sample retention requirements apply to refiners and importers?

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 17 2012-07-01 2012-07-01 false What gasoline sample retention... PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) REGULATION OF FUELS AND FUEL ADDITIVES Gasoline Sulfur Sampling, Testing and Retention Requirements for Refiners and Importers § 80.335 What gasoline...

  15. 40 CFR 80.335 - What gasoline sample retention requirements apply to refiners and importers?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 16 2011-07-01 2011-07-01 false What gasoline sample retention... PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) REGULATION OF FUELS AND FUEL ADDITIVES Gasoline Sulfur Sampling, Testing and Retention Requirements for Refiners and Importers § 80.335 What gasoline...

  16. Development of an automated data processing method for sample to sample comparison of seized methamphetamines.

    PubMed

    Choe, Sanggil; Lee, Jaesin; Choi, Hyeyoung; Park, Yujin; Lee, Heesang; Pyo, Jaesung; Jo, Jiyeong; Park, Yonghoon; Choi, Hwakyung; Kim, Suncheun

    2012-11-30

    The information about the sources of supply, trafficking routes, distribution patterns and conspiracy links can be obtained from methamphetamine profiling. The precursor and synthetic method for the clandestine manufacture can be estimated from the analysis of minor impurities contained in methamphetamine. Also, the similarity between samples can be evaluated using the peaks that appear in chromatograms. In South Korea, methamphetamine was the most popular drug but the total seized amount of methamphetamine whole through the country was very small. Therefore, it would be more important to find the links between samples than the other uses of methamphetamine profiling. Many Asian countries including Japan and South Korea have been using the method developed by National Research Institute of Police Science of Japan. The method used gas chromatography-flame ionization detector (GC-FID), DB-5 column and four internal standards. It was developed to increase the amount of impurities and minimize the amount of methamphetamine. After GC-FID analysis, the raw data have to be processed. The data processing steps are very complex and require a lot of time and effort. In this study, Microsoft Visual Basic Application (VBA) modules were developed to handle these data processing steps. This module collected the results from the data into an Excel file and then corrected the retention time shift and response deviation generated from the sample preparation and instruments analysis. The developed modules were tested for their performance using 10 samples from 5 different cases. The processed results were analyzed with Pearson correlation coefficient for similarity assessment and the correlation coefficient of the two samples from the same case was more than 0.99. When the modules were applied to 131 seized methamphetamine samples, four samples from two different cases were found to have the common origin and the chromatograms of the four samples were appeared visually identical

  17. A whole-path importance-sampling scheme for Feynman path integral calculations of absolute partition functions and free energies.

    PubMed

    Mielke, Steven L; Truhlar, Donald G

    2016-01-21

    Using Feynman path integrals, a molecular partition function can be written as a double integral with the inner integral involving all closed paths centered at a given molecular configuration, and the outer integral involving all possible molecular configurations. In previous work employing Monte Carlo methods to evaluate such partition functions, we presented schemes for importance sampling and stratification in the molecular configurations that constitute the path centroids, but we relied on free-particle paths for sampling the path integrals. At low temperatures, the path sampling is expensive because the paths can travel far from the centroid configuration. We now present a scheme for importance sampling of whole Feynman paths based on harmonic information from an instantaneous normal mode calculation at the centroid configuration, which we refer to as harmonically guided whole-path importance sampling (WPIS). We obtain paths conforming to our chosen importance function by rejection sampling from a distribution of free-particle paths. Sample calculations on CH4 demonstrate that at a temperature of 200 K, about 99.9% of the free-particle paths can be rejected without integration, and at 300 K, about 98% can be rejected. We also show that it is typically possible to reduce the overhead associated with the WPIS scheme by sampling the paths using a significantly lower-order path discretization than that which is needed to converge the partition function.

  18. Coalescent: an open-science framework for importance sampling in coalescent theory

    PubMed Central

    Spouge, John L.

    2015-01-01

    Background. In coalescent theory, computer programs often use importance sampling to calculate likelihoods and other statistical quantities. An importance sampling scheme can exploit human intuition to improve statistical efficiency of computations, but unfortunately, in the absence of general computer frameworks on importance sampling, researchers often struggle to translate new sampling schemes computationally or benchmark against different schemes, in a manner that is reliable and maintainable. Moreover, most studies use computer programs lacking a convenient user interface or the flexibility to meet the current demands of open science. In particular, current computer frameworks can only evaluate the efficiency of a single importance sampling scheme or compare the efficiencies of different schemes in an ad hoc manner. Results. We have designed a general framework (http://coalescent.sourceforge.net; language: Java; License: GPLv3) for importance sampling that computes likelihoods under the standard neutral coalescent model of a single, well-mixed population of constant size over time following infinite sites model of mutation. The framework models the necessary core concepts, comes integrated with several data sets of varying size, implements the standard competing proposals, and integrates tightly with our previous framework for calculating exact probabilities. For a given dataset, it computes the likelihood and provides the maximum likelihood estimate of the mutation parameter. Well-known benchmarks in the coalescent literature validate the accuracy of the framework. The framework provides an intuitive user interface with minimal clutter. For performance, the framework switches automatically to modern multicore hardware, if available. It runs on three major platforms (Windows, Mac and Linux). Extensive tests and coverage make the framework reliable and maintainable. Conclusions. In coalescent theory, many studies of computational efficiency consider only

  19. Coalescent: an open-science framework for importance sampling in coalescent theory.

    PubMed

    Tewari, Susanta; Spouge, John L

    2015-01-01

    Background. In coalescent theory, computer programs often use importance sampling to calculate likelihoods and other statistical quantities. An importance sampling scheme can exploit human intuition to improve statistical efficiency of computations, but unfortunately, in the absence of general computer frameworks on importance sampling, researchers often struggle to translate new sampling schemes computationally or benchmark against different schemes, in a manner that is reliable and maintainable. Moreover, most studies use computer programs lacking a convenient user interface or the flexibility to meet the current demands of open science. In particular, current computer frameworks can only evaluate the efficiency of a single importance sampling scheme or compare the efficiencies of different schemes in an ad hoc manner. Results. We have designed a general framework (http://coalescent.sourceforge.net; language: Java; License: GPLv3) for importance sampling that computes likelihoods under the standard neutral coalescent model of a single, well-mixed population of constant size over time following infinite sites model of mutation. The framework models the necessary core concepts, comes integrated with several data sets of varying size, implements the standard competing proposals, and integrates tightly with our previous framework for calculating exact probabilities. For a given dataset, it computes the likelihood and provides the maximum likelihood estimate of the mutation parameter. Well-known benchmarks in the coalescent literature validate the accuracy of the framework. The framework provides an intuitive user interface with minimal clutter. For performance, the framework switches automatically to modern multicore hardware, if available. It runs on three major platforms (Windows, Mac and Linux). Extensive tests and coverage make the framework reliable and maintainable. Conclusions. In coalescent theory, many studies of computational efficiency consider only

  20. Modified electrokinetic sample injection method in chromatography and electrophoresis analysis

    DOEpatents

    Davidson, J. Courtney; Balch, Joseph W.

    2001-01-01

    A sample injection method for horizontal configured multiple chromatography or electrophoresis units, each containing a number of separation/analysis channels, that enables efficient introduction of analyte samples. This method for loading when taken in conjunction with horizontal microchannels allows much reduced sample volumes and a means of sample stacking to greatly reduce the concentration of the sample. This reduction in the amount of sample can lead to great cost savings in sample preparation, particularly in massively parallel applications such as DNA sequencing. The essence of this method is in preparation of the input of the separation channel, the physical sample introduction, and subsequent removal of excess material. By this method, sample volumes of 100 nanoliter to 2 microliters have been used successfully, compared to the typical 5 microliters of sample required by the prior separation/analysis method.

  1. Surface Sampling Methods for Bacillus anthracis Spore Contamination

    PubMed Central

    Hein, Misty J.; Taylor, Lauralynn; Curwin, Brian D.; Kinnes, Gregory M.; Seitz, Teresa A.; Popovic, Tanja; Holmes, Harvey T.; Kellum, Molly E.; McAllister, Sigrid K.; Whaley, David N.; Tupin, Edward A.; Walker, Timothy; Freed, Jennifer A.; Small, Dorothy S.; Klusaritz, Brian; Bridges, John H.

    2002-01-01

    During an investigation conducted December 17–20, 2001, we collected environmental samples from a U.S. postal facility in Washington, D.C., known to be extensively contaminated with Bacillus anthracis spores. Because methods for collecting and analyzing B. anthracis spores have not yet been validated, our objective was to compare the relative effectiveness of sampling methods used for collecting spores from contaminated surfaces. Comparison of wipe, wet and dry swab, and HEPA vacuum sock samples on nonporous surfaces indicated good agreement between results with HEPA vacuum and wipe samples. However, results from HEPA vacuum sock and wipe samples agreed poorly with the swab samples. Dry swabs failed to detect spores >75% of the time they were detected by wipe and HEPA vacuum samples. Wipe samples collected after HEPA vacuum samples and HEPA vacuum samples after wipe samples indicated that neither method completely removed spores from the sampled surfaces. PMID:12396930

  2. 7 CFR 58.245 - Method of sample analysis.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 7 Agriculture 3 2013-01-01 2013-01-01 false Method of sample analysis. 58.245 Section 58.245... Procedures § 58.245 Method of sample analysis. Samples shall be tested according to the applicable methods of laboratory analysis contained in either DA Instruction 918-RL as issued by the USDA, Agricultural...

  3. 7 CFR 58.812 - Methods of sample analysis.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 7 Agriculture 3 2013-01-01 2013-01-01 false Methods of sample analysis. 58.812 Section 58.812... Procedures § 58.812 Methods of sample analysis. Samples shall be tested according to the applicable methods of laboratory analysis contained in either DA Instruction 918-RL, as issued by the USDA,...

  4. 7 CFR 58.812 - Methods of sample analysis.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 3 2010-01-01 2010-01-01 false Methods of sample analysis. 58.812 Section 58.812... Procedures § 58.812 Methods of sample analysis. Samples shall be tested according to the applicable methods of laboratory analysis contained in either DA Instruction 918-RL, as issued by the USDA,...

  5. 7 CFR 58.245 - Method of sample analysis.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 3 2010-01-01 2010-01-01 false Method of sample analysis. 58.245 Section 58.245... Procedures § 58.245 Method of sample analysis. Samples shall be tested according to the applicable methods of laboratory analysis contained in either DA Instruction 918-RL as issued by the USDA, Agricultural...

  6. Microfluidic DNA sample preparation method and device

    DOEpatents

    Krulevitch, Peter A.; Miles, Robin R.; Wang, Xiao-Bo; Mariella, Raymond P.; Gascoyne, Peter R. C.; Balch, Joseph W.

    2002-01-01

    Manipulation of DNA molecules in solution has become an essential aspect of genetic analyses used for biomedical assays, the identification of hazardous bacterial agents, and in decoding the human genome. Currently, most of the steps involved in preparing a DNA sample for analysis are performed manually and are time, labor, and equipment intensive. These steps include extraction of the DNA from spores or cells, separation of the DNA from other particles and molecules in the solution (e.g. dust, smoke, cell/spore debris, and proteins), and separation of the DNA itself into strands of specific lengths. Dielectrophoresis (DEP), a phenomenon whereby polarizable particles move in response to a gradient in electric field, can be used to manipulate and separate DNA in an automated fashion, considerably reducing the time and expense involved in DNA analyses, as well as allowing for the miniaturization of DNA analysis instruments. These applications include direct transport of DNA, trapping of DNA to allow for its separation from other particles or molecules in the solution, and the separation of DNA into strands of varying lengths.

  7. Molecular method for the diagnosis of imported pediatric malaria.

    PubMed

    Delhaes Jeanne, L; Berry, A; Dutoit, E; Leclerc, F; Beaudou, J; Leteurtre, S; Camus, D; Benoit-Vical, F

    2010-02-01

    Malaria is a polymorphous disease; it can be life threatening especially for children. We report a case of imported malaria in a boy, illustrating the epidemiological and clinical aspects of severe pediatric malaria. In this case real-time PCR was used to quantify Plasmodium falciparum DNA levels, to monitor the evolution under treatment, and to determine genetic mutations involved in chloroquine resistance. The major epidemiological features of imported malaria, and the difficulty to diagnose childhood severe malaria are described. The contribution of molecular methods for the diagnosis of imported malaria is discussed.

  8. Method for sampling sub-micron particles

    DOEpatents

    Gay, Don D.; McMillan, William G.

    1985-01-01

    Apparatus and method steps for collecting sub-micron sized particles include a collection chamber and cryogenic cooling. The cooling is accomplished by coil tubing carrying nitrogen in liquid form, with the liquid nitrogen changing to the gas phase before exiting from the collection chamber in the tubing. Standard filters are used to filter out particles of diameter greater than or equal to 0.3 microns; however the present invention is used to trap particles of less than 0.3 micron in diameter. A blower draws air to said collection chamber through a filter which filters particles with diameters greater than or equal to 0.3 micron. The air is then cryogenically cooled so that moisture and sub-micron sized particles in the air condense into ice on the coil. The coil is then heated so that the ice melts, and the liquid is then drawn off and passed through a Buchner funnel where the liquid is passed through a Nuclepore membrane. A vacuum draws the liquid through the Nuclepore membrane, with the Nuclepore membrane trapping sub-micron sized particles therein. The Nuclepore membrane is then covered on its top and bottom surfaces with sheets of Mylar.RTM. and the assembly is then crushed into a pellet. This effectively traps the sub-micron sized particles for later analysis.

  9. [A membrane filter sampling method for determining microbial air pollution].

    PubMed

    Cherneva, P; Kiranova, A

    1996-01-01

    The method is a contribution in the evaluation of the exposition and the control of the standards for organic powders. The method concerns the sample-taking procedure and the analysis-making technique for determining of the concentration of the microbial pollution of the air. It is based on filtering of some amount of air through a membrane filter which is then processed for cultivating of microbial colonies on its surface. The results are obtained in number of microbial colonies per unit of air. The method presents opportunity to select and vary the filtered volume of air, to determine the respirable fraction, to determine the personal exposition, as well as for the simultaneous determining of the microbial pollution together with other important parameters of the particle pollutants of the air (metal, fibre and others).

  10. Methods for characterizing, classifying, and identifying unknowns in samples

    DOEpatents

    Grate, Jay W [West Richland, WA; Wise, Barry M [Manson, WA

    2002-01-01

    Disclosed is a method for taking the data generated from an array of responses from a multichannel instrument, and determining the characteristics of a chemical in the sample without the necessity of calibrating or training the instrument with known samples containing the same chemical. The characteristics determined by the method are then used to classify and identify the chemical in the sample. The method can also be used to quantify the concentration of the chemical in the sample.

  11. Methods for characterizing, classifying, and identifying unknowns in samples

    DOEpatents

    Grate, Jay W.; Wise, Barry M.

    2003-08-12

    Disclosed is a method for taking the data generated from an array of responses from a multichannel instrument, and determining the characteristics of a chemical in the sample without the necessity of calibrating or training the instrument with known samples containing the same chemical. The characteristics determined by the method are then used to classify and identify the chemical in the sample. The method can also be used to quantify the concentration of the chemical in the sample.

  12. Sediment sampling and processing methods in Hungary, and possible improvements

    NASA Astrophysics Data System (ADS)

    Tamas, Eniko Anna; Koch, Daniel; Varga, Gyorgy

    2016-04-01

    The importance of the monitoring of sediment processes is unquestionable: sediment balance of regulated rivers suffered substantial alterations in the past century, affecting navigation, energy production, fish habitats and floodplain ecosystems alike; infiltration times to our drinking water wells have shortened, exposing them to an eventual pollution event and making them vulnerable; and sediment-attached contaminants accumulate in floodplains and reservoirs, threatening our healthy environment. The changes in flood characteristics and rating curves of our rivers are regularly being researched and described, involving state-of-the-art measurement methods, modeling tools and traditional statistics. Sediment processes however, are much less known. Unlike the investigation of flow processes, sediment-related research is scarce, which is partly due to the outdated methodology and poor database background in the specific field. Sediment-related data, information and analyses form an important and integral part of Civil engineering in relation to rivers all over the world. In relation to the second largest river of Europe, the Danube, it is widely known in expert community and for long discussed at different expert forums that the sediment balance of the river Danube has changed drastically over the past century. Sediment monitoring on the river Danube started as early as the end of the 19th century, with scattered measurements carried out. Regular sediment sampling was developed in the first half of the 20th century all along the river, with different station density and monitoring frequencies in different countries. After the first few decades of regular sampling, the concept of (mainly industrial) development changed along the river and data needs changed as well, furthermore the complicated and inexact methods of sampling bed load on the alluvial reach of the river were not developed further. Frequency of suspended sediment sampling is very low along the river

  13. Post awakening salivary cortisol secretion and trait well-being: The importance of sample timing accuracy.

    PubMed

    Smyth, Nina; Thorn, Lisa; Hucklebridge, Frank; Evans, Phil; Clow, Angela

    2015-08-01

    Indices of post awakening cortisol secretion (PACS), include the rise in cortisol (cortisol awakening response: CAR) and overall cortisol concentrations (e.g., area under the curve with reference to ground: AUCg) in the first 30-45 min. Both are commonly investigated in relation to psychosocial variables. Although sampling within the domestic setting is ecologically valid, participant non-adherence to the required timing protocol results in erroneous measurement of PACS and this may explain discrepancies in the literature linking these measures to trait well-being (TWB). We have previously shown that delays of little over 5 min (between awakening and the start of sampling) to result in erroneous CAR estimates. In this study, we report for the first time on the negative impact of sample timing inaccuracy (verified by electronic-monitoring) on the efficacy to detect significant relationships between PACS and TWB when measured in the domestic setting. Healthy females (N=49, 20.5±2.8 years) selected for differences in TWB collected saliva samples (S1-4) on 4 days at 0, 15, 30, 45 min post awakening, to determine PACS. Adherence to the sampling protocol was objectively monitored using a combination of electronic estimates of awakening (actigraphy) and sampling times (track caps). Relationships between PACS and TWB were found to depend on sample timing accuracy. Lower TWB was associated with higher post awakening cortisol AUCg in proportion to the mean sample timing accuracy (p<.005). There was no association between TWB and the CAR even taking into account sample timing accuracy. These results highlight the importance of careful electronic monitoring of participant adherence for measurement of PACS in the domestic setting. Mean sample timing inaccuracy, mainly associated with delays of >5 min between awakening and collection of sample 1 (median=8 min delay), negatively impacts on the sensitivity of analysis to detect associations between PACS and TWB.

  14. Cool walking: a new Markov chain Monte Carlo sampling method.

    PubMed

    Brown, Scott; Head-Gordon, Teresa

    2003-01-15

    Effective relaxation processes for difficult systems like proteins or spin glasses require special simulation techniques that permit barrier crossing to ensure ergodic sampling. Numerous adaptations of the venerable Metropolis Monte Carlo (MMC) algorithm have been proposed to improve its sampling efficiency, including various hybrid Monte Carlo (HMC) schemes, and methods designed specifically for overcoming quasi-ergodicity problems such as Jump Walking (J-Walking), Smart Walking (S-Walking), Smart Darting, and Parallel Tempering. We present an alternative to these approaches that we call Cool Walking, or C-Walking. In C-Walking two Markov chains are propagated in tandem, one at a high (ergodic) temperature and the other at a low temperature. Nonlocal trial moves for the low temperature walker are generated by first sampling from the high-temperature distribution, then performing a statistical quenching process on the sampled configuration to generate a C-Walking jump move. C-Walking needs only one high-temperature walker, satisfies detailed balance, and offers the important practical advantage that the high and low-temperature walkers can be run in tandem with minimal degradation of sampling due to the presence of correlations. To make the C-Walking approach more suitable to real problems we decrease the required number of cooling steps by attempting to jump at intermediate temperatures during cooling. We further reduce the number of cooling steps by utilizing "windows" of states when jumping, which improves acceptance ratios and lowers the average number of cooling steps. We present C-Walking results with comparisons to J-Walking, S-Walking, Smart Darting, and Parallel Tempering on a one-dimensional rugged potential energy surface in which the exact normalized probability distribution is known. C-Walking shows superior sampling as judged by two ergodic measures.

  15. Photoacoustic sample vessel and method of elevated pressure operation

    DOEpatents

    Autrey, Tom; Yonker, Clement R.

    2004-05-04

    An improved photoacoustic vessel and method of photoacoustic analysis. The photoacoustic sample vessel comprises an acoustic detector, an acoustic couplant, and an acoustic coupler having a chamber for holding the acoustic couplant and a sample. The acoustic couplant is selected from the group consisting of liquid, solid, and combinations thereof. Passing electromagnetic energy through the sample generates an acoustic signal within the sample, whereby the acoustic signal propagates through the sample to and through the acoustic couplant to the acoustic detector.

  16. An adaptive importance sampling algorithm for Bayesian inversion with multimodal distributions

    SciTech Connect

    Li, Weixuan; Lin, Guang

    2015-08-01

    Parametric uncertainties are encountered in the simulations of many physical systems, and may be reduced by an inverse modeling procedure that calibrates the simulation results to observations on the real system being simulated. Following Bayes' rule, a general approach for inverse modeling problems is to sample from the posterior distribution of the uncertain model parameters given the observations. However, the large number of repetitive forward simulations required in the sampling process could pose a prohibitive computational burden. This difficulty is particularly challenging when the posterior is multimodal. We present in this paper an adaptive importance sampling algorithm to tackle these challenges. Two essential ingredients of the algorithm are: 1) a Gaussian mixture (GM) model adaptively constructed as the proposal distribution to approximate the possibly multimodal target posterior, and 2) a mixture of polynomial chaos (PC) expansions, built according to the GM proposal, as a surrogate model to alleviate the computational burden caused by computational-demanding forward model evaluations. In three illustrative examples, the proposed adaptive importance sampling algorithm demonstrates its capabilities of automatically finding a GM proposal with an appropriate number of modes for the specific problem under study, and obtaining a sample accurately and efficiently representing the posterior with limited number of forward simulations.

  17. An adaptive importance sampling algorithm for Bayesian inversion with multimodal distributions

    SciTech Connect

    Li, Weixuan; Lin, Guang

    2015-03-21

    Parametric uncertainties are encountered in the simulations of many physical systems, and may be reduced by an inverse modeling procedure that calibrates the simulation results to observations on the real system being simulated. Following Bayes’ rule, a general approach for inverse modeling problems is to sample from the posterior distribution of the uncertain model parameters given the observations. However, the large number of repetitive forward simulations required in the sampling process could pose a prohibitive computational burden. This difficulty is particularly challenging when the posterior is multimodal. We present in this paper an adaptive importance sampling algorithm to tackle these challenges. Two essential ingredients of the algorithm are: 1) a Gaussian mixture (GM) model adaptively constructed as the proposal distribution to approximate the possibly multimodal target posterior, and 2) a mixture of polynomial chaos (PC) expansions, built according to the GM proposal, as a surrogate model to alleviate the computational burden caused by computational-demanding forward model evaluations. In three illustrative examples, the proposed adaptive importance sampling algorithm demonstrates its capabilities of automatically finding a GM proposal with an appropriate number of modes for the specific problem under study, and obtaining a sample accurately and efficiently representing the posterior with limited number of forward simulations.

  18. An adaptive importance sampling algorithm for Bayesian inversion with multimodal distributions

    DOE PAGES

    Li, Weixuan; Lin, Guang

    2015-03-21

    Parametric uncertainties are encountered in the simulations of many physical systems, and may be reduced by an inverse modeling procedure that calibrates the simulation results to observations on the real system being simulated. Following Bayes’ rule, a general approach for inverse modeling problems is to sample from the posterior distribution of the uncertain model parameters given the observations. However, the large number of repetitive forward simulations required in the sampling process could pose a prohibitive computational burden. This difficulty is particularly challenging when the posterior is multimodal. We present in this paper an adaptive importance sampling algorithm to tackle thesemore » challenges. Two essential ingredients of the algorithm are: 1) a Gaussian mixture (GM) model adaptively constructed as the proposal distribution to approximate the possibly multimodal target posterior, and 2) a mixture of polynomial chaos (PC) expansions, built according to the GM proposal, as a surrogate model to alleviate the computational burden caused by computational-demanding forward model evaluations. In three illustrative examples, the proposed adaptive importance sampling algorithm demonstrates its capabilities of automatically finding a GM proposal with an appropriate number of modes for the specific problem under study, and obtaining a sample accurately and efficiently representing the posterior with limited number of forward simulations.« less

  19. Systems and methods for self-synchronized digital sampling

    NASA Technical Reports Server (NTRS)

    Samson, Jr., John R. (Inventor)

    2008-01-01

    Systems and methods for self-synchronized data sampling are provided. In one embodiment, a system for capturing synchronous data samples is provided. The system includes an analog to digital converter adapted to capture signals from one or more sensors and convert the signals into a stream of digital data samples at a sampling frequency determined by a sampling control signal; and a synchronizer coupled to the analog to digital converter and adapted to receive a rotational frequency signal from a rotating machine, wherein the synchronizer is further adapted to generate the sampling control signal, and wherein the sampling control signal is based on the rotational frequency signal.

  20. Performance evaluation of an importance sampling technique in a Jackson network

    NASA Astrophysics Data System (ADS)

    brahim Mahdipour, E.; Masoud Rahmani, Amir; Setayeshi, Saeed

    2014-03-01

    Importance sampling is a technique that is commonly used to speed up Monte Carlo simulation of rare events. However, little is known regarding the design of efficient importance sampling algorithms in the context of queueing networks. The standard approach, which simulates the system using an a priori fixed change of measure suggested by large deviation analysis, has been shown to fail in even the simplest network settings. Estimating probabilities associated with rare events has been a topic of great importance in queueing theory, and in applied probability at large. In this article, we analyse the performance of an importance sampling estimator for a rare event probability in a Jackson network. This article carries out strict deadlines to a two-node Jackson network with feedback whose arrival and service rates are modulated by an exogenous finite state Markov process. We have estimated the probability of network blocking for various sets of parameters, and also the probability of missing the deadline of customers for different loads and deadlines. We have finally shown that the probability of total population overflow may be affected by various deadline values, service rates and arrival rates.

  1. The Importance of Sample Processing in Analysis of Asbestos Content in Rocks and Soils

    NASA Astrophysics Data System (ADS)

    Neumann, R. D.; Wright, J.

    2012-12-01

    Analysis of asbestos content in rocks and soils using Air Resources Board (ARB) Test Method 435 (M435) involves the processing of samples for subsequent analysis by polarized light microscopy (PLM). The use of different equipment and procedures by commercial laboratories to pulverize rock and soil samples could result in different particle size distributions. It has long been theorized that asbestos-containing samples can be over-pulverized to the point where the particle dimensions of the asbestos no longer meet the required 3:1 length-to-width aspect ratio or the particles become so small that they no longer can be tested for optical characteristics using PLM where maximum PLM magnification is typically 400X. Recent work has shed some light on this issue. ARB staff conducted an interlaboratory study to investigate variability in preparation and analytical procedures used by laboratories performing M435 analysis. With regard to sample processing, ARB staff found that different pulverization equipment and processing procedures produced powders that have varying particle size distributions. PLM analysis of the finest powders produced by one laboratory showed all but one of the 12 samples were non-detect or below the PLM reporting limit; in contrast to the other 36 coarser samples from the same field sample and processed by three other laboratories where 21 samples were above the reporting limit. The set of 12, exceptionally fine powder samples produced by the same laboratory was re-analyzed by transmission electron microscopy (TEM) and results showed that these samples contained asbestos above the TEM reporting limit. However, the use of TEM as a stand-alone analytical procedure, usually performed at magnifications between 3,000 to 20,000X, also has its drawbacks because of the miniscule mass of sample that this method examines. The small amount of powder analyzed by TEM may not be representative of the field sample. The actual mass of the sample powder analyzed by

  2. Sampling bee communities using pan traps: alternative methods increase sample size

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Monitoring of the status of bee populations and inventories of bee faunas require systematic sampling. Efficiency and ease of implementation has encouraged the use of pan traps to sample bees. Efforts to find an optimal standardized sampling method for pan traps have focused on pan trap color. Th...

  3. Improved importance sampling technique for efficient simulation of digital communication systems

    NASA Technical Reports Server (NTRS)

    Lu, Dingqing; Yao, Kung

    1988-01-01

    A new, improved importance sampling (IIS) approach to simulation is considered. Some basic concepts of IS are introduced, and detailed evolutions of simulation estimation variances for Monte Carlo (MC) and IS simulations are given. The general results obtained from these evolutions are applied to the specific previously known conventional importance sampling (CIS) technique and the new IIS technique. The derivation for a linear system with no signal random memory is considered in some detail. For the CIS technique, the optimum input scaling parameter is found, while for the IIS technique, the optimum translation parameter is found. The results are generalized to a linear system with memory and signals. Specific numerical and simulation results are given which show the advantages of CIS over MC and IIS over CIS for simulations of digital communications systems.

  4. Burnout and Engagement: Relative Importance of Predictors and Outcomes in Two Health Care Worker Samples.

    PubMed

    Fragoso, Zachary L; Holcombe, Kyla J; McCluney, Courtney L; Fisher, Gwenith G; McGonagle, Alyssa K; Friebe, Susan J

    2016-06-09

    This study's purpose was twofold: first, to examine the relative importance of job demands and resources as predictors of burnout and engagement, and second, the relative importance of engagement and burnout related to health, depressive symptoms, work ability, organizational commitment, and turnover intentions in two samples of health care workers. Nurse leaders (n = 162) and licensed emergency medical technicians (EMTs; n = 102) completed surveys. In both samples, job demands predicted burnout more strongly than job resources, and job resources predicted engagement more strongly than job demands. Engagement held more weight than burnout for predicting commitment, and burnout held more weight for predicting health outcomes, depressive symptoms, and work ability. Results have implications for the design, evaluation, and effectiveness of workplace interventions to reduce burnout and improve engagement among health care workers. Actionable recommendations for increasing engagement and decreasing burnout in health care organizations are provided.

  5. Passive Samplers for Investigations of Air Quality: Method Description, Implementation, and Comparison to Alternative Sampling Methods

    EPA Science Inventory

    This Paper covers the basics of passive sampler design, compares passive samplers to conventional methods of air sampling, and discusses considerations when implementing a passive sampling program. The Paper also discusses field sampling and sample analysis considerations to ensu...

  6. DOE methods for evaluating environmental and waste management samples.

    SciTech Connect

    Goheen, S C; McCulloch, M; Thomas, B L; Riley, R G; Sklarew, D S; Mong, G M; Fadeff, S K

    1994-04-01

    DOE Methods for Evaluating Environmental and Waste Management Samples (DOE Methods) provides applicable methods in use by. the US Department of Energy (DOE) laboratories for sampling and analyzing constituents of waste and environmental samples. The development of DOE Methods is supported by the Laboratory Management Division (LMD) of the DOE. This document contains chapters and methods that are proposed for use in evaluating components of DOE environmental and waste management samples. DOE Methods is a resource intended to support sampling and analytical activities that will aid in defining the type and breadth of contamination and thus determine the extent of environmental restoration or waste management actions needed, as defined by the DOE, the US Environmental Protection Agency (EPA), or others.

  7. Importance of sample form and surface temperature for analysis by ambient plasma mass spectrometry (PADI).

    PubMed

    Salter, Tara La Roche; Bunch, Josephine; Gilmore, Ian S

    2014-09-16

    Many different types of samples have been analyzed in the literature using plasma-based ambient mass spectrometry sources; however, comprehensive studies of the important parameters for analysis are only just beginning. Here, we investigate the effect of the sample form and surface temperature on the signal intensities in plasma-assisted desorption ionization (PADI). The form of the sample is very important, with powders of all volatilities effectively analyzed. However, for the analysis of thin films at room temperature and using a low plasma power, a vapor pressure of greater than 10(-4) Pa is required to achieve a sufficiently good quality spectrum. Using thermal desorption, we are able to increase the signal intensity of less volatile materials with vapor pressures less than 10(-4) Pa, in thin film form, by between 4 and 7 orders of magnitude. This is achieved by increasing the temperature of the sample up to a maximum of 200 °C. Thermal desorption can also increase the signal intensity for the analysis of powders.

  8. Method and apparatus for imaging a sample on a device

    DOEpatents

    Trulson, Mark; Stern, David; Fiekowsky, Peter; Rava, Richard; Walton, Ian; Fodor, Stephen P. A.

    2001-01-01

    A method and apparatus for imaging a sample are provided. An electromagnetic radiation source generates excitation radiation which is sized by excitation optics to a line. The line is directed at a sample resting on a support and excites a plurality of regions on the sample. Collection optics collect response radiation reflected from the sample I and image the reflected radiation. A detector senses the reflected radiation and is positioned to permit discrimination between radiation reflected from a certain focal plane in the sample and certain other planes within the sample.

  9. Engineering Study of 500 ML Sample Bottle Transportation Methods

    SciTech Connect

    BOGER, R.M.

    1999-08-25

    This engineering study reviews and evaluates all available methods for transportation of 500-mL grab sample bottles, reviews and evaluates transportation requirements and schedules and analyzes and recommends the most cost-effective method for transporting 500-mL grab sample bottles.

  10. 40 CFR Appendix I to Part 261 - Representative Sampling Methods

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Appendix I to Part 261 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) SOLID WASTES (CONTINUED) IDENTIFICATION AND LISTING OF HAZARDOUS WASTE Pt. 261, App. I Appendix I to Part 261—Representative Sampling Methods The methods and equipment used for sampling waste materials will vary with...

  11. GROUND WATER PURGING AND SAMPLING METHODS: HISTORY VS. HYSTERIA

    EPA Science Inventory

    It has been over 10 years since the low-flow ground water purging and sampling method was initially reported in the literature. The method grew from the recognition that well purging was necessary to collect representative samples, bailers could not achieve well purging, and high...

  12. The importance of a priori sample size estimation in strength and conditioning research.

    PubMed

    Beck, Travis W

    2013-08-01

    The statistical power, or sensitivity of an experiment, is defined as the probability of rejecting a false null hypothesis. Only 3 factors can affect statistical power: (a) the significance level (α), (b) the magnitude or size of the treatment effect (effect size), and (c) the sample size (n). Of these 3 factors, only the sample size can be manipulated by the investigator because the significance level is usually selected before the study, and the effect size is determined by the effectiveness of the treatment. Thus, selection of an appropriate sample size is one of the most important components of research design but is often misunderstood by beginning researchers. The purpose of this tutorial is to describe procedures for estimating sample size for a variety of different experimental designs that are common in strength and conditioning research. Emphasis is placed on selecting an appropriate effect size because this step fully determines sample size when power and the significance level are fixed. There are many different software packages that can be used for sample size estimation. However, I chose to describe the procedures for the G*Power software package (version 3.1.4) because this software is freely downloadable and capable of estimating sample size for many of the different statistical tests used in strength and conditioning research. Furthermore, G*Power provides a number of different auxiliary features that can be useful for researchers when designing studies. It is my hope that the procedures described in this article will be beneficial for researchers in the field of strength and conditioning.

  13. The Importance of Meteorite Collections to Sample Return Missions: Past, Present, and Future Considerations

    NASA Technical Reports Server (NTRS)

    Welzenbach, L. C.; McCoy, T. J.; Glavin, D. P.; Dworkin, J. P.; Abell, P. A.

    2012-01-01

    turn led to a new wave of Mars exploration that ultimately could lead to sample return focused on evidence for past or present life. This partnership between collections and missions will be increasingly important in the coming decades as we discover new questions to be addressed and identify targets for for both robotic and human exploration . Nowhere is this more true than in the ultimate search for the abiotic and biotic processes that produced life. Existing collections also provide the essential materials for developing and testing new analytical schemes to detect the rare markers of life and distinguish them from abiotic processes. Large collections of meteorites and the new types being identified within these collections, which come to us at a fraction of the cost of a sample return mission, will continue to shape the objectives of future missions and provide new ways of interpreting returned samples.

  14. [Clinical importance and diagnostic methods of minimal hepatic encephalopathy].

    PubMed

    Stawicka, Agnieszka; Zbrzeźniak, Justyna; Świderska, Aleksandra; Kilisińska, Natalia; Świderska, Magdalena; Jaroszewicz, Jerzy; Flisiak, Robert

    2016-02-01

    Minimal hepatic encephalopathy (MHE) encompasses a number of neuropsychological and neurophysiological disorders in patients suffering from liver cirrhosis, who do not display abnormalities during a medical interview or physical examination. A negative influence of MHE on the quality of life of patients suffering from liver cirrhosis was confirmed, which include retardation of ability of operating motor vehicles and disruption of multiple health-related areas, as well as functioning in the society. The data on frequency of traffic offences and accidents amongst patients diagnosed with MHE in comparison to patients diagnosed with liver cirrhosis without MHE, as well as healthy persons is alarming. Those patients are unaware of their disorder and retardation of their ability to operate vehicles, therefore it is of utmost importance to define this group. The term minimal hepatic encephalopathy (formerly "subclinical" encephalopathy) erroneously suggested the unnecessity of diagnostic and therapeutic procedures in patients with liver cirrhosis. Diagnosing MHE is an important predictive factor for occurrence of overt encephalopathy - more than 50% of patients with this diagnosis develop overt encephalopathy during a period of 30 months after. Early diagnosing MHE gives a chance to implement proper treatment which can be a prevention of overt encephalopathy. Due to continuing lack of clinical research there exist no commonly agreed-upon standards for definition, diagnostics, classification and treatment of hepatic encephalopathy. This article introduces the newest findings regarding the importance of MHE, scientific recommendations and provides detailed descriptions of the most valuable diagnostic methods.

  15. Evaluating Composite Sampling Methods of Bacillus spores at Low Concentrations

    SciTech Connect

    Hess, Becky M.; Amidan, Brett G.; Anderson, Kevin K.; Hutchison, Janine R.

    2016-10-13

    Restoring facility operations after the 2001 Amerithrax attacks took over three months to complete, highlighting the need to reduce remediation time. The most time intensive tasks were environmental sampling and sample analyses. Composite sampling allows disparate samples to be combined, with only a single analysis needed, making it a promising method to reduce response times. We developed a statistical experimental design to test three different composite sampling methods: 1) single medium single pass composite: a single cellulose sponge samples multiple coupons; 2) single medium multi-pass composite: a single cellulose sponge is used to sample multiple coupons; and 3) multi-medium post-sample composite: a single cellulose sponge samples a single surface, and then multiple sponges are combined during sample extraction. Five spore concentrations of Bacillus atrophaeus Nakamura spores were tested; concentrations ranged from 5 to 100 CFU/coupon (0.00775 to 0.155CFU/cm2, respectively). Study variables included four clean surface materials (stainless steel, vinyl tile, ceramic tile, and painted wallboard) and three grime coated/dirty materials (stainless steel, vinyl tile, and ceramic tile). Analysis of variance for the clean study showed two significant factors: composite method (p-value < 0.0001) and coupon material (p-value = 0.0008). Recovery efficiency (RE) was higher overall using the post-sample composite (PSC) method compared to single medium composite from both clean and grime coated materials. RE with the PSC method for concentrations tested (10 to 100 CFU/coupon) was similar for ceramic tile, painted wall board, and stainless steel for clean materials. RE was lowest for vinyl tile with both composite methods. Statistical tests for the dirty study showed RE was significantly higher for vinyl and stainless steel materials, but significantly lower for ceramic tile. These results suggest post-sample compositing can be used to reduce sample analysis time when

  16. A Mixed Methods Sampling Methodology for a Multisite Case Study

    ERIC Educational Resources Information Center

    Sharp, Julia L.; Mobley, Catherine; Hammond, Cathy; Withington, Cairen; Drew, Sam; Stringfield, Sam; Stipanovic, Natalie

    2012-01-01

    The flexibility of mixed methods research strategies makes such approaches especially suitable for multisite case studies. Yet the utilization of mixed methods to select sites for these studies is rarely reported. The authors describe their pragmatic mixed methods approach to select a sample for their multisite mixed methods case study of a…

  17. [Respondent-Driven Sampling: a new sampling method to study visible and hidden populations].

    PubMed

    Mantecón, Alejandro; Juan, Montse; Calafat, Amador; Becoña, Elisardo; Román, Encarna

    2008-01-01

    The paper introduces a variant of chain-referral sampling: respondent-driven sampling (RDS). This sampling method shows that methods based on network analysis can be combined with the statistical validity of standard probability sampling methods. In this sense, RDS appears to be a mathematical improvement of snowball sampling oriented to the study of hidden populations. However, we try to prove its validity with populations that are not within a sampling frame but can nonetheless be contacted without difficulty. The basics of RDS are explained through our research on young people (aged 14 to 25) who go clubbing, consume alcohol and other drugs, and have sex. Fieldwork was carried out between May and July 2007 in three Spanish regions: Baleares, Galicia and Comunidad Valenciana. The presentation of the study shows the utility of this type of sampling when the population is accessible but there is a difficulty deriving from the lack of a sampling frame. However, the sample obtained is not a random representative one in statistical terms of the target population. It must be acknowledged that the final sample is representative of a 'pseudo-population' that approximates to the target population but is not identical to it.

  18. Method based on bioinspired sample improves autofocusing performances

    NASA Astrophysics Data System (ADS)

    Cao, Jie; Cheng, Yang; Wang, Peng; Peng, Yuxin; Zhang, Kaiyu; Wu, Leina; Xia, Wenze; Yu, Haoyong

    2016-10-01

    In order to solve the issue between fast autofocusing speed and high volume data processing, we propose a bioinspired sampling method based on a retina-like structure. We develop retina-like models and analyze the division of sampling structure. The optimal retina-like sample is obtained by analyzing two key parameters (sectors and radius of blind area) of the retina-like structure through experiments. Under the typical autofocus functions, including Vollath-4, Laplacian, Tenengrad, spatial frequency, and sum-modified-Laplacian (SML), we carry out comparative experiments of computation time based on the retina-like sample and a traditional uniform sample. The results show that the retina-like sample is suitable for those autofocus functions. Based on the autofocus function of SML, the average time of uniform sample decreases from 3.5 to 2.1 s for the retina-like sample.

  19. Method for using polarization gating to measure a scattering sample

    DOEpatents

    Baba, Justin S.

    2015-08-04

    Described herein are systems, devices, and methods facilitating optical characterization of scattering samples. A polarized optical beam can be directed to pass through a sample to be tested. The optical beam exiting the sample can then be analyzed to determine its degree of polarization, from which other properties of the sample can be determined. In some cases, an apparatus can include a source of an optical beam, an input polarizer, a sample, an output polarizer, and a photodetector. In some cases, a signal from a photodetector can be processed through attenuation, variable offset, and variable gain.

  20. A comparison of methods for representing sparsely sampled random quantities.

    SciTech Connect

    Romero, Vicente Jose; Swiler, Laura Painton; Urbina, Angel; Mullins, Joshua

    2013-09-01

    This report discusses the treatment of uncertainties stemming from relatively few samples of random quantities. The importance of this topic extends beyond experimental data uncertainty to situations involving uncertainty in model calibration, validation, and prediction. With very sparse data samples it is not practical to have a goal of accurately estimating the underlying probability density function (PDF). Rather, a pragmatic goal is that the uncertainty representation should be conservative so as to bound a specified percentile range of the actual PDF, say the range between 0.025 and .975 percentiles, with reasonable reliability. A second, opposing objective is that the representation not be overly conservative; that it minimally over-estimate the desired percentile range of the actual PDF. The presence of the two opposing objectives makes the sparse-data uncertainty representation problem interesting and difficult. In this report, five uncertainty representation techniques are characterized for their performance on twenty-one test problems (over thousands of trials for each problem) according to these two opposing objectives and other performance measures. Two of the methods, statistical Tolerance Intervals and a kernel density approach specifically developed for handling sparse data, exhibit significantly better overall performance than the others.

  1. [Recent advances in sample preparation methods of plant hormones].

    PubMed

    Wu, Qian; Wang, Lus; Wu, Dapeng; Duan, Chunfeng; Guan, Yafeng

    2014-04-01

    Plant hormones are a group of naturally occurring trace substances which play a crucial role in controlling the plant development, growth and environment response. With the development of the chromatography and mass spectroscopy technique, chromatographic analytical method has become a widely used way for plant hormone analysis. Among the steps of chromatographic analysis, sample preparation is undoubtedly the most vital one. Thus, a highly selective and efficient sample preparation method is critical for accurate identification and quantification of phytohormones. For the three major kinds of plant hormones including acidic plant hormones & basic plant hormones, brassinosteroids and plant polypeptides, the sample preparation methods are reviewed in sequence especially the recently developed methods. The review includes novel methods, devices, extractive materials and derivative reagents for sample preparation of phytohormones analysis. Especially, some related works of our group are included. At last, the future developments in this field are also prospected.

  2. Interval sampling methods and measurement error: a computer simulation.

    PubMed

    Wirth, Oliver; Slaven, James; Taylor, Matthew A

    2014-01-01

    A simulation study was conducted to provide a more thorough account of measurement error associated with interval sampling methods. A computer program simulated the application of momentary time sampling, partial-interval recording, and whole-interval recording methods on target events randomly distributed across an observation period. The simulation yielded measures of error for multiple combinations of observation period, interval duration, event duration, and cumulative event duration. The simulations were conducted up to 100 times to yield measures of error variability. Although the present simulation confirmed some previously reported characteristics of interval sampling methods, it also revealed many new findings that pertain to each method's inherent strengths and weaknesses. The analysis and resulting error tables can help guide the selection of the most appropriate sampling method for observation-based behavioral assessments.

  3. [Weighted estimation methods for multistage sampling survey data].

    PubMed

    Hou, Xiao-Yan; Wei, Yong-Yue; Chen, Feng

    2009-06-01

    Multistage sampling techniques are widely applied in the cross-sectional study of epidemiology, while methods based on independent assumption are still used to analyze such complex survey data. This paper aims to introduce the application of weighted estimation methods for the complex survey data. A brief overview of basic theory is described, and then a practical analysis is illustrated to apply to the weighted estimation algorithm in a stratified two-stage clustered sampling data. For multistage sampling survey data, weighted estimation method can be used to obtain unbiased point estimation and more reasonable variance estimation, and so make proper statistical inference by correcting the clustering, stratification and unequal probability effects.

  4. A random spatial sampling method in a rural developing nation

    PubMed Central

    2014-01-01

    Background Nonrandom sampling of populations in developing nations has limitations and can inaccurately estimate health phenomena, especially among hard-to-reach populations such as rural residents. However, random sampling of rural populations in developing nations can be challenged by incomplete enumeration of the base population. Methods We describe a stratified random sampling method using geographical information system (GIS) software and global positioning system (GPS) technology for application in a health survey in a rural region of Guatemala, as well as a qualitative study of the enumeration process. Results This method offers an alternative sampling technique that could reduce opportunities for bias in household selection compared to cluster methods. However, its use is subject to issues surrounding survey preparation, technological limitations and in-the-field household selection. Application of this method in remote areas will raise challenges surrounding the boundary delineation process, use and translation of satellite imagery between GIS and GPS, and household selection at each survey point in varying field conditions. This method favors household selection in denser urban areas and in new residential developments. Conclusions Random spatial sampling methodology can be used to survey a random sample of population in a remote region of a developing nation. Although this method should be further validated and compared with more established methods to determine its utility in social survey applications, it shows promise for use in developing nations with resource-challenged environments where detailed geographic and human census data are less available. PMID:24716473

  5. Sequential sampling: a novel method in farm animal welfare assessment.

    PubMed

    Heath, C A E; Main, D C J; Mullan, S; Haskell, M J; Browne, W J

    2016-02-01

    Lameness in dairy cows is an important welfare issue. As part of a welfare assessment, herd level lameness prevalence can be estimated from scoring a sample of animals, where higher levels of accuracy are associated with larger sample sizes. As the financial cost is related to the number of cows sampled, smaller samples are preferred. Sequential sampling schemes have been used for informing decision making in clinical trials. Sequential sampling involves taking samples in stages, where sampling can stop early depending on the estimated lameness prevalence. When welfare assessment is used for a pass/fail decision, a similar approach could be applied to reduce the overall sample size. The sampling schemes proposed here apply the principles of sequential sampling within a diagnostic testing framework. This study develops three sequential sampling schemes of increasing complexity to classify 80 fully assessed UK dairy farms, each with known lameness prevalence. Using the Welfare Quality herd-size-based sampling scheme, the first 'basic' scheme involves two sampling events. At the first sampling event half the Welfare Quality sample size is drawn, and then depending on the outcome, sampling either stops or is continued and the same number of animals is sampled again. In the second 'cautious' scheme, an adaptation is made to ensure that correctly classifying a farm as 'bad' is done with greater certainty. The third scheme is the only scheme to go beyond lameness as a binary measure and investigates the potential for increasing accuracy by incorporating the number of severely lame cows into the decision. The three schemes are evaluated with respect to accuracy and average sample size by running 100 000 simulations for each scheme, and a comparison is made with the fixed size Welfare Quality herd-size-based sampling scheme. All three schemes performed almost as well as the fixed size scheme but with much smaller average sample sizes. For the third scheme, an overall

  6. A LITERATURE REVIEW OF WIPE SAMPLING METHODS FOR CHEMICAL WARFARE AGENTS AND TOXIC INDUSTRIAL CHEMICALS

    EPA Science Inventory

    Wipe sampling is an important technique for the estimation of contaminant deposition in buildings, homes, or outdoor surfaces as a source of possible human exposure. Numerous

    methods of wipe sampling exist, and each method has its own specification for the type of wipe, we...

  7. Integration of sample analysis method (SAM) for polychlorinated biphenyls

    SciTech Connect

    Monagle, M.; Johnson, R.C.

    1996-05-01

    A completely integrated Sample Analysis Method (SAM) has been tested as part of the Contaminant Analysis Automation program. The SAM system was tested for polychlorinated biphenyl samples using five Standard Laboratory Modules{trademark}: two Soxtec{trademark} modules, a high volume concentrator module, a generic materials handling module, and the gas chromatographic module. With over 300 samples completed within the first phase of the validation, recovery and precision data were comparable to manual methods. Based on experience derived from the first evaluation of the automated system, efforts are underway to improve sample recoveries and integrate a sample cleanup procedure. In addition, initial work in automating the extraction of semivolatile samples using this system will also be discussed.

  8. Method and sample spinning apparatus for measuring the NMR spectrum of an orientationally disordered sample

    DOEpatents

    Pines, Alexander; Samoson, Ago

    1990-01-01

    An improved NMR apparatus and method are described which substantially improve the resolution of NMR measurements made on powdered or amorphous or otherwise orientationally disordered samples. The apparatus spins the sample about an axis. The angle of the axis is mechanically varied such that the time average of two or more Legendre polynomials are zero.

  9. Methods for collection and analysis of water samples

    USGS Publications Warehouse

    Rainwater, Frank Hays; Thatcher, Leland Lincoln

    1960-01-01

    This manual contains methods used by the U.S. Geological Survey to collect, preserve, and analyze water samples. Throughout, the emphasis is on obtaining analytical results that accurately describe the chemical composition of the water in situ. Among the topics discussed are selection of sampling sites, frequency of sampling, field equipment, preservatives and fixatives, analytical techniques of water analysis, and instruments. Seventy-seven laboratory and field procedures are given for determining fifty-three water properties.

  10. Optimized method for dissolved hydrogen sampling in groundwater.

    PubMed

    Alter, Marcus D; Steiof, Martin

    2005-06-01

    Dissolved hydrogen concentrations are used to characterize redox conditions of contaminated aquifers. The currently accepted and recommended bubble strip method for hydrogen sampling (Wiedemeier et al., 1998) requires relatively long sampling times and immediate field analysis. In this study we present methods for optimized sampling and for sample storage. The bubble strip sampling method was examined for various flow rates, bubble sizes (headspace volume in the sampling bulb) and two different H2 concentrations. The results were compared to a theoretical equilibration model. Turbulent flow in the sampling bulb was optimized for gas transfer by reducing the inlet diameter. Extraction with a 5 mL headspace volume and flow rates higher than 100 mL/min resulted in 95-100% equilibrium within 10-15 min. In order to investigate the storage of samples from the gas sampling bulb gas samples were kept in headspace vials for varying periods. Hydrogen samples (4.5 ppmv, corresponding to 3.5 nM in liquid phase) could be stored up to 48 h and 72 h with a recovery rate of 100.1+/-2.6% and 94.6+/-3.2%, respectively. These results are promising and prove the possibility of storage for 2-3 days before laboratory analysis. The optimized method was tested at a field site contaminated with chlorinated solvents. Duplicate gas samples were stored in headspace vials and analyzed after 24 h. Concentrations were measured in the range of 2.5-8.0 nM corresponding to known concentrations in reduced aquifers.

  11. Capillary microextraction: A new method for sampling methamphetamine vapour.

    PubMed

    Nair, M V; Miskelly, G M

    2016-11-01

    Clandestine laboratories pose a serious health risk to first responders, investigators, decontamination companies, and the public who may be inadvertently exposed to methamphetamine and other chemicals used in its manufacture. Therefore there is an urgent need for reliable methods to detect and measure methamphetamine at such sites. The most common method for determining methamphetamine contamination at former clandestine laboratory sites is selected surface wipe sampling, followed by analysis with gas chromatography-mass spectrometry (GC-MS). We are investigating the use of sampling for methamphetamine vapour to complement such wipe sampling. In this study, we report the use of capillary microextraction (CME) devices for sampling airborne methamphetamine, and compare their sampling efficiency with a previously reported dynamic SPME method. The CME devices consisted of PDMS-coated glass filter strips inside a glass tube. The devices were used to dynamically sample methamphetamine vapour in the range of 0.42-4.2μgm(-3), generated by a custom-built vapour dosing system, for 1-15min, and methamphetamine was analysed using a GC-MS fitted with a ChromatoProbe thermal desorption unit. The devices showed good reproducibility (RSD<15%), and a curvilinear pre-equilibrium relationship between sampling times and peak area, which can be utilised for calibration. Under identical sampling conditions, the CME devices were approximately 30 times more sensitive than the dynamic SPME method. The CME devices could be stored for up to 3days after sampling prior to analysis. Consecutive sampling of methamphetamine and its isotopic substitute, d-9 methamphetamine showed no competitive displacement. This suggests that CME devices, pre-loaded with an internal standard, could be a feasible method for sampling airborne methamphetamine at former clandestine laboratories.

  12. The importance of quality control in validating concentrations of contaminants of emerging concern in source and treated drinking water samples.

    PubMed

    Batt, Angela L; Furlong, Edward T; Mash, Heath E; Glassmeyer, Susan T; Kolpin, Dana W

    2017-02-01

    A national-scale survey of 247 contaminants of emerging concern (CECs), including organic and inorganic chemical compounds, and microbial contaminants, was conducted in source and treated drinking water samples from 25 treatment plants across the United States. Multiple methods were used to determine these CECs, including six analytical methods to measure 174 pharmaceuticals, personal care products, and pesticides. A three-component quality assurance/quality control (QA/QC) program was designed for the subset of 174 CECs which allowed us to assess and compare performances of the methods used. The three components included: 1) a common field QA/QC protocol and sample design, 2) individual investigator-developed method-specific QA/QC protocols, and 3) a suite of 46 method comparison analytes that were determined in two or more analytical methods. Overall method performance for the 174 organic chemical CECs was assessed by comparing spiked recoveries in reagent, source, and treated water over a two-year period. In addition to the 247 CECs reported in the larger drinking water study, another 48 pharmaceutical compounds measured did not consistently meet predetermined quality standards. Methodologies that did not seem suitable for these analytes are overviewed. The need to exclude analytes based on method performance demonstrates the importance of additional QA/QC protocols.

  13. Demonstration Report for Visual Sample Plan (VSP) Verification Sampling Methods at the Navy/DRI Site

    DTIC Science & Technology

    2011-08-01

    STATISTICAL VERIFICATION AND REMEDIATION SAMPLING METHODS (200837) August 2011 Pacific Northwest National Laboratory Brent Pulsipher...17. LIMIT ATIOH OF 1S. NUMSER 19~. NAME OS: RESPONSI’SLE PERSON ABSTRACT o• Brent Pulsipher ... ...., .. •. ’ · ’"" .... PAG .’ES uu 93 19b... Statistical Verification Sampling Methods in VSP ii August 2011 6.2.1  Transect Survey Design and Parameter Settings

  14. Nominal Weights Mean Equating: A Method for Very Small Samples

    ERIC Educational Resources Information Center

    Babcock, Ben; Albano, Anthony; Raymond, Mark

    2012-01-01

    The authors introduced nominal weights mean equating, a simplified version of Tucker equating, as an alternative for dealing with very small samples. The authors then conducted three simulation studies to compare nominal weights mean equating to six other equating methods under the nonequivalent groups anchor test design with sample sizes of 20,…

  15. Field Evaluation of Personal Sampling Methods for Multiple Bioaerosols

    PubMed Central

    Wang, Chi-Hsun; Chen, Bean T.; Han, Bor-Cheng; Liu, Andrew Chi-Yeu; Hung, Po-Chen; Chen, Chih-Yong; Chao, Hsing Jasmine

    2015-01-01

    Ambient bioaerosols are ubiquitous in the daily environment and can affect health in various ways. However, few studies have been conducted to comprehensively evaluate personal bioaerosol exposure in occupational and indoor environments because of the complex composition of bioaerosols and the lack of standardized sampling/analysis methods. We conducted a study to determine the most efficient collection/analysis method for the personal exposure assessment of multiple bioaerosols. The sampling efficiencies of three filters and four samplers were compared. According to our results, polycarbonate (PC) filters had the highest relative efficiency, particularly for bacteria. Side-by-side sampling was conducted to evaluate the three filter samplers (with PC filters) and the NIOSH Personal Bioaerosol Cyclone Sampler. According to the results, the Button Aerosol Sampler and the IOM Inhalable Dust Sampler had the highest relative efficiencies for fungi and bacteria, followed by the NIOSH sampler. Personal sampling was performed in a pig farm to assess occupational bioaerosol exposure and to evaluate the sampling/analysis methods. The Button and IOM samplers yielded a similar performance for personal bioaerosol sampling at the pig farm. However, the Button sampler is more likely to be clogged at high airborne dust concentrations because of its higher flow rate (4 L/min). Therefore, the IOM sampler is a more appropriate choice for performing personal sampling in environments with high dust levels. In summary, the Button and IOM samplers with PC filters are efficient sampling/analysis methods for the personal exposure assessment of multiple bioaerosols. PMID:25799419

  16. Importance of anthropogenic climate impact, sampling error and urban development in sewer system design.

    PubMed

    Egger, C; Maurer, M

    2015-04-15

    Urban drainage design relying on observed precipitation series neglects the uncertainties associated with current and indeed future climate variability. Urban drainage design is further affected by the large stochastic variability of precipitation extremes and sampling errors arising from the short observation periods of extreme precipitation. Stochastic downscaling addresses anthropogenic climate impact by allowing relevant precipitation characteristics to be derived from local observations and an ensemble of climate models. This multi-climate model approach seeks to reflect the uncertainties in the data due to structural errors of the climate models. An ensemble of outcomes from stochastic downscaling allows for addressing the sampling uncertainty. These uncertainties are clearly reflected in the precipitation-runoff predictions of three urban drainage systems. They were mostly due to the sampling uncertainty. The contribution of climate model uncertainty was found to be of minor importance. Under the applied greenhouse gas emission scenario (A1B) and within the period 2036-2065, the potential for urban flooding in our Swiss case study is slightly reduced on average compared to the reference period 1981-2010. Scenario planning was applied to consider urban development associated with future socio-economic factors affecting urban drainage. The impact of scenario uncertainty was to a large extent found to be case-specific, thus emphasizing the need for scenario planning in every individual case. The results represent a valuable basis for discussions of new drainage design standards aiming specifically to include considerations of uncertainty.

  17. Statistics in brief: the importance of sample size in the planning and interpretation of medical research.

    PubMed

    Biau, David Jean; Kernéis, Solen; Porcher, Raphaël

    2008-09-01

    The increasing volume of research by the medical community often leads to increasing numbers of contradictory findings and conclusions. Although the differences observed may represent true differences, the results also may differ because of sampling variability as all studies are performed on a limited number of specimens or patients. When planning a study reporting differences among groups of patients or describing some variable in a single group, sample size should be considered because it allows the researcher to control for the risk of reporting a false-negative finding (Type II error) or to estimate the precision his or her experiment will yield. Equally important, readers of medical journals should understand sample size because such understanding is essential to interpret the relevance of a finding with regard to their own patients. At the time of planning, the investigator must establish (1) a justifiable level of statistical significance, (2) the chances of detecting a difference of given magnitude between the groups compared, ie, the power, (3) this targeted difference (ie, effect size), and (4) the variability of the data (for quantitative data). We believe correct planning of experiments is an ethical issue of concern to the entire community.

  18. A cryopreservation method for Pasteurella multocida from wetland samples

    USGS Publications Warehouse

    Moore, Melody K.; Shadduck, D.J.; Goldberg, D.R.; Samuel, M.D.

    1998-01-01

    A cryopreservation method and improved isolation techniques for detection of Pasteurella multocida from wetland samples were developed. Wetland water samples were collected in the field, diluted in dimethyl sulfoxide (DMSO, final concentration 10%), and frozen at -180 C in a liquid nitrogen vapor shipper. Frozen samples were transported to the laboratory where they were subsequently thawed and processed in Pasteurella multocida selective broth (PMSB) to isolate P. multocida. This method allowed for consistent isolation of 2 to 18 organisms/ml from water seeded with known concentrations of P. multocida. The method compared favorably with the standard mouse inoculation method and allowed for preservation of the samples until they could be processed in the laboratory.

  19. Probability Sampling Method for a Hidden Population Using Respondent-Driven Sampling: Simulation for Cancer Survivors.

    PubMed

    Jung, Minsoo

    2015-01-01

    When there is no sampling frame within a certain group or the group is concerned that making its population public would bring social stigma, we say the population is hidden. It is difficult to approach this kind of population survey-methodologically because the response rate is low and its members are not quite honest with their responses when probability sampling is used. The only alternative known to address the problems caused by previous methods such as snowball sampling is respondent-driven sampling (RDS), which was developed by Heckathorn and his colleagues. RDS is based on a Markov chain, and uses the social network information of the respondent. This characteristic allows for probability sampling when we survey a hidden population. We verified through computer simulation whether RDS can be used on a hidden population of cancer survivors. According to the simulation results of this thesis, the chain-referral sampling of RDS tends to minimize as the sample gets bigger, and it becomes stabilized as the wave progresses. Therefore, it shows that the final sample information can be completely independent from the initial seeds if a certain level of sample size is secured even if the initial seeds were selected through convenient sampling. Thus, RDS can be considered as an alternative which can improve upon both key informant sampling and ethnographic surveys, and it needs to be utilized for various cases domestically as well.

  20. Quality of plasma sampled by different methods for multiple blood sampling in mice.

    PubMed

    Christensen, S D; Mikkelsen, L F; Fels, J J; Bodvarsdóttir, T B; Hansen, A K

    2009-01-01

    For oral glucose tolerance test (OGTT) in mice, multiple blood samples need to be taken within a few hours from conscious mice. Today, a number of essential parameters may be analysed on very small amounts of plasma, thus reducing the number of animals to be used. It is, however, crucial to obtain high-quality plasma or serum in order to avoid increased data variation and thereby increased group sizes. The aim of this study was to find the most valid and reproducible method for withdrawal of blood samples when performing OGTT. Four methods, i.e. amputation of the tail tip, lateral tail incision, puncture of the tail tip and periorbital puncture, were selected for testing at 21 degrees C and 30 degrees C after a pilot study. For each method, four blood samples were drawn from C57BL/6 mice at 30 min intervals. The presence of clots was registered, haemolysis was monitored spectrophotometrically at 430 nm, and it was noted whether it was possible to achieve 30-50 microL blood. Furthermore, a small amount of extra blood was sampled before and after the four samplings for testing of whether the sampling induced a blood glucose change over the 90 min test period. All methods resulted in acceptable amounts of plasma. Clots were observed in a sparse number of samples with no significant differences between the methods. Periorbital puncture did not lead to any haemolysed samples at all, and lateral tail incision resulted in only a few haemolysed samples, while puncture or amputation of the tail tip induced haemolysis in a significant number of samples. All methods, except for puncture of the tail tip, influenced blood glucose. Periorbital puncture resulted in a dramatic increase in blood glucose of up to 3.5 mmol/L indicating that it is stressful. Although lateral tail incision also had some impact on blood glucose, it seems to be the method of choice for OGTT, as it is likely to produce a clot-free non-haemolysed sample, while periorbital sampling, although producing a

  1. A quantitative sampling method for Oncomelania quadrasi by filter paper.

    PubMed

    Tanaka, H; Santos, M J; Matsuda, H; Yasuraoka, K; Santos, A T

    1975-08-01

    Filter paper was found to attract Oncomelania quadrasi in waters the same way as fallen dried banana leaves, although less number of other species of snails was collected on the former than on the latter. Snails were collected in limited areas using a tube (85 cm2 area at cross-section) and a filter paper (20 X 20 CM) samplers. The sheet of filter paper was placed close to the spot where a tube sample was taken, and recovered after 24 hours. At each sampling, 30 samples were taken by each method in an area and sampling was made four times. The correlation of the number of snails collected by the tube and that by filter paper was studied. The ratio of the snail counts by the tube sampler to those by the filter paper was 1.18. A loose correlation was observed between snail counts of both methods as shown by the correlation coefficient r = 0.6502. The formulas for the regression line were Y = 0.77 X + 1.6 and X = 0.55 Y + 1.35 for 3 experiments where Y is the number of snails collected by tube sampling and X is the number of snails collected in the sheet of filter paper. The type of snail distribution was studied in the 30 samples taken by each method and this was observed to be nearly the same in both sampling methods. All sampling data were found to fit the negative binomial distribution with the values of the constant k varying very much from 0.5775 to 5.9186 in (q -- p)-k. In each experiment, the constant k was always larger in tube sampling than in filter paper sampling. This indicates that the uneven distribution of snails on the soil surface becomes more conspicuous by the filter paper sampling.

  2. Neutron activation analysis of certified samples by the absolute method

    NASA Astrophysics Data System (ADS)

    Kadem, F.; Belouadah, N.; Idiri, Z.

    2015-07-01

    The nuclear reactions analysis technique is mainly based on the relative method or the use of activation cross sections. In order to validate nuclear data for the calculated cross section evaluated from systematic studies, we used the neutron activation analysis technique (NAA) to determine the various constituent concentrations of certified samples for animal blood, milk and hay. In this analysis, the absolute method is used. The neutron activation technique involves irradiating the sample and subsequently performing a measurement of the activity of the sample. The fundamental equation of the activation connects several physical parameters including the cross section that is essential for the quantitative determination of the different elements composing the sample without resorting to the use of standard sample. Called the absolute method, it allows a measurement as accurate as the relative method. The results obtained by the absolute method showed that the values are as precise as the relative method requiring the use of standard sample for each element to be quantified.

  3. Soil separator and sampler and method of sampling

    SciTech Connect

    O'Brien, Barry H; Ritter, Paul D

    2010-02-16

    A soil sampler includes a fluidized bed for receiving a soil sample. The fluidized bed may be in communication with a vacuum for drawing air through the fluidized bed and suspending particulate matter of the soil sample in the air. In a method of sampling, the air may be drawn across a filter, separating the particulate matter. Optionally, a baffle or a cyclone may be included within the fluidized bed for disentrainment, or dedusting, so only the finest particulate matter, including asbestos, will be trapped on the filter. The filter may be removable, and may be tested to determine the content of asbestos and other hazardous particulate matter in the soil sample.

  4. Method and apparatus for imaging a sample on a device

    DOEpatents

    Trulson, Mark; Stern, David; Fiekowsky, Peter; Rava, Richard; Walton, Ian; Fodor, Stephen P. A.

    1996-01-01

    The present invention provides methods and systems for detecting a labeled marker on a sample located on a support. The imaging system comprises a body for immobilizing the support, an excitation radiation source and excitation optics to generate and direct the excitation radiation at the sample. In response, labeled material on the sample emits radiation which has a wavelength that is different from the excitation wavelength, which radiation is collected by collection optics and imaged onto a detector which generates an image of the sample.

  5. System and method for measuring fluorescence of a sample

    DOEpatents

    Riot, Vincent J

    2015-03-24

    The present disclosure provides a system and a method for measuring fluorescence of a sample. The sample may be a polymerase-chain-reaction (PCR) array, a loop-mediated-isothermal amplification array, etc. LEDs are used to excite the sample, and a photodiode is used to collect the sample's fluorescence. An electronic offset signal is used to reduce the effects of background fluorescence and the noises from the measurement system. An integrator integrates the difference between the output of the photodiode and the electronic offset signal over a given period of time. The resulting integral is then converted into digital domain for further processing and storage.

  6. Cooperative Nature of Gating Transitions in K+ Channels as seen from Dynamic Importance Sampling Calculations

    PubMed Central

    Denning, Elizabeth J.; Woolf, Thomas B.

    2009-01-01

    The growing dataset of K+ channel x-ray structures provides an excellent opportunity to begin a detailed molecular understanding of voltage-dependent gating. These structures, while differing in sequence, represent either a stable open or closed state. However, an understanding of the molecular details of gating will require models for the transitions and experimentally testable predictions for the gating transition. To explore these ideas, we apply Dynamic Importance Sampling (DIMS) to a set of homology models for the molecular conformations of K+ channels for four different sets of sequences and eight different states. In our results, we highlight the importance of particular residues upstream from the PVP region to the gating transition. This supports growing evidence that the PVP region is important for influencing the flexibility of the S6 helix and thus the opening of the gating domain. The results further suggest how gating on the molecular level depends on intra-subunit motions to influence the cooperative behavior of all four subunits of the K+ channel. We hypothesize that the gating process occurs in steps: first sidechain movement, then inter- S5-S6 subunit motions, and lastly the large-scale domain rearrangements. PMID:19950367

  7. Convenient mounting method for electrical measurements of thin samples

    NASA Technical Reports Server (NTRS)

    Matus, L. G.; Summers, R. L.

    1986-01-01

    A method for mounting thin samples for electrical measurements is described. The technique is based on a vacuum chuck concept in which the vacuum chuck simultaneously holds the sample and established electrical contact. The mounting plate is composed of a glass-ceramic insulating material and the surfaces of the plate and vacuum chuck are polished. The operation of the vacuum chuck is examined. The contacts on the sample and mounting plate, which are sputter-deposited through metal masks, are analyzed. The mounting method was utilized for van der Pauw measurements.

  8. DOE methods for evaluating environmental and waste management samples

    SciTech Connect

    Goheen, S.C.; McCulloch, M.; Thomas, B.L.; Riley, R.G.; Sklarew, D.S.; Mong, G.M.; Fadeff, S.K.

    1994-10-01

    DOE Methods for Evaluating Environmental and Waste Management Samples (DOE Methods) is a resource intended to support sampling and analytical activities for the evaluation of environmental and waste management samples from U.S. Department of Energy (DOE) sites. DOE Methods is the result of extensive cooperation from all DOE analytical laboratories. All of these laboratories have contributed key information and provided technical reviews as well as significant moral support leading to the success of this document. DOE Methods is designed to encompass methods for collecting representative samples and for determining the radioisotope activity and organic and inorganic composition of a sample. These determinations will aid in defining the type and breadth of contamination and thus determine the extent of environmental restoration or waste management actions needed, as defined by the DOE, the U.S. Environmental Protection Agency, or others. The development of DOE Methods is supported by the Analytical Services Division of DOE. Unique methods or methods consolidated from similar procedures in the DOE Procedures Database are selected for potential inclusion in this document. Initial selection is based largely on DOE needs and procedure applicability and completeness. Methods appearing in this document are one of two types, {open_quotes}Draft{close_quotes} or {open_quotes}Verified{close_quotes}. {open_quotes}Draft{close_quotes} methods that have been reviewed internally and show potential for eventual verification are included in this document, but they have not been reviewed externally, and their precision and bias may not be known. {open_quotes}Verified{close_quotes} methods in DOE Methods have been reviewed by volunteers from various DOE sites and private corporations. These methods have delineated measures of precision and accuracy.

  9. Determining the relative importance of soil sample locations to predict risk of child lead exposure.

    PubMed

    Zahran, Sammy; Mielke, Howard W; McElmurry, Shawn P; Filippelli, Gabriel M; Laidlaw, Mark A S; Taylor, Mark P

    2013-10-01

    Soil lead in urban neighborhoods is a known predictor of child blood lead levels. In this paper, we address the question where one ought to concentrate soil sample collection efforts to efficiently predict children at-risk for soil Pb exposure. Two extensive data sets are combined, including 5467 surface soil samples collected from 286 census tracts, and geo-referenced blood Pb data for 55,551 children in metropolitan New Orleans, USA. Random intercept least squares, random intercept logistic, and quantile regression results indicate that soils collected within 1m adjacent to residential streets most reliably predict child blood Pb outcomes in child blood Pb levels. Regression decomposition results show that residential street soils account for 39.7% of between-neighborhood explained variation, followed by busy street soils (21.97%), open space soils (20.25%), and home foundation soils (18.71%). Just as the age of housing stock is used as a statistical shortcut for child risk of exposure to lead-based paint, our results indicate that one can shortcut the characterization of child risk of exposure to neighborhood soil Pb by concentrating sampling efforts within 1m and adjacent to residential and busy streets, while significantly reducing the total costs of collection and analysis. This efficiency gain can help advance proactive upstream, preventive methods of environmental Pb discovery.

  10. Tests of a comparative method of dating plutonium samples

    NASA Astrophysics Data System (ADS)

    West, D.

    1987-04-01

    Tests of a comparative method of dating plutonium samples have been carried out using 241Pu in aqueous solution. The six samples were of known ages (between 0.25 and 15 yr) and with one exception the measured ages, using particular samples as standards, agreed with the stated ages. In one case the agreement was beter than 1% in age. Mixed-oxide fuel pins were also intercompared. In this case it was with some difficulty that a sample of known age was obtaine. Comparison using this sample and an older one gave the same value (within ±1%) for the separation date of the unknown sample on three occasions over a three year period.

  11. Importance of sampling design and analysis in animal population studies: a comment on Sergio et al

    USGS Publications Warehouse

    Kery, M.; Royle, J. Andrew; Schmid, Hans

    2008-01-01

    1. The use of predators as indicators and umbrellas in conservation has been criticized. In the Trentino region, Sergio et al. (2006; hereafter SEA) counted almost twice as many bird species in quadrats located in raptor territories than in controls. However, SEA detected astonishingly few species. We used contemporary Swiss Breeding Bird Survey data from an adjacent region and a novel statistical model that corrects for overlooked species to estimate the expected number of bird species per quadrat in that region. 2. There are two anomalies in SEA which render their results ambiguous. First, SEA detected on average only 6.8 species, whereas a value of 32 might be expected. Hence, they probably overlooked almost 80% of all species. Secondly, the precision of their mean species counts was greater in two-thirds of cases than in the unlikely case that all quadrats harboured exactly the same number of equally detectable species. This suggests that they detected consistently only a biased, unrepresentative subset of species. 3. Conceptually, expected species counts are the product of true species number and species detectability p. Plenty of factors may affect p, including date, hour, observer, previous knowledge of a site and mobbing behaviour of passerines in the presence of predators. Such differences in p between raptor and control quadrats could have easily created the observed effects. Without a method that corrects for such biases, or without quantitative evidence that species detectability was indeed similar between raptor and control quadrats, the meaning of SEA's counts is hard to evaluate. Therefore, the evidence presented by SEA in favour of raptors as indicator species for enhanced levels of biodiversity remains inconclusive. 4. Synthesis and application. Ecologists should pay greater attention to sampling design and analysis in animal population estimation. Species richness estimation means sampling a community. Samples should be representative for the

  12. Evaluating Composite Sampling Methods of Bacillus Spores at Low Concentrations

    PubMed Central

    Hess, Becky M.; Amidan, Brett G.; Anderson, Kevin K.; Hutchison, Janine R.

    2016-01-01

    Restoring all facility operations after the 2001 Amerithrax attacks took years to complete, highlighting the need to reduce remediation time. Some of the most time intensive tasks were environmental sampling and sample analyses. Composite sampling allows disparate samples to be combined, with only a single analysis needed, making it a promising method to reduce response times. We developed a statistical experimental design to test three different composite sampling methods: 1) single medium single pass composite (SM-SPC): a single cellulose sponge samples multiple coupons with a single pass across each coupon; 2) single medium multi-pass composite: a single cellulose sponge samples multiple coupons with multiple passes across each coupon (SM-MPC); and 3) multi-medium post-sample composite (MM-MPC): a single cellulose sponge samples a single surface, and then multiple sponges are combined during sample extraction. Five spore concentrations of Bacillus atrophaeus Nakamura spores were tested; concentrations ranged from 5 to 100 CFU/coupon (0.00775 to 0.155 CFU/cm2). Study variables included four clean surface materials (stainless steel, vinyl tile, ceramic tile, and painted dry wallboard) and three grime coated/dirty materials (stainless steel, vinyl tile, and ceramic tile). Analysis of variance for the clean study showed two significant factors: composite method (p< 0.0001) and coupon material (p = 0.0006). Recovery efficiency (RE) was higher overall using the MM-MPC method compared to the SM-SPC and SM-MPC methods. RE with the MM-MPC method for concentrations tested (10 to 100 CFU/coupon) was similar for ceramic tile, dry wall, and stainless steel for clean materials. RE was lowest for vinyl tile with both composite methods. Statistical tests for the dirty study showed RE was significantly higher for vinyl and stainless steel materials, but lower for ceramic tile. These results suggest post-sample compositing can be used to reduce sample analysis time when

  13. Estimation variance bounds of importance sampling simulations in digital communication systems

    NASA Technical Reports Server (NTRS)

    Lu, D.; Yao, K.

    1991-01-01

    In practical applications of importance sampling (IS) simulation, two basic problems are encountered, that of determining the estimation variance and that of evaluating the proper IS parameters needed in the simulations. The authors derive new upper and lower bounds on the estimation variance which are applicable to IS techniques. The upper bound is simple to evaluate and may be minimized by the proper selection of the IS parameter. Thus, lower and upper bounds on the improvement ratio of various IS techniques relative to the direct Monte Carlo simulation are also available. These bounds are shown to be useful and computationally simple to obtain. Based on the proposed technique, one can readily find practical suboptimum IS parameters. Numerical results indicate that these bounding techniques are useful for IS simulations of linear and nonlinear communication systems with intersymbol interference in which bit error rate and IS estimation variances cannot be obtained readily using prior techniques.

  14. COMPARISON OF MACROINVERTEBRATE SAMPLING METHODS FOR NONWADEABLE STREAMS

    EPA Science Inventory

    The bioassessment of nonwadeable streams in the United States is increasing, but methods for these systems are not as well developed as for wadeable streams. In this study, we compared six benthic macroinvertebrate field sampling methods for nonwadeable streams based on those us...

  15. The Precision Efficacy Analysis for Regression Sample Size Method.

    ERIC Educational Resources Information Center

    Brooks, Gordon P.; Barcikowski, Robert S.

    The general purpose of this study was to examine the efficiency of the Precision Efficacy Analysis for Regression (PEAR) method for choosing appropriate sample sizes in regression studies used for precision. The PEAR method, which is based on the algebraic manipulation of an accepted cross-validity formula, essentially uses an effect size to…

  16. A multi-dimensional sampling method for locating small scatterers

    NASA Astrophysics Data System (ADS)

    Song, Rencheng; Zhong, Yu; Chen, Xudong

    2012-11-01

    A multiple signal classification (MUSIC)-like multi-dimensional sampling method (MDSM) is introduced to locate small three-dimensional scatterers using electromagnetic waves. The indicator is built with the most stable part of signal subspace of the multi-static response matrix on a set of combinatorial sampling nodes inside the domain of interest. It has two main advantages compared to the conventional MUSIC methods. First, the MDSM is more robust against noise. Second, it can work with a single incidence even for multi-scatterers. Numerical simulations are presented to show the good performance of the proposed method.

  17. 40 CFR 80.1348 - What gasoline sample retention requirements apply to refiners and importers?

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... Benzene Sampling, Testing and Retention Requirements § 80.1348 What gasoline sample retention requirements... independent laboratory shall also include with the retained sample the test result for benzene as...

  18. 40 CFR 80.1348 - What gasoline sample retention requirements apply to refiners and importers?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... Benzene Sampling, Testing and Retention Requirements § 80.1348 What gasoline sample retention requirements... independent laboratory shall also include with the retained sample the test result for benzene as...

  19. Beryllium Wipe Sampling (differing methods - differing exposure potentials)

    SciTech Connect

    Kerr, Kent

    2005-03-09

    This research compared three wipe sampling techniques currently used to test for beryllium contamination on room and equipment surfaces in Department of Energy facilities. Efficiencies of removal of beryllium contamination from typical painted surfaces were tested by wipe sampling without a wetting agent, with water-moistened wipe materials, and by methanol-moistened wipes. Analysis indicated that methanol-moistened wipe sampling removed about twice as much beryllium/oil-film surface contamination as water-moistened wipes, which removed about twice as much residue as dry wipes. Criteria at 10 CFR 850.30 and .31 were established on unspecified wipe sampling method(s). The results of this study reveal a need to identify criteria-setting method and equivalency factors. As facilities change wipe sampling methods among the three compared in this study, these results may be useful for approximate correlations. Accurate decontamination decision-making depends on the selection of appropriate wetting agents for the types of residues and surfaces. Evidence for beryllium sensitization via skin exposure argues in favor of wipe sampling with wetting agents that provide enhanced removal efficiency such as methanol when surface contamination includes oil mist residue.

  20. Rapid method for the determination of 226Ra in hydraulic fracturing wastewater samples

    DOE PAGES

    Maxwell, Sherrod L.; Culligan, Brian K.; Warren, Richard A.; ...

    2016-03-24

    A new method that rapidly preconcentrates and measures 226Ra from hydraulic fracturing wastewater samples was developed in the Savannah River Environmental Laboratory. The method improves the quality of 226Ra measurements using gamma spectrometry by providing up to 100x preconcentration of 226Ra from this difficult sample matrix, which contains very high levels of calcium, barium, strontium, magnesium and sodium. The high chemical yield, typically 80-90%, facilitates a low detection limit, important for lower level samples, and indicates method ruggedness. Ba-133 tracer is used to determine chemical yield and correct for geometry-related counting issues. The 226Ra sample preparation takes < 2 hours.

  1. Self-contained cryogenic gas sampling apparatus and method

    DOEpatents

    McManus, G.J.; Motes, B.G.; Bird, S.K.; Kotter, D.K.

    1996-03-26

    Apparatus for obtaining a whole gas sample, is composed of: a sample vessel having an inlet for receiving a gas sample; a controllable valve mounted for controllably opening and closing the inlet; a valve control coupled to the valve for opening and closing the valve at selected times; a portable power source connected for supplying operating power to the valve control; and a cryogenic coolant in thermal communication with the vessel for cooling the interior of the vessel to cryogenic temperatures. A method is described for obtaining an air sample using the apparatus described above, by: placing the apparatus at a location at which the sample is to be obtained; operating the valve control to open the valve at a selected time and close the valve at a selected subsequent time; and between the selected times maintaining the vessel at a cryogenic temperature by heat exchange with the coolant. 3 figs.

  2. Self-contained cryogenic gas sampling apparatus and method

    DOEpatents

    McManus, Gary J.; Motes, Billy G.; Bird, Susan K.; Kotter, Dale K.

    1996-01-01

    Apparatus for obtaining a whole gas sample, composed of: a sample vessel having an inlet for receiving a gas sample; a controllable valve mounted for controllably opening and closing the inlet; a valve control coupled to the valve for opening and closing the valve at selected times; a portable power source connected for supplying operating power to the valve control; and a cryogenic coolant in thermal communication with the vessel for cooling the interior of the vessel to cryogenic temperatures. A method of obtaining an air sample using the apparatus described above, by: placing the apparatus at a location at which the sample is to be obtained; operating the valve control to open the valve at a selected time and close the valve at a selected subsequent time; and between the selected times maintaining the vessel at a cryogenic temperature by heat exchange with the coolant.

  3. Fluidics platform and method for sample preparation and analysis

    SciTech Connect

    Benner, W. Henry; Dzenitis, John M.; Bennet, William J.; Baker, Brian R.

    2014-08-19

    Herein provided are fluidics platform and method for sample preparation and analysis. The fluidics platform is capable of analyzing DNA from blood samples using amplification assays such as polymerase-chain-reaction assays and loop-mediated-isothermal-amplification assays. The fluidics platform can also be used for other types of assays and analyzes. In some embodiments, a sample in a sealed tube can be inserted directly. The following isolation, detection, and analyzes can be performed without a user's intervention. The disclosed platform may also comprises a sample preparation system with a magnetic actuator, a heater, and an air-drying mechanism, and fluid manipulation processes for extraction, washing, elution, assay assembly, assay detection, and cleaning after reactions and between samples.

  4. Importance of long-time simulations for rare event sampling in zinc finger proteins.

    PubMed

    Godwin, Ryan; Gmeiner, William; Salsbury, Freddie R

    2016-01-01

    Molecular dynamics (MD) simulation methods have seen significant improvement since their inception in the late 1950s. Constraints of simulation size and duration that once impeded the field have lessened with the advent of better algorithms, faster processors, and parallel computing. With newer techniques and hardware available, MD simulations of more biologically relevant timescales can now sample a broader range of conformational and dynamical changes including rare events. One concern in the literature has been under which circumstances it is sufficient to perform many shorter timescale simulations and under which circumstances fewer longer simulations are necessary. Herein, our simulations of the zinc finger NEMO (2JVX) using multiple simulations of length 15, 30, 1000, and 3000 ns are analyzed to provide clarity on this point.

  5. 40 CFR 80.1348 - What gasoline sample retention requirements apply to refiners and importers?

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... Benzene Sampling, Testing and Retention Requirements § 80.1348 What gasoline sample retention requirements... include with the retained sample the test result for benzene as conducted pursuant to § 80.46(e). (b... sample the test result for benzene as conducted pursuant to § 80.47....

  6. 40 CFR 80.330 - What are the sampling and testing requirements for refiners and importers?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Practice for Manual Sampling of Petroleum and Petroleum Products.” (ii) Samples collected under the... present that could affect the sulfur test result. (2) Automatic sampling of petroleum products in..., entitled “Standard Practice for Automatic Sampling of Petroleum and Petroleum Products.” (c) Test...

  7. 40 CFR 80.330 - What are the sampling and testing requirements for refiners and importers?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... Practice for Manual Sampling of Petroleum and Petroleum Products.” (ii) Samples collected under the... present that could affect the sulfur test result. (2) Automatic sampling of petroleum products in..., entitled “Standard Practice for Automatic Sampling of Petroleum and Petroleum Products.” (c) Test...

  8. Detecting spatial structures in throughfall data: the effect of extent, sample size, sampling design, and variogram estimation method

    NASA Astrophysics Data System (ADS)

    Voss, Sebastian; Zimmermann, Beate; Zimmermann, Alexander

    2016-04-01

    In the last three decades, an increasing number of studies analyzed spatial patterns in throughfall to investigate the consequences of rainfall redistribution for biogeochemical and hydrological processes in forests. In the majority of cases, variograms were used to characterize the spatial properties of the throughfall data. The estimation of the variogram from sample data requires an appropriate sampling scheme: most importantly, a large sample and an appropriate layout of sampling locations that often has to serve both variogram estimation and geostatistical prediction. While some recommendations on these aspects exist, they focus on Gaussian data and high ratios of the variogram range to the extent of the study area. However, many hydrological data, and throughfall data in particular, do not follow a Gaussian distribution. In this study, we examined the effect of extent, sample size, sampling design, and calculation methods on variogram estimation of throughfall data. For our investigation, we first generated non-Gaussian random fields based on throughfall data with heavy outliers. Subsequently, we sampled the fields with three extents (plots with edge lengths of 25 m, 50 m, and 100 m), four common sampling designs (two grid-based layouts, transect and random sampling), and five sample sizes (50, 100, 150, 200, 400). We then estimated the variogram parameters by method-of-moments and residual maximum likelihood. Our key findings are threefold. First, the choice of the extent has a substantial influence on the estimation of the variogram. A comparatively small ratio of the extent to the correlation length is beneficial for variogram estimation. Second, a combination of a minimum sample size of 150, a design that ensures the sampling of small distances and variogram estimation by residual maximum likelihood offers a good compromise between accuracy and efficiency. Third, studies relying on method-of-moments based variogram estimation may have to employ at least

  9. Detecting spatial structures in throughfall data: The effect of extent, sample size, sampling design, and variogram estimation method

    NASA Astrophysics Data System (ADS)

    Voss, Sebastian; Zimmermann, Beate; Zimmermann, Alexander

    2016-09-01

    In the last decades, an increasing number of studies analyzed spatial patterns in throughfall by means of variograms. The estimation of the variogram from sample data requires an appropriate sampling scheme: most importantly, a large sample and a layout of sampling locations that often has to serve both variogram estimation and geostatistical prediction. While some recommendations on these aspects exist, they focus on Gaussian data and high ratios of the variogram range to the extent of the study area. However, many hydrological data, and throughfall data in particular, do not follow a Gaussian distribution. In this study, we examined the effect of extent, sample size, sampling design, and calculation method on variogram estimation of throughfall data. For our investigation, we first generated non-Gaussian random fields based on throughfall data with large outliers. Subsequently, we sampled the fields with three extents (plots with edge lengths of 25 m, 50 m, and 100 m), four common sampling designs (two grid-based layouts, transect and random sampling) and five sample sizes (50, 100, 150, 200, 400). We then estimated the variogram parameters by method-of-moments (non-robust and robust estimators) and residual maximum likelihood. Our key findings are threefold. First, the choice of the extent has a substantial influence on the estimation of the variogram. A comparatively small ratio of the extent to the correlation length is beneficial for variogram estimation. Second, a combination of a minimum sample size of 150, a design that ensures the sampling of small distances and variogram estimation by residual maximum likelihood offers a good compromise between accuracy and efficiency. Third, studies relying on method-of-moments based variogram estimation may have to employ at least 200 sampling points for reliable variogram estimates. These suggested sample sizes exceed the number recommended by studies dealing with Gaussian data by up to 100 %. Given that most previous

  10. Compressive sampling in computed tomography: Method and application

    NASA Astrophysics Data System (ADS)

    Hu, Zhanli; Liang, Dong; Xia, Dan; Zheng, Hairong

    2014-06-01

    Since Donoho and Candes et al. published their groundbreaking work on compressive sampling or compressive sensing (CS), CS theory has attracted a lot of attention and become a hot topic, especially in biomedical imaging. Specifically, some CS based methods have been developed to enable accurate reconstruction from sparse data in computed tomography (CT) imaging. In this paper, we will review the progress in CS based CT from aspects of three fundamental requirements of CS: sparse representation, incoherent sampling and reconstruction algorithm. In addition, some potential applications of compressive sampling in CT are introduced.

  11. Comparison of pigment content of paint samples using spectrometric methods.

    PubMed

    Trzcińska, Beata; Kowalski, Rafał; Zięba-Palus, Janina

    2014-09-15

    The aim of the paper was to evaluate the influence of pigment concentration and its distribution in polymer binder on the possibility of colour identification and paint sample comparison. Two sets of paint samples: one containing red and another one green pigment were prepared. Each set consisted of 13 samples differing gradually in the concentration of pigment. To obtain the sets of various colour shades white paint was mixed with the appropriate pigment in the form of a concentrated suspension. After solvents evaporation the samples were examined using spectrometric methods. The resin and main filler were identified by IR method. Colour and white pigments were identified on the base of Raman spectra. Colour of samples were compared based on Vis spectrometry according to colour theory. It was found that samples are homogenous (parameter measuring colour similarity ΔE<3). The values of ΔE between the neighbouring samples in the set revealed decreasing linear function and between the first and following one--a logarithmic function.

  12. RAPID METHOD FOR DETERMINATION OF RADIOSTRONTIUM IN EMERGENCY MILK SAMPLES

    SciTech Connect

    Maxwell, S.; Culligan, B.

    2008-07-17

    A new rapid separation method for radiostrontium in emergency milk samples was developed at the Savannah River Site (SRS) Environmental Bioassay Laboratory (Aiken, SC, USA) that will allow rapid separation and measurement of Sr-90 within 8 hours. The new method uses calcium phosphate precipitation, nitric acid dissolution of the precipitate to coagulate residual fat/proteins and a rapid strontium separation using Sr Resin (Eichrom Technologies, Darien, IL, USA) with vacuum-assisted flow rates. The method is much faster than previous method that use calcination or cation exchange pretreatment, has excellent chemical recovery, and effectively removes beta interferences. When a 100 ml sample aliquot is used, the method has a detection limit of 0.5 Bq/L, well below generic emergency action levels.

  13. A method to optimize sampling locations for measuring indoor air distributions

    NASA Astrophysics Data System (ADS)

    Huang, Yan; Shen, Xiong; Li, Jianmin; Li, Bingye; Duan, Ran; Lin, Chao-Hsin; Liu, Junjie; Chen, Qingyan

    2015-02-01

    Indoor air distributions, such as the distributions of air temperature, air velocity, and contaminant concentrations, are very important to occupants' health and comfort in enclosed spaces. When point data is collected for interpolation to form field distributions, the sampling locations (the locations of the point sensors) have a significant effect on time invested, labor costs and measuring accuracy on field interpolation. This investigation compared two different sampling methods: the grid method and the gradient-based method, for determining sampling locations. The two methods were applied to obtain point air parameter data in an office room and in a section of an economy-class aircraft cabin. The point data obtained was then interpolated to form field distributions by the ordinary Kriging method. Our error analysis shows that the gradient-based sampling method has 32.6% smaller error of interpolation than the grid sampling method. We acquired the function between the interpolation errors and the sampling size (the number of sampling points). According to the function, the sampling size has an optimal value and the maximum sampling size can be determined by the sensor and system errors. This study recommends the gradient-based sampling method for measuring indoor air distributions.

  14. A comparison of sampling methods for examining the laryngeal microbiome

    PubMed Central

    Hanshew, Alissa S.; Jetté, Marie E.; Tadayon, Stephanie; Thibeault, Susan L.

    2017-01-01

    Shifts in healthy human microbial communities have now been linked to disease in numerous body sites. Noninvasive swabbing remains the sampling technique of choice in most locations; however, it is not well known if this method samples the entire community, or only those members that are easily removed from the surface. We sought to compare the communities found via swabbing and biopsied tissue in true vocal folds, a location that is difficult to sample without causing potential damage and impairment to tissue function. A secondary aim of this study was to determine if swab sampling of the false vocal folds could be used as proxy for true vocal folds. True and false vocal fold mucosal samples (swabbed and biopsied) were collected from six pigs and used for 454 pyrosequencing of the V3–V5 region of the 16S rRNA gene. Most of the alpha and beta measures of diversity were found to be significantly similar between swabbed and biopsied tissue samples. Similarly, the communities found in true and false vocal folds did not differ considerably. These results suggest that samples taken via swabs are sufficient to assess the community, and that samples taken from the false vocal folds may be used as proxies for the true vocal folds. Assessment of these techniques opens an avenue to less traumatic means to explore the role microbes play in the development of diseases of the vocal folds, and perhaps the rest of the respiratory tract. PMID:28362810

  15. NEW COLUMN SEPARATION METHOD FOR EMERGENCY URINE SAMPLES

    SciTech Connect

    Maxwell, S; Brian Culligan, B

    2007-08-28

    The Savannah River Site Environmental Bioassay Lab participated in the 2007 NRIP Emergency Response program administered by the National Institute for Standards and Technology (NIST) in May, 2007. A new rapid column separation method was applied directly to the NRIP 2007 emergency urine samples, with only minimal sample preparation to reduce preparation time. Calcium phosphate precipitation, previously used to pre-concentrate actinides and Sr-90 in NRIP 2006 urine and water samples, was not used for the NRIP 2007 urine samples. Instead, the raw urine was acidified and passed directly through the stacked resin columns (TEVA+TRU+SR Resins) to separate the actinides and strontium from the NRIP urine samples more quickly. This improvement reduced sample preparation time for the NRIP 2007 emergency urine analyses significantly. This approach works well for small volume urine samples expected during an emergency response event. Based on initial feedback from NIST, the SRS Environmental Bioassay Lab had the most rapid analysis times for actinides and strontium-90 analyses for NRIP 2007 urine samples.

  16. On the Exploitation of Sensitivity Derivatives for Improving Sampling Methods

    NASA Technical Reports Server (NTRS)

    Cao, Yanzhao; Hussaini, M. Yousuff; Zang, Thomas A.

    2003-01-01

    Many application codes, such as finite-element structural analyses and computational fluid dynamics codes, are capable of producing many sensitivity derivatives at a small fraction of the cost of the underlying analysis. This paper describes a simple variance reduction method that exploits such inexpensive sensitivity derivatives to increase the accuracy of sampling methods. Three examples, including a finite-element structural analysis of an aircraft wing, are provided that illustrate an order of magnitude improvement in accuracy for both Monte Carlo and stratified sampling schemes.

  17. Two-dimensional signal reconstruction: The correlation sampling method

    SciTech Connect

    Roman, H. E.

    2007-12-15

    An accurate approach for reconstructing a time-dependent two-dimensional signal from non-synchronized time series recorded at points located on a grid is discussed. The method, denoted as correlation sampling, improves the standard conditional sampling approach commonly employed in the study of turbulence in magnetoplasma devices. Its implementation is illustrated in the case of an artificial time-dependent signal constructed using a fractal algorithm that simulates a fluctuating surface. A statistical method is also discussed for distinguishing coherent (i.e., collective) from purely random (noisy) behavior for such two-dimensional fluctuating phenomena.

  18. Evaluation of Stress Loaded Steel Samples Using Selected Electromagnetic Methods

    SciTech Connect

    Chady, T.

    2004-02-26

    In this paper the magnetic leakage flux and eddy current method were used to evaluate changes of materials' properties caused by stress. Seven samples made of ferromagnetic material with different level of applied stress were prepared. First, the leakage magnetic fields were measured by scanning the surface of the specimens with GMR gradiometer. Next, the same samples were evaluated using an eddy current sensor. A comparison between results obtained from both methods was carried out. Finally, selected parameters of the measured signal were calculated and utilized to evaluate level of the applied stress. A strong coincidence between amount of the applied stress and the maximum amplitude of the derivative was confirmed.

  19. A Modified Trap for Adult Sampling of Medically Important Flies (Insecta: Diptera)

    PubMed Central

    Akbarzadeh, Kamran; Rafinejad, Javad; Nozari, Jamasb; Rassi, Yavar; Sedaghat, Mohammad Mehdi; Hosseini, Mostafa

    2012-01-01

    Background: Bait-trapping appears to be a generally useful method of studying fly populations. The aim of this study was to construct a new adult flytrap by some modifications in former versions and to evaluate its applicability in a subtropical zone in southern Iran. Methods: The traps were constructed with modification by adding some equipment to a polyethylene container (18× 20× 33 cm) with lid. The fresh sheep meat was used as bait. Totally 27 adult modified traps were made and tested for their efficacies to attract adult flies. The experiment was carried out in a range of different topographic areas of Fars Province during June 2010. Results: The traps were able to attract various groups of adult flies belonging to families of: Calliphoridae, Sarcophagidae, Muscidae, and Faniidae. The species of Calliphora vicina (Diptera: Calliphoridae), Sarcophaga argyrostoma (Diptera: Sarcophagidae) and Musca domestica (Diptera: Muscidae) include the majority of the flies collected by this sheep-meat baited trap. Conclusion: This adult flytrap can be recommended for routine field sampling to study diversity and population dynamics of flies where conducting of daily collection is difficult. PMID:23378969

  20. Comparison of sampling methods for radiocarbon dating of carbonyls in air samples via accelerator mass spectrometry

    NASA Astrophysics Data System (ADS)

    Schindler, Matthias; Kretschmer, Wolfgang; Scharf, Andreas; Tschekalinskij, Alexander

    2016-05-01

    Three new methods to sample and prepare various carbonyl compounds for radiocarbon measurements were developed and tested. Two of these procedures utilized the Strecker synthetic method to form amino acids from carbonyl compounds with either sodium cyanide or trimethylsilyl cyanide. The third procedure used semicarbazide to form crystalline carbazones with the carbonyl compounds. The resulting amino acids and semicarbazones were then separated and purified using thin layer chromatography. The separated compounds were then combusted to CO2 and reduced to graphite to determine 14C content by accelerator mass spectrometry (AMS). All of these methods were also compared with the standard carbonyl compound sampling method wherein a compound is derivatized with 2,4-dinitrophenylhydrazine and then separated by high-performance liquid chromatography (HPLC).

  1. A molecular method to assess Phytophthora diversity in environmental samples.

    PubMed

    Scibetta, Silvia; Schena, Leonardo; Chimento, Antonio; Cacciola, Santa O; Cooke, David E L

    2012-03-01

    Current molecular detection methods for the genus Phytophthora are specific to a few key species rather than the whole genus and this is a recognized weakness of protocols for ecological studies and international plant health legislation. In the present study a molecular approach was developed to detect Phytophthora species in soil and water samples using novel sets of genus-specific primers designed against the internal transcribed spacer (ITS) regions. Two different rDNA primer sets were tested: one assay amplified a long product including the ITS1, 5.8S and ITS2 regions (LP) and the other a shorter product including the ITS1 only (SP). Both assays specifically amplified products from Phytophthora species without cross-reaction with the related Pythium s. lato, however the SP assay proved the more sensitive and reliable. The method was validated using woodland soil and stream water from Invergowrie, Scotland. On-site use of a knapsack sprayer and in-line water filters proved more rapid and effective than centrifugation at sampling Phytophthora propagules. A total of 15 different Phytophthora phylotypes were identified which clustered within the reported ITS-clades 1, 2, 3, 6, 7 and 8. The range and type of the sequences detected varied from sample to sample and up to three and five different Phytophthora phylotypes were detected within a single sample of soil or water, respectively. The most frequently detected sequences were related to members of ITS-clade 6 (i.e. P. gonapodyides-like). The new method proved very effective at discriminating multiple species in a given sample and can also detect as yet unknown species. The reported primers and methods will prove valuable for ecological studies, biosecurity and commercial plant, soil or water (e.g. irrigation water) testing as well as the wider metagenomic sampling of this fascinating component of microbial pathogen diversity.

  2. RAPID SEPARATION METHOD FOR EMERGENCY WATER AND URINE SAMPLES

    SciTech Connect

    Maxwell, S.; Culligan, B.

    2008-08-27

    The Savannah River Site Environmental Bioassay Lab participated in the 2008 NRIP Emergency Response program administered by the National Institute for Standards and Technology (NIST) in May, 2008. A new rapid column separation method was used for analysis of actinides and {sup 90}Sr the NRIP 2008 emergency water and urine samples. Significant method improvements were applied to reduce analytical times. As a result, much faster analysis times were achieved, less than 3 hours for determination of {sup 90}Sr and 3-4 hours for actinides. This represents a 25%-33% improvement in analysis times from NRIP 2007 and a {approx}100% improvement compared to NRIP 2006 report times. Column flow rates were increased by a factor of two, with no significant adverse impact on the method performance. Larger sample aliquots, shorter count times, faster cerium fluoride microprecipitation and streamlined calcium phosphate precipitation were also employed. Based on initial feedback from NIST, the SRS Environmental Bioassay Lab had the most rapid analysis times for actinides and {sup 90}Sr analyses for NRIP 2008 emergency urine samples. High levels of potential matrix interferences may be present in emergency samples and rugged methods are essential. Extremely high levels of {sup 210}Po were found to have an adverse effect on the uranium results for the NRIP-08 urine samples, while uranium results for NRIP-08 water samples were not affected. This problem, which was not observed for NRIP-06 or NRIP-07 urine samples, was resolved by using an enhanced {sup 210}Po removal step, which will be described.

  3. The impact of particle size selective sampling methods on occupational assessment of airborne beryllium particulates.

    PubMed

    Sleeth, Darrah K

    2013-05-01

    In 2010, the American Conference of Governmental Industrial Hygienists (ACGIH) formally changed its Threshold Limit Value (TLV) for beryllium from a 'total' particulate sample to an inhalable particulate sample. This change may have important implications for workplace air sampling of beryllium. A history of particle size-selective sampling methods, with a special focus on beryllium, will be provided. The current state of the science on inhalable sampling will also be presented, including a look to the future at what new methods or technology may be on the horizon. This includes new sampling criteria focused on particle deposition in the lung, proposed changes to the existing inhalable convention, as well as how the issues facing beryllium sampling may help drive other changes in sampling technology.

  4. Proteome Analysis of Human Perilymph using an Intraoperative Sampling Method.

    PubMed

    Schmitt, Heike Andrea; Pich, Andreas; Schröder, Anke; Scheper, Verena; Lilli, Giorgio; Reuter, Günter; Lenarz, Thomas

    2017-03-10

    The knowledge about the etiology and pathophysiology of sensorineural hearing loss (SNHL) is still very limited. The project aims at the improvement of understanding different types of SNHL by proteome analysis of human perilymph. Sampling of perilymph has been established during inner ear surgeries (cochlear implant and vestibular schwannoma surgeries) and safety of the sampling method was determined by pure tone audiometry. An in-depth shot-gun proteomics approach was performed to identify cochlear proteins and individual proteome in perilymph of patients. This method enables the identification and quantification of protein composition of perilymph. The proteome of 41 collected perilymph samples with volumes of 1-12 µl was analyzed by data dependent acquisition resulting in overall 878 detected protein groups. At least 203 protein groups were solely identified in perilymph, not in reference samples (serum, cerebrospinal fluid), displaying a specific protein pattern for perilymph. Samples were grouped according to age of patients and type of surgery leading to identification of some proteins specific to particular subgroups. Proteins with different abundances between different sample groups were subjected to classification by gene ontology annotations. The identified proteins might be used to develop tools for non-invasive inner ear diagnostics and to elucidate molecular profiles of SNHL.

  5. Method and apparatus for sampling low-yield wells

    DOEpatents

    Last, George V.; Lanigan, David C.

    2003-04-15

    An apparatus and method for collecting a sample from a low-yield well or perched aquifer includes a pump and a controller responsive to water level sensors for filling a sample reservoir. The controller activates the pump to fill the reservoir when the water level in the well reaches a high level as indicated by the sensor. The controller deactivates the pump when the water level reaches a lower level as indicated by the sensors. The pump continuously activates and deactivates the pump until the sample reservoir is filled with a desired volume, as indicated by a reservoir sensor. At the beginning of each activation cycle, the controller optionally can select to purge an initial quantity of water prior to filling the sample reservoir. The reservoir can be substantially devoid of air and the pump is a low volumetric flow rate pump. Both the pump and the reservoir can be located either inside or outside the well.

  6. 40 CFR 80.1348 - What gasoline sample retention requirements apply to refiners and importers?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 16 2010-07-01 2010-07-01 false What gasoline sample retention... PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) REGULATION OF FUELS AND FUEL ADDITIVES Gasoline Benzene Sampling, Testing and Retention Requirements § 80.1348 What gasoline sample retention...

  7. 40 CFR 80.1348 - What gasoline sample retention requirements apply to refiners and importers?

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 17 2012-07-01 2012-07-01 false What gasoline sample retention... PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) REGULATION OF FUELS AND FUEL ADDITIVES Gasoline Benzene Sampling, Testing and Retention Requirements § 80.1348 What gasoline sample retention...

  8. 40 CFR 80.1644 - Sampling and testing requirements for producers and importers of certified ethanol denaturant.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... producers and importers of certified ethanol denaturant. 80.1644 Section 80.1644 Protection of Environment... ethanol denaturant. (a) Sample and test each batch of certified ethanol denaturant. (1) Producers and importers of certified ethanol denaturant shall collect a representative sample from each batch of...

  9. METHODS FOR THE ANALYSIS OF CARPET SAMPLES FOR ASBESTOS

    EPA Science Inventory

    Assessing asbestos fiber contamination in a carpet is complicated by the nature of the carpeting – because of the pile’s rough surface and thickness, samples cannot be collected directly from carpet for analysis by TEM. Two indirect methods are currently used by laboratories when...

  10. A General Linear Method for Equating with Small Samples

    ERIC Educational Resources Information Center

    Albano, Anthony D.

    2015-01-01

    Research on equating with small samples has shown that methods with stronger assumptions and fewer statistical estimates can lead to decreased error in the estimated equating function. This article introduces a new approach to linear observed-score equating, one which provides flexible control over how form difficulty is assumed versus estimated…

  11. 40 CFR Appendix I to Part 261 - Representative Sampling Methods

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 26 2014-07-01 2014-07-01 false Representative Sampling Methods I Appendix I to Part 261 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) SOLID WASTES (CONTINUED) IDENTIFICATION AND LISTING OF HAZARDOUS WASTE Pt. 261, App. I Appendix I to Part...

  12. 40 CFR Appendix I to Part 261 - Representative Sampling Methods

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 26 2011-07-01 2011-07-01 false Representative Sampling Methods I Appendix I to Part 261 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) SOLID WASTES (CONTINUED) IDENTIFICATION AND LISTING OF HAZARDOUS WASTE Pt. 261, App. I Appendix I to Part...

  13. 40 CFR Appendix I to Part 261 - Representative Sampling Methods

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 27 2012-07-01 2012-07-01 false Representative Sampling Methods I Appendix I to Part 261 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) SOLID WASTES (CONTINUED) IDENTIFICATION AND LISTING OF HAZARDOUS WASTE Pt. 261, App. I Appendix I to Part...

  14. 40 CFR Appendix I to Part 261 - Representative Sampling Methods

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 27 2013-07-01 2013-07-01 false Representative Sampling Methods I Appendix I to Part 261 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) SOLID WASTES (CONTINUED) IDENTIFICATION AND LISTING OF HAZARDOUS WASTE Pt. 261, App. I Appendix I to Part...

  15. On the Importance of Accounting for Competing Risks in Pediatric Brain Cancer: II. Regression Modeling and Sample Size

    SciTech Connect

    Tai, Bee-Choo; Grundy, Richard; Machin, David

    2011-03-15

    Purpose: To accurately model the cumulative need for radiotherapy in trials designed to delay or avoid irradiation among children with malignant brain tumor, it is crucial to account for competing events and evaluate how each contributes to the timing of irradiation. An appropriate choice of statistical model is also important for adequate determination of sample size. Methods and Materials: We describe the statistical modeling of competing events (A, radiotherapy after progression; B, no radiotherapy after progression; and C, elective radiotherapy) using proportional cause-specific and subdistribution hazard functions. The procedures of sample size estimation based on each method are outlined. These are illustrated by use of data comparing children with ependymoma and other malignant brain tumors. The results from these two approaches are compared. Results: The cause-specific hazard analysis showed a reduction in hazards among infants with ependymoma for all event types, including Event A (adjusted cause-specific hazard ratio, 0.76; 95% confidence interval, 0.45-1.28). Conversely, the subdistribution hazard analysis suggested an increase in hazard for Event A (adjusted subdistribution hazard ratio, 1.35; 95% confidence interval, 0.80-2.30), but the reduction in hazards for Events B and C remained. Analysis based on subdistribution hazard requires a larger sample size than the cause-specific hazard approach. Conclusions: Notable differences in effect estimates and anticipated sample size were observed between methods when the main event showed a beneficial effect whereas the competing events showed an adverse effect on the cumulative incidence. The subdistribution hazard is the most appropriate for modeling treatment when its effects on both the main and competing events are of interest.

  16. Evaluation of seven aquatic sampling methods for amphibians and other aquatic fauna

    USGS Publications Warehouse

    Gunzburger, M.S.

    2007-01-01

    To design effective and efficient research and monitoring programs researchers must have a thorough understanding of the capabilities and limitations of their sampling methods. Few direct comparative studies exist for aquatic sampling methods for amphibians. The objective of this study was to simultaneously employ seven aquatic sampling methods in 10 wetlands to compare amphibian species richness and number of individuals detected with each method. Four sampling methods allowed counts of individuals (metal dipnet, D-frame dipnet, box trap, crayfish trap), whereas the other three methods allowed detection of species (visual encounter, aural, and froglogger). Amphibian species richness was greatest with froglogger, box trap, and aural samples. For anuran species, the sampling methods by which each life stage was detected was related to relative length of larval and breeding periods and tadpole size. Detection probability of amphibians varied across sampling methods. Box trap sampling resulted in the most precise amphibian count, but the precision of all four count-based methods was low (coefficient of variation > 145 for all methods). The efficacy of the four count sampling methods at sampling fish and aquatic invertebrates was also analyzed because these predatory taxa are known to be important predictors of amphibian habitat distribution. Species richness and counts were similar for fish with the four methods, whereas invertebrate species richness and counts were greatest in box traps. An effective wetland amphibian monitoring program in the southeastern United States should include multiple sampling methods to obtain the most accurate assessment of species community composition at each site. The combined use of frogloggers, crayfish traps, and dipnets may be the most efficient and effective amphibian monitoring protocol. ?? 2007 Brill Academic Publishers.

  17. Universal nucleic acids sample preparation method for cells, spores and their mixture

    DOEpatents

    Bavykin, Sergei [Darien, IL

    2011-01-18

    The present invention relates to a method for extracting nucleic acids from biological samples. More specifically the invention relates to a universal method for extracting nucleic acids from unidentified biological samples. An advantage of the presently invented method is its ability to effectively and efficiently extract nucleic acids from a variety of different cell types including but not limited to prokaryotic or eukaryotic cells and/or recalcitrant organisms (i.e. spores). Unlike prior art methods which are focused on extracting nucleic acids from vegetative cell or spores, the present invention effectively extracts nucleic acids from spores, multiple cell types or mixtures thereof using a single method. Important that the invented method has demonstrated an ability to extract nucleic acids from spores and vegetative bacterial cells with similar levels effectiveness. The invented method employs a multi-step protocol which erodes the cell structure of the biological sample, isolates, labels, fragments nucleic acids and purifies labeled samples from the excess of dye.

  18. 40 CFR 80.583 - What alternative sampling and testing requirements apply to importers who transport motor vehicle...

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... applicable. (b) Quality assurance program. The importer must conduct a quality assurance program, as specified in this paragraph (b), for each truck or rail car loading terminal. (1) Quality assurance samples... frequency of the quality assurance sampling and testing must be at least one sample for each 50 of...

  19. [Methods of the elaboration of data of the cardiological importance].

    PubMed

    Marchesi, C; Taddei, A; Varanini, M

    1987-12-01

    This paper deals with some introductory topics of signal processing and decision making in cardiology. In both instances the matter is referred to general schemes well suited to host different applications. Signal processing is divided in some phases: acquisition, storing, analysis and each of them is described with applications to specific signals. In a similar manner the methods for decision making have been simplified to a scheme including a "knowledge base" and an "inference method". The scheme is used to classify various implementations. Bayes analysis and expert systems have been introduced with some details.

  20. Comparison of DNA preservation methods for environmental bacterial community samples

    USGS Publications Warehouse

    Gray, Michael A.; Pratte, Zoe A.; Kellogg, Christina A.

    2013-01-01

    Field collections of environmental samples, for example corals, for molecular microbial analyses present distinct challenges. The lack of laboratory facilities in remote locations is common, and preservation of microbial community DNA for later study is critical. A particular challenge is keeping samples frozen in transit. Five nucleic acid preservation methods that do not require cold storage were compared for effectiveness over time and ease of use. Mixed microbial communities of known composition were created and preserved by DNAgard™, RNAlater®, DMSO–EDTA–salt (DESS), FTA® cards, and FTA Elute® cards. Automated ribosomal intergenic spacer analysis and clone libraries were used to detect specific changes in the faux communities over weeks and months of storage. A previously known bias in FTA® cards that results in lower recovery of pure cultures of Gram-positive bacteria was also detected in mixed community samples. There appears to be a uniform bias across all five preservation methods against microorganisms with high G + C DNA. Overall, the liquid-based preservatives (DNAgard™, RNAlater®, and DESS) outperformed the card-based methods. No single liquid method clearly outperformed the others, leaving method choice to be based on experimental design, field facilities, shipping constraints, and allowable cost.

  1. Comparison of aquatic macroinvertebrate samples collected using different field methods

    USGS Publications Warehouse

    Lenz, Bernard N.; Miller, Michael A.

    1996-01-01

    Government agencies, academic institutions, and volunteer monitoring groups in the State of Wisconsin collect aquatic macroinvertebrate data to assess water quality. Sampling methods differ among agencies, reflecting the differences in the sampling objectives of each agency. Lack of infor- mation about data comparability impedes data shar- ing among agencies, which can result in duplicated sampling efforts or the underutilization of avail- able information. To address these concerns, com- parisons were made of macroinvertebrate samples collected from wadeable streams in Wisconsin by personnel from the U.S. Geological Survey- National Water Quality Assessment Program (USGS-NAWQA), the Wisconsin Department of Natural Resources (WDNR), the U.S. Department of Agriculture-Forest Service (USDA-FS), and volunteers from the Water Action Volunteer-Water Quality Monitoring Program (WAV). This project was part of the Intergovernmental Task Force on Monitoring Water Quality (ITFM) Wisconsin Water Resources Coordination Project. The numbers, types, and environmental tolerances of the organ- isms collected were analyzed to determine if the four different field methods that were used by the different agencies and volunteer groups provide comparable results. Additionally, this study com- pared the results of samples taken from different locations and habitats within the same streams.

  2. Bayesian Methods for Determining the Importance of Effects

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Criticisms have plagued the frequentist null-hypothesis significance testing (NHST) procedure since the day it was created from the Fisher Significance Test and Hypothesis Test of Jerzy Neyman and Egon Pearson. Alternatives to NHST exist in frequentist statistics, but competing methods are also avai...

  3. Sample Selected Averaging Method for Analyzing the Event Related Potential

    NASA Astrophysics Data System (ADS)

    Taguchi, Akira; Ono, Youhei; Kimura, Tomoaki

    The event related potential (ERP) is often measured through the oddball task. On the oddball task, subjects are given “rare stimulus” and “frequent stimulus”. Measured ERPs were analyzed by the averaging technique. In the results, amplitude of the ERP P300 becomes large when the “rare stimulus” is given. However, measured ERPs are included samples without an original feature of ERP. Thus, it is necessary to reject unsuitable measured ERPs when using the averaging technique. In this paper, we propose the rejection method for unsuitable measured ERPs for the averaging technique. Moreover, we combine the proposed method and Woody's adaptive filter method.

  4. New Methods of Sample Preparation for Atom Probe Specimens

    NASA Technical Reports Server (NTRS)

    Kuhlman, Kimberly, R.; Kowalczyk, Robert S.; Ward, Jennifer R.; Wishard, James L.; Martens, Richard L.; Kelly, Thomas F.

    2003-01-01

    Magnetite is a common conductive mineral found on Earth and Mars. Disk-shaped precipitates approximately 40 nm in diameter have been shown to have manganese and aluminum concentrations. Atom-probe field-ion microscopy (APFIM) is the only technique that can potentially quantify the composition of these precipitates. APFIM will be used to characterize geological and planetary materials, analyze samples of interest for geomicrobiology; and, for the metrology of nanoscale instrumentation. Prior to APFIM sample preparation was conducted by electropolishing, the method of sharp shards (MSS), or Bosch process (deep reactive ion etching) with focused ion beam (FIB) milling as a final step. However, new methods are required for difficult samples. Many materials are not easily fabricated using electropolishing, MSS, or the Bosch process, FIB milling is slow and expensive, and wet chemistry and the reactive ion etching are typically limited to Si and other semiconductors. APFIM sample preparation using the dicing saw is commonly used to section semiconductor wafers into individual devices following manufacture. The dicing saw is a time-effective method for preparing high aspect ratio posts of poorly conducting materials. Femtosecond laser micromachining is also suitable for preparation of posts. FIB time required is reduced by about a factor of 10 and multi-tip specimens can easily be fabricated using the dicing saw.

  5. Transuranic waste characterization sampling and analysis methods manual. Revision 1

    SciTech Connect

    Suermann, J.F.

    1996-04-01

    This Methods Manual provides a unified source of information on the sampling and analytical techniques that enable Department of Energy (DOE) facilities to comply with the requirements established in the current revision of the Transuranic Waste Characterization Quality Assurance Program Plan (QAPP) for the Waste Isolation Pilot Plant (WIPP) Transuranic (TRU) Waste Characterization Program (the Program) and the WIPP Waste Analysis Plan. This Methods Manual includes all of the testing, sampling, and analytical methodologies accepted by DOE for use in implementing the Program requirements specified in the QAPP and the WIPP Waste Analysis Plan. The procedures in this Methods Manual are comprehensive and detailed and are designed to provide the necessary guidance for the preparation of site-specific procedures. With some analytical methods, such as Gas Chromatography/Mass Spectrometry, the Methods Manual procedures may be used directly. With other methods, such as nondestructive characterization, the Methods Manual provides guidance rather than a step-by-step procedure. Sites must meet all of the specified quality control requirements of the applicable procedure. Each DOE site must document the details of the procedures it will use and demonstrate the efficacy of such procedures to the Manager, National TRU Program Waste Characterization, during Waste Characterization and Certification audits.

  6. Sampling Small Mammals in Southeastern Forests: The Importance of Trapping in Trees

    SciTech Connect

    Loeb, S.C.; Chapman, G.L.; Ridley, T.R.

    1999-01-01

    We investigated the effect of sampling methodology on the richness and abundance of small mammal communities in loblolly pine forests. Trapping in trees using Sherman live traps was included along with routine ground trapping using the same device. Estimates of species richness did not differ among samples in which tree traps were included or excluded. However, diversity indeces (Shannon-Wiener, Simpson, Shannon and Brillouin) were strongly effected. The indeces were significantly greater than if tree samples were included primarily the result of flying squirrel captures. Without tree traps, the results suggested that cotton mince dominated the community. We recommend that tree traps we included in sampling.

  7. Methods for assessing relative importance in preference based outcome measures.

    PubMed

    Kaplan, R M; Feeny, D; Revicki, D A

    1993-12-01

    This paper reviews issues relevant to preference assessment for utility based measures of health-related quality of life. Cost/utility studies require a common measurement of health outcome, such as the quality adjusted life year (QALY). A key element in the QALY methodology is the measure of preference that estimates subjective health quality. Economists and psychologists differ on their preferred approach to preference measurement. Economists rely on utility assessment methods that formally consider economic trades. These methods include the standard gamble, time-trade off and person trade-off. However, some evidence suggests that many of the assumptions that underlie economic measurements of choice are open to challenge because human information processors do poorly at integrating complex probability information when making decisions that involve risk. Further, economic analysis assumes that choices accurately correspond to the way rational humans use information. Psychology experiments suggest that methods commonly used for economic analysis do not represent the underlying true preference continuum and some evidence supports the use of simple rating scales. More recent research by economists attempts integrated cognitive models, while contemporary research by psychologists considers economic models of choice. The review also suggests that difference in preference between different social groups tends to be small.

  8. RAPID SEPARATION METHOD FOR ACTINIDES IN EMERGENCY SOIL SAMPLES

    SciTech Connect

    Maxwell, S.; Culligan, B.; Noyes, G.

    2009-11-09

    A new rapid method for the determination of actinides in soil and sediment samples has been developed at the Savannah River Site Environmental Lab (Aiken, SC, USA) that can be used for samples up to 2 grams in emergency response situations. The actinides in soil method utilizes a rapid sodium hydroxide fusion method, a lanthanum fluoride soil matrix removal step, and a streamlined column separation process with stacked TEVA, TRU and DGA Resin cartridges. Lanthanum was separated rapidly and effectively from Am and Cm on DGA Resin. Vacuum box technology and rapid flow rates are used to reduce analytical time. Alpha sources are prepared using cerium fluoride microprecipitation for counting by alpha spectrometry. The method showed high chemical recoveries and effective removal of interferences. This new procedure was applied to emergency soil samples received in the NRIP Emergency Response exercise administered by the National Institute for Standards and Technology (NIST) in April, 2009. The actinides in soil results were reported within 4-5 hours with excellent quality.

  9. 40 CFR 80.1347 - What are the sampling and testing requirements for refiners and importers?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... Benzene Sampling, Testing and Retention Requirements § 80.1347 What are the sampling and testing... benzene requirements of this subpart, except as modified by paragraphs (a)(2), (a)(3) and (a)(4) of this... benzene concentration for compliance with the requirements of this subpart. (ii) Independent...

  10. 40 CFR 80.1347 - What are the sampling and testing requirements for refiners and importers?

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... Benzene Sampling, Testing and Retention Requirements § 80.1347 What are the sampling and testing... benzene requirements of this subpart, except as modified by paragraphs (a)(2), (a)(3) and (a)(4) of this..., 2015, to determine its benzene concentration for compliance with the requirements of this...

  11. 40 CFR 80.1347 - What are the sampling and testing requirements for refiners and importers?

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... Benzene Sampling, Testing and Retention Requirements § 80.1347 What are the sampling and testing... benzene requirements of this subpart, except as modified by paragraphs (a)(2), (a)(3) and (a)(4) of this... benzene concentration for compliance with the requirements of this subpart. (ii) Independent...

  12. Bandpass Sampling--An Opportunity to Stress the Importance of In-Depth Understanding

    ERIC Educational Resources Information Center

    Stern, Harold P. E.

    2010-01-01

    Many bandpass signals can be sampled at rates lower than the Nyquist rate, allowing significant practical advantages. Illustrating this phenomenon after discussing (and proving) Shannon's sampling theorem provides a valuable opportunity for an instructor to reinforce the principle that innovation is possible when students strive to have a complete…

  13. Spanish Multicenter Normative Studies (NEURONORMA Project): methods and sample characteristics.

    PubMed

    Peña-Casanova, Jordi; Blesa, Rafael; Aguilar, Miquel; Gramunt-Fombuena, Nina; Gómez-Ansón, Beatriz; Oliva, Rafael; Molinuevo, José Luis; Robles, Alfredo; Barquero, María Sagrario; Antúnez, Carmen; Martínez-Parra, Carlos; Frank-García, Anna; Fernández, Manuel; Alfonso, Verónica; Sol, Josep M

    2009-06-01

    This paper describes the methods and sample characteristics of a series of Spanish normative studies (The NEURONORMA project). The primary objective of our research was to collect normative and psychometric information on a sample of people aged over 49 years. The normative information was based on a series of selected, but commonly used, neuropsychological tests covering attention, language, visuo-perceptual abilities, constructional tasks, memory, and executive functions. A sample of 356 community dwelling individuals was studied. Demographics, socio-cultural, and medical data were collected. Cognitive normality was validated via informants and a cognitive screening test. Norms were calculated for midpoint age groups. Effects of age, education, and sex were determined. The use of these norms should improve neuropsychological diagnostic accuracy in older Spanish subjects. These data may also be of considerable use for comparisons with other normative studies. Limitations of these normative data are also commented on.

  14. 40 CFR 80.583 - What alternative sampling and testing requirements apply to importers who transport motor vehicle...

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... requirements apply to importers who transport motor vehicle diesel fuel, NRLM diesel fuel, or ECA marine fuel...; Nonroad, Locomotive, and Marine Diesel Fuel; and ECA Marine Fuel Sampling and Testing § 80.583 What... diesel fuel, or ECA marine fuel by truck or rail car? Importers who import diesel fuel subject to the...

  15. The laboratory methods of induced polarization measurement of manganese sample

    NASA Astrophysics Data System (ADS)

    Adhiguna, D.; Handayani, G.

    2015-09-01

    Metallic minerals are polarizable. The polarizable property can be used as the basis for metallic minerals exploration process. By use of induced polarization method, we observed polarization phenomenon that occur in metallic material. In this study, physical events were observed that occur in rocks containing manganese minerals using induced polarization method. Induced polarization method is a geophysical method that is based on the principle of electrical charging and discharging of a capacitor which is applied to the rock. By using the method of induced polarization, chargeability values can be determined for the rock. Chargeability is one of the important properties of metal material. Measurement on this research will be done in two different ways to determine the induced events that occurred in both methods.

  16. A direct method for e-cigarette aerosol sample collection.

    PubMed

    Olmedo, Pablo; Navas-Acien, Ana; Hess, Catherine; Jarmul, Stephanie; Rule, Ana

    2016-08-01

    E-cigarette use is increasing in populations around the world. Recent evidence has shown that the aerosol produced by e-cigarettes can contain a variety of toxicants. Published studies characterizing toxicants in e-cigarette aerosol have relied on filters, impingers or sorbent tubes, which are methods that require diluting or extracting the sample in a solution during collection. We have developed a collection system that directly condenses e-cigarette aerosol samples for chemical and toxicological analyses. The collection system consists of several cut pipette tips connected with short pieces of tubing. The pipette tip-based collection system can be connected to a peristaltic pump, a vacuum pump, or directly to an e-cigarette user for the e-cigarette aerosol to flow through the system. The pipette tip-based system condenses the aerosol produced by the e-cigarette and collects a liquid sample that is ready for analysis without the need of intermediate extraction solutions. We tested a total of 20 e-cigarettes from 5 different brands commercially available in Maryland. The pipette tip-based collection system condensed between 0.23 and 0.53mL of post-vaped e-liquid after 150 puffs. The proposed method is highly adaptable, can be used during field work and in experimental settings, and allows collecting aerosol samples from a wide variety of e-cigarette devices, yielding a condensate of the likely exact substance that is being delivered to the lungs.

  17. THE IMPORTANCE OF THE MAGNETIC FIELD FROM AN SMA-CSO-COMBINED SAMPLE OF STAR-FORMING REGIONS

    SciTech Connect

    Koch, Patrick M.; Tang, Ya-Wen; Ho, Paul T. P.; Chen, Huei-Ru Vivien; Liu, Hau-Yu Baobab; Yen, Hsi-Wei; Lai, Shih-Ping; Zhang, Qizhou; Chen, How-Huan; Ching, Tao-Chung; Girart, Josep M.; Frau, Pau; Li, Hua-Bai; Li, Zhi-Yun; Padovani, Marco; Qiu, Keping; Rao, Ramprasad

    2014-12-20

    Submillimeter dust polarization measurements of a sample of 50 star-forming regions, observed with the Submillimeter Array (SMA) and the Caltech Submillimeter Observatory (CSO) covering parsec-scale clouds to milliparsec-scale cores, are analyzed in order to quantify the magnetic field importance. The magnetic field misalignment δ—the local angle between magnetic field and dust emission gradient—is found to be a prime observable, revealing distinct distributions for sources where the magnetic field is preferentially aligned with or perpendicular to the source minor axis. Source-averaged misalignment angles (|δ|) fall into systematically different ranges, reflecting the different source-magnetic field configurations. Possible bimodal (|δ|) distributions are found for the separate SMA and CSO samples. Combining both samples broadens the distribution with a wide maximum peak at small (|δ|) values. Assuming the 50 sources to be representative, the prevailing source-magnetic field configuration is one that statistically prefers small magnetic field misalignments |δ|. When interpreting |δ| together with a magnetohydrodynamics force equation, as developed in the framework of the polarization-intensity gradient method, a sample-based log-linear scaling fits the magnetic field tension-to-gravity force ratio (Σ {sub B}) versus (|δ|) with (Σ {sub B}) = 0.116 · exp (0.047 · (|δ|)) ± 0.20 (mean error), providing a way to estimate the relative importance of the magnetic field, only based on measurable field misalignments |δ|. The force ratio Σ {sub B} discriminates systems that are collapsible on average ((Σ {sub B}) < 1) from other molecular clouds where the magnetic field still provides enough resistance against gravitational collapse ((Σ {sub B}) > 1). The sample-wide trend shows a transition around (|δ|) ≈ 45°. Defining an effective gravitational force ∼1 – (Σ {sub B}), the average magnetic-field-reduced star formation efficiency is at least a

  18. A Mixed Methods Investigation of Mixed Methods Sampling Designs in Social and Health Science Research

    ERIC Educational Resources Information Center

    Collins, Kathleen M. T.; Onwuegbuzie, Anthony J.; Jiao, Qun G.

    2007-01-01

    A sequential design utilizing identical samples was used to classify mixed methods studies via a two-dimensional model, wherein sampling designs were grouped according to the time orientation of each study's components and the relationship of the qualitative and quantitative samples. A quantitative analysis of 121 studies representing nine fields…

  19. Rock sampling. [method for controlling particle size distribution

    NASA Technical Reports Server (NTRS)

    Blum, P. (Inventor)

    1971-01-01

    A method for sampling rock and other brittle materials and for controlling resultant particle sizes is described. The method involves cutting grooves in the rock surface to provide a grouping of parallel ridges and subsequently machining the ridges to provide a powder specimen. The machining step may comprise milling, drilling, lathe cutting or the like; but a planing step is advantageous. Control of the particle size distribution is effected primarily by changing the height and width of these ridges. This control exceeds that obtainable by conventional grinding.

  20. Recent advances in sample preparation techniques for effective bioanalytical methods.

    PubMed

    Kole, Prashant Laxman; Venkatesh, Gantala; Kotecha, Jignesh; Sheshala, Ravi

    2011-01-01

    This paper reviews the recent developments in bioanalysis sample preparation techniques and gives an update on basic principles, theory, applications and possibilities for automation, and a comparative discussion on the advantages and limitation of each technique. Conventional liquid-liquid extraction (LLE), protein precipitation (PP) and solid-phase extraction (SPE) techniques are now been considered as methods of the past. The last decade has witnessed a rapid development of novel sample preparation techniques in bioanalysis. Developments in SPE techniques such as selective sorbents and in the overall approach to SPE, such as hybrid SPE and molecularly imprinted polymer SPE, have been addressed. Considerable literature has been published in the area of solid-phase micro-extraction and its different versions, e.g. stir bar sorptive extraction, and their application in the development of selective and sensitive bioanalytical methods. Techniques such as dispersive solid-phase extraction, disposable pipette extraction and micro-extraction by packed sorbent offer a variety of extraction phases and provide unique advantages to bioanalytical methods. On-line SPE utilizing column-switching techniques is rapidly gaining acceptance in bioanalytical applications. PP sample preparation techniques such as PP filter plates/tubes offer many advantages like removal of phospholipids and proteins in plasma/serum. Newer approaches to conventional LLE techniques (salting-out LLE) are also covered in this review article.

  1. ADAPTIVE ANNEALED IMPORTANCE SAMPLING FOR MULTIMODAL POSTERIOR EXPLORATION AND MODEL SELECTION WITH APPLICATION TO EXTRASOLAR PLANET DETECTION

    SciTech Connect

    Liu, Bin

    2014-07-01

    We describe an algorithm that can adaptively provide mixture summaries of multimodal posterior distributions. The parameter space of the involved posteriors ranges in size from a few dimensions to dozens of dimensions. This work was motivated by an astrophysical problem called extrasolar planet (exoplanet) detection, wherein the computation of stochastic integrals that are required for Bayesian model comparison is challenging. The difficulty comes from the highly nonlinear models that lead to multimodal posterior distributions. We resort to importance sampling (IS) to estimate the integrals, and thus translate the problem to be how to find a parametric approximation of the posterior. To capture the multimodal structure in the posterior, we initialize a mixture proposal distribution and then tailor its parameters elaborately to make it resemble the posterior to the greatest extent possible. We use the effective sample size (ESS) calculated based on the IS draws to measure the degree of approximation. The bigger the ESS is, the better the proposal resembles the posterior. A difficulty within this tailoring operation lies in the adjustment of the number of mixing components in the mixture proposal. Brute force methods just preset it as a large constant, which leads to an increase in the required computational resources. We provide an iterative delete/merge/add process, which works in tandem with an expectation-maximization step to tailor such a number online. The efficiency of our proposed method is tested via both simulation studies and real exoplanet data analysis.

  2. Random Sampling of Quantum States: a Survey of Methods. And Some Issues Regarding the Overparametrized Method

    NASA Astrophysics Data System (ADS)

    Maziero, Jonas

    2015-12-01

    The numerical generation of random quantum states (RQS) is an important procedure for investigations in quantum information science. Here, we review some methods that may be used for performing that task. We start by presenting a simple procedure for generating random state vectors, for which the main tool is the random sampling of unbiased discrete probability distributions (DPD). Afterwards, the creation of random density matrices is addressed. In this context, we first present the standard method, which consists in using the spectral decomposition of a quantum state for getting RQS from random DPDs and random unitary matrices. In the sequence, the Bloch vector parametrization method is described. This approach, despite being useful in several instances, is not in general convenient for RQS generation. In the last part of the article, we regard the overparametrized method (OPM) and the related Ginibre and Bures techniques. The OPM can be used to create random positive semidefinite matrices with unit trace from randomly produced general complex matrices in a simple way that is friendly for numerical implementations. We consider a physically relevant issue related to the possible domains that may be used for the real and imaginary parts of the elements of such general complex matrices. Subsequently, a too fast concentration of measure in the quantum state space that appears in this parametrization is noticed.

  3. RAPID SEPARATION METHOD FOR ACTINIDES IN EMERGENCY AIR FILTER SAMPLES

    SciTech Connect

    Maxwell, S.; Noyes, G.; Culligan, B.

    2010-02-03

    A new rapid method for the determination of actinides and strontium in air filter samples has been developed at the Savannah River Site Environmental Lab (Aiken, SC, USA) that can be used in emergency response situations. The actinides and strontium in air filter method utilizes a rapid acid digestion method and a streamlined column separation process with stacked TEVA, TRU and Sr Resin cartridges. Vacuum box technology and rapid flow rates are used to reduce analytical time. Alpha emitters are prepared using cerium fluoride microprecipitation for counting by alpha spectrometry. The purified {sup 90}Sr fractions are mounted directly on planchets and counted by gas flow proportional counting. The method showed high chemical recoveries and effective removal of interferences. This new procedure was applied to emergency air filter samples received in the NRIP Emergency Response exercise administered by the National Institute for Standards and Technology (NIST) in April, 2009. The actinide and {sup 90}Sr in air filter results were reported in {approx}4 hours with excellent quality.

  4. Hand held sample tube manipulator, system and method

    DOEpatents

    Kenny, Donald V [Liberty Township, OH; Smith, Deborah L [Liberty Township, OH; Severance, Richard A [late of Columbus, OH

    2001-01-01

    A manipulator apparatus, system and method for measuring analytes present in sample tubes. The manipulator apparatus includes a housing having a central bore with an inlet end and outlet end; a plunger mechanism with at least a portion thereof slideably disposed for reciprocal movement within the central bore, the plunger mechanism having a tubular gas channel with an inlet end and an outlet end, the gas channel inlet end disposed in the same direction as said inlet end of the central bore, wherein the inlet end of said plunger mechanism is adapted for movement so as to expel a sample tube inserted in the bore at the outlet end of the housing, the inlet end of the plunger mechanism is adapted for connection to gas supply; a first seal is disposed in the housing for sealing between the central bore and the plunger mechanism; a second seal is disposed at the outlet end of the housing for sealing between the central bore and a sample tube; a holder mounted on the housing for holding the sample tube; and a biasing mechanism for returning the plunger mechanism to a starting position.

  5. Transfer of sampling methods for studies on most-at-risk populations (MARPs) in Brazil.

    PubMed

    Barbosa Júnior, Aristides; Pascom, Ana Roberta Pati; Szwarcwald, Célia Landmann; Kendall, Carl; McFarland, Willi

    2011-01-01

    The objective of this paper was to describe the process of transferring two methods for sampling most-at-risk populations: respondent-driven sampling (RDS) and time-space sampling (TSS). The article describes steps in the process, the methods used in the 10 pilot studies, and lessons learned. The process was conducted in six steps, from a state-of-the-art seminar to a workshop on writing articles with the results of the pilot studies. The principal investigators reported difficulties in the fieldwork and data analysis, independently of the pilot sampling method. One of the most important results of the transfer process is that Brazil now has more than 100 researchers able to sample MARPs using RDS or TSS. The process also enabled the construction of baselines for MARPS, thus providing a broader understanding of the dynamics of HIV infection in the country and the use of evidence to plan the national response to the epidemic in these groups.

  6. A Novel Method for Sampling Alpha-Helical Protein Backbones

    DOE R&D Accomplishments Database

    Fain, Boris; Levitt, Michael

    2001-01-01

    We present a novel technique of sampling the configurations of helical proteins. Assuming knowledge of native secondary structure, we employ assembly rules gathered from a database of existing structures to enumerate the geometrically possible 3-D arrangements of the constituent helices. We produce a library of possible folds for 25 helical protein cores. In each case the method finds significant numbers of conformations close to the native structure. In addition we assign coordinates to all atoms for 4 of the 25 proteins. In the context of database driven exhaustive enumeration our method performs extremely well, yielding significant percentages of structures (0.02%--82%) within 6A of the native structure. The method's speed and efficiency make it a valuable contribution towards the goal of predicting protein structure.

  7. Well fluid isolation and sample apparatus and method

    DOEpatents

    Schalla, Ronald; Smith, Ronald M.; Hall, Stephen H.; Smart, John E.

    1995-01-01

    The present invention specifically permits purging and/or sampling of a well but only removing, at most, about 25% of the fluid volume compared to conventional methods and, at a minimum, removing none of the fluid volume from the well. The invention is an isolation assembly that is inserted into the well. The isolation assembly is designed so that only a volume of fluid between the outside diameter of the isolation assembly and the inside diameter of the well over a fluid column height from the bottom of the well to the top of the active portion (lower annulus) is removed. A seal may be positioned above the active portion thereby sealing the well and preventing any mixing or contamination of inlet fluid with fluid above the packer. Purged well fluid is stored in a riser above the packer. Ports in the wall of the isolation assembly permit purging and sampling of the lower annulus along the height of the active portion.

  8. A novel method for sampling bacteria on plant root and soil surfaces at the microhabitat scale.

    PubMed

    Dennis, Paul G; Miller, Anthony J; Clark, Ian M; Taylor, Richard G; Valsami-Jones, Eugenia; Hirsch, Penny R

    2008-09-01

    This study reports the first method for sampling bacteria at a spatial scale approximating a microhabitat. At the core of this method is the use of tungsten rods with laser-cut tips of known surface area (0.013 mm(2)). Exposed plant root or soil surfaces were viewed with a dissecting microscope and micro-sampling rods were guided to sample sites using a micro-manipulator. Bacteria that adhered to the sampling tips were then recovered for microbiological analyses. The efficiency of this method for removing bacteria from root surfaces was similar to that with which bacteria are recovered from dissected root segments using the conventional technique of washing. However, as the surface area of the micro-sampling tips was known, the new method has the advantage of eliminating inaccuracy in estimates of bacterial densities due to inaccurate estimation of the root or soil surface sampled. When used to investigate spatial distributions of rhizoplane bacteria, the new technique revealed trends that were consistent with those reported with existing methods, while providing access to additional information about community structure at a much smaller spatial scale. The spatial scale of this new method is ca. 1000-times smaller than other sampling methods involving swabbing. This novel technique represents an important methodological step facilitating microbial ecological investigations at a microhabitat scale.

  9. Methods for parasitic protozoans detection in the environmental samples.

    PubMed

    Skotarczak, B

    2009-09-01

    The environmental route of transmission of many parasitic protozoa and their potential for producing large numbers of transmissive stages constitute persistent threats to public and veterinary health. Conventional and new immunological and molecular methods enable to assess the occurrence, prevalence, levels and sources of waterborne protozoa. Concentration, purification, and detection are the three key steps in all methods that have been approved for routine monitoring of waterborne cysts and oocysts. These steps have been optimized to such an extent that low levels of naturally occurring (oo)cysts of protozoan can be efficiently recovered from water. Ten years have passed since the United States Environmental Protection Agency (USEPA) introduced the 1622 and 1623 methods and used them to concentrate and detect the oocysts of Cryptosporidium and cysts of Giardia in water samples. Nevertheless, the methods still need studies and improvements. Pre-PCR processing procedures have been developed and they are still improved to remove or reduce the effects of PCR inhibitors. The progress in molecular methods allows to more precise distinction of species or simultaneous detection of several parasites, however, they are still not routinely used and need standardization. Standardized methods are required to maximize public health surveillance.

  10. Path Sampling Methods for Enzymatic Quantum Particle Transfer Reactions.

    PubMed

    Dzierlenga, M W; Varga, M J; Schwartz, S D

    2016-01-01

    The mechanisms of enzymatic reactions are studied via a host of computational techniques. While previous methods have been used successfully, many fail to incorporate the full dynamical properties of enzymatic systems. This can lead to misleading results in cases where enzyme motion plays a significant role in the reaction coordinate, which is especially relevant in particle transfer reactions where nuclear tunneling may occur. In this chapter, we outline previous methods, as well as discuss newly developed dynamical methods to interrogate mechanisms of enzymatic particle transfer reactions. These new methods allow for the calculation of free energy barriers and kinetic isotope effects (KIEs) with the incorporation of quantum effects through centroid molecular dynamics (CMD) and the full complement of enzyme dynamics through transition path sampling (TPS). Recent work, summarized in this chapter, applied the method for calculation of free energy barriers to reaction in lactate dehydrogenase (LDH) and yeast alcohol dehydrogenase (YADH). We found that tunneling plays an insignificant role in YADH but plays a more significant role in LDH, though not dominant over classical transfer. Additionally, we summarize the application of a TPS algorithm for the calculation of reaction rates in tandem with CMD to calculate the primary H/D KIE of YADH from first principles. We found that the computationally obtained KIE is within the margin of error of experimentally determined KIEs and corresponds to the KIE of particle transfer in the enzyme. These methods provide new ways to investigate enzyme mechanism with the inclusion of protein and quantum dynamics.

  11. Advances in sample preparation in electromigration, chromatographic and mass spectrometric separation methods.

    PubMed

    Gilar, M; Bouvier, E S; Compton, B J

    2001-02-16

    The quality of sample preparation is a key factor in determining the success of analysis. While analysis of pharmaceutically important compounds in biological matrixes has driven forward the development of sample clean-up procedures in last 20 years, today's chemists face an additional challenge: sample preparation and analysis of complex biochemical samples for characterization of genotypic or phenotypic information contained in DNA and proteins. This review focuses on various sample pretreatment methods designed to meet the requirements for the analysis of biopolymers and small drugs in complex matrices. We discuss the advances in development of solid-phase extraction (SPE) sorbents, on-line SPE, membrane-based sample preparation, and sample clean-up of biopolymers prior to their analysis by mass spectrometry.

  12. Miniaturized sample preparation method for determination of amphetamines in urine.

    PubMed

    Nishida, Manami; Namera, Akira; Yashiki, Mikio; Kimura, Kojiro

    2004-07-16

    A simple and miniaturized sample preparation method for determination of amphetamines in urine was developed using on-column derivatization and gas chromatography-mass spectrometry (GC-MS). Urine was directly applied to the extraction column that was pre-packed with Extrelut and sodium carbonate. Amphetamine (AP) and methamphetamine (MA) in urine were adsorbed on the surface of Extrelut. AP and MA were then converted to a free base and derivatized to N-propoxycarbonyl derivatives using propylchloroformate on the column. Pentadeuterated MA was used as an internal standard. The recoveries of AP and MA from urine were 100 and 102%, respectively. The calibration curves showed linearity in the range of 0.50-50 microg/mL for AP and MA in urine. When urine samples containing two different concentrations (0.50 and 5.0 microg/mL) of AP and MA were determined, the intra-day and inter-day coefficients of variation were 1.4-7.7%. This method was applied to 14 medico-legal cases of MA intoxication. The results were compared and a good agreement was obtained with a HPLC method.

  13. Vadose Zone Sampling Methods for Detection of Preferential Pesticides Transport

    NASA Astrophysics Data System (ADS)

    Peranginangin, N.; Richards, B. K.; Steenhuis, T. S.

    2003-12-01

    Leaching of agricultural applied chemicals through the vadose zone is a major cause for the occurrence of agrichemicals in groundwater. Accurate soil water sampling methods are needed to ensure meaningful monitoring results, especially for soils that have significant preferential flow paths. The purpose of this study was to assess the capability and the effectiveness of various soil water sampling methods in detecting preferential transport of pesticides in a strongly-structured silty clay loam (Hudson series) soil. Soil water sampling devices tested were wick pan and gravity pan lysimeters, tile lines, porous ceramic cups, and pipe lysimeters; all installed at 45 to105 cm depth below the ground surface. A reasonable worse-case scenario was tested by applying a simulated rain storm soon after pesticides were sprayed at agronomic rates. Herbicides atrazine (6-chloro-N2-ethyl-N4-isopropyl-1,3,5-triazine-2,4-diamine) and 2,4-D (2,4-dichloro-phenoxyacetic acid) were chosen as model compounds. Chloride (KCl) tracer was used to determine spatial and temporal distribution of non-reactive solute and water as well as a basis for determining the retardation in pesticides movement. Results show that observed pesticide mobility was much greater than would be predicted by uniform flow. Under relatively high soil moisture conditions, gravity and wick pan lysimeters had comparably good collection efficiencies, whereas the wick samplers had an advantage over gravity driven sampler when the soil moisture content was below field capacity. Pipe lysimeters had breakthrough patterns that were similar to pan samplers. At small plot scale, tile line samplers tended to underestimate solute concentration because of water dilution around the samplers. The use of porous cup samplers performed poorly because of their sensitivity to local profile characteristics: only by chance can they intercept and sample the preferential flow paths that are critical to transport. Wick sampler had the least

  14. Importance sampling-based Monte Carlo simulation of time-domain optical coherence tomography with embedded objects.

    PubMed

    Periyasamy, Vijitha; Pramanik, Manojit

    2016-04-10

    Monte Carlo simulation for light propagation in biological tissue is widely used to study light-tissue interaction. Simulation for optical coherence tomography (OCT) studies requires handling of embedded objects of various shapes. In this work, time-domain OCT simulations for multilayered tissue with embedded objects (such as sphere, cylinder, ellipsoid, and cuboid) was done. Improved importance sampling (IS) was implemented for the proposed OCT simulation for faster speed. At first, IS was validated against standard and angular biased Monte Carlo methods for OCT. Both class I and class II photons were in agreement in all the three methods. However, the IS method had more than tenfold improvement in terms of simulation time. Next, B-scan images were obtained for four types of embedded objects. All the four shapes are clearly visible from the B-scan OCT images. With the improved IS B-scan OCT images of embedded objects can be obtained with reasonable simulation time using a standard desktop computer. User-friendly, C-based, Monte Carlo simulation for tissue layers with embedded objects for OCT (MCEO-OCT) will be very useful for time-domain OCT simulations in many biological applications.

  15. Comparison of individual and pooled sampling methods for detecting bacterial pathogens of fish

    USGS Publications Warehouse

    Mumford, Sonia; Patterson, Chris; Evered, J.; Brunson, Ray; Levine, J.; Winton, J.

    2005-01-01

    Examination of finfish populations for viral and bacterial pathogens is an important component of fish disease control programs worldwide. Two methods are commonly used for collecting tissue samples for bacteriological culture, the currently accepted standards for detection of bacterial fish pathogens. The method specified in the Office International des Epizooties Manual of Diagnostic Tests for Aquatic Animals permits combining renal and splenic tissues from as many as 5 fish into pooled samples. The American Fisheries Society (AFS) Blue Book/US Fish and Wildlife Service (USFWS) Inspection Manual specifies the use of a bacteriological loop for collecting samples from the kidney of individual fish. An alternative would be to more fully utilize the pooled samples taken for virology. If implemented, this approach would provide substantial savings in labor and materials. To compare the relative performance of the AFS/USFWS method and this alternative approach, cultures of Yersinia ruckeri were used to establish low-level infections in groups of rainbow trout (Oncorhynchus mykiss) that were sampled by both methods. Yersinia ruckeri was cultured from 22 of 37 groups by at least 1 method. The loop method yielded 18 positive groups, with 1 group positive in the loop samples but negative in the pooled samples. The pooled samples produced 21 positive groups, with 4 groups positive in the pooled samples but negative in the loop samples. There was statistically significant agreement (Spearman coefficient 0.80, P < 0.001) in the relative ability of the 2 sampling methods to permit detection of low-level bacterial infections of rainbow trout.

  16. Drum plug piercing and sampling device and method

    DOEpatents

    Counts, Kevin T.

    2011-04-26

    An apparatus and method for piercing a drum plug of a drum in order to sample and/or vent gases that may accumulate in a space of the drum is provided. The drum is not damaged and can be reused since the pierced drum plug can be subsequently replaced. The apparatus includes a frame that is configured for engagement with the drum. A cylinder actuated by a fluid is mounted to the frame. A piercer is placed into communication with the cylinder so that actuation of the cylinder causes the piercer to move in a linear direction so that the piercer may puncture the drum plug of the drum.

  17. A GPU code for analytic continuation through a sampling method

    NASA Astrophysics Data System (ADS)

    Nordström, Johan; Schött, Johan; Locht, Inka L. M.; Di Marco, Igor

    We here present a code for performing analytic continuation of fermionic Green's functions and self-energies as well as bosonic susceptibilities on a graphics processing unit (GPU). The code is based on the sampling method introduced by Mishchenko et al. (2000), and is written for the widely used CUDA platform from NVidia. Detailed scaling tests are presented, for two different GPUs, in order to highlight the advantages of this code with respect to standard CPU computations. Finally, as an example of possible applications, we provide the analytic continuation of model Gaussian functions, as well as more realistic test cases from many-body physics.

  18. RESULTS FROM EPA FUNDED RESEARCH PROGRAMS ON THE IMPORTANCE OF PURGE VOLUME, SAMPLE VOLUME, SAMPLE FLOW RATE AND TEMPORAL VARIATIONS ON SOIL GAS CONCENTRATIONS

    EPA Science Inventory

    Two research studies funded and overseen by EPA have been conducted since October 2006 on soil gas sampling methods and variations in shallow soil gas concentrations with the purpose of improving our understanding of soil gas methods and data for vapor intrusion applications. Al...

  19. 40 CFR 80.583 - What alternative sampling and testing requirements apply to importers who transport motor vehicle...

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... by truck or rail car? 80.583 Section 80.583 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY... diesel fuel, or ECA marine fuel by truck or rail car? Importers who import diesel fuel subject to the 15... car may comply with the following requirements instead of the requirements to sample and test...

  20. 40 CFR 80.583 - What alternative sampling and testing requirements apply to importers who transport motor vehicle...

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... by truck or rail car? 80.583 Section 80.583 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY... diesel fuel, or ECA marine fuel by truck or rail car? Importers who import diesel fuel subject to the 15... car may comply with the following requirements instead of the requirements to sample and test...

  1. 40 CFR 80.583 - What alternative sampling and testing requirements apply to importers who transport motor vehicle...

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... by truck or rail car? 80.583 Section 80.583 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY... diesel fuel, or ECA marine fuel by truck or rail car? Importers who import diesel fuel subject to the 15... car may comply with the following requirements instead of the requirements to sample and test...

  2. Sample Size for Assessing Agreement between Two Methods of Measurement by Bland-Altman Method.

    PubMed

    Lu, Meng-Jie; Zhong, Wei-Hua; Liu, Yu-Xiu; Miao, Hua-Zhang; Li, Yong-Chang; Ji, Mu-Huo

    2016-11-01

    The Bland-Altman method has been widely used for assessing agreement between two methods of measurement. However, it remains unsolved about sample size estimation. We propose a new method of sample size estimation for Bland-Altman agreement assessment. According to the Bland-Altman method, the conclusion on agreement is made based on the width of the confidence interval for LOAs (limits of agreement) in comparison to predefined clinical agreement limit. Under the theory of statistical inference, the formulae of sample size estimation are derived, which depended on the pre-determined level of α, β, the mean and the standard deviation of differences between two measurements, and the predefined limits. With this new method, the sample sizes are calculated under different parameter settings which occur frequently in method comparison studies, and Monte-Carlo simulation is used to obtain the corresponding powers. The results of Monte-Carlo simulation showed that the achieved powers could coincide with the pre-determined level of powers, thus validating the correctness of the method. The method of sample size estimation can be applied in the Bland-Altman method to assess agreement between two methods of measurement.

  3. Eigenvector method for umbrella sampling enables error analysis.

    PubMed

    Thiede, Erik H; Van Koten, Brian; Weare, Jonathan; Dinner, Aaron R

    2016-08-28

    Umbrella sampling efficiently yields equilibrium averages that depend on exploring rare states of a model by biasing simulations to windows of coordinate values and then combining the resulting data with physical weighting. Here, we introduce a mathematical framework that casts the step of combining the data as an eigenproblem. The advantage to this approach is that it facilitates error analysis. We discuss how the error scales with the number of windows. Then, we derive a central limit theorem for averages that are obtained from umbrella sampling. The central limit theorem suggests an estimator of the error contributions from individual windows, and we develop a simple and computationally inexpensive procedure for implementing it. We demonstrate this estimator for simulations of the alanine dipeptide and show that it emphasizes low free energy pathways between stable states in comparison to existing approaches for assessing error contributions. Our work suggests the possibility of using the estimator and, more generally, the eigenvector method for umbrella sampling to guide adaptation of the simulation parameters to accelerate convergence.

  4. Method for testing earth samples for contamination by organic contaminants

    DOEpatents

    Schabron, J.F.

    1996-10-01

    Provided is a method for testing earth samples for contamination by organic contaminants, and particularly for aromatic compounds such as those found in diesel fuel and other heavy fuel oils, kerosene, creosote, coal oil, tars and asphalts. A drying step is provided in which a drying agent is contacted with either the earth sample or a liquid extract phase to reduce to possibility of false indications of contamination that could occur when humic material is present in the earth sample. This is particularly a problem when using relatively safe, non-toxic and inexpensive polar solvents such as isopropyl alcohol since the humic material tends to be very soluble in those solvents when water is present. Also provided is an ultraviolet spectroscopic measuring technique for obtaining an indication as to whether a liquid extract phase contains aromatic organic contaminants. In one embodiment, the liquid extract phase is subjected to a narrow and discrete band of radiation including a desired wave length and the ability of the liquid extract phase to absorb that wavelength of ultraviolet radiation is measured to provide an indication of the presence of aromatic organic contaminants. 2 figs.

  5. Method for testing earth samples for contamination by organic contaminants

    DOEpatents

    Schabron, John F.

    1996-01-01

    Provided is a method for testing earth samples for contamination by organic contaminants, and particularly for aromatic compounds such as those found in diesel fuel and other heavy fuel oils, kerosene, creosote, coal oil, tars and asphalts. A drying step is provided in which a drying agent is contacted with either the earth sample or a liquid extract phase to reduce to possibility of false indications of contamination that could occur when humic material is present in the earth sample. This is particularly a problem when using relatively safe, non-toxic and inexpensive polar solvents such as isopropyl alcohol since the humic material tends to be very soluble in those solvents when water is present. Also provided is an ultraviolet spectroscopic measuring technique for obtaining an indication as to whether a liquid extract phase contains aromatic organic contaminants. In one embodiment, the liquid extract phase is subjected to a narrow and discrete band of radiation including a desired wave length and the ability of the liquid extract phase to absorb that wavelength of ultraviolet radiation is measured to provide an indication of the presence of aromatic organic contaminants.

  6. Eigenvector method for umbrella sampling enables error analysis

    NASA Astrophysics Data System (ADS)

    Thiede, Erik H.; Van Koten, Brian; Weare, Jonathan; Dinner, Aaron R.

    2016-08-01

    Umbrella sampling efficiently yields equilibrium averages that depend on exploring rare states of a model by biasing simulations to windows of coordinate values and then combining the resulting data with physical weighting. Here, we introduce a mathematical framework that casts the step of combining the data as an eigenproblem. The advantage to this approach is that it facilitates error analysis. We discuss how the error scales with the number of windows. Then, we derive a central limit theorem for averages that are obtained from umbrella sampling. The central limit theorem suggests an estimator of the error contributions from individual windows, and we develop a simple and computationally inexpensive procedure for implementing it. We demonstrate this estimator for simulations of the alanine dipeptide and show that it emphasizes low free energy pathways between stable states in comparison to existing approaches for assessing error contributions. Our work suggests the possibility of using the estimator and, more generally, the eigenvector method for umbrella sampling to guide adaptation of the simulation parameters to accelerate convergence.

  7. Personal sampling of airborne particles: method performance and data quality.

    PubMed

    Janssen, N A; Hoek, G; Harssema, H; Brunekreef, B

    1998-01-01

    A study of personal exposure to respirable particles (PM10) and fine particles (FP) was conducted in groups of 50-70 year-old adults and primary school children in the Netherlands. Four to eight personal measurements per subject were conducted, on weekdays only. Averaging time was 24 hours. Method performance was evaluated regarding compliance, flow, weighing procedure, field blanks and co-located operation of the personal samplers with stationary methods. Furthermore, the possibility that subjects change their behavior due to the wearing of personal sampling equipment was studied by comparing time activity on days of personal sampling with time activity other weekdays. Compliance was high; 95% of the subjects who agreed to continue participating after the first measurement, successfully completed the study, and, expect for the first two days of FP sampling, over 90% of all personal measurements were successful. All pre and post sampling flow readings were within 10% of the required flow rate of 4 L/min. For PM10 precision of the gravimetric analyses was 2.8 microgram/m3 and 0.7 micrograms/m3 for filters weighted on an analytical and a micro-balance respectively. The detection limit was 10.8 micrograms/m3 and 8.6 micrograms/m3 respectively. For FP, weighing precision was 0.4 micrograms/m3 and the detection limit was 5.3 micrograms/m3. All measurements were above the detection limit. Co-located operation of the personal sampler with stationary samplers gave highly correlated concentration (R > 0.90). Outdoor PM10 concentrations measured with the personal sampler were on average 4% higher compared to a Sierra Anderson (SA) inlet and 9% higher compared to a PM10 Harvard Impactor (HI). With the FP cyclone 6% higher classroom concentrations were measured compared to a PM2.5 HI. Adults spent significantly less time outdoor (0.5 hour) and more time at home (0.9 hour) on days of personal sampling compared to other weekdays. For children no significant differences in time

  8. Importance of closely spaced vertical sampling in delineating chemical and microbiological gradients in groundwater studies

    USGS Publications Warehouse

    Smith, R.L.; Harvey, R.W.; LeBlanc, D.R.

    1991-01-01

    Vertical gradients of selected chemical constituents, bacterial populations, bacterial activity and electron acceptors were investigated for an unconfined aquifer contaminated with nitrate and organic compounds on Cape Cod, Massachusetts, U.S.A. Fifteen-port multilevel sampling devices (MLS's) were installed within the contaminant plume at the source of the contamination, and at 250 and 2100 m downgradient from the source. Depth profiles of specific conductance and dissolved oxygen at the downgradient sites exhibited vertical gradients that were both steep and inversely related. Narrow zones (2-4 m thick) of high N2O and NH4+ concentrations were also detected within the contaminant plume. A 27-fold change in bacterial abundance; a 35-fold change in frequency of dividing cells (FDC), an indicator of bacterial growth; a 23-fold change in 3H-glucose uptake, a measure of heterotrophic activity; and substantial changes in overall cell morphology were evident within a 9-m vertical interval at 250 m downgradient. The existence of these gradients argues for the need for closely spaced vertical sampling in groundwater studies because small differences in the vertical placement of a well screen can lead to incorrect conclusions about the chemical and microbiological processes within an aquifer.Vertical gradients of selected chemical constituents, bacterial populations, bacterial activity and electron acceptors were investigated for an unconfined aquifer contaminated with nitrate and organic compounds on Cape Cod, Massachusetts, USA. Fifteen-port multilevel sampling devices (MLS's) were installed within the contaminant plume at the source of the contamination, and at 250 and 2100 m downgradient from the source. Depth profiles of specific conductance and dissolved oxygen at the downgradient sites exhibited vertical gradients that were both steep and inversely related. Narrow zones (2-4 m thick) of high N2O and NH4+ concentrations were also detected within the contaminant plume

  9. Evaluating the performance of sampling plans to detect hypoglycin A in ackee fruit shipments imported into the United States.

    PubMed

    Whitaker, Thomas B; Saltsman, Joyce J; Ware, George M; Slate, Andrew B

    2007-01-01

    Hypoglycin A (HGA) is a toxic amino acid that is naturally produced in unripe ackee fruit. In 1973, the U.S. Food and Drug Administration (FDA) placed a worldwide import alert on ackee fruit, which banned the product from entering the United States. The FDA has considered establishing a regulatory limit for HGA and lifting the ban, which will require development of a monitoring program. The establishment of a regulatory limit for HGA requires the development of a scientifically based sampling plan to detect HGA in ackee fruit imported into the United States. Thirty-three lots of ackee fruit were sampled according to an experimental protocol in which 10 samples, i.e., ten 19 oz cans, were randomly taken from each lot and analyzed for HGA by using liquid chromatography. The total variance was partitioned into sampling and analytical variance components, which were found to be a function of the HGA concentration. Regression equations were developed to predict the total, sampling, and analytical variances as a function of HGA concentration. The observed HGA distribution among the test results for the 10 HGA samples was compared with the normal and lognormal distributions. A computer model based on the lognormal distribution was developed to predict the performance of sampling plan designs to detect HGA in ackee fruit shipments. The performance of several sampling plan designs was evaluated to demonstrate how to manipulate sample size and accept/reject limits to reduce misclassification of ackee fruit lots.

  10. SAMSAN- MODERN NUMERICAL METHODS FOR CLASSICAL SAMPLED SYSTEM ANALYSIS

    NASA Technical Reports Server (NTRS)

    Frisch, H. P.

    1994-01-01

    SAMSAN was developed to aid the control system analyst by providing a self consistent set of computer algorithms that support large order control system design and evaluation studies, with an emphasis placed on sampled system analysis. Control system analysts have access to a vast array of published algorithms to solve an equally large spectrum of controls related computational problems. The analyst usually spends considerable time and effort bringing these published algorithms to an integrated operational status and often finds them less general than desired. SAMSAN reduces the burden on the analyst by providing a set of algorithms that have been well tested and documented, and that can be readily integrated for solving control system problems. Algorithm selection for SAMSAN has been biased toward numerical accuracy for large order systems with computational speed and portability being considered important but not paramount. In addition to containing relevant subroutines from EISPAK for eigen-analysis and from LINPAK for the solution of linear systems and related problems, SAMSAN contains the following not so generally available capabilities: 1) Reduction of a real non-symmetric matrix to block diagonal form via a real similarity transformation matrix which is well conditioned with respect to inversion, 2) Solution of the generalized eigenvalue problem with balancing and grading, 3) Computation of all zeros of the determinant of a matrix of polynomials, 4) Matrix exponentiation and the evaluation of integrals involving the matrix exponential, with option to first block diagonalize, 5) Root locus and frequency response for single variable transfer functions in the S, Z, and W domains, 6) Several methods of computing zeros for linear systems, and 7) The ability to generate documentation "on demand". All matrix operations in the SAMSAN algorithms assume non-symmetric matrices with real double precision elements. There is no fixed size limit on any matrix in any

  11. De novo mutations from sporadic schizophrenia cases highlight important signaling genes in an independent sample.

    PubMed

    Kranz, Thorsten M; Harroch, Sheila; Manor, Orly; Lichtenberg, Pesach; Friedlander, Yechiel; Seandel, Marco; Harkavy-Friedman, Jill; Walsh-Messinger, Julie; Dolgalev, Igor; Heguy, Adriana; Chao, Moses V; Malaspina, Dolores

    2015-08-01

    Schizophrenia is a debilitating syndrome with high heritability. Genomic studies reveal more than a hundred genetic variants, largely nonspecific and of small effect size, and not accounting for its high heritability. De novo mutations are one mechanism whereby disease related alleles may be introduced into the population, although these have not been leveraged to explore the disease in general samples. This paper describes a framework to find high impact genes for schizophrenia. This study consists of two different datasets. First, whole exome sequencing was conducted to identify disruptive de novo mutations in 14 complete parent-offspring trios with sporadic schizophrenia from Jerusalem, which identified 5 sporadic cases with de novo gene mutations in 5 different genes (PTPRG, TGM5, SLC39A13, BTK, CDKN3). Next, targeted exome capture of these genes was conducted in 48 well-characterized, unrelated, ethnically diverse schizophrenia cases, recruited and characterized by the same research team in New York (NY sample), which demonstrated extremely rare and potentially damaging variants in three of the five genes (MAF<0.01) in 12/48 cases (25%); including PTPRG (5 cases), SCL39A13 (4 cases) and TGM5 (4 cases), a higher number than usually identified by whole exome sequencing. Cases differed in cognition and illness features based on which mutation-enriched gene they carried. Functional de novo mutations in protein-interaction domains in sporadic schizophrenia can illuminate risk genes that increase the propensity to develop schizophrenia across ethnicities.

  12. BMAA extraction of cyanobacteria samples: which method to choose?

    PubMed

    Lage, Sandra; Burian, Alfred; Rasmussen, Ulla; Costa, Pedro Reis; Annadotter, Heléne; Godhe, Anna; Rydberg, Sara

    2016-01-01

    β-N-Methylamino-L-alanine (BMAA), a neurotoxin reportedly produced by cyanobacteria, diatoms and dinoflagellates, is proposed to be linked to the development of neurological diseases. BMAA has been found in aquatic and terrestrial ecosystems worldwide, both in its phytoplankton producers and in several invertebrate and vertebrate organisms that bioaccumulate it. LC-MS/MS is the most frequently used analytical technique in BMAA research due to its high selectivity, though consensus is lacking as to the best extraction method to apply. This study accordingly surveys the efficiency of three extraction methods regularly used in BMAA research to extract BMAA from cyanobacteria samples. The results obtained provide insights into possible reasons for the BMAA concentration discrepancies in previous publications. In addition and according to the method validation guidelines for analysing cyanotoxins, the TCA protein precipitation method, followed by AQC derivatization and LC-MS/MS analysis, is now validated for extracting protein-bound (after protein hydrolysis) and free BMAA from cyanobacteria matrix. BMAA biological variability was also tested through the extraction of diatom and cyanobacteria species, revealing a high variance in BMAA levels (0.0080-2.5797 μg g(-1) DW).

  13. Automated Aqueous Sample Concentration Methods for in situ Astrobiological Instrumentation

    NASA Astrophysics Data System (ADS)

    Aubrey, A. D.; Grunthaner, F. J.

    2009-12-01

    The era of wet chemical experiments for in situ planetary science investigations is upon us, as evidenced by recent results from the surface of Mars by Phoenix’s microscopy, electrochemistry, and conductivity analyzer, MECA [1]. Studies suggest that traditional thermal volatilization methods for planetary science in situ investigations induce organic degradation during sample processing [2], an effect that is enhanced in the presence of oxidants [3]. Recent developments have trended towards adaptation of non-destructive aqueous extraction and analytical methods for future astrobiological instrumentation. Wet chemical extraction techniques under investigation include subcritical water extraction, SCWE [4], aqueous microwave assisted extraction, MAE, and organic solvent extraction [5]. Similarly, development of miniaturized analytical space flight instruments that require aqueous extracts include microfluidic capillary electrophoresis chips, μCE [6], liquid-chromatography mass-spectrometrometers, LC-MS [7], and life marker chips, LMC [8]. If organics are present on the surface of Mars, they are expected to be present at extremely low concentrations (parts-per-billion), orders of magnitude below the sensitivities of most flight instrument technologies. Therefore, it becomes necessary to develop and integrate concentration mechanisms for in situ sample processing before delivery to analytical flight instrumentation. We present preliminary results of automated solid-phase-extraction (SPE) sample purification and concentration methods for the treatment of highly saline aqueous soil extracts. These methods take advantage of the affinity of low molecular weight organic compounds with natural and synthetic scavenger materials. These interactions allow for the separation of target organic analytes from unfavorable background species (i.e. salts) during inline treatment, and a clever method for selective desorption is utilized to obtain concentrated solutions on the order

  14. 19 CFR 151.70 - Method of sampling by Customs.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... THE TREASURY (CONTINUED) EXAMINATION, SAMPLING, AND TESTING OF MERCHANDISE Wool and Hair § 151.70... feasible to obtain a representative general sample of the wool or hair in a sampling unit or to test such...

  15. 19 CFR 151.70 - Method of sampling by Customs.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... THE TREASURY (CONTINUED) EXAMINATION, SAMPLING, AND TESTING OF MERCHANDISE Wool and Hair § 151.70... feasible to obtain a representative general sample of the wool or hair in a sampling unit or to test such...

  16. 19 CFR 151.70 - Method of sampling by Customs.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... THE TREASURY (CONTINUED) EXAMINATION, SAMPLING, AND TESTING OF MERCHANDISE Wool and Hair § 151.70... feasible to obtain a representative general sample of the wool or hair in a sampling unit or to test such...

  17. 19 CFR 151.70 - Method of sampling by Customs.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... THE TREASURY (CONTINUED) EXAMINATION, SAMPLING, AND TESTING OF MERCHANDISE Wool and Hair § 151.70... feasible to obtain a representative general sample of the wool or hair in a sampling unit or to test such...

  18. 19 CFR 151.70 - Method of sampling by Customs.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... THE TREASURY (CONTINUED) EXAMINATION, SAMPLING, AND TESTING OF MERCHANDISE Wool and Hair § 151.70... feasible to obtain a representative general sample of the wool or hair in a sampling unit or to test such...

  19. Allergic contact dermatitis from exotic woods: importance of patch-testing with patient-provided samples.

    PubMed

    Podjasek, Joshua O; Cook-Norris, Robert H; Richardson, Donna M; Drage, Lisa A; Davis, Mark D P

    2011-01-01

    Exotic woods from tropical and subtropical regions (eg, from South America, south Asia, and Africa) frequently are used occupationally and recreationally by woodworkers and hobbyists. These exotic woods more commonly provoke irritant contact dermatitis reactions, but they also can provoke allergic contact dermatitis reactions. We report three patients seen at Mayo Clinic (Rochester, MN) with allergic contact dermatitis reactions to exotic woods. Patch testing was performed and included patient-provided wood samples. Avoidance of identified allergens was recommended. For all patients, the dermatitis cleared or improved after avoidance of the identified allergens. Clinicians must be aware of the potential for allergic contact dermatitis reactions to compounds in exotic woods. Patch testing should be performed with suspected woods for diagnostic confirmation and allowance of subsequent avoidance of the allergens.

  20. Chlamydophila pneumoniae diagnostics: importance of methodology in relation to timing of sampling.

    PubMed

    Hvidsten, D; Halvorsen, D S; Berdal, B P; Gutteberg, T J

    2009-01-01

    The diagnostic impact of PCR-based detection was compared to single-serum IgM antibody measurement and IgG antibody seroconversion during an outbreak of Chlamydophila pneumoniae in a military community. Nasopharyngeal swabs for PCR-based detection, and serum, were obtained from 127 conscripts during the outbreak. Serum, drawn many months before the outbreak, provided the baseline antibody status. C. pneumoniae IgM and IgG antibodies were assayed using microimmunofluorescence (MIF), enzyme immunoassay (EIA) and recombinant ELISA (rELISA). Two reference standard tests were applied: (i) C. pneumoniae PCR; and (ii) assay of C. pneumoniae IgM antibodies, defined as positive if >or=2 IgM antibody assays (i.e. rELISA with MIF and/or EIA) were positive. In 33 subjects, of whom two tested negative according to IgM antibody assays and IgG seroconversion, C. pneumoniae DNA was detected by PCR. The sensitivities were 79%, 85%, 88% and 68%, respectively, and the specificities were 86%, 84%, 78% and 93%, respectively, for MIF IgM, EIA IgM, rELISA IgM and PCR. In two subjects, acute infection was diagnosed on the basis of IgG antibody seroconversion alone. The sensitivity of PCR detection was lower than that of any IgM antibody assay. This may be explained by the late sampling, or clearance of the organism following antibiotic treatment. The results of assay evaluation studies are affected not only by the choice of reference standard tests, but also by the timing of sampling for the different test principles used. On the basis of these findings, a combination of nasopharyngeal swabbing for PCR detection and specific single-serum IgM measurement is recommended in cases of acute respiratory C. pneumoniae infection.

  1. Rapid method for the determination of 226Ra in hydraulic fracturing wastewater samples

    SciTech Connect

    Maxwell, Sherrod L.; Culligan, Brian K.; Warren, Richard A.; McAlister, Daniel R.

    2016-03-24

    A new method that rapidly preconcentrates and measures 226Ra from hydraulic fracturing wastewater samples was developed in the Savannah River Environmental Laboratory. The method improves the quality of 226Ra measurements using gamma spectrometry by providing up to 100x preconcentration of 226Ra from this difficult sample matrix, which contains very high levels of calcium, barium, strontium, magnesium and sodium. The high chemical yield, typically 80-90%, facilitates a low detection limit, important for lower level samples, and indicates method ruggedness. Ba-133 tracer is used to determine chemical yield and correct for geometry-related counting issues. The 226Ra sample preparation takes < 2 hours.

  2. Active Search on Carcasses versus Pitfall Traps: a Comparison of Sampling Methods.

    PubMed

    Zanetti, N I; Camina, R; Visciarelli, E C; Centeno, N D

    2016-04-01

    The study of insect succession in cadavers and the classification of arthropods have mostly been done by placing a carcass in a cage, protected from vertebrate scavengers, which is then visited periodically. An alternative is to use specific traps. Few studies on carrion ecology and forensic entomology involving the carcasses of large vertebrates have employed pitfall traps. The aims of this study were to compare both sampling methods (active search on a carcass and pitfall trapping) for each coleopteran family, and to establish whether there is a discrepancy (underestimation and/or overestimation) in the presence of each family by either method. A great discrepancy was found for almost all families with some of them being more abundant in samples obtained through active search on carcasses and others in samples from traps, whereas two families did not show any bias towards a given sampling method. The fact that families may be underestimated or overestimated by the type of sampling technique highlights the importance of combining both methods, active search on carcasses and pitfall traps, in order to obtain more complete information on decomposition, carrion habitat and cadaveric families or species. Furthermore, a hypothesis advanced on the reasons for the underestimation by either sampling method showing biases towards certain families. Information about the sampling techniques indicating which would be more appropriate to detect or find a particular family is provided.

  3. A New Automated Method and Sample Data Flow for Analysis of Volatile Nitrosamines in Human Urine*

    PubMed Central

    Hodgson, James A.; Seyler, Tiffany H.; McGahee, Ernest; Arnstein, Stephen; Wang, Lanqing

    2016-01-01

    Volatile nitrosamines (VNAs) are a group of compounds classified as probable (group 2A) and possible (group 2B) carcinogens in humans. Along with certain foods and contaminated drinking water, VNAs are detected at high levels in tobacco products and in both mainstream and sidestream smoke. Our laboratory monitors six urinary VNAs—N-nitrosodimethylamine (NDMA), N-nitrosomethylethylamine (NMEA), N-nitrosodiethylamine (NDEA), N-nitrosopiperidine (NPIP), N-nitrosopyrrolidine (NPYR), and N-nitrosomorpholine (NMOR)—using isotope dilution GC-MS/MS (QQQ) for large population studies such as the National Health and Nutrition Examination Survey (NHANES). In this paper, we report for the first time a new automated sample preparation method to more efficiently quantitate these VNAs. Automation is done using Hamilton STAR™ and Caliper Staccato™ workstations. This new automated method reduces sample preparation time from 4 hours to 2.5 hours while maintaining precision (inter-run CV < 10%) and accuracy (85% - 111%). More importantly this method increases sample throughput while maintaining a low limit of detection (<10 pg/mL) for all analytes. A streamlined sample data flow was created in parallel to the automated method, in which samples can be tracked from receiving to final LIMs output with minimal human intervention, further minimizing human error in the sample preparation process. This new automated method and the sample data flow are currently applied in bio-monitoring of VNAs in the US non-institutionalized population NHANES 2013-2014 cycle. PMID:26949569

  4. Detection of Acanthamoeba and Toxoplasma in River Water Samples by Molecular Methods in Iran

    PubMed Central

    MAHMOUDI, Mohammad Reza; KAZEMI, Bahram; HAGHIGHI, Ali; KARANIS, Panagiotis

    2015-01-01

    Background: Free-living amoebae such as Acanthamoeba species may act as carriers of Cryptosporidium and Toxoplasma oocysts, thus, may play an important role in the water-borne transmission of these parasites. In the present study, a loop mediated isothermal amplification (LAMP) method for detection of Toxoplasma and a PCR assay were developed for investigation of Acanthamoeba in environmental water samples. Methods: A total of 34 samples were collected from the surface water in Guilan Province. Water samples were filtrated with membrane filters and followed by DNA extraction. PCR and LAMP methods used for detection of the protozoan parasites Acanthamoeba and Toxoplasma respectively. Results: Totally 30 and 2 of 34 samples were positive for Acanthamoeba and Toxoplasma oocysts respectively. Two samples were positive for both investigated parasites. Conclusion: The investigated water supplies, are contaminated by Toxoplasma and Acanthamoeba (oo)cystes. Acanthamoeba may play an important role in water-borne transmission of Toxoplasma in the study area. For the first time in Iran, protocol of LAMP method was used effectively for the detection of Toxoplasma in surface water samples in Iran. PMID:26246823

  5. Passive sampling methods for contaminated sediments: Risk assessment and management

    PubMed Central

    Greenberg, Marc S; Chapman, Peter M; Allan, Ian J; Anderson, Kim A; Apitz, Sabine E; Beegan, Chris; Bridges, Todd S; Brown, Steve S; Cargill, John G; McCulloch, Megan C; Menzie, Charles A; Shine, James P; Parkerton, Thomas F

    2014-01-01

    This paper details how activity-based passive sampling methods (PSMs), which provide information on bioavailability in terms of freely dissolved contaminant concentrations (Cfree), can be used to better inform risk management decision making at multiple points in the process of assessing and managing contaminated sediment sites. PSMs can increase certainty in site investigation and management, because Cfree is a better predictor of bioavailability than total bulk sediment concentration (Ctotal) for 4 key endpoints included in conceptual site models (benthic organism toxicity, bioaccumulation, sediment flux, and water column exposures). The use of passive sampling devices (PSDs) presents challenges with respect to representative sampling for estimating average concentrations and other metrics relevant for exposure and risk assessment. These challenges can be addressed by designing studies that account for sources of variation associated with PSMs and considering appropriate spatial scales to meet study objectives. Possible applications of PSMs include: quantifying spatial and temporal trends in bioavailable contaminants, identifying and evaluating contaminant source contributions, calibrating site-specific models, and, improving weight-of-evidence based decision frameworks. PSM data can be used to assist in delineating sediment management zones based on likelihood of exposure effects, monitor remedy effectiveness, and, evaluate risk reduction after sediment treatment, disposal, or beneficial reuse after management actions. Examples are provided illustrating why PSMs and freely dissolved contaminant concentrations (Cfree) should be incorporated into contaminated sediment investigations and study designs to better focus on and understand contaminant bioavailability, more accurately estimate exposure to sediment-associated contaminants, and better inform risk management decisions. Research and communication needs for encouraging broader use are discussed. Integr

  6. Clustering Methods with Qualitative Data: a Mixed-Methods Approach for Prevention Research with Small Samples.

    PubMed

    Henry, David; Dymnicki, Allison B; Mohatt, Nathaniel; Allen, James; Kelly, James G

    2015-10-01

    Qualitative methods potentially add depth to prevention research but can produce large amounts of complex data even with small samples. Studies conducted with culturally distinct samples often produce voluminous qualitative data but may lack sufficient sample sizes for sophisticated quantitative analysis. Currently lacking in mixed-methods research are methods allowing for more fully integrating qualitative and quantitative analysis techniques. Cluster analysis can be applied to coded qualitative data to clarify the findings of prevention studies by aiding efforts to reveal such things as the motives of participants for their actions and the reasons behind counterintuitive findings. By clustering groups of participants with similar profiles of codes in a quantitative analysis, cluster analysis can serve as a key component in mixed-methods research. This article reports two studies. In the first study, we conduct simulations to test the accuracy of cluster assignment using three different clustering methods with binary data as produced when coding qualitative interviews. Results indicated that hierarchical clustering, K-means clustering, and latent class analysis produced similar levels of accuracy with binary data and that the accuracy of these methods did not decrease with samples as small as 50. Whereas the first study explores the feasibility of using common clustering methods with binary data, the second study provides a "real-world" example using data from a qualitative study of community leadership connected with a drug abuse prevention project. We discuss the implications of this approach for conducting prevention research, especially with small samples and culturally distinct communities.

  7. Improved transition path sampling methods for simulation of rare events.

    PubMed

    Chopra, Manan; Malshe, Rohit; Reddy, Allam S; de Pablo, J J

    2008-04-14

    The free energy surfaces of a wide variety of systems encountered in physics, chemistry, and biology are characterized by the existence of deep minima separated by numerous barriers. One of the central aims of recent research in computational chemistry and physics has been to determine how transitions occur between deep local minima on rugged free energy landscapes, and transition path sampling (TPS) Monte-Carlo methods have emerged as an effective means for numerical investigation of such transitions. Many of the shortcomings of TPS-like approaches generally stem from their high computational demands. Two new algorithms are presented in this work that improve the efficiency of TPS simulations. The first algorithm uses biased shooting moves to render the sampling of reactive trajectories more efficient. The second algorithm is shown to substantially improve the accuracy of the transition state ensemble by introducing a subset of local transition path simulations in the transition state. The system considered in this work consists of a two-dimensional rough energy surface that is representative of numerous systems encountered in applications. When taken together, these algorithms provide gains in efficiency of over two orders of magnitude when compared to traditional TPS simulations.

  8. Sampling trace organic compounds in water: a comparison of a continuous active sampler to continuous passive and discrete sampling methods

    USGS Publications Warehouse

    Coes, Alissa L.; Paretti, Nicholas V.; Foreman, William T.; Iverson, Jana L.; Alvarez, David A.

    2014-01-01

    A continuous active sampling method was compared to continuous passive and discrete sampling methods for the sampling of trace organic compounds (TOCs) in water. Results from each method are compared and contrasted in order to provide information for future investigators to use while selecting appropriate sampling methods for their research. The continuous low-level aquatic monitoring (CLAM) sampler (C.I.Agent® Storm-Water Solutions) is a submersible, low flow-rate sampler, that continuously draws water through solid-phase extraction media. CLAM samplers were deployed at two wastewater-dominated stream field sites in conjunction with the deployment of polar organic chemical integrative samplers (POCIS) and the collection of discrete (grab) water samples. All samples were analyzed for a suite of 69 TOCs. The CLAM and POCIS samples represent time-integrated samples that accumulate the TOCs present in the water over the deployment period (19–23 h for CLAM and 29 days for POCIS); the discrete samples represent only the TOCs present in the water at the time and place of sampling. Non-metric multi-dimensional scaling and cluster analysis were used to examine patterns in both TOC detections and relative concentrations between the three sampling methods. A greater number of TOCs were detected in the CLAM samples than in corresponding discrete and POCIS samples, but TOC concentrations in the CLAM samples were significantly lower than in the discrete and (or) POCIS samples. Thirteen TOCs of varying polarity were detected by all of the three methods. TOC detections and concentrations obtained by the three sampling methods, however, are dependent on multiple factors. This study found that stream discharge, constituent loading, and compound type all affected TOC concentrations detected by each method. In addition, TOC detections and concentrations were affected by the reporting limits, bias, recovery, and performance of each method.

  9. Sampling trace organic compounds in water: a comparison of a continuous active sampler to continuous passive and discrete sampling methods.

    PubMed

    Coes, Alissa L; Paretti, Nicholas V; Foreman, William T; Iverson, Jana L; Alvarez, David A

    2014-03-01

    A continuous active sampling method was compared to continuous passive and discrete sampling methods for the sampling of trace organic compounds (TOCs) in water. Results from each method are compared and contrasted in order to provide information for future investigators to use while selecting appropriate sampling methods for their research. The continuous low-level aquatic monitoring (CLAM) sampler (C.I.Agent® Storm-Water Solutions) is a submersible, low flow-rate sampler, that continuously draws water through solid-phase extraction media. CLAM samplers were deployed at two wastewater-dominated stream field sites in conjunction with the deployment of polar organic chemical integrative samplers (POCIS) and the collection of discrete (grab) water samples. All samples were analyzed for a suite of 69 TOCs. The CLAM and POCIS samples represent time-integrated samples that accumulate the TOCs present in the water over the deployment period (19-23 h for CLAM and 29 days for POCIS); the discrete samples represent only the TOCs present in the water at the time and place of sampling. Non-metric multi-dimensional scaling and cluster analysis were used to examine patterns in both TOC detections and relative concentrations between the three sampling methods. A greater number of TOCs were detected in the CLAM samples than in corresponding discrete and POCIS samples, but TOC concentrations in the CLAM samples were significantly lower than in the discrete and (or) POCIS samples. Thirteen TOCs of varying polarity were detected by all of the three methods. TOC detections and concentrations obtained by the three sampling methods, however, are dependent on multiple factors. This study found that stream discharge, constituent loading, and compound type all affected TOC concentrations detected by each method. In addition, TOC detections and concentrations were affected by the reporting limits, bias, recovery, and performance of each method.

  10. Rapid fusion method for the determination of refractory thorium and uranium isotopes in soil samples

    SciTech Connect

    Maxwell, Sherrod L.; Hutchison, Jay B.; McAlister, Daniel R.

    2015-02-14

    Recently, approximately 80% of participating laboratories failed to accurately determine uranium isotopes in soil samples in the U.S Department of Energy Mixed Analyte Performance Evaluation Program (MAPEP) Session 30, due to incomplete dissolution of refractory particles in the samples. Failing laboratories employed acid dissolution methods, including hydrofluoric acid, to recover uranium from the soil matrix. The failures illustrate the importance of rugged soil dissolution methods for the accurate measurement of analytes in the sample matrix. A new rapid fusion method has been developed by the Savannah River National Laboratory (SRNL) to prepare 1-2 g soil sample aliquots very quickly, with total dissolution of refractory particles. Soil samples are fused with sodium hydroxide at 600 ºC in zirconium crucibles to enable complete dissolution of the sample. Uranium and thorium are separated on stacked TEVA and TRU extraction chromatographic resin cartridges, prior to isotopic measurements by alpha spectrometry on cerium fluoride microprecipitation sources. Plutonium can also be separated and measured using this method. Batches of 12 samples can be prepared for measurement in <5 hours.

  11. Rapid fusion method for the determination of refractory thorium and uranium isotopes in soil samples

    DOE PAGES

    Maxwell, Sherrod L.; Hutchison, Jay B.; McAlister, Daniel R.

    2015-02-14

    Recently, approximately 80% of participating laboratories failed to accurately determine uranium isotopes in soil samples in the U.S Department of Energy Mixed Analyte Performance Evaluation Program (MAPEP) Session 30, due to incomplete dissolution of refractory particles in the samples. Failing laboratories employed acid dissolution methods, including hydrofluoric acid, to recover uranium from the soil matrix. The failures illustrate the importance of rugged soil dissolution methods for the accurate measurement of analytes in the sample matrix. A new rapid fusion method has been developed by the Savannah River National Laboratory (SRNL) to prepare 1-2 g soil sample aliquots very quickly, withmore » total dissolution of refractory particles. Soil samples are fused with sodium hydroxide at 600 ºC in zirconium crucibles to enable complete dissolution of the sample. Uranium and thorium are separated on stacked TEVA and TRU extraction chromatographic resin cartridges, prior to isotopic measurements by alpha spectrometry on cerium fluoride microprecipitation sources. Plutonium can also be separated and measured using this method. Batches of 12 samples can be prepared for measurement in <5 hours.« less

  12. A simple capacitive method to evaluate ethanol fuel samples

    NASA Astrophysics Data System (ADS)

    Vello, Tatiana P.; de Oliveira, Rafael F.; Silva, Gustavo O.; de Camargo, Davi H. S.; Bufon, Carlos C. B.

    2017-02-01

    Ethanol is a biofuel used worldwide. However, the presence of excessive water either during the distillation process or by fraudulent adulteration is a major concern in the use of ethanol fuel. High water levels may cause engine malfunction, in addition to being considered illegal. Here, we describe the development of a simple, fast and accurate platform based on nanostructured sensors to evaluate ethanol samples. The device fabrication is facile, based on standard microfabrication and thin-film deposition methods. The sensor operation relies on capacitance measurements employing a parallel plate capacitor containing a conformational aluminum oxide (Al2O3) thin layer (15 nm). The sensor operates over the full range water concentration, i.e., from approximately 0% to 100% vol. of water in ethanol, with water traces being detectable down to 0.5% vol. These characteristics make the proposed device unique with respect to other platforms. Finally, the good agreement between the sensor response and analyses performed by gas chromatography of ethanol biofuel endorses the accuracy of the proposed method. Due to the full operation range, the reported sensor has the technological potential for use as a point-of-care analytical tool at gas stations or in the chemical, pharmaceutical, and beverage industries, to mention a few.

  13. A simple capacitive method to evaluate ethanol fuel samples

    PubMed Central

    Vello, Tatiana P.; de Oliveira, Rafael F.; Silva, Gustavo O.; de Camargo, Davi H. S.; Bufon, Carlos C. B.

    2017-01-01

    Ethanol is a biofuel used worldwide. However, the presence of excessive water either during the distillation process or by fraudulent adulteration is a major concern in the use of ethanol fuel. High water levels may cause engine malfunction, in addition to being considered illegal. Here, we describe the development of a simple, fast and accurate platform based on nanostructured sensors to evaluate ethanol samples. The device fabrication is facile, based on standard microfabrication and thin-film deposition methods. The sensor operation relies on capacitance measurements employing a parallel plate capacitor containing a conformational aluminum oxide (Al2O3) thin layer (15 nm). The sensor operates over the full range water concentration, i.e., from approximately 0% to 100% vol. of water in ethanol, with water traces being detectable down to 0.5% vol. These characteristics make the proposed device unique with respect to other platforms. Finally, the good agreement between the sensor response and analyses performed by gas chromatography of ethanol biofuel endorses the accuracy of the proposed method. Due to the full operation range, the reported sensor has the technological potential for use as a point-of-care analytical tool at gas stations or in the chemical, pharmaceutical, and beverage industries, to mention a few. PMID:28240312

  14. MARKOV CHAIN MONTE CARLO POSTERIOR SAMPLING WITH THE HAMILTONIAN METHOD

    SciTech Connect

    K. HANSON

    2001-02-01

    The Markov Chain Monte Carlo technique provides a means for drawing random samples from a target probability density function (pdf). MCMC allows one to assess the uncertainties in a Bayesian analysis described by a numerically calculated posterior distribution. This paper describes the Hamiltonian MCMC technique in which a momentum variable is introduced for each parameter of the target pdf. In analogy to a physical system, a Hamiltonian H is defined as a kinetic energy involving the momenta plus a potential energy {var_phi}, where {var_phi} is minus the logarithm of the target pdf. Hamiltonian dynamics allows one to move along trajectories of constant H, taking large jumps in the parameter space with relatively few evaluations of {var_phi} and its gradient. The Hamiltonian algorithm alternates between picking a new momentum vector and following such trajectories. The efficiency of the Hamiltonian method for multidimensional isotropic Gaussian pdfs is shown to remain constant at around 7% for up to several hundred dimensions. The Hamiltonian method handles correlations among the variables much better than the standard Metropolis algorithm. A new test, based on the gradient of {var_phi}, is proposed to measure the convergence of the MCMC sequence.

  15. Spline methods for approximating quantile functions and generating random samples

    NASA Technical Reports Server (NTRS)

    Schiess, J. R.; Matthews, C. G.

    1985-01-01

    Two cubic spline formulations are presented for representing the quantile function (inverse cumulative distribution function) of a random sample of data. Both B-spline and rational spline approximations are compared with analytic representations of the quantile function. It is also shown how these representations can be used to generate random samples for use in simulation studies. Comparisons are made on samples generated from known distributions and a sample of experimental data. The spline representations are more accurate for multimodal and skewed samples and to require much less time to generate samples than the analytic representation.

  16. Nasal swab samples and real-time polymerase chain reaction assays in community-based, longitudinal studies of respiratory viruses: the importance of sample integrity and quality control

    PubMed Central

    2014-01-01

    Background Carefully conducted, community-based, longitudinal studies are required to gain further understanding of the nature and timing of respiratory viruses causing infections in the population. However, such studies pose unique challenges for field specimen collection, including as we have observed the appearance of mould in some nasal swab specimens. We therefore investigated the impact of sample collection quality and the presence of visible mould in samples upon respiratory virus detection by real-time polymerase chain reaction (PCR) assays. Methods Anterior nasal swab samples were collected from infants participating in an ongoing community-based, longitudinal, dynamic birth cohort study. The samples were first collected from each infant shortly after birth and weekly thereafter. They were then mailed to the laboratory where they were catalogued, stored at -80°C and later screened by PCR for 17 respiratory viruses. The quality of specimen collection was assessed by screening for human deoxyribonucleic acid (DNA) using endogenous retrovirus 3 (ERV3). The impact of ERV3 load upon respiratory virus detection and the impact of visible mould observed in a subset of swabs reaching the laboratory upon both ERV3 loads and respiratory virus detection was determined. Results In total, 4933 nasal swabs were received in the laboratory. ERV3 load in nasal swabs was associated with respiratory virus detection. Reduced respiratory virus detection (odds ratio 0.35; 95% confidence interval 0.27-0.44) was observed in samples where the ERV3 could not be identified. Mould was associated with increased time of samples reaching the laboratory and reduced ERV3 loads and respiratory virus detection. Conclusion Suboptimal sample collection and high levels of visible mould can impact negatively upon sample quality. Quality control measures, including monitoring human DNA loads using ERV3 as a marker for epithelial cell components in samples should be undertaken to optimize the

  17. Accelerated Evaluation of Automated Vehicles Safety in Lane-Change Scenarios Based on Importance Sampling Techniques.

    PubMed

    Zhao, Ding; Lam, Henry; Peng, Huei; Bao, Shan; LeBlanc, David J; Nobukawa, Kazutoshi; Pan, Christopher S

    2016-08-05

    Automated vehicles (AVs) must be thoroughly evaluated before their release and deployment. A widely used evaluation approach is the Naturalistic-Field Operational Test (N-FOT), which tests prototype vehicles directly on the public roads. Due to the low exposure to safety-critical scenarios, N-FOTs are time consuming and expensive to conduct. In this paper, we propose an accelerated evaluation approach for AVs. The results can be used to generate motions of the other primary vehicles to accelerate the verification of AVs in simulations and controlled experiments. Frontal collision due to unsafe cut-ins is the target crash type of this paper. Human-controlled vehicles making unsafe lane changes are modeled as the primary disturbance to AVs based on data collected by the University of Michigan Safety Pilot Model Deployment Program. The cut-in scenarios are generated based on skewed statistics of collected human driver behaviors, which generate risky testing scenarios while preserving the statistical information so that the safety benefits of AVs in nonaccelerated cases can be accurately estimated. The cross-entropy method is used to recursively search for the optimal skewing parameters. The frequencies of the occurrences of conflicts, crashes, and injuries are estimated for a modeled AV, and the achieved accelerated rate is around 2000 to 20 000. In other words, in the accelerated simulations, driving for 1000 miles will expose the AV with challenging scenarios that will take about 2 to 20 million miles of real-world driving to encounter. This technique thus has the potential to greatly reduce the development and validation time for AVs.

  18. Accelerated Evaluation of Automated Vehicles Safety in Lane-Change Scenarios Based on Importance Sampling Techniques

    PubMed Central

    Zhao, Ding; Lam, Henry; Peng, Huei; Bao, Shan; LeBlanc, David J.; Nobukawa, Kazutoshi; Pan, Christopher S.

    2016-01-01

    Automated vehicles (AVs) must be thoroughly evaluated before their release and deployment. A widely used evaluation approach is the Naturalistic-Field Operational Test (N-FOT), which tests prototype vehicles directly on the public roads. Due to the low exposure to safety-critical scenarios, N-FOTs are time consuming and expensive to conduct. In this paper, we propose an accelerated evaluation approach for AVs. The results can be used to generate motions of the other primary vehicles to accelerate the verification of AVs in simulations and controlled experiments. Frontal collision due to unsafe cut-ins is the target crash type of this paper. Human-controlled vehicles making unsafe lane changes are modeled as the primary disturbance to AVs based on data collected by the University of Michigan Safety Pilot Model Deployment Program. The cut-in scenarios are generated based on skewed statistics of collected human driver behaviors, which generate risky testing scenarios while preserving the statistical information so that the safety benefits of AVs in nonaccelerated cases can be accurately estimated. The cross-entropy method is used to recursively search for the optimal skewing parameters. The frequencies of the occurrences of conflicts, crashes, and injuries are estimated for a modeled AV, and the achieved accelerated rate is around 2000 to 20 000. In other words, in the accelerated simulations, driving for 1000 miles will expose the AV with challenging scenarios that will take about 2 to 20 million miles of real-world driving to encounter. This technique thus has the potential to greatly reduce the development and validation time for AVs. PMID:27840592

  19. An economic passive sampling method to detect particulate pollutants using magnetic measurements.

    PubMed

    Cao, Liwan; Appel, Erwin; Hu, Shouyun; Ma, Mingming

    2015-10-01

    Identifying particulate matter (PM) emitted from industrial processes into the atmosphere is an important issue in environmental research. This paper presents a passive sampling method using simple artificial samplers that maintains the advantage of bio-monitoring, but overcomes some of its disadvantages. The samplers were tested in a heavily polluted area (Linfen, China) and compared to results from leaf samples. Spatial variations of magnetic susceptibility from artificial passive samplers and leaf samples show very similar patterns. Scanning electron microscopy suggests that the collected PM are mostly in the range of 2-25 μm; frequent occurrence of spherical shape indicates industrial combustion dominates PM emission. Magnetic properties around power plants show different features than other plants. This sampling method provides a suitable and economic tool for semi-quantifying temporal and spatial distribution of air quality; they can be installed in a regular grid and calibrate the weight of PM.

  20. Sexual violence and HIV risk behaviors among a nationally representative sample of heterosexual American women: The importance of sexual coercion

    PubMed Central

    Stockman, Jamila K; Campbell, Jacquelyn C; Celentano, David D

    2009-01-01

    Objectives Recent evidence suggests that it is important to consider behavioral-specific sexual violence measures in assessing women’s risk behaviors. This study investigated associations of history and types of sexual coercion on HIV risk behaviors in a nationally representative sample of heterosexually active American women. Methods Analyses were based on 5,857 women aged 18–44 participating in the 2002 National Survey of Family Growth. Types of lifetime sexual coercion included: victim given alcohol or drugs, verbally pressured, threatened with physical injury, and physically injured. Associations with HIV risk behaviors were assessed using logistic regression. Results Of 5,857 heterosexually active women, 16.4% reported multiple sex partners and 15.3% reported substance abuse. A coerced first sexual intercourse experience and coerced sex after sexual debut were independently associated with multiple sex partners and substance abuse; the highest risk was observed for women reporting a coerced first sexual intercourse experience. Among types of sexual coercion, alcohol or drug use at coerced sex was independently associated with multiple sex partners and substance abuse. Conclusions Our findings suggest that public health strategies are needed to address the violent components of heterosexual relationships. Future research should utilize longitudinal and qualitative research to characterize the relationship between continuums of sexual coercion and HIV risk. PMID:19734802

  1. Probing methane hydrate nucleation through the forward flux sampling method.

    PubMed

    Bi, Yuanfei; Li, Tianshu

    2014-11-26

    Understanding the nucleation of hydrate is the key to developing effective strategies for controlling methane hydrate formation. Here we present a computational study of methane hydrate nucleation, by combining the forward flux sampling (FFS) method and the coarse-grained water model mW. To facilitate the application of FFS in studying the formation of methane hydrate, we developed an effective order parameter λ on the basis of the topological analysis of the tetrahedral network. The order parameter capitalizes the signature of hydrate structure, i.e., polyhedral cages, and is capable of efficiently distinguishing hydrate from ice and liquid water while allowing the formation of different hydrate phases, i.e., sI, sII, and amorphous. Integration of the order parameter λ with FFS allows explicitly computing hydrate nucleation rates and obtaining an ensemble of nucleation trajectories under conditions where spontaneous hydrate nucleation becomes too slow to occur in direct simulation. The convergence of the obtained hydrate nucleation rate was found to depend crucially on the convergence of the spatial distribution for the spontaneously formed hydrate seeds obtained from the initial sampling of FFS. The validity of the approach is also verified by the agreement between the calculated nucleation rate and that inferred from the direct simulation. Analyzing the obtained large ensemble of hydrate nucleation trajectories, we show hydrate formation at 220 K and 500 bar is initiated by the nucleation events occurring in the vicinity of water-methane interface, and facilitated by a gradual transition from amorphous to crystalline structure. The latter provides the direct support to the proposed two-step nucleation mechanism of methane hydrate.

  2. COMPARISON OF USEPA FIELD SAMPLING METHODS FOR BENTHIC MACROINVERTEBRATE STUDIES

    EPA Science Inventory

    Two U.S. Environmental Protection Agency (USEPA) macroinvertebrate sampling protocols were compared in the Mid-Atlantic Highlands region. The Environmental Monitoring and Assessment Program (EMAP) wadeable streams protocol results in a single composite sample from nine transects...

  3. 19 CFR 19.8 - Examination of goods by importer; sampling; repacking; examination of merchandise by prospective...

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 19 Customs Duties 1 2010-04-01 2010-04-01 false Examination of goods by importer; sampling; repacking; examination of merchandise by prospective purchasers. 19.8 Section 19.8 Customs Duties U.S... WAREHOUSES, CONTAINER STATIONS AND CONTROL OF MERCHANDISE THEREIN General Provisions § 19.8 Examination...

  4. Evaluation of Environmental Sample Analysis Methods and Results Reporting in the National Children's Study Vanguard Study.

    PubMed

    Heikkinen, Maire S A; Khalaf, Abdisalam; Beard, Barbara; Viet, Susan M; Dellarco, Michael

    2016-05-03

    During the initial Vanguard phase of the U.S. National Children's Study (NCS), about 2000 tap water, surface wipe, and air samples were collected and analyzed immediately. The shipping conditions, analysis methods, results, and laboratory performance were evaluated to determine the best approaches for use in the NCS Main Study. The main conclusions were (1) to employ established sample analysis methods, when possible, and alternate methodologies only after careful consideration with method validation studies; (2) lot control and prescreening sample collection materials are important quality assurance procedures; (3) packing samples correctly requires careful training and adjustment of shipping conditions to local conditions; (4) trip blanks and spiked samples should be considered for samplers with short expiration times and labile analytes; (5) two study-specific results reports should be required: laboratory electronic data deliverables (EDD) of sample results in a useable electronic format (CSV or SEDD XML/CSV) and a data package with sample results and supporting information in PDF format. These experiences and lessons learned can be applied to any long-term study.

  5. Preparation of samples for leaf architecture studies, a method for mounting cleared leaves1

    PubMed Central

    Vasco, Alejandra; Thadeo, Marcela; Conover, Margaret; Daly, Douglas C.

    2014-01-01

    • Premise of the study: Several recent waves of interest in leaf architecture have shown an expanding range of approaches and applications across a number of disciplines. Despite this increased interest, examination of existing archives of cleared and mounted leaves shows that current methods for mounting, in particular, yield unsatisfactory results and deterioration of samples over relatively short periods. Although techniques for clearing and staining leaves are numerous, published techniques for mounting leaves are scarce. • Methods and Results: Here we present a complete protocol and recommendations for clearing, staining, and imaging leaves, and, most importantly, a method to permanently mount cleared leaves. • Conclusions: The mounting protocol is faster than other methods, inexpensive, and straightforward; moreover, it yields clear and permanent samples that can easily be imaged, scanned, and stored. Specimens mounted with this method preserve well, with leaves that were mounted more than 35 years ago showing no signs of bubbling or discoloration. PMID:25225627

  6. Vegetation Sampling for Wetland Delineation: A Review and Synthesis of Methods and Sampling Issues

    DTIC Science & Technology

    2010-07-01

    bryophytes , can be effectively sampled using much smaller plots (Bonham 1989). The most appropriate size for a sample plot depends on the type of...and, when abundant, provide strong indicators of hydrophytic vegetation (Seppelt et al. 2008; Lichvar et al. 2009). Sampling bryophytes presents...of wetland specialist bryophytes (USACE 2007). In some wetland types, bryophytes may contribute significant floristic diversity and canopy coverage

  7. Out-of-Sample Extensions for Non-Parametric Kernel Methods.

    PubMed

    Pan, Binbin; Chen, Wen-Sheng; Chen, Bo; Xu, Chen; Lai, Jianhuang

    2017-02-01

    Choosing suitable kernels plays an important role in the performance of kernel methods. Recently, a number of studies were devoted to developing nonparametric kernels. Without assuming any parametric form of the target kernel, nonparametric kernel learning offers a flexible scheme to utilize the information of the data, which may potentially characterize the data similarity better. The kernel methods using nonparametric kernels are referred to as nonparametric kernel methods. However, many nonparametric kernel methods are restricted to transductive learning, where the prediction function is defined only over the data points given beforehand. They have no straightforward extension for the out-of-sample data points, and thus cannot be applied to inductive learning. In this paper, we show how to make the nonparametric kernel methods applicable to inductive learning. The key problem of out-of-sample extension is how to extend the nonparametric kernel matrix to the corresponding kernel function. A regression approach in the hyper reproducing kernel Hilbert space is proposed to solve this problem. Empirical results indicate that the out-of-sample performance is comparable to the in-sample performance in most cases. Experiments on face recognition demonstrate the superiority of our nonparametric kernel method over the state-of-the-art parametric kernel methods.

  8. An evaluation of long-term preservation methods for brown bear (Ursus arctos) faecal DNA samples

    USGS Publications Warehouse

    Murphy, M.A.; Waits, L.P.; Kendall, K.C.; Wasser, S.K.; Higbee, J.A.; Bogden, R.

    2002-01-01

    Relatively few large-scale faecal DNA studies have been initiated due to difficulties in amplifying low quality and quantity DNA template. To improve brown bear faecal DNA PCR amplification success rates and to determine post collection sample longevity, five preservation methods were evaluated: 90% ethanol, DETs buffer, silica-dried, oven-dried stored at room temperature, and oven-dried stored at -20??C. Preservation effectiveness was evaluated for 50 faecal samples by PCR amplification of a mitochondrial DNA (mtDNA) locus (???146 bp) and a nuclear DNA (nDNA) locus (???200 bp) at time points of one week, one month, three months and six months. Preservation method and storage time significantly impacted mtDNA and nDNA amplification success rates. For mtDNA, all preservation methods had ??? 75% success at one week, but storage time had a significant impact on the effectiveness of the silica preservation method. Ethanol preserved samples had the highest success rates for both mtDNA (86.5%) and nDNA (84%). Nuclear DNA amplification success rates ranged from 26-88%, and storage time had a significant impact on all methods but ethanol. Preservation method and storage time should be important considerations for researchers planning projects utilizing faecal DNA. We recommend preservation of faecal samples in 90% ethanol when feasible, although when collecting in remote field conditions or for both DNA and hormone assays a dry collection method may be advantageous.

  9. Photoacoustic spectroscopy sample array vessels and photoacoustic spectroscopy methods for using the same

    DOEpatents

    Amonette, James E.; Autrey, S. Thomas; Foster-Mills, Nancy S.

    2006-02-14

    Methods and apparatus for simultaneous or sequential, rapid analysis of multiple samples by photoacoustic spectroscopy are disclosed. Particularly, a photoacoustic spectroscopy sample array vessel including a vessel body having multiple sample cells connected thereto is disclosed. At least one acoustic detector is acoustically positioned near the sample cells. Methods for analyzing the multiple samples in the sample array vessels using photoacoustic spectroscopy are provided.

  10. Sampling methods for assessing syrphid biodiversity (Diptera: Syrphidae) in tropical forests.

    PubMed

    Marcos-García, M A; García-López, A; Zumbado, M A; Rotheray, G E

    2012-12-01

    When assessing the species richness of a taxonomic group in a specific area, the choice of sampling method is critical. In this study, the effectiveness of three methods for sampling syrphids (Diptera: Syrphidae) in tropical forests is compared: Malaise trapping, collecting adults with an entomological net, and collecting and rearing immatures. Surveys were made from 2008 to 2011 in six tropical forest sites in Costa Rica. The results revealed significant differences in the composition and richness of syrphid faunas obtained by each method. Collecting immatures was the most successful method based on numbers of species and individuals, whereas Malaise trapping was the least effective. This pattern of sampling effectiveness was independent of syrphid trophic or functional group and annual season. An advantage of collecting immatures over collecting adults is the quality and quantity of associated biological data obtained by the former method. However, complementarity between results of collecting adults and collecting immatures, showed that a combined sampling regime obtained the most complete inventory. Differences between these results and similar studies in more open Mediterranean habitats, suggest that for effective inventory, it is important to consider the effects of environmental characteristics on the catchability of syrphids as much as the costs and benefits of different sampling techniques.

  11. An automated method of sample preparation of biofluids using pierceable caps to eliminate the uncapping of the sample tubes during sample transfer.

    PubMed

    Teitz, D S; Khan, S; Powell, M L; Jemal, M

    2000-09-11

    Biological samples are normally collected and stored frozen in capped tubes until analysis. To obtain aliquots of biological samples for analysis, the sample tubes have to be thawed, uncapped, samples removed and then recapped for further storage. In this paper, we report an automated method of sample transfer devised to eliminate the uncapping and recapping process. This sampling method was incorporated into an automated liquid-liquid extraction procedure of plasma samples. Using a robotic system, the plasma samples were transferred directly from pierceable capped tubes into microtubes contained in a 96-position block. The aliquoted samples were extracted with methyl-tert-butyl ether in the same microtubes. The supernatant organic layers were transferred to a 96-well collection plate and evaporated to dryness. The dried extracts were reconstituted and injected from the same plate for analysis by liquid chromatography with tandem mass spectrometry.

  12. Automated PCR setup for forensic casework samples using the Normalization Wizard and PCR Setup robotic methods.

    PubMed

    Greenspoon, S A; Sykes, K L V; Ban, J D; Pollard, A; Baisden, M; Farr, M; Graham, N; Collins, B L; Green, M M; Christenson, C C

    2006-12-20

    Human genome, pharmaceutical and research laboratories have long enjoyed the application of robotics to performing repetitive laboratory tasks. However, the utilization of robotics in forensic laboratories for processing casework samples is relatively new and poses particular challenges. Since the quantity and quality (a mixture versus a single source sample, the level of degradation, the presence of PCR inhibitors) of the DNA contained within a casework sample is unknown, particular attention must be paid to procedural susceptibility to contamination, as well as DNA yield, especially as it pertains to samples with little biological material. The Virginia Department of Forensic Science (VDFS) has successfully automated forensic casework DNA extraction utilizing the DNA IQ(trade mark) System in conjunction with the Biomek 2000 Automation Workstation. Human DNA quantitation is also performed in a near complete automated fashion utilizing the AluQuant Human DNA Quantitation System and the Biomek 2000 Automation Workstation. Recently, the PCR setup for casework samples has been automated, employing the Biomek 2000 Automation Workstation and Normalization Wizard, Genetic Identity version, which utilizes the quantitation data, imported into the software, to create a customized automated method for DNA dilution, unique to that plate of DNA samples. The PCR Setup software method, used in conjunction with the Normalization Wizard method and written for the Biomek 2000, functions to mix the diluted DNA samples, transfer the PCR master mix, and transfer the diluted DNA samples to PCR amplification tubes. Once the process is complete, the DNA extracts, still on the deck of the robot in PCR amplification strip tubes, are transferred to pre-labeled 1.5 mL tubes for long-term storage using an automated method. The automation of these steps in the process of forensic DNA casework analysis has been accomplished by performing extensive optimization, validation and testing of the

  13. Validated Test Method 5030C: Purge-and-Trap for Aqueous Samples

    EPA Pesticide Factsheets

    This method describes a purge-and-trap procedure for the analysis of volatile organic compoundsin aqueous samples & water miscible liquid samples. It also describes the analysis of high concentration soil and waste sample extracts prepared in Method 5035.

  14. Evaluation of sample preservation methods for space mission

    NASA Technical Reports Server (NTRS)

    Schubert, W.; Rohatgi, N.; Kazarians, G.

    2002-01-01

    For interplanetary spacecraft that will travel to destinations where future life detection experiments may be conducted or samples are to be returned to earth, we should archive and preserve relevant samples from the spacecraft and cleanrooms for evaluation at a future date.

  15. The importance of quality control in validating concentrations of contaminants of emerging concern in source and treated drinking water samples.

    EPA Science Inventory

    A national-scale survey of 247 contaminants of emerging concern (CECs), including organic and inorganic chemical compounds, and microbial contaminants, was conducted in source and treated drinking water samples from 25 treatment plants across the United States. Multiple methods w...

  16. Alternative methods for determining the electrical conductivity of core samples.

    PubMed

    Lytle, R J; Duba, A G; Willows, J L

    1979-05-01

    Electrode configurations are described that can be used in measuring the electrical conductivity of a core sample and that do not require access to the core end faces. The use of these configurations eliminates the need for machining the core ends for placement of end electrodes. This is because the conductivity in the cases described is relatively insensitive to the length of the sample. We validated the measurement technique by comparing mathematical models with actual measurements that were made perpendicular and paralled to the core axis of granite samples.

  17. A method for time-resolved calorespirometry of terrestrial samples.

    PubMed

    Wadsö, Lars

    2015-04-01

    A new vessel for simultaneous isothermal calorimetry and respirometry (calorespirometry) on terrestrial (non-aqueous) samples has been developed. All types of small (<1 g) biological samples (insects, soil, leaves, fungi, etc.) can be studied. The respirometric measurements are made by opening and closing a valve to a vial inside the sample ampoule containing a carbon dioxide absorbent. Typically a 7 h measurement results in seven measurements of heat production rate, oxygen consumption and carbon dioxide production, which can be used to evaluate how the metabolic activity in a sample changes over time. Results from three experiments on leaves, a cut vegetable, and mold are given. As uncertainties--especially in the carbon dioxide production--tend to be quite high, improvements to the technique are also discussed.

  18. Field sampling method for quantifying odorants in humid environments.

    PubMed

    Trabue, Steven L; Scoggin, Kenwood D; Li, Hong; Burns, Robert; Xin, Hongwei

    2008-05-15

    Most air quality studies in agricultural environments use thermal desorption analysis for quantifying semivolatile organic compounds (SVOCs) associated with odor. The objective of this study was to develop a robust sampling technique for measuring SVOCs in humid environments. Test atmospheres were generated at ambient temperatures (23 +/- 1.5 degrees C) and 25, 50, and 80% relative humidity (RH). Sorbent material used included Tenax, graphitized carbon, and carbon molecular sieve (CMS). Sorbent tubes were challenged with 2, 4, 8, 12, and 24 L of air at various RHs. Sorbent tubes with CMS material performed poorly at both 50 and 80% RH dueto excessive sorption of water. Heating of CMS tubes during sampling or dry-purging of CMS tubes post sampling effectively reduced water sorption with heating of tubes being preferred due to the higher recovery and reproducibility. Tenaxtubes had breakthrough of the more volatile compounds and tended to form artifacts with increasing volumes of air sampled. Graphitized carbon sorbent tubes containing Carbopack X and Carbopack C performed best with quantitative recovery of all compounds at all RHs and sampling volumes tested. The graphitized carbon tubes were taken to the field for further testing. Field samples taken from inside swine feeding operations showed that butanoic acid, 4-methylphenol, 4-ethylphenol, indole, and 3-methylindole were the compounds detected most often above their odor threshold values. Field samples taken from a poultry facility demonstrated that butanoic acid, 3-methylbutanoic acid, and 4-methylphenol were the compounds above their odor threshold values detected most often, relative humidity, CAFO, VOC, SVOC, thermal desorption, swine, poultry, air quality, odor.

  19. Photoacoustic spectroscopy sample array vessel and photoacoustic spectroscopy method for using the same

    DOEpatents

    Amonette, James E.; Autrey, S. Thomas; Foster-Mills, Nancy S.; Green, David

    2005-03-29

    Methods and apparatus for analysis of multiple samples by photoacoustic spectroscopy are disclosed. Particularly, a photoacoustic spectroscopy sample array vessel including a vessel body having multiple sample cells connected thereto is disclosed. At least one acoustic detector is acoustically coupled with the vessel body. Methods for analyzing the multiple samples in the sample array vessels using photoacoustic spectroscopy are provided.

  20. Analytical instrument with apparatus and method for sample concentrating

    DOEpatents

    Zaromb, S.

    1986-08-04

    A system for analysis of trace concentrations of contaminants in air includes a portable liquid chromatograph and a preconcentrator for the contaminants to be analyzed. The preconcentrator includes a sample bag having an inlet valve and an outlet valve for collecting an air sample. When the sample is collected the sample bag is connected in series with a sorbing apparatus in a recirculation loop. The sorbing apparatus has an inner gas-permeable container containing a sorbent material and an outer gas-impermeable container. The sample is circulated through the outer container and around the inner container for trapping and preconcentrating the contaminants in the sorbent material. The sorbent material may be a liquid having the same composition as the mobile phase of the chromatograph for direct injection thereinto. Alternatively, the sorbent material may be a porous, solid body, to which mobile phase liquid is added after preconcentration of the contaminants for dissolving the contaminants, the liquid solution then being withdrawn for injection into the chromatograph.

  1. Method for preconcentrating a sample for subsequent analysis

    DOEpatents

    Zaromb, Solomon

    1990-01-01

    A system for analysis of trace concentration of contaminants in air includes a portable liquid chromatograph and a preconcentrator for the contaminants to be analyzed. The preconcentrator includes a sample bag having an inlet valve and an outlet valve for collecting an air sample. When the sample is collected the sample bag is connected in series with a sorbing apparatus in a recirculation loop. The sorbing apparatus has an inner gas-permeable container containing a sorbent material and an outer gas-impermeable container. The sample is circulated through the outer container and around the inner container for trapping and preconcentrating the contaminants in the sorbent material. The sorbent material may be a liquid having the same composition as the mobile phase of the chromatograph for direct injection thereinto. Alternatively, the sorbent material may be a porous, solid body, to which mobile phase liquid is added after preconcentration of the contaminants for dissolving the contaminants, the liquid solution then being withdrawn for injection into the chromatograph.

  2. Salmonella spp. contamination in commercial layer hen farms using different types of samples and detection methods.

    PubMed

    Soria, M C; Soria, M A; Bueno, D J; Godano, E I; Gómez, S C; ViaButron, I A; Padin, V M; Rogé, A D

    2017-03-31

    The performance of detection methods (culture methods and polymerase chain reaction assay) and plating media used in the same type of samples were determined as well as the specificity of PCR primers to detected Salmonella spp. contamination in layer hen farms. Also, the association of farm characteristics with Salmonella presence was evaluated. Environmental samples (feces, feed, drinking water, air, boot-swabs) and eggs were taken from 40 layer hen houses. Salmonella spp. was most detected in boot-swabs taken around the houses (30% and 35% by isolation and PCR, respectively) follow by fecal samples (15.2% and 13.6% by isolation and PCR, respectively). Eggs, drinking water, and air samples were negative for Salmonella detection. Salmonella Schwarzengrund and S. Enteritidis were the most isolated serotypes. For plating media, relative specificity was 1, and the relative sensitivity was greater for EF-18 agar than XLDT agar in feed and fecal samples. However, relative sensitivity was greater in XLDT agar than EF-18 agar for boot-swab samples. Agreement was between fair to good depending on the sample, and it was good between isolation and PCR (feces and boot-swabs), without agreement for feed samples. Salmonella spp. PCR was positive for all strains, while S. Typhimurium PCR was negative. Salmonella Enteritidis PCR used was not specific. Based in the multiple logistic regression analyses, categorization by counties was significant for Salmonella spp. presence (P-value = 0.010). This study shows the importance of considering different types of samples, plating media and detection methods during a Salmonella spp. monitoring study. In addition, it is important to incorporate the sampling of floors around the layer hen houses to learn if biosecurity measures should be strengthened to minimize the entry and spread of Salmonella in the houses. Also, the performance of some PCR methods and S. Enteritidis PCR should be improved, and biosecurity measures in hen farms must be

  3. Sample Selection in Randomized Experiments: A New Method Using Propensity Score Stratified Sampling

    ERIC Educational Resources Information Center

    Tipton, Elizabeth; Hedges, Larry; Vaden-Kiernan, Michael; Borman, Geoffrey; Sullivan, Kate; Caverly, Sarah

    2014-01-01

    Randomized experiments are often seen as the "gold standard" for causal research. Despite the fact that experiments use random assignment to treatment conditions, units are seldom selected into the experiment using probability sampling. Very little research on experimental design has focused on how to make generalizations to well-defined…

  4. Statistical Methods and Tools for Hanford Staged Feed Tank Sampling

    SciTech Connect

    Fountain, Matthew S.; Brigantic, Robert T.; Peterson, Reid A.

    2013-10-01

    This report summarizes work conducted by Pacific Northwest National Laboratory to technically evaluate the current approach to staged feed sampling of high-level waste (HLW) sludge to meet waste acceptance criteria (WAC) for transfer from tank farms to the Hanford Waste Treatment and Immobilization Plant (WTP). The current sampling and analysis approach is detailed in the document titled Initial Data Quality Objectives for WTP Feed Acceptance Criteria, 24590-WTP-RPT-MGT-11-014, Revision 0 (Arakali et al. 2011). The goal of this current work is to evaluate and provide recommendations to support a defensible, technical and statistical basis for the staged feed sampling approach that meets WAC data quality objectives (DQOs).

  5. Implementation of Fowler's method for end-tidal air sampling.

    PubMed

    Di Francesco, F; Loccioni, C; Fioravanti, M; Russo, A; Pioggia, G; Ferro, M; Roehrer, I; Tabucchi, S; Onor, M

    2008-09-01

    The design, realization and testing of a CO(2)-triggered breath sampler, capable of a separate collection of dead space and end-tidal air on multiple breaths, is presented. This sampling procedure has advantages in terms of the sample volume, insights regarding the origin of compounds, increased reproducibility and higher concentrations of compounds. The high quality of design and the speed of the components ensure a breath-by-breath estimate of dead volume, as well as the comfort and safety of the subject under test. The system represents a valid tool to contribute to the development of a standardized sampling protocol needed to compare results obtained by the various groups in this field.

  6. Apparatus and method for centrifugation and robotic manipulation of samples

    NASA Technical Reports Server (NTRS)

    Vellinger, John C. (Inventor); Ormsby, Rachel A. (Inventor); Kennedy, David J. (Inventor); Thomas, Nathan A. (Inventor); Shulthise, Leo A. (Inventor); Kurk, Michael A. (Inventor); Metz, George W. (Inventor)

    2007-01-01

    A device for centrifugation and robotic manipulation of specimen samples, including incubating eggs, and uses thereof are provided. The device may advantageously be used for the incubation of avian, reptilian or any type of vertebrate eggs. The apparatus comprises a mechanism for holding samples individually, rotating them individually, rotating them on a centrifuge collectively, injecting them individually with a fixative or other chemical reagent, and maintaining them at controlled temperature, relative humidity and atmospheric composition. The device is applicable to experiments involving entities other than eggs, such as invertebrate specimens, plants, microorganisms and molecular systems.

  7. An efficient and cost-effective method for preparing transmission electron microscopy samples from powders

    DOE PAGES

    Wen, Haiming; Lin, Yaojun; Seidman, David N.; ...

    2015-09-09

    The preparation of transmission electron microcopy (TEM) samples from powders with particle sizes larger than ~100 nm poses a challenge. The existing methods are complicated and expensive, or have a low probability of success. Herein, we report a modified methodology for preparation of TEM samples from powders, which is efficient, cost-effective, and easy to perform. This method involves mixing powders with an epoxy on a piece of weighing paper, curing the powder–epoxy mixture to form a bulk material, grinding the bulk to obtain a thin foil, punching TEM discs from the foil, dimpling the discs, and ion milling the dimpledmore » discs to electron transparency. Compared with the well established and robust grinding–dimpling–ion-milling method for TEM sample preparation for bulk materials, our modified approach for preparing TEM samples from powders only requires two additional simple steps. In this article, step-by-step procedures for our methodology are described in detail, and important strategies to ensure success are elucidated. Furthermore, our methodology has been applied successfully for preparing TEM samples with large thin areas and high quality for many different mechanically milled metallic powders.« less

  8. An efficient and cost-effective method for preparing transmission electron microscopy samples from powders

    SciTech Connect

    Wen, Haiming; Lin, Yaojun; Seidman, David N.; Schoenung, Julie M.; van Rooyen, Isabella J.; Lavernia, Enrique J.

    2015-09-09

    The preparation of transmission electron microcopy (TEM) samples from powders with particle sizes larger than ~100 nm poses a challenge. The existing methods are complicated and expensive, or have a low probability of success. Herein, we report a modified methodology for preparation of TEM samples from powders, which is efficient, cost-effective, and easy to perform. This method involves mixing powders with an epoxy on a piece of weighing paper, curing the powder–epoxy mixture to form a bulk material, grinding the bulk to obtain a thin foil, punching TEM discs from the foil, dimpling the discs, and ion milling the dimpled discs to electron transparency. Compared with the well established and robust grinding–dimpling–ion-milling method for TEM sample preparation for bulk materials, our modified approach for preparing TEM samples from powders only requires two additional simple steps. In this article, step-by-step procedures for our methodology are described in detail, and important strategies to ensure success are elucidated. Furthermore, our methodology has been applied successfully for preparing TEM samples with large thin areas and high quality for many different mechanically milled metallic powders.

  9. HRJCOSY: A three-dimensional NMR method for measuring complex samples in inhomogeneous magnetic fields

    NASA Astrophysics Data System (ADS)

    Huang, Yuqing; Zhang, Zhiyong; Wang, Kaiyu; Cai, Shuhui; Chen, Zhong

    2014-08-01

    Three-dimensional (3D) NMR plays an important role in structural elucidations of complex samples, whereas difficulty remains in its applications to inhomogeneous fields. Here, we propose an NMR approach based on intermolecular zero-quantum coherences (iZQCs) to obtain high-resolution 3D J-resolved-COSY spectra in inhomogeneous fields. Theoretical analyses are presented for verifying the proposed method. Experiments on a simple chemical solution and a complex brain phantom are performed under non-ideal field conditions to show the ability of the proposed method. This method is an application of iZQCs to high-resolution 3D NMR, and is useful for studies of complex samples in inhomogeneous fields.

  10. A Bayesian Method to Improve Sampling in Weapons Testing

    DTIC Science & Technology

    1988-12-01

    New York, 1961. 4. Duncan, Acheson J., Quality Control and Industrial Statistics, Richard D. Irwan, Inc., Homewood, Illinois, 1986. 5. DeGroot ... Morris H., Probability and Statistics, Second Edition, Addison-Wesley Publishing Company, Inc., USA, 1986. 6. Manion, Rober B., Number of Samples Needed

  11. Improved sample management in the cylindrical-tube microelectrophoresis method

    NASA Technical Reports Server (NTRS)

    Smolka, A. J. K.

    1980-01-01

    A modification to an analytical microelectrophoresis system is described that improves the manipulation of the sample particles and fluid. The apparatus modification and improved operational procedure should yield more accurate measurements of particle mobilities and permit less skilled operators to use the apparatus.

  12. A rapid wire-based sampling method for DNA profiling.

    PubMed

    Chen, Tong; Catcheside, David E A; Stephenson, Alice; Hefford, Chris; Kirkbride, K Paul; Burgoyne, Leigh A

    2012-03-01

    This paper reports the results of a commission to develop a field deployable rapid short tandem repeat (STR)-based DNA profiling system to enable discrimination between tissues derived from a small number of individuals. Speed was achieved by truncation of sample preparation and field deployability by use of an Agilent 2100 Bioanalyser(TM). Human blood and tissues were stabbed with heated stainless steel wire and the resulting sample dehydrated with isopropanol prior to direct addition to a PCR. Choice of a polymerase tolerant of tissue residues and cycles of amplification appropriate for the amount of template expected yielded useful profiles with a custom-designed quintuplex primer set suitable for use with the Bioanalyser(TM). Samples stored on wires remained amplifiable for months, allowing their transportation unrefrigerated from remote locations to a laboratory for analysis using AmpFlSTR(®) Profiler Plus(®) without further processing. The field system meets the requirements for discrimination of samples from small sets and retains access to full STR profiling when required.

  13. Technical Evaluation of Sample-Processing, Collection, and Preservation Methods

    DTIC Science & Technology

    2014-07-01

    policy document entitled The National Strategy for Biosurveillance was released (White House, July 2012) as part of the National Security Strategy...concept of leveraging existing capabilities to “scan and discern the environment,” which implies the use of current technical biosurveillance ...testing of existing sample-processing technologies are expected to enable in silico evaluations of biosurveillance methodologies, equipment, and

  14. Modern Numerical Methods for Classical Sampled System Analysis-SAMSAN

    NASA Technical Reports Server (NTRS)

    Frisch, H. P.

    1984-01-01

    SAMSAN aids control-system analyst by providing self-consistent set of computer algorithms that support large-order control-system design and evaluation studies, with emphasis placed on sampled system analysis. Program provides set of algorithms readily integrated for solving control-system problems.

  15. Numerical Methods for Classical Sampled-System Analysis

    NASA Technical Reports Server (NTRS)

    Frisch, H. P.; Bauer, F. H.

    1986-01-01

    SAMSAN provides control-system analyst with self-consistent computer algorithms that support large-order control-system design and evaluation studies. Emphasizes sampled-system analysis. SAMSAN reduces burden on analyst by providing set of algorithms well tested and documented and readily integrated for solving control-system problems.

  16. Development of a novel cell sorting method that samples population diversity in flow cytometry.

    PubMed

    Osborne, Geoffrey W; Andersen, Stacey B; Battye, Francis L

    2015-11-01

    Flow cytometry based electrostatic cell sorting is an important tool in the separation of cell populations. Existing instruments can sort single cells into multi-well collection plates, and keep track of cell of origin and sorted well location. However currently single sorted cell results reflect the population distribution and fail to capture the population diversity. Software was designed that implements a novel sorting approach, "Slice and Dice Sorting," that links a graphical representation of a multi-well plate to logic that ensures that single cells are sampled and sorted from all areas defined by the sort region/s. Therefore the diversity of the total population is captured, and the more frequently occurring or rarer cell types are all sampled. The sorting approach was tested computationally, and using functional cell based assays. Computationally we demonstrate that conventional single cell sorting can sample as little as 50% of the population diversity dependant on the population distribution, and that Slice and Dice sorting samples much more of the variety present within a cell population. We then show by sorting single cells into wells using the Slice and Dice sorting method that there are cells sorted using this method that would be either rarely sorted, or not sorted at all using conventional single cell sorting approaches. The present study demonstrates a novel single cell sorting method that samples much more of the population diversity than current methods. It has implications in clonal selection, stem cell sorting, single cell sequencing and any areas where population heterogeneity is of importance.

  17. Post-Decontamination Vapor Sampling and Analytical Test Methods

    DTIC Science & Technology

    2015-08-12

    emission rate) after treatment with a decontamination system (decontaminant and/or applicator) used against CWAs, simulants, NTAs, TICs, or other...Residual liquid testing is addressed in TOP 08-2-061A1*. b. This TOP includes procedures for analyzing the decontamination of equipment and...Residual contaminant in samples from solid sorbent tubes (SSTs), or equivalent. Gas chromatograph (GC), liquid chromatograph (LC), flame

  18. A method for estimating population sex ratio for sage-grouse using noninvasive genetic samples.

    PubMed

    Baumgardt, J A; Goldberg, C S; Reese, K P; Connelly, J W; Musil, D D; Garton, E O; Waits, L P

    2013-05-01

    Population sex ratio is an important metric for wildlife management and conservation, but estimates can be difficult to obtain, particularly for sexually monomorphic species or for species that differ in detection probability between the sexes. Noninvasive genetic sampling (NGS) using polymerase chain reaction (PCR) has become a common method for identifying sex from sources such as hair, feathers or faeces, and is a potential source for estimating sex ratio. If, however, PCR success is sex-biased, naively using NGS could lead to a biased sex ratio estimator. We measured PCR success rates and error rates for amplifying the W and Z chromosomes from greater sage-grouse (Centrocercus urophasianus) faecal samples, examined how success and error rates for sex identification changed in response to faecal sample exposure time, and used simulation models to evaluate precision and bias of three sex assignment criteria for estimating population sex ratio with variable sample sizes and levels of PCR replication. We found PCR success rates were higher for females than males and that choice of sex assignment criteria influenced the bias and precision of corresponding sex ratio estimates. Our simulations demonstrate the importance of considering the interplay between the sex bias of PCR success, number of genotyping replicates, sample size, true population sex ratio and accuracy of assignment rules for designing future studies. Our results suggest that using faecal DNA for estimating the sex ratio of sage-grouse populations has great potential and, with minor adaptations and similar marker evaluations, should be applicable to numerous species.

  19. A new method of snowmelt sampling for water stable isotopes

    USGS Publications Warehouse

    Penna, D.; Ahmad, M.; Birks, S. J.; Bouchaou, L.; Brencic, M.; Butt, S.; Holko, L.; Jeelani, G.; Martinez, D. E.; Melikadze, G.; Shanley, J.B.; Sokratov, S. A.; Stadnyk, T.; Sugimoto, A.; Vreca, P.

    2014-01-01

    We modified a passive capillary sampler (PCS) to collect snowmelt water for isotopic analysis. Past applications of PCSs have been to sample soil water, but the novel aspect of this study was the placement of the PCSs at the ground-snowpack interface to collect snowmelt. We deployed arrays of PCSs at 11 sites in ten partner countries on five continents representing a range of climate and snow cover worldwide. The PCS reliably collected snowmelt at all sites and caused negligible evaporative fractionation effects in the samples. PCS is low-cost, easy to install, and collects a representative integrated snowmelt sample throughout the melt season or at the melt event scale. Unlike snow cores, the PCS collects the water that would actually infiltrate the soil; thus, its isotopic composition is appropriate to use for tracing snowmelt water through the hydrologic cycle. The purpose of this Briefing is to show the potential advantages of PCSs and recommend guidelines for constructing and installing them based on our preliminary results from two snowmelt seasons.

  20. In-syringe reversed dispersive liquid-liquid microextraction for the evaluation of three important bioactive compounds of basil, tarragon and fennel in human plasma and urine samples.

    PubMed

    Barfi, Azadeh; Nazem, Habibollah; Saeidi, Iman; Peyrovi, Moazameh; Afsharzadeh, Maryam; Barfi, Behruz; Salavati, Hossein

    2016-03-20

    In the present study, an efficient and environmental friendly method (called in-syringe reversed dispersive liquid-liquid microextraction (IS-R-DLLME)) was developed to extract three important components (i.e. para-anisaldehyde, trans-anethole and its isomer estragole) simultaneously in different plant extracts (basil, fennel and tarragon), human plasma and urine samples prior their determination using high-performance liquid chromatography. The importance of choosing these plant extracts as samples is emanating from the dual roles of their bioactive compounds (trans-anethole and estragole), which can alter positively or negatively different cellular processes, and necessity to a simple and efficient method for extraction and sensitive determination of these compounds in the mentioned samples. Under the optimum conditions (including extraction solvent: 120 μL of n-octanol; dispersive solvent: 600 μL of acetone; collecting solvent: 1000 μL of acetone, sample pH 3; with no salt), limits of detection (LODs), linear dynamic ranges (LDRs) and recoveries (R) were 79-81 ng mL(-1), 0.26-6.9 μg mL(-1) and 94.1-99.9%, respectively. The obtained results showed that the IS-R-DLLME was a simple, fast and sensitive method with low level consumption of extraction solvent which provides high recovery under the optimum conditions. The present method was applied to investigate the absorption amounts of the mentioned analytes through the determination of the analytes before (in the plant extracts) and after (in the human plasma and urine samples) the consumption which can determine the toxicity levels of the analytes (on the basis of their dosages) in the extracts.

  1. Capture-recapture and removal methods for sampling closed populations

    USGS Publications Warehouse

    White, Gary C.; Anderson, David R.; Burnham, Kenneth P.; Otis, David L.

    1982-01-01

    The problem of estimating animal abundance is common in wildlife management and environmental impact asessment. Capture-recapture and removal methods are often used to estimate population size. Statistical Inference From Capture Data On Closed Animal Populations, a monograph by Otis et al. (1978), provides a comprehensive synthesis of much of the wildlife and statistical literature on the methods, as well as some extensions of the general theory. In our primer, we focus on capture-recapture and removal methods for trapping studies in which a population is assumed to be closed and do not treat open-population models, such as the Jolly-Seber model, or catch-effort methods in any detail. The primer, written for students interested in population estimation, is intended for use with the more theoretical monograph.

  2. A sample preparation method for recovering suppressed analyte ions in MALDI TOF MS.

    PubMed

    Lou, Xianwen; de Waal, Bas F M; Milroy, Lech-Gustav; van Dongen, Joost L J

    2015-05-01

    In matrix-assisted laser desorption/ionization time-of-flight mass spectrometry (MALDI TOF MS), analyte signals can be substantially suppressed by other compounds in the sample. In this technical note, we describe a modified thin-layer sample preparation method that significantly reduces the analyte suppression effect (ASE). In our method, analytes are deposited on top of the surface of matrix preloaded on the MALDI plate. To prevent embedding of analyte into the matrix crystals, the sample solution were prepared without matrix and efforts were taken not to re-dissolve the preloaded matrix. The results with model mixtures of peptides, synthetic polymers and lipids show that detection of analyte ions, which were completely suppressed using the conventional dried-droplet method, could be effectively recovered by using our method. Our findings suggest that the incorporation of analytes in the matrix crystals has an important contributory effect on ASE. By reducing ASE, our method should be useful for the direct MALDI MS analysis of multicomponent mixtures.

  3. Rapid methods to detect organic mercury and total selenium in biological samples

    PubMed Central

    2011-01-01

    Background Organic mercury (Hg) is a global pollutant of concern and selenium is believed to afford protection against mercury risk though few approaches exist to rapidly assess both chemicals in biological samples. Here, micro-scale and rapid methods to detect organic mercury (< 1.5 ml total sample volume, < 1.5 hour) and total selenium (Se; < 3.0 ml total volume, < 3 hour) from a range of biological samples (10-50 mg) are described. Results For organic Hg, samples are digested using Tris-HCl buffer (with sequential additions of protease, NaOH, cysteine, CuSO4, acidic NaBr) followed by extraction with toluene and Na2S2O3. The final product is analyzed via commercially available direct/total mercury analyzers. For Se, a fluorometric assay has been developed for microplate readers that involves digestion (HNO3-HClO4 and HCl), conjugation (2,3-diaminonaphthalene), and cyclohexane extraction. Recovery of organic Hg (86-107%) and Se (85-121%) were determined through use of Standard Reference Materials and lemon shark kidney tissues. Conclusions The approaches outlined provide an easy, rapid, reproducible, and cost-effective platform for monitoring organic Hg and total Se in biological samples. Owing to the importance of organic Hg and Se in the pathophysiology of Hg, integration of such methods into established research monitoring efforts (that largely focus on screening total Hg only) will help increase understanding of Hg's true risks. PMID:21232132

  4. Evaluation of sample pretreatment methods for analysis of polonium isotopes in herbal medicines.

    PubMed

    Sreejith, Sathyapriya R; Nair, Madhu G; Rao, D D

    2014-12-01

    Herbal infusions like ayurvedic aristas are widely consumed by Indian population for good health. With increasing awareness about radiological assessment, an effort was made to assess the radioactivity concentration of naturally occurring radionuclides in herbal medicines. (210)Po is an important alpha particle emitter contributing to internal dose to man from ingestion. Though (210)Po can be spontaneously deposited on silver disk for alpha spectrometric measurements with less radiochemical step, great care has to be taken during the sample pretreatment step owing to the high volatility of polonium even at low temperatures. Aim of the study was to evaluate an appropriate sample pretreatment method for estimation of polonium in herbal medicines. (209)Po was used for radiochemical yield calculation. Conventional open vessel wet ashing, physical evaporation, freeze-drying and microwave digestion in a Teflon vessel were examined. The recovery ranged between 9 and 79%. The lowest recovery was obtained for the samples that were processed by open vessel digestion without any volume reduction. The recoveries were comparable for those samples that were freeze dried and subjected to HNO3 + HClO4 + H2O2 + HF acid digestion and microwave digested samples. (210)Po concentration in the samples ranged from 11.3 to 39.6 mBq/L.

  5. Use of aspiration method for collecting brain samples for rabies diagnosis in small wild animals.

    PubMed

    Iamamoto, K; Quadros, J; Queiroz, L H

    2011-02-01

    In developing countries such as Brazil, where canine rabies is still a considerable problem, samples from wildlife species are infrequently collected and submitted for screening for rabies. A collaborative study was established involving environmental biologists and veterinarians for rabies epidemiological research in a specific ecological area located at the Sao Paulo State, Brazil. The wild animals' brains are required to be collected without skull damage because the skull's measurements are important in the identification of the captured animal species. For this purpose, samples from bats and small mammals were collected using an aspiration method by inserting a plastic pipette into the brain through the magnum foramen. While there is a progressive increase in the use of the plastic pipette technique in various studies undertaken, it is also appreciated that this method could foster collaborative research between wildlife scientists and rabies epidemiologists thus improving rabies surveillance.

  6. Several methods for concentrating bacteria in fluid samples

    NASA Technical Reports Server (NTRS)

    Thomas, R. R.

    1976-01-01

    The sensitivities of the firefly luciferase - ATP flow system and luminol flow system were established as 300,000 E. coli per milliliter and 10,000 E. coli per milliliter respectively. To achieve the detection limit of 1,000 bacteria per milliliter previously established, a method of concentrating microorganisms using a sartorius membrane filter system is investigated. Catalase in 50% ethanol is found to be a stable luminol standard and can be used up to 24 hours with only a 10% loss of activity. The luminol reagent is also stable over a 24 hour period. A method of preparing relatively inexpensive luciferase from desiccated firefly tails is developed.

  7. A personal sampling method for the determination of styrene exposure.

    PubMed

    Brown, R H; Saunders, K J; Walkin, K T

    1987-09-01

    A diffusive sampler, which is suitable for the determination of time-weighted average personal and static exposures to styrene, is described. The sampler is based on a commercially available tube design that is amenable to fully-automated thermal desorption. The performance of the sampler has been evaluated using a protocol, developed by the U.K. Health and Safety Executive, involving both laboratory and field experiments. The sampler has been shown to give results that are similar both in magnitude and precision to those obtained using conventional sorbent tube and pump methods. The diffusive method is preferred because of its simplicity and convenience.

  8. 7 CFR 31.400 - Samples for wool and wool top grades; method of obtaining.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 2 2010-01-01 2010-01-01 false Samples for wool and wool top grades; method of... STANDARDS AND STANDARD CONTAINER REGULATIONS PURCHASE OF WOOL AND WOOL TOP SAMPLES § 31.400 Samples for wool and wool top grades; method of obtaining. Samples certified as representative of the...

  9. 7 CFR 31.400 - Samples for wool and wool top grades; method of obtaining.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 7 Agriculture 2 2011-01-01 2011-01-01 false Samples for wool and wool top grades; method of... STANDARDS AND STANDARD CONTAINER REGULATIONS PURCHASE OF WOOL AND WOOL TOP SAMPLES § 31.400 Samples for wool and wool top grades; method of obtaining. Samples certified as representative of the...

  10. 7 CFR 32.402 - Samples of mohair top grades; method of obtaining.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 7 Agriculture 2 2011-01-01 2011-01-01 false Samples of mohair top grades; method of obtaining. 32... STANDARD CONTAINER REGULATIONS PURCHASE OF GREASE MOHAIR AND MOHAIR TOP SAMPLES § 32.402 Samples of mohair top grades; method of obtaining. Samples certified as representative of the official standards of...

  11. [DOE method for evaluating environmental and waste management samples: Revision 1, Addendum 1

    SciTech Connect

    Goheen, S.C.

    1995-04-01

    The US Dapartment of Energy`s (DOE`s) environmental and waste management (EM) sampling and analysis activities require that large numbers of samples be analyzed for materials characterization, environmental surveillance, and site-remediation programs. The present document, DOE Methods for Evaluating Environmental and Waste Management Samples (DOE Methods), is a supplemental resource for analyzing many of these samples.

  12. TESTING METHODS FOR DETECTION OF CRYPTOSPORIDIUM SPP. IN WATER SAMPLES

    EPA Science Inventory

    A large waterborne outbreak of cryptosporidiosis in Milwaukee, Wisconsin, U.S.A. in 1993 prompted a search for ways to prevent large-scale waterborne outbreaks of protozoan parasitoses. Methods for detecting Cryptosporidium parvum play an integral role in strategies that lead to...

  13. Ant colony optimization as a method for strategic genotype sampling.

    Technology Transfer Automated Retrieval System (TEKTRAN)

    A simulation study was carried out to develop an alternative method of selecting animals to be genotyped. Simulated pedigrees included 5000 animals, each assigned genotypes for a bi-allelic single nucleotide polymorphism (SNP) based on assumed allelic frequencies of 0.7/ 0.3 and 0.5/0.5. In addition...

  14. COMPARISON OF LARGE RIVER SAMPLING METHOD USING DIATOM METRICS

    EPA Science Inventory

    We compared the results of four methods used to assess the algal communities at 60 sites distributed among four rivers. Based on Principle Component Analysis of physical habitat data collected concomitantly with the algal data, sites were separated into those with a mean thalweg...

  15. COMPARISON OF LARGE RIVER SAMPLING METHODS ON ALGAL METRICS

    EPA Science Inventory

    We compared the results of four methods used to assess the algal communities at 60 sites distributed among four rivers. Based on Principle Component Analysis of physical habitat data collected concomitantly with the algal data, sites were separated into those with a mean thalweg...

  16. 40 CFR 761.243 - Standard wipe sample method and size.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ..., AND USE PROHIBITIONS Determining a PCB Concentration for Purposes of Abandonment or Disposal of Natural Gas Pipeline: Selecting Sample Sites, Collecting Surface Samples, and Analyzing Standard PCB Wipe Samples § 761.243 Standard wipe sample method and size. (a) Collect a surface sample from a natural...

  17. 40 CFR 761.243 - Standard wipe sample method and size.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ..., AND USE PROHIBITIONS Determining a PCB Concentration for Purposes of Abandonment or Disposal of Natural Gas Pipeline: Selecting Sample Sites, Collecting Surface Samples, and Analyzing Standard PCB Wipe Samples § 761.243 Standard wipe sample method and size. (a) Collect a surface sample from a natural...

  18. 40 CFR 761.243 - Standard wipe sample method and size.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ..., AND USE PROHIBITIONS Determining a PCB Concentration for Purposes of Abandonment or Disposal of Natural Gas Pipeline: Selecting Sample Sites, Collecting Surface Samples, and Analyzing Standard PCB Wipe Samples § 761.243 Standard wipe sample method and size. (a) Collect a surface sample from a natural...

  19. Estimating the Expected Value of Sample Information Using the Probabilistic Sensitivity Analysis Sample: A Fast, Nonparametric Regression-Based Method.

    PubMed

    Strong, Mark; Oakley, Jeremy E; Brennan, Alan; Breeze, Penny

    2015-07-01

    Health economic decision-analytic models are used to estimate the expected net benefits of competing decision options. The true values of the input parameters of such models are rarely known with certainty, and it is often useful to quantify the value to the decision maker of reducing uncertainty through collecting new data. In the context of a particular decision problem, the value of a proposed research design can be quantified by its expected value of sample information (EVSI). EVSI is commonly estimated via a 2-level Monte Carlo procedure in which plausible data sets are generated in an outer loop, and then, conditional on these, the parameters of the decision model are updated via Bayes rule and sampled in an inner loop. At each iteration of the inner loop, the decision model is evaluated. This is computationally demanding and may be difficult if the posterior distribution of the model parameters conditional on sampled data is hard to sample from. We describe a fast nonparametric regression-based method for estimating per-patient EVSI that requires only the probabilistic sensitivity analysis sample (i.e., the set of samples drawn from the joint distribution of the parameters and the corresponding net benefits). The method avoids the need to sample from the posterior distributions of the parameters and avoids the need to rerun the model. The only requirement is that sample data sets can be generated. The method is applicable with a model of any complexity and with any specification of model parameter distribution. We demonstrate in a case study the superior efficiency of the regression method over the 2-level Monte Carlo method.

  20. RAPID METHOD FOR PLUTONIUM, AMERICIUM AND CURIUM IN VERY LARGE SOIL SAMPLES

    SciTech Connect

    Maxwell, S

    2007-01-08

    The analysis of actinides in environmental soil and sediment samples is very important for environmental monitoring. There is a need to measure actinide isotopes with very low detection limits. A new, rapid actinide separation method has been developed and implemented that allows the measurement of plutonium, americium and curium isotopes in very large soil samples (100-200 g) with high chemical recoveries and effective removal of matrix interferences. This method uses stacked TEVA Resin{reg_sign}, TRU Resin{reg_sign} and DGA-Resin{reg_sign} cartridges from Eichrom Technologies (Darien, IL, USA) that allows the rapid separation of plutonium (Pu), americium (Am), and curium (Cm) using a single multistage column combined with alpha spectrometry. The method combines an acid leach step and innovative matrix removal using cerium fluoride precipitation to remove the difficult soil matrix. This method is unique in that it provides high tracer recoveries and effective removal of interferences with small extraction chromatography columns instead of large ion exchange resin columns that generate large amounts of acid waste. By using vacuum box cartridge technology with rapid flow rates, sample preparation time is minimized.

  1. Fire ant-detecting canines: a complementary method in detecting red imported fire ants.

    PubMed

    Lin, Hui-Min; Chi, Wei-Lien; Lin, Chung-Chi; Tseng, Yu-Ching; Chen, Wang-Ting; Kung, Yu-Ling; Lien, Yi-Yang; Chen, Yang-Yuan

    2011-02-01

    In this investigation, detection dogs are trained and used in identifying red imported fire ants, Solenopsis invicta Buren, and their nests. The methodology could assist in reducing the frequency and scope of chemical treatments for red imported fire ant management and thus reduce labor costs and chemical use as well as improve control and quarantine efficiency. Three dogs previously trained for customs quarantine were retrained to detect the scents of red imported fire ants. After passing tests involving different numbers of live red imported fire ants and three other ant species--Crematogaster rogenhoferi Mayr, Paratrechina longicornis Latreille, and Pheidole megacephala F.--placed in containers, ajoint field survey for red imported fire ant nests by detection dogs and bait traps was conducted to demonstrate their use as a supplement to conventional detection methods. The most significant findings in this report are (1) with 10 or more red imported fire ants in scent containers, the dogs had >98% chance in tracing the red imported fire ant. Upon the introduction of other ant species, the dogs still achieved on average, a 93% correct red imported fire ant indication rate. Moreover, the dogs demonstrated great competence in pinpointing emerging and smaller red imported fire ant nests in red imported fire ant-infested areas that had been previously confirmed by bait trap stations. (2) Along with the bait trap method, we also discovered that approximately 90% of red imported fire ants foraged within a distance of 14 m away from their nests. The results prove detection dogs to be most effective for red imported fire ant control in areas that have been previously treated with pesticides and therefore containing a low density of remaining red imported fire ant nests. Furthermore, as a complement to other red imported fire ant monitoring methods, this strategy will significantly increase the efficacy of red imported fire ant control in cases of individual mount treatment.

  2. Image reconstruction in EIT with unreliable electrode data using random sample consensus method

    NASA Astrophysics Data System (ADS)

    Jeon, Min Ho; Khambampati, Anil Kumar; Kim, Bong Seok; In Kang, Suk; Kim, Kyung Youn

    2017-04-01

    In electrical impedance tomography (EIT), it is important to acquire reliable measurement data through EIT system for achieving good reconstructed image. In order to have reliable data, various methods for checking and optimizing the EIT measurement system have been studied. However, most of the methods involve additional cost for testing and the measurement setup is often evaluated before the experiment. It is useful to have a method which can detect the faulty electrode data during the experiment without any additional cost. This paper presents a method based on random sample consensus (RANSAC) to find the incorrect data on fault electrode in EIT data. RANSAC is a curve fitting method that removes the outlier data from measurement data. RANSAC method is applied with Gauss Newton (GN) method for image reconstruction of human thorax with faulty data. Numerical and phantom experiments are performed and the reconstruction performance of the proposed RANSAC method with GN is compared with conventional GN method. From the results, it can be noticed that RANSAC with GN has better reconstruction performance than conventional GN method with faulty electrode data.

  3. A sampling method for conducting relocation studies with freshwater mussels

    USGS Publications Warehouse

    Waller, D.L.; Rach, J.J.; Cope, W.G.; Luoma, J.A.

    1993-01-01

    Low recovery of transplanted mussels often prevents accurate estimates of survival. We developed a method that provided a high recovery of transplanted mussels and allowed for a reliable assessment of mortality. A 3 x 3 m polyvinyl chloride (PVC) pipe grid was secured to the sediment with iron reinforcing bars. The grid was divided into nine 1-m super(2) segments and each treatment segment, was stocked with 100 marked mussels. The recovery of mussels after six months exceeded 80% in all but one treatment group.

  4. Method and apparatus for processing a test sample to concentrate an analyte in the sample from a solvent in the sample

    DOEpatents

    Turner, T.D.; Beller, L.S.; Clark, M.L.; Klingler, K.M.

    1997-10-14

    A method of processing a test sample to concentrate an analyte in the sample from a solvent in the sample includes: (a) boiling the test sample containing the analyte and solvent in a boiling chamber to a temperature greater than or equal to the solvent boiling temperature and less than the analyte boiling temperature to form a rising sample vapor mixture; (b) passing the sample vapor mixture from the boiling chamber to an elongated primary separation tube, the separation tube having internal sidewalls and a longitudinal axis, the longitudinal axis being angled between vertical and horizontal and thus having an upper region and a lower region; (c) collecting the physically transported liquid analyte on the internal sidewalls of the separation tube; and (d) flowing the collected analyte along the angled internal sidewalls of the separation tube to and pass the separation tube lower region. The invention also includes passing a turbulence inducing wave through a vapor mixture to separate physically transported liquid second material from vaporized first material. Apparatus is also disclosed for effecting separations. Further disclosed is a fluidically powered liquid test sample withdrawal apparatus for withdrawing a liquid test sample from a test sample container and for cleaning the test sample container. 8 figs.

  5. Method and apparatus for processing a test sample to concentrate an analyte in the sample from a solvent in the sample

    DOEpatents

    Turner, Terry D.; Beller, Laurence S.; Clark, Michael L.; Klingler, Kerry M.

    1997-01-01

    A method of processing a test sample to concentrate an analyte in the sample from a solvent in the sample includes: a) boiling the test sample containing the analyte and solvent in a boiling chamber to a temperature greater than or equal to the solvent boiling temperature and less than the analyte boiling temperature to form a rising sample vapor mixture; b) passing the sample vapor mixture from the boiling chamber to an elongated primary separation tube, the separation tube having internal sidewalls and a longitudinal axis, the longitudinal axis being angled between vertical and horizontal and thus having an upper region and a lower region; c) collecting the physically transported liquid analyte on the internal sidewalls of the separation tube; and d) flowing the collected analyte along the angled internal sidewalls of the separation tube to and pass the separation tube lower region. The invention also includes passing a turbulence inducing wave through a vapor mixture to separate physically transported liquid second material from vaporized first material. Apparatus are also disclosed for effecting separations. Further disclosed is a fluidically powered liquid test sample withdrawal apparatus for withdrawing a liquid test sample from a test sample container and for cleaning the test sample container.

  6. Analytical methods for determination of anticoagulant rodenticides in biological samples.

    PubMed

    Imran, Muhammad; Shafi, Humera; Wattoo, Sardar Ali; Chaudhary, Muhammad Taimoor; Usman, Hafiz Faisal

    2015-08-01

    Anticoagulant rodenticides belong to a heterogeneous group of compounds which are used to kill rodents. They bind to enzyme complexes responsible for recycling of vitamin K, thus producing impairment in coagulation process. Rodenticides are among the most common house hold toxicants and exhibit wide variety of toxicities in non-target species especially in human, dogs and cats. This article reviews published analytical methods reported in literature for qualitative and quantitative determination of anticoagulant rodenticides in biological specimens. These techniques include high performance liquid chromatography coupled with ultraviolet and florescence detectors, liquid chromatography electrospray ionization tandem mass spectrometry, liquid chromatography with high resolution tandem mass spectrometry, ultra performance liquid chromatography mass spectrometry, gas chromatography mass spectrometry, ion chromatography with fluorescence detection, ion chromatography electrospray ionization ion trap mass spectrometry and ion chromatography electrospray ionization tandem mass spectrometry.

  7. The Importance of Sample Return in Establishing Chemical Evidence for Life on Mars or Other Solar System Bodies

    NASA Technical Reports Server (NTRS)

    Glavin, D. P.; Conrad, P.; Dworkin, J. P.; Eigenbrode, J.; Mahaffy, P. R.

    2011-01-01

    The search for evidence of life on Mars and elsewhere will continue to be one of the primary goals of NASA s robotic exploration program over the next decade. NASA and ESA are currently planning a series of robotic missions to Mars with the goal of understanding its climate, resources, and potential for harboring past or present life. One key goal will be the search for chemical biomarkers including complex organic compounds important in life on Earth. These include amino acids, the monomer building blocks of proteins and enzymes, nucleobases and sugars which form the backbone of DNA and RNA, and lipids, the structural components of cell membranes. Many of these organic compounds can also be formed abiotically as demonstrated by their prevalence in carbonaceous meteorites [1], though, their molecular characteristics may distinguish a biological source [2]. It is possible that in situ instruments may reveal such characteristics, however, return of the right sample (i.e. one with biosignatures or having a high probability of biosignatures) to Earth would allow for more intensive laboratory studies using a broad array of powerful instrumentation for bulk characterization, molecular detection, isotopic and enantiomeric compositions, and spatially resolved chemistry that may be required for confirmation of extant or extinct Martian life. Here we will discuss the current analytical capabilities and strategies for the detection of organics on the Mars Science Laboratory (MSL) using the Sample Analysis at Mars (SAM) instrument suite and how sample return missions from Mars and other targets of astrobiological interest will help advance our understanding of chemical biosignatures in the solar system.

  8. Sampling strategies and post-processing methods for increasing the time resolution of organic aerosol measurements requiring long sample-collection times

    NASA Astrophysics Data System (ADS)

    Modini, Rob L.; Takahama, Satoshi

    2016-07-01

    The composition and properties of atmospheric organic aerosols (OAs) change on timescales of minutes to hours. However, some important OA characterization techniques typically require greater than a few hours of sample-collection time (e.g., Fourier transform infrared (FTIR) spectroscopy). In this study we have performed numerical modeling to investigate and compare sample-collection strategies and post-processing methods for increasing the time resolution of OA measurements requiring long sample-collection times. Specifically, we modeled the measurement of hydrocarbon-like OA (HOA) and oxygenated OA (OOA) concentrations at a polluted urban site in Mexico City, and investigated how to construct hourly resolved time series from samples collected for 4, 6, and 8 h. We modeled two sampling strategies - sequential and staggered sampling - and a range of post-processing methods including interpolation and deconvolution. The results indicated that relative to the more sophisticated and costly staggered sampling methods, linear interpolation between sequential measurements is a surprisingly effective method for increasing time resolution. Additional error can be added to a time series constructed in this manner if a suboptimal sequential sampling schedule is chosen. Staggering measurements is one way to avoid this effect. There is little to be gained from deconvolving staggered measurements, except at very low values of random measurement error (< 5 %). Assuming 20 % random measurement error, one can expect average recovery errors of 1.33-2.81 µg m-3 when using 4-8 h-long sequential and staggered samples to measure time series of concentration values ranging from 0.13-29.16 µg m-3. For 4 h samples, 19-47 % of this total error can be attributed to the process of increasing time resolution alone, depending on the method used, meaning that measurement precision would only be improved by 0.30-0.75 µg m-3 if samples could be collected over 1 h instead of 4 h. Devising a

  9. Sparsity-weighted outlier FLOODing (OFLOOD) method: Efficient rare event sampling method using sparsity of distribution.

    PubMed

    Harada, Ryuhei; Nakamura, Tomotake; Shigeta, Yasuteru

    2016-03-30

    As an extension of the Outlier FLOODing (OFLOOD) method [Harada et al., J. Comput. Chem. 2015, 36, 763], the sparsity of the outliers defined by a hierarchical clustering algorithm, FlexDice, was considered to achieve an efficient conformational search as sparsity-weighted "OFLOOD." In OFLOOD, FlexDice detects areas of sparse distribution as outliers. The outliers are regarded as candidates that have high potential to promote conformational transitions and are employed as initial structures for conformational resampling by restarting molecular dynamics simulations. When detecting outliers, FlexDice defines a rank in the hierarchy for each outlier, which relates to sparsity in the distribution. In this study, we define a lower rank (first ranked), a medium rank (second ranked), and the highest rank (third ranked) outliers, respectively. For instance, the first-ranked outliers are located in a given conformational space away from the clusters (highly sparse distribution), whereas those with the third-ranked outliers are nearby the clusters (a moderately sparse distribution). To achieve the conformational search efficiently, resampling from the outliers with a given rank is performed. As demonstrations, this method was applied to several model systems: Alanine dipeptide, Met-enkephalin, Trp-cage, T4 lysozyme, and glutamine binding protein. In each demonstration, the present method successfully reproduced transitions among metastable states. In particular, the first-ranked OFLOOD highly accelerated the exploration of conformational space by expanding the edges. In contrast, the third-ranked OFLOOD reproduced local transitions among neighboring metastable states intensively. For quantitatively evaluations of sampled snapshots, free energy calculations were performed with a combination of umbrella samplings, providing rigorous landscapes of the biomolecules.

  10. Method for sequential injection of liquid samples for radioisotope separations

    DOEpatents

    Egorov, Oleg B.; Grate, Jay W.; Bray, Lane A.

    2000-01-01

    The present invention is a method of separating a short-lived daughter isotope from a longer lived parent isotope, with recovery of the parent isotope for further use. Using a system with a bi-directional pump and one or more valves, a solution of the parent isotope is processed to generate two separate solutions, one of which contains the daughter isotope, from which the parent has been removed with a high decontamination factor, and the other solution contains the recovered parent isotope. The process can be repeated on this solution of the parent isotope. The system with the fluid drive and one or more valves is controlled by a program on a microprocessor executing a series of steps to accomplish the operation. In one approach, the cow solution is passed through a separation medium that selectively retains the desired daughter isotope, while the parent isotope and the matrix pass through the medium. After washing this medium, the daughter is released from the separation medium using another solution. With the automated generator of the present invention, all solution handling steps necessary to perform a daughter/parent radionuclide separation, e.g. Bi-213 from Ac-225 "cow" solution, are performed in a consistent, enclosed, and remotely operated format. Operator exposure and spread of contamination are greatly minimized compared to the manual generator procedure described in U.S. patent application Ser. No. 08/789,973, now U.S. Pat. No. 5,749,042, herein incorporated by reference. Using 16 mCi of Ac-225 there was no detectable external contamination of the instrument components.

  11. [Comparative Analysis of Spectrophotometric Methods of the Protein Measurement in the Pectic Polysaccharide Samples].

    PubMed

    Ponomareva, S A; Golovchenko, V V; Patova, O A; Vanchikova, E V; Ovodov, Y S

    2015-01-01

    For the assay to reliability of determination of the protein content in the pectic polysaccharide samples by absorbance in the ultraviolet and visible regions of the spectrum a comparison of the eleven techniques called Flores, Lovry, Bradford, Sedmak, Rueman (ninhydrin reaction) methods, the method of ultraviolet spectrophotometry, the method Benedict's reagent, the method Nessler's reagent, the method with amide black, the bicinchoninic reagent and the biuret method was carried out. The data obtained show that insufficient sensitivity of the seven methods from the listed techniques doesn't allow their usage for determination of protein content in pectic polysaccharide samples. But the Lowry, Bradford, Sedmak methods, and the method Nessler's reagent may be used for determination of protein content in pectic polysaccharide samples, and the Bradford method is advisable for protein contaminants content determination in pectic polysaccharide samples in case protein content is less than 15%, and the Lowry method--for samples is more than 15%.

  12. A Method for Selective Enrichment and Analysis of Nitrotyrosine-Containing Peptides in Complex Proteome Samples

    SciTech Connect

    Zhang, Qibin; Qian, Weijun; Knyushko, Tanya V.; Clauss, Therese RW; Purvine, Samuel O.; Moore, Ronald J.; Sacksteder, Colette A.; Chin, Mark H.; Smith, Desmond J.; Camp, David G.; Bigelow, Diana J.; Smith, Richard D.

    2007-06-01

    Elevated levels of protein tyrosine nitration have been found in various neurodegenerative diseases and aging related pathologies; however, the lack of an efficient enrichment method has prevented the analysis of this important low level protein modification. We have developed an efficient method for specific enrichment of nitrotyrosine containing peptides that permits nitrotyrosine peptides and specific nitration sites to be unambiguously identified with LC-MS/MS. The method is based on the derivatization of nitrotyrosine into free sulfhydryl groups followed by high efficiency enrichment of sulfhydryl-containing peptides with thiopropyl sepharose beads. The derivatization process starts with acetylation with acetic anhydride to block all primary amines, followed by reduction of nitrotyrosine to aminotyrosine, then derivatization of aminotyrosine with N-Succinimidyl S-Acetylthioacetate (SATA), and finally deprotecting of S-acetyl on SATA to form free sulfhydryl groups. This method was evaluated using nitrotyrosine containing peptides, in-vitro nitrated human histone 1.2, and bovine serum albumin (BSA). 91% and 62% of the identified peptides from enriched histone and BSA samples were nitrotyrosine derivatized peptides, respectively, suggesting relative high specificity of the enrichment method. The application of this method to in-vitro nitrated mouse brain homogenate resulted in 35% of identified peptides containing nitrotyrosine (compared to only 5.9% observed from the global analysis of unenriched sample), and a total of 150 unique nitrated peptides covering 102 proteins were identified with a false discovery rate estimated at 3.3% from duplicate LC-MS/MS analyses of a single enriched sample.

  13. Surveying immigrants without sampling frames - evaluating the success of alternative field methods.

    PubMed

    Reichel, David; Morales, Laura

    2017-01-01

    This paper evaluates the sampling methods of an international survey, the Immigrant Citizens Survey, which aimed at surveying immigrants from outside the European Union (EU) in 15 cities in seven EU countries. In five countries, no sample frame was available for the target population. Consequently, alternative ways to obtain a representative sample had to be found. In three countries 'location sampling' was employed, while in two countries traditional methods were used with adaptations to reach the target population. The paper assesses the main methodological challenges of carrying out a survey among a group of immigrants for whom no sampling frame exists. The samples of the survey in these five countries are compared to results of official statistics in order to assess the accuracy of the samples obtained through the different sampling methods. It can be shown that alternative sampling methods can provide meaningful results in terms of core demographic characteristics although some estimates differ to some extent from the census results.

  14. A Novel Videography Method for Generating Crack-Extension Resistance Curves in Small Bone Samples

    PubMed Central

    Katsamenis, Orestis L.; Jenkins, Thomas; Quinci, Federico; Michopoulou, Sofia; Sinclair, Ian; Thurner, Philipp J.

    2013-01-01

    Assessment of bone quality is an emerging solution for quantifying the effects of bone pathology or treatment. Perhaps one of the most important parameters characterising bone quality is the toughness behaviour of bone. Particularly, fracture toughness, is becoming a popular means for evaluating bone quality. The method is moving from a single value approach that models bone as a linear-elastic material (using the stress intensity factor, K) towards full crack extension resistance curves (R-curves) using a non-linear model (the strain energy release rate in J-R curves). However, for explanted human bone or small animal bones, there are difficulties in measuring crack-extension resistance curves due to size constraints at the millimetre and sub-millimetre scale. This research proposes a novel “whitening front tracking” method that uses videography to generate full fracture resistance curves in small bone samples where crack propagation cannot typically be observed. Here we present this method on sharp edge notched samples (<1 mm×1 mm×Length) prepared from four human femora tested in three-point bending. Each sample was loaded in a mechanical tester with the crack propagation recorded using videography and analysed using an algorithm to track the whitening (damage) zone. Using the “whitening front tracking” method, full R-curves and J-R curves could be generated for these samples. The curves for this antiplane longitudinal orientation were similar to those found in the literature, being between the published longitudinal and transverse orientations. The proposed technique shows the ability to generate full “crack” extension resistance curves by tracking the whitening front propagation to overcome the small size limitations and the single value approach. PMID:23405186

  15. Predictive Sampling of Rare Conformational Events in Aqueous Solution: Designing a Generalized Orthogonal Space Tempering Method.

    PubMed

    Lu, Chao; Li, Xubin; Wu, Dongsheng; Zheng, Lianqing; Yang, Wei

    2016-01-12

    In aqueous solution, solute conformational transitions are governed by intimate interplays of the fluctuations of solute-solute, solute-water, and water-water interactions. To promote molecular fluctuations to enhance sampling of essential conformational changes, a common strategy is to construct an expanded Hamiltonian through a series of Hamiltonian perturbations and thereby broaden the distribution of certain interactions of focus. Due to a lack of active sampling of configuration response to Hamiltonian transitions, it is challenging for common expanded Hamiltonian methods to robustly explore solvent mediated rare conformational events. The orthogonal space sampling (OSS) scheme, as exemplified by the orthogonal space random walk and orthogonal space tempering methods, provides a general framework for synchronous acceleration of slow configuration responses. To more effectively sample conformational transitions in aqueous solution, in this work, we devised a generalized orthogonal space tempering (gOST) algorithm. Specifically, in the Hamiltonian perturbation part, a solvent-accessible-surface-area-dependent term is introduced to implicitly perturb near-solute water-water fluctuations; more importantly in the orthogonal space response part, the generalized force order parameter is generalized as a two-dimension order parameter set, in which essential solute-solvent and solute-solute components are separately treated. The gOST algorithm is evaluated through a molecular dynamics simulation study on the explicitly solvated deca-alanine (Ala10) peptide. On the basis of a fully automated sampling protocol, the gOST simulation enabled repetitive folding and unfolding of the solvated peptide within a single continuous trajectory and allowed for detailed constructions of Ala10 folding/unfolding free energy surfaces. The gOST result reveals that solvent cooperative fluctuations play a pivotal role in Ala10 folding/unfolding transitions. In addition, our assessment

  16. Estimation of the Coefficient of Variation with Minimum Risk: A Sequential Method for Minimizing Sampling Error and Study Cost.

    PubMed

    Chattopadhyay, Bhargab; Kelley, Ken

    2016-01-01

    The coefficient of variation is an effect size measure with many potential uses in psychology and related disciplines. We propose a general theory for a sequential estimation of the population coefficient of variation that considers both the sampling error and the study cost, importantly without specific distributional assumptions. Fixed sample size planning methods, commonly used in psychology and related fields, cannot simultaneously minimize both the sampling error and the study cost. The sequential procedure we develop is the first sequential sampling procedure developed for estimating the coefficient of variation. We first present a method of planning a pilot sample size after the research goals are specified by the researcher. Then, after collecting a sample size as large as the estimated pilot sample size, a check is performed to assess whether the conditions necessary to stop the data collection have been satisfied. If not an additional observation is collected and the check is performed again. This process continues, sequentially, until a stopping rule involving a risk function is satisfied. Our method ensures that the sampling error and the study costs are considered simultaneously so that the cost is not higher than necessary for the tolerable sampling error. We also demonstrate a variety of properties of the distribution of the final sample size for five different distributions under a variety of conditions with a Monte Carlo simulation study. In addition, we provide freely available functions via the MBESS package in R to implement the methods discussed.

  17. 7 CFR 32.402 - Samples of mohair top grades; method of obtaining.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... through 20s. (b) Each application for standard samples of mohair top shall be upon an application form... 7 Agriculture 2 2010-01-01 2010-01-01 false Samples of mohair top grades; method of obtaining. 32... STANDARD CONTAINER REGULATIONS PURCHASE OF GREASE MOHAIR AND MOHAIR TOP SAMPLES § 32.402 Samples of...

  18. A Typology of Mixed Methods Sampling Designs in Social Science Research

    ERIC Educational Resources Information Center

    Onwuegbuzie, Anthony J.; Collins, Kathleen M. T.

    2007-01-01

    This paper provides a framework for developing sampling designs in mixed methods research. First, we present sampling schemes that have been associated with quantitative and qualitative research. Second, we discuss sample size considerations and provide sample size recommendations for each of the major research designs for quantitative and…

  19. 40 CFR 80.8 - Sampling methods for gasoline and diesel fuel.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 17 2012-07-01 2012-07-01 false Sampling methods for gasoline and... gasoline and diesel fuel. The sampling methods specified in this section shall be used to collect samples of gasoline and diesel fuel for purposes of determining compliance with the requirements of this...

  20. 40 CFR 80.8 - Sampling methods for gasoline and diesel fuel.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 16 2011-07-01 2011-07-01 false Sampling methods for gasoline and... gasoline and diesel fuel. The sampling methods specified in this section shall be used to collect samples of gasoline and diesel fuel for purposes of determining compliance with the requirements of this...

  1. 40 CFR 80.8 - Sampling methods for gasoline and diesel fuel.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 17 2013-07-01 2013-07-01 false Sampling methods for gasoline and... gasoline and diesel fuel. The sampling methods specified in this section shall be used to collect samples of gasoline and diesel fuel for purposes of determining compliance with the requirements of this...

  2. A Sequential Optimization Sampling Method for Metamodels with Radial Basis Functions

    PubMed Central

    Pan, Guang; Ye, Pengcheng; Yang, Zhidong

    2014-01-01

    Metamodels have been widely used in engineering design to facilitate analysis and optimization of complex systems that involve computationally expensive simulation programs. The accuracy of metamodels is strongly affected by the sampling methods. In this paper, a new sequential optimization sampling method is proposed. Based on the new sampling method, metamodels can be constructed repeatedly through the addition of sampling points, namely, extrema points of metamodels and minimum points of density function. Afterwards, the more accurate metamodels would be constructed by the procedure above. The validity and effectiveness of proposed sampling method are examined by studying typical numerical examples. PMID:25133206

  3. Differences in Movement Pattern and Detectability between Males and Females Influence How Common Sampling Methods Estimate Sex Ratio

    PubMed Central

    Rodrigues, João Fabrício Mota; Coelho, Marco Túlio Pacheco

    2016-01-01

    Sampling the biodiversity is an essential step for conservation, and understanding the efficiency of sampling methods allows us to estimate the quality of our biodiversity data. Sex ratio is an important population characteristic, but until now, no study has evaluated how efficient are the sampling methods commonly used in biodiversity surveys in estimating the sex ratio of populations. We used a virtual ecologist approach to investigate whether active and passive capture methods are able to accurately sample a population’s sex ratio and whether differences in movement pattern and detectability between males and females produce biased estimates of sex-ratios when using these methods. Our simulation allowed the recognition of individuals, similar to mark-recapture studies. We found that differences in both movement patterns and detectability between males and females produce biased estimates of sex ratios. However, increasing the sampling effort or the number of sampling days improves the ability of passive or active capture methods to properly sample sex ratio. Thus, prior knowledge regarding movement patterns and detectability for species is important information to guide field studies aiming to understand sex ratio related patterns. PMID:27441554

  4. Enhanced Sampling in Free Energy Calculations: Combining SGLD with the Bennett's Acceptance Ratio and Enveloping Distribution Sampling Methods.

    PubMed

    König, Gerhard; Miller, Benjamin T; Boresch, Stefan; Wu, Xiongwu; Brooks, Bernard R

    2012-10-09

    One of the key requirements for the accurate calculation of free energy differences is proper sampling of conformational space. Especially in biological applications, molecular dynamics simulations are often confronted with rugged energy surfaces and high energy barriers, leading to insufficient sampling and, in turn, poor convergence of the free energy results. In this work, we address this problem by employing enhanced sampling methods. We explore the possibility of using self-guided Langevin dynamics (SGLD) to speed up the exploration process in free energy simulations. To obtain improved free energy differences from such simulations, it is necessary to account for the effects of the bias due to the guiding forces. We demonstrate how this can be accomplished for the Bennett's acceptance ratio (BAR) and the enveloping distribution sampling (EDS) methods. While BAR is considered among the most efficient methods available for free energy calculations, the EDS method developed by Christ and van Gunsteren is a promising development that reduces the computational costs of free energy calculations by simulating a single reference state. To evaluate the accuracy of both approaches in connection with enhanced sampling, EDS was implemented in CHARMM. For testing, we employ benchmark systems with analytical reference results and the mutation of alanine to serine. We find that SGLD with reweighting can provide accurate results for BAR and EDS where conventional molecular dynamics simulations fail. In addition, we compare the performance of EDS with other free energy methods. We briefly discuss the implications of our results and provide practical guidelines for conducting free energy simulations with SGLD.

  5. A fully automated plasma protein precipitation sample preparation method for LC-MS/MS bioanalysis.

    PubMed

    Ma, Ji; Shi, Jianxia; Le, Hoa; Cho, Robert; Huang, Judy Chi-jou; Miao, Shichang; Wong, Bradley K

    2008-02-01

    This report describes the development and validation of a robust robotic system that fully integrates all peripheral devices needed for the automated preparation of plasma samples by protein precipitation. The liquid handling system consisted of a Tecan Freedom EVO 200 liquid handling platform equipped with an 8-channel liquid handling arm, two robotic plate-handling arms, and two plate shakers. Important additional components integrated into the platform were a robotic temperature-controlled centrifuge, a plate sealer, and a plate seal piercing station. These enabled unattended operation starting from a stock solution of the test compound, a set of test plasma samples and associated reagents. The stock solution of the test compound was used to prepare plasma calibration and quality control samples. Once calibration and quality control samples were prepared, precipitation of plasma proteins was achieved by addition of three volumes of acetonitrile. Integration of the peripheral devices allowed automated sequential completion of the centrifugation, plate sealing, piercing and supernatant transferral steps. The method produced a sealed, injection-ready 96-well plate of plasma extracts. Accuracy and precision of the automated system were satisfactory for the intended use: intra-day and the inter-day precision were excellent (C.V.<5%), while the intra-day and inter-day accuracies were acceptable (relative error<8%). The flexibility of the platform was sufficient to accommodate pharmacokinetic studies of different numbers of animals and time points. To the best of our knowledge, this represents the first complete automation of the protein precipitation method for plasma sample analysis.

  6. Extended Phase-Space Methods for Enhanced Sampling in Molecular Simulations: A Review

    PubMed Central

    Fujisaki, Hiroshi; Moritsugu, Kei; Matsunaga, Yasuhiro; Morishita, Tetsuya; Maragliano, Luca

    2015-01-01

    Molecular Dynamics simulations are a powerful approach to study biomolecular conformational changes or protein–ligand, protein–protein, and protein–DNA/RNA interactions. Straightforward applications, however, are often hampered by incomplete sampling, since in a typical simulated trajectory the system will spend most of its time trapped by high energy barriers in restricted regions of the configuration space. Over the years, several techniques have been designed to overcome this problem and enhance space sampling. Here, we review a class of methods that rely on the idea of extending the set of dynamical variables of the system by adding extra ones associated to functions describing the process under study. In particular, we illustrate the Temperature Accelerated Molecular Dynamics (TAMD), Logarithmic Mean Force Dynamics (LogMFD), and Multiscale Enhanced Sampling (MSES) algorithms. We also discuss combinations with techniques for searching reaction paths. We show the advantages presented by this approach and how it allows to quickly sample important regions of the free-energy landscape via automatic exploration. PMID:26389113

  7. Monitoring benthic aIgal communides: A comparison of targeted and coefficient sampling methods

    USGS Publications Warehouse

    Edwards, Matthew S.; Tinker, M. Tim

    2009-01-01

    Choosing an appropriate sample unit is a fundamental decision in the design of ecological studies. While numerous methods have been developed to estimate organism abundance, they differ in cost, accuracy and precision.Using both field data and computer simulation modeling, we evaluated the costs and benefits associated with two methods commonly used to sample benthic organisms in temperate kelp forests. One of these methods, the Targeted Sampling method, relies on different sample units, each "targeted" for a specific species or group of species while the other method relies on coefficients that represent ranges of bottom cover obtained from visual esti-mates within standardized sample units. Both the field data and the computer simulations suggest that both methods yield remarkably similar estimates of organism abundance and among-site variability, although the Coefficient method slightly underestimates variability among sample units when abundances are low. In contrast, the two methods differ considerably in the effort needed to sample these communities; the Targeted Sampling requires more time and twice the personnel to complete. We conclude that the Coefficent Sampling method may be better for environmental monitoring programs where changes in mean abundance are of central concern and resources are limiting, but that the Targeted sampling methods may be better for ecological studies where quantitative relationships among species and small-scale variability in abundance are of central concern.

  8. Effects of model complexity and priors on estimation using sequential importance sampling/resampling for species conservation

    USGS Publications Warehouse

    Dunham, Kylee; Grand, James B.

    2016-01-01

    We examined the effects of complexity and priors on the accuracy of models used to estimate ecological and observational processes, and to make predictions regarding population size and structure. State-space models are useful for estimating complex, unobservable population processes and making predictions about future populations based on limited data. To better understand the utility of state space models in evaluating population dynamics, we used them in a Bayesian framework and compared the accuracy of models with differing complexity, with and without informative priors using sequential importance sampling/resampling (SISR). Count data were simulated for 25 years using known parameters and observation process for each model. We used kernel smoothing to reduce the effect of particle depletion, which is common when estimating both states and parameters with SISR. Models using informative priors estimated parameter values and population size with greater accuracy than their non-informative counterparts. While the estimates of population size and trend did not suffer greatly in models using non-informative priors, the algorithm was unable to accurately estimate demographic parameters. This model framework provides reasonable estimates of population size when little to no information is available; however, when information on some vital rates is available, SISR can be used to obtain more precise estimates of population size and process. Incorporating model complexity such as that required by structured populations with stage-specific vital rates affects precision and accuracy when estimating latent population variables and predicting population dynamics. These results are important to consider when designing monitoring programs and conservation efforts requiring management of specific population segments.

  9. Method for multiresidue determination of halogenated aromatics and PAHs in combustion-related samples.

    PubMed

    Liljelind, Per; Söderström, Gunilla; Hedman, Björn; Karlsson, Stina; Lundin, Lisa; Marklund, Stellan

    2003-08-15

    Flue gas and fly ash samples have a complex composition. Thus, thorough extraction and selective cleanup prior to analysis are essential. This paper presents an evaluated method for determining halogenated dibenzo-p-dioxins (PXDD), halogenated dibenzofurans (PXDF), chlorinated biphenyls (PCB), chlorobenzenes (CBz), -phenols (CPh), dibenzo-p-dioxins (DD), dibenzofurans (DF), and polycyclic aromatic hydrocarbons (PAH) in a single sample. Since these combustion byproducts are ubiquitous, harmful environmental contaminants it is very important to obtain reliable assessments of them: especially specific PCDD/F and PCB congeners with Ah-receptor mediated toxicity. The reported method for this purpose includes techniques such as solid-phase extraction, Soxhlet-Dean-Stark extraction, cleanup using open liquid chromatographic columns, and finally GC/MS analysis/determination with quantification by the isotope dilution technique. The validation results presented here show good reproducibility for PXDD/F and PCB and are satisfactory for CPh, CBz, and PAH. An extraction efficiency test revealed that a nonpolar solvent did not completely extract a few analytes, i.e., diCPh and fluorene, which appear to require a more polar extraction agent. To pinpoint and minimize the loss of analytes, specific studies on reductions of their amounts during sample concentration were performed, showing that traditional rotary evaporation and nitrogen blow-down produce equally good results as a novel technique.

  10. Isolation of three important types of stem cells from the same samples of banked umbilical cord blood.

    PubMed

    Phuc, Pham Van; Ngoc, Vu Bich; Lam, Dang Hoang; Tam, Nguyen Thanh; Viet, Pham Quoc; Ngoc, Phan Kim

    2012-06-01

    It is known that umbilical cord blood (UCB) is a rich source of stem cells with practical and ethical advantages. Three important types of stem cells which can be harvested from umbilical cord blood and used in disease treatment are hematopoietic stem cells (HSCs), mesenchymal stem cells (MSCs) and endothelial progenitor cells (EPCs). Since these stem cells have shown enormous potential in regenerative medicine, numerous umbilical cord blood banks have been established. In this study, we examined the ability of banked UCB collected to produce three types of stem cells from the same samples with characteristics of HSCs, MSCs and EPCs. We were able to obtain homogeneous plastic rapidly-adherent cells (with characteristics of MSCs), slowly-adherent (with characteristics of EPCs) and non-adherent cells (with characteristics of HSCs) from the mononuclear cell fractions of cryopreserved UCB. Using a protocol of 48 h supernatant transferring, we successfully isolated MSCs which expressed CD13, CD44 and CD90 while CD34, CD45 and CD133 negative, had typical fibroblast-like shape, and was able to differentiate into adipocytes; EPCs which were CD34, and CD90 positive, CD13, CD44, CD45 and CD133 negative, adherent with cobble-like shape; HSCs which formed colonies when cultured in MethoCult medium.

  11. The Five Planets in the Kepler-296 Binary System All Orbit the Primary: An Application of Importance Sampling

    NASA Astrophysics Data System (ADS)

    Barclay, Thomas; Quintana, Elisa; Adams, Fred; Ciardi, David; Huber, Daniel; Foreman-Mackey, Daniel; Montet, Benjamin Tyler; Caldwell, Douglas

    2015-08-01

    Kepler-296 is a binary star system with two M-dwarf components separated by 0.2 arcsec. Five transiting planets have been confirmed to be associated with the Kepler-296 system; given the evidence to date, however, the planets could in principle orbit either star. This ambiguity has made it difficult to constrain both the orbital and physical properties of the planets. Using both statistical and analytical arguments, this paper shows that all five planets are highly likely to orbit the primary star in this system. We performed a Markov-Chain Monte Carlo simulation using a five transiting planet model, leaving the stellar density and dilution with uniform priors. Using importance sampling, we compared the model probabilities under the priors of the planets orbiting either the brighter or the fainter component of the binary. A model where the planets orbit the brighter component, Kepler-296A, is strongly preferred by the data. Combined with our assertion that all five planets orbit the same star, the two outer planets in the system, Kepler-296 Ae and Kepler-296 Af, have radii of 1.53 ± 0.26 and 1.80 ± 0.31 R⊕, respectively, and receive incident stellar fluxes of 1.40 ± 0.23 and 0.62 ± 0.10 times the incident flux the Earth receives from the Sun. This level of irradiation places both planets within or close to the circumstellar habitable zone of their parent star.

  12. Method validation for control determination of mercury in fresh fish and shrimp samples by solid sampling thermal decomposition/amalgamation atomic absorption spectrometry.

    PubMed

    Torres, Daiane Placido; Martins-Teixeira, Maristela Braga; Cadore, Solange; Queiroz, Helena Müller

    2015-01-01

    A method for the determination of total mercury in fresh fish and shrimp samples by solid sampling thermal decomposition/amalgamation atomic absorption spectrometry (TDA AAS) has been validated following international foodstuff protocols in order to fulfill the Brazilian National Residue Control Plan. The experimental parameters have been previously studied and optimized according to specific legislation on validation and inorganic contaminants in foodstuff. Linearity, sensitivity, specificity, detection and quantification limits, precision (repeatability and within-laboratory reproducibility), robustness as well as accuracy of the method have been evaluated. Linearity of response was satisfactory for the two range concentrations available on the TDA AAS equipment, between approximately 25.0 and 200.0 μg kg(-1) (square regression) and 250.0 and 2000.0 μg kg(-1) (linear regression) of mercury. The residues for both ranges were homoscedastic and independent, with normal distribution. Correlation coefficients obtained for these ranges were higher than 0.995. Limits of quantification (LOQ) and of detection of the method (LDM), based on signal standard deviation (SD) for a low-in-mercury sample, were 3.0 and 1.0 μg kg(-1), respectively. Repeatability of the method was better than 4%. Within-laboratory reproducibility achieved a relative SD better than 6%. Robustness of the current method was evaluated and pointed sample mass as a significant factor. Accuracy (assessed as the analyte recovery) was calculated on basis of the repeatability, and ranged from 89% to 99%. The obtained results showed the suitability of the present method for direct mercury measurement in fresh fish and shrimp samples and the importance of monitoring the analysis conditions for food control purposes. Additionally, the competence of this method was recognized by accreditation under the standard ISO/IEC 17025.

  13. Comparison of preprocessing methods and storage times for touch DNA samples

    PubMed Central

    Dong, Hui; Wang, Jing; Zhang, Tao; Ge, Jian-ye; Dong, Ying-qiang; Sun, Qi-fan; Liu, Chao; Li, Cai-xia

    2017-01-01

    Aim To select appropriate preprocessing methods for different substrates by comparing the effects of four different preprocessing methods on touch DNA samples and to determine the effect of various storage times on the results of touch DNA sample analysis. Method Hand touch DNA samples were used to investigate the detection and inspection results of DNA on different substrates. Four preprocessing methods, including the direct cutting method, stubbing procedure, double swab technique, and vacuum cleaner method, were used in this study. DNA was extracted from mock samples with four different preprocessing methods. The best preprocess protocol determined from the study was further used to compare performance after various storage times. DNA extracted from all samples was quantified and amplified using standard procedures. Results The amounts of DNA and the number of alleles detected on the porous substrates were greater than those on the non-porous substrates. The performances of the four preprocessing methods varied with different substrates. The direct cutting method displayed advantages for porous substrates, and the vacuum cleaner method was advantageous for non-porous substrates. No significant degradation trend was observed as the storage times increased. Conclusion Different substrates require the use of different preprocessing method in order to obtain the highest DNA amount and allele number from touch DNA samples. This study provides a theoretical basis for explorations of touch DNA samples and may be used as a reference when dealing with touch DNA samples in case work. PMID:28252870

  14. Sampling uncertainty evaluation for data acquisition board based on Monte Carlo method

    NASA Astrophysics Data System (ADS)

    Ge, Leyi; Wang, Zhongyu

    2008-10-01

    Evaluating the data acquisition board sampling uncertainty is a difficult problem in the field of signal sampling. This paper analyzes the sources of dada acquisition board sampling uncertainty in the first, then introduces a simulation theory of dada acquisition board sampling uncertainty evaluation based on Monte Carlo method and puts forward a relation model of sampling uncertainty results, sampling numbers and simulation times. In the case of different sample numbers and different signal scopes, the author establishes a random sampling uncertainty evaluation program of a PCI-6024E data acquisition board to execute the simulation. The results of the proposed Monte Carlo simulation method are in a good agreement with the GUM ones, and the validities of Monte Carlo method are represented.

  15. Magnetic irreversibility: An important amendment in the zero-field-cooling and field-cooling method

    NASA Astrophysics Data System (ADS)

    Teixeira Dias, Fábio; das Neves Vieira, Valdemar; Esperança Nunes, Sabrina; Pureur, Paulo; Schaf, Jacob; Fernanda Farinela da Silva, Graziele; de Paiva Gouvêa, Cristol; Wolff-Fabris, Frederik; Kampert, Erik; Obradors, Xavier; Puig, Teresa; Roa Rovira, Joan Josep

    2016-02-01

    The present work reports about experimental procedures to correct significant deviations of magnetization data, caused by magnetic relaxation, due to small field cycling by sample transport in the inhomogeneous applied magnetic field of commercial magnetometers. The extensively used method for measuring the magnetic irreversibility by first cooling the sample in zero field, switching on a constant applied magnetic field and measuring the magnetization M(T) while slowly warming the sample, and subsequently measuring M(T) while slowly cooling it back in the same field, is very sensitive even to small displacement of the magnetization curve. In our melt-processed YBaCuO superconducting sample we observed displacements of the irreversibility limit up to 7 K in high fields. Such displacements are detected only on confronting the magnetic irreversibility limit with other measurements, like for instance zero resistance, in which the sample remains fixed and so is not affected by such relaxation. We measured the magnetic irreversibility, Tirr(H), using a vibrating sample magnetometer (VSM) from Quantum Design. The zero resistance data, Tc0(H), were obtained using a PPMS from Quantum Design. On confronting our irreversibility lines with those of zero resistance, we observed that the Tc0(H) data fell several degrees K above the Tirr(H) data, which obviously contradicts the well known properties of superconductivity. In order to get consistent Tirr(H) data in the H-T plane, it was necessary to do a lot of additional measurements as a function of the amplitude of the sample transport and extrapolate the Tirr(H) data for each applied field to zero amplitude.

  16. Transition Path Sampling Method and Its Application in Argon Phase Transition

    NASA Astrophysics Data System (ADS)

    Li, Bingxi

    Rare events during both physical and chemical transitions are of great significance to under- stand the evolution of systems from one stable state to another. Solid-solid phase transition is a fundamental problem in this field and a lot of experimental and theoretical efforts have been made into tackling it. However Molecular Dynamics simulation in this field encounters the problem that these transitions occur too rarely to be observed within current simulations. Thus the Transition Path Sampling (TPS) method is designed to tackle this issue. The phase transition between face centered cubic (fcc) and hexagonal close packed (hcp) phases in argon solid at 40K is investigated with TPS method. TPS is a rare event sampling methodology, which combine Molecular Dynamics and Monte Carlo. Molecular Dynamics is used to generate the whole trajectory from an assigned starting point, based on the time evolution of the system. Monte Carlo is applied to select a structure from the known phase transition trajectories as the starting point. This is an importance sampling process and the acceptance probability for starting point selection depends on its equilibrium probability in the ensemble of interest. With TPS method, the sampling of trajectories can be efficiently performed in the phase transition trajectories ensemble. The sampling process will yield energetically favorable trajectories. A phase transition trajectory is required to initialize the Molecular Dynamics Transition Path Sampling process. This trajectory can be generated with Variable Cell Nudged Elastic Band (VCNEB) method, which determines Ar fcc to hcp transition at 0K. The configuration of transition state in VCNEB trajectory is selected as the initial state to start TPS calculation. An atomistic description of the mechanism of the fcc-to- hcp transformation in solid argon is then obtained from Molecular Dynamics transition path sampling simulations. We show that the transition barrier at 40 K under ambient

  17. Evaluation of micro-colorimetric lipid determination method with samples prepared using sonication and accelerated solvent extraction methods.

    PubMed

    Billa, Nanditha; Hubin-Barrows, Dylan; Lahren, Tylor; Burkhard, Lawrence P

    2014-02-01

    Two common laboratory extraction techniques were evaluated for routine use with the micro-colorimetric lipid determination method developed by Van Handel (1985) [2] and recently validated for small samples by Inouye and Lotufo (2006) [1]. With the accelerated solvent extraction method using chloroform:methanol solvent and the colorimetric lipid determination method, 28 of 30 samples had significant proportional bias (α=1%, determined using standard additions) and 1 of 30 samples had significant constant bias (α=1%, determined using Youden Blank measurements). With sonic extraction, 0 of 6 samples had significant proportional bias (α=1%) and 1 of 6 samples had significant constant bias (α=1%). These demonstrate that the accelerated solvent extraction method with chloroform:methanol solvent system creates an interference with the colorimetric assay method, and without accounting for the bias in the analysis, inaccurate measurements would be obtained.

  18. 19 CFR 12.39 - Imported articles involving unfair methods of competition or practices.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... competition or practices. 12.39 Section 12.39 Customs Duties U.S. CUSTOMS AND BORDER PROTECTION, DEPARTMENT OF HOMELAND SECURITY; DEPARTMENT OF THE TREASURY SPECIAL CLASSES OF MERCHANDISE Unfair Competition § 12.39 Imported articles involving unfair methods of competition or practices. (a) Determinations of...

  19. 19 CFR 12.39 - Imported articles involving unfair methods of competition or practices.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... competition or practices. 12.39 Section 12.39 Customs Duties U.S. CUSTOMS AND BORDER PROTECTION, DEPARTMENT OF HOMELAND SECURITY; DEPARTMENT OF THE TREASURY SPECIAL CLASSES OF MERCHANDISE Unfair Competition § 12.39 Imported articles involving unfair methods of competition or practices. (a) Determinations of...

  20. 19 CFR 12.39 - Imported articles involving unfair methods of competition or practices.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... competition or practices. 12.39 Section 12.39 Customs Duties U.S. CUSTOMS AND BORDER PROTECTION, DEPARTMENT OF HOMELAND SECURITY; DEPARTMENT OF THE TREASURY SPECIAL CLASSES OF MERCHANDISE Unfair Competition § 12.39 Imported articles involving unfair methods of competition or practices. (a) Determinations of...

  1. 19 CFR 12.39 - Imported articles involving unfair methods of competition or practices.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... competition or practices. 12.39 Section 12.39 Customs Duties U.S. CUSTOMS AND BORDER PROTECTION, DEPARTMENT OF HOMELAND SECURITY; DEPARTMENT OF THE TREASURY SPECIAL CLASSES OF MERCHANDISE Unfair Competition § 12.39 Imported articles involving unfair methods of competition or practices. (a) Determinations of...

  2. 19 CFR 12.39 - Imported articles involving unfair methods of competition or practices.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... competition or practices. 12.39 Section 12.39 Customs Duties U.S. CUSTOMS AND BORDER PROTECTION, DEPARTMENT OF HOMELAND SECURITY; DEPARTMENT OF THE TREASURY SPECIAL CLASSES OF MERCHANDISE Unfair Competition § 12.39 Imported articles involving unfair methods of competition or practices. (a) Determinations of...

  3. New approach of a transient ICP-MS measurement method for samples with high salinity.

    PubMed

    Hein, Christina; Sander, Jonas Michael; Kautenburger, Ralf

    2017-03-01

    In the near future it is necessary to establish a disposal for high level nuclear waste (HLW) in deep and stable geological formations. In Germany typical host rocks are salt or claystone. Suitable clay formations exist in the south and in the north of Germany. The geochemical conditions of these clay formations show a strong difference. In the northern ionic strengths of the pore water up to 5M are observed. The determination of parameters like Kd values during sorption experiments of metal ions like uranium or europium as homologues for trivalent actinides onto clay stones are very important for long term safety analysis. The measurement of the low concentrated, not sorbed analytes commonly takes place by inductively coupled plasma mass spectrometry (ICP-MS). A direct measurement of high saline samples like seawater with more than 1% total dissolved salt content is not possible. Alternatives like sample clean up, preconcentration or strong dilution have more disadvantages than advantages for example more preparation steps or additional and expensive components. With a small modification of the ICP-MS sample introduction system and a home-made reprogramming of the autosampler a transient analysing method was developed which is suitable for measuring metal ions like europium and uranium in high saline sample matrices up to 5M (NaCl). Comparisons at low ionic strength between the default and the transient measurement show the latter performs similarly well to the default measurement. Additionally no time consuming sample clean-up or expensive online dilution or matrix removal systems are necessary and the analysation shows a high sensitivity due to the data processing based on the peak area.

  4. A Novel Method of Failure Sample Selection for Electrical Systems Using Ant Colony Optimization

    PubMed Central

    Tian, Shulin; Yang, Chenglin; Liu, Cheng

    2016-01-01

    The influence of failure propagation is ignored in failure sample selection based on traditional testability demonstration experiment method. Traditional failure sample selection generally causes the omission of some failures during the selection and this phenomenon could lead to some fearful risks of usage because these failures will lead to serious propagation failures. This paper proposes a new failure sample selection method to solve the problem. First, the method uses a directed graph and ant colony optimization (ACO) to obtain a subsequent failure propagation set (SFPS) based on failure propagation model and then we propose a new failure sample selection method on the basis of the number of SFPS. Compared with traditional sampling plan, this method is able to improve the coverage of testing failure samples, increase the capacity of diagnosis, and decrease the risk of using. PMID:27738424

  5. A DOE manual: DOE methods for evaluating environmental and waste management samples

    SciTech Connect

    Goheen, S.C.; Fadeff, S.K.; Sklarew, D.S.; McCulloch, M.; Mong, G.M.; Riley, R.G.; Thomas, B.L.

    1994-08-01

    DOE Methods for Evaluating Environmental and Waste Management Samples (DOE Methods) is a guidance/methods document supporting environmental restoration (ER) and waste management (WM) (collectively referred to as EM) sampling and analysis activities at US Department of Energy (DOE) sites. DOE Methods is intended to supplement existing guidance documents (e.g., the US Environmental Protection Agency`s Test Methods for Evaluating Solid Waste, SW-846), which apply to low-level or non-radioactive samples, and the complexities of waste and environmental samples encountered at DOE sites. The document contains quality assurance (QA), quality control (QC), safety, sampling, organic analysis, inorganic analysis, and radio-analytical guidance as well as sampling and analytical methods. It is updated every six months (April and October) with additional methods. As of April 1994, DOE methods contained 3 sampling and 39 analytical methods. It is anticipated that between 10 and 20 new methods will be added in October 1994. All methods are either peer reviewed and contain performance data, or are included as draft methods.

  6. Importance of mixed methods in pragmatic trials and dissemination and implementation research.

    PubMed

    Albright, Karen; Gechter, Katherine; Kempe, Allison

    2013-01-01

    With increased attention to the importance of translating research to clinical practice and policy, recent years have seen a proliferation of particular types of research, including pragmatic trials and dissemination and implementation research. Such research seeks to understand how and why interventions function in real-world settings, as opposed to highly controlled settings involving conditions not likely to be repeated outside the research study. Because understanding the context in which interventions are implemented is imperative for effective pragmatic trials and dissemination and implementation research, the use of mixed methods is critical to understanding trial results and the success or failure of implementation efforts. This article discusses a number of dimensions of mixed methods research, utilizing at least one qualitative method and at least one quantitative method, that may be helpful when designing projects or preparing grant proposals. Although the strengths and emphases of qualitative and quantitative approaches differ substantially, methods may be combined in a variety of ways to achieve a deeper level of understanding than can be achieved by one method alone. However, researchers must understand when and how to integrate the data as well as the appropriate order, priority, and purpose of each method. The ability to demonstrate an understanding of the rationale for and benefits of mixed methods research is increasingly important in today's competitive funding environment, and many funding agencies now expect applicants to include mixed methods in proposals. The increasing demand for mixed methods research necessitates broader methodological training and deepened collaboration between medical, clinical, and social scientists. Although a number of challenges to conducting and disseminating mixed methods research remain, the potential for insight generated by such work is substantial.

  7. An intercomparison study of analytical methods used for quantification of levoglucosan in ambient aerosol filter samples

    NASA Astrophysics Data System (ADS)

    Yttri, K. E.; Schnelle-Kreiss, J.; Maenhaut, W.; Alves, C.; Bossi, R.; Bjerke, A.; Claeys, M.; Dye, C.; Evtyugina, M.; García-Gacio, D.; Gülcin, A.; Hillamo, R.; Hoffer, A.; Hyder, M.; Iinuma, Y.; Jaffrezo, J.-L.; Kasper-Giebl, A.; Kiss, G.; López-Mahia, P. L.; Pio, C.; Piot, C.; Ramirez-Santa-Cruz, C.; Sciare, J.; Teinilä, K.; Vermeylen, R.; Vicente, A.; Zimmermann, R.

    2014-07-01

    the laboratories the mean PE was within ±10%. For galactosan, the mean PE for the participating laboratories ranged from -84 to 593%, and as for mannosan 33% of the laboratories reported a mean PE within ±10%. The variability of the various analytical methods, as defined by their minimum and maximum PE value, was typically better for levoglucosan than for mannosan and galactosan, ranging from 3.2 to 41% for levoglucosan, from 10 to 67% for mannosan, and from 6 to 364% for galactosan. For the levoglucosan to mannosan ratio, which may be used to assess the relative importance of softwood vs. hardwood burning, the variability only ranged from 3.5 to 24%. To our knowledge, this is the first major intercomparison on analytical methods used to quantify monosaccharide anhydrides in ambient aerosol filter samples conducted and reported in the scientific literature. The results show that for levoglucosan the accuracy is only slightly lower than that reported for analysis of SO42- on filter samples, a constituent that has been analyzed by numerous laboratories for several decades, typically by ion chromatography, and which is considered a fairly easy constituent to measure. Hence, the results obtained for levoglucosan with respect to accuracy are encouraging and suggest that levels of levoglucosan, and to a lesser extent mannosan and galactosan, obtained by most of the analytical methods currently used to quantify monosaccharide anhydrides in ambient aerosol filter samples, are comparable. Finally, the various analytical methods used in the current study should be tested for other aerosol matrices and concentrations as well, the most obvious being summertime aerosol samples affected by wild fires and/or agricultural fires.

  8. An intercomparison study of analytical methods used for quantification of levoglucosan in ambient aerosol filter samples

    NASA Astrophysics Data System (ADS)

    Yttri, K. E.; Schnelle-Kreis, J.; Maenhaut, W.; Abbaszade, G.; Alves, C.; Bjerke, A.; Bonnier, N.; Bossi, R.; Claeys, M.; Dye, C.; Evtyugina, M.; García-Gacio, D.; Hillamo, R.; Hoffer, A.; Hyder, M.; Iinuma, Y.; Jaffrezo, J.-L.; Kasper-Giebl, A.; Kiss, G.; López-Mahia, P. L.; Pio, C.; Piot, C.; Ramirez-Santa-Cruz, C.; Sciare, J.; Teinilä, K.; Vermeylen, R.; Vicente, A.; Zimmermann, R.

    2015-01-01

    laboratories the mean PE was within ±10%. For galactosan, the mean PE for the participating laboratories ranged from -84 to 593%, and as for mannosan 33% of the laboratories reported a mean PE within ±10%. The variability of the various analytical methods, as defined by their minimum and maximum PE value, was typically better for levoglucosan than for mannosan and galactosan, ranging from 3.2 to 41% for levoglucosan, from 10 to 67% for mannosan and from 6 to 364% for galactosan. For the levoglucosan to mannosan ratio, which may be used to assess the relative importance of softwood versus hardwood burning, the variability only ranged from 3.5 to 24 . To our knowledge, this is the first major intercomparison on analytical methods used to quantify monosaccharide anhydrides in ambient aerosol filter samples conducted and reported in the scientific literature. The results show that for levoglucosan the accuracy is only slightly lower than that reported for analysis of SO42- (sulfate) on filter samples, a constituent that has been analysed by numerous laboratories for several decades, typically by ion chromatography and which is considered a fairly easy constituent to measure. Hence, the results obtained for levoglucosan with respect to accuracy are encouraging and suggest that levels of levoglucosan, and to a lesser extent mannosan and galactosan, obtained by most of the analytical methods currently used to quantify monosaccharide anhydrides in ambient aerosol filter samples, are comparable. Finally, the various analytical methods used in the current study should be tested for other aerosol matrices and concentrations as well, the most obvious being summertime aerosol samples affected by wildfires and/or agricultural fires.

  9. Rapid Method for Ra-226 and Ra-228 in Water Samples

    SciTech Connect

    Maxwell, Sherrod, L. III

    2006-02-10

    The measurement of radium isotopes in natural waters is important for oceanographic studies and for public health reasons. Ra-226 (1620 year half-life) is one of the most toxic of the long-lived alpha emitters present in the environment due to its long life and its tendency to concentrate in bones, which increases the internal radiation dose of individuals. The analysis of radium-226 and radium-228 in natural waters can be tedious and time-consuming. Different sample preparation methods are often required to prepare Ra-226 and Ra-228 for separate analyses. A rapid method has been developed at the Savannah River Environmental Laboratory that effectively separates both Ra-226 and Ra-228 (via Ac-228) for assay. This method uses MnO{sub 2} Resin from Eichrom Technologies (Darien, IL, USA) to preconcentrate Ra-226 and Ra-228 rapidly from water samples, along with Ba-133 tracer. DGA Resin{reg_sign} (Eichrom) and Ln-Resin{reg_sign} (Eichrom) are employed in tandem to prepare Ra-226 for assay by alpha spectrometry and to determine Ra-228 via the measurement of Ac-228 by gas proportional counting. After preconcentration, the manganese dioxide is dissolved from the resin and passed through stacked Ln-Resin-DGA Resin cartridges that remove uranium and thorium interferences and retain Ac-228 on DGA Resin. The eluate that passed through this column is evaporated, redissolved in a lower acidity and passed through Ln-Resin again to further remove interferences before performing a barium sulfate microprecipitation. The Ac-228 is stripped from the resin, collected using cerium fluoride microprecipitation and counted by gas proportional counting. By using vacuum box cartridge technology with rapid flow rates, sample preparation time is minimized.

  10. Practical method for extraction of PCR-quality DNA from environmental soil samples.

    PubMed

    Fitzpatrick, Kelly A; Kersh, Gilbert J; Massung, Robert F

    2010-07-01

    Methods for the extraction of PCR-quality DNA from environmental soil samples by using pairs of commercially available kits were evaluated. Coxiella burnetii DNA was detected in spiked soil samples at <1,000 genome equivalents per gram of soil and in 12 (16.4%) of 73 environmental soil samples.

  11. METHOD DEVELOPMENT FOR THE DETERMINATION OF FORMALDEHYDE IN SAMPLES OF ENVIRONMENTAL ORIGIN

    EPA Science Inventory

    An analytical method was developed for the determination of formaldehyde in samples of environmental origin. After a review of the current literature, five candidate methods involving chemical derivatization were chosen for evaluation. The five derivatization reagents studied wer...

  12. Global mean estimation using a self-organizing dual-zoning method for preferential sampling.

    PubMed

    Pan, Yuchun; Ren, Xuhong; Gao, Bingbo; Liu, Yu; Gao, YunBing; Hao, Xingyao; Chen, Ziyue

    2015-03-01

    Giving an appropriate weight to each sampling point is essential to global mean estimation. The objective of this paper was to develop a global mean estimation method with preferential samples. The procedure for this estimation method was to first zone the study area based on self-organizing dual-zoning method and then to estimate the mean according to stratified sampling method. In this method, spreading of points in both feature and geographical space is considered. The method is tested in a case study on the metal Mn concentrations in Jilin provinces of China. Six sample patterns are selected to estimate the global mean and compared with the global mean calculated by direct arithmetic mean method, polygon method, and cell method. The results show that the proposed method produces more accurate and stable mean estimates under different feature deviation index (FDI) values and sample sizes. The relative errors of the global mean calculated by the proposed method are from 0.14 to 1.47 % and they are the largest (4.83-8.84 %) by direct arithmetic mean method. At the same time, the mean results calculated by the other three methods are sensitive to the FDI values and sample sizes.

  13. Multivariate Methods for Prediction of Geologic Sample Composition with Laser-Induced Breakdown Spectroscopy

    NASA Technical Reports Server (NTRS)

    Morris, Richard; Anderson, R.; Clegg, S. M.; Bell, J. F., III

    2010-01-01

    Laser-induced breakdown spectroscopy (LIBS) uses pulses of laser light to ablate a material from the surface of a sample and produce an expanding plasma. The optical emission from the plasma produces a spectrum which can be used to classify target materials and estimate their composition. The ChemCam instrument on the Mars Science Laboratory (MSL) mission will use LIBS to rapidly analyze targets remotely, allowing more resource- and time-intensive in-situ analyses to be reserved for targets of particular interest. ChemCam will also be used to analyze samples that are not reachable by the rover's in-situ instruments. Due to these tactical and scientific roles, it is important that ChemCam-derived sample compositions are as accurate as possible. We have compared the results of partial least squares (PLS), multilayer perceptron (MLP) artificial neural networks (ANNs), and cascade correlation (CC) ANNs to determine which technique yields better estimates of quantitative element abundances in rock and mineral samples. The number of hidden nodes in the MLP ANNs was optimized using a genetic algorithm. The influence of two data preprocessing techniques were also investigated: genetic algorithm feature selection and averaging the spectra for each training sample prior to training the PLS and ANN algorithms. We used a ChemCam-like laboratory stand-off LIBS system to collect spectra of 30 pressed powder geostandards and a diverse suite of 196 geologic slab samples of known bulk composition. We tested the performance of PLS and ANNs on a subset of these samples, choosing to focus on silicate rocks and minerals with a loss on ignition of less than 2 percent. This resulted in a set of 22 pressed powder geostandards and 80 geologic samples. Four of the geostandards were used as a validation set and 18 were used as the training set for the algorithms. We found that PLS typically resulted in the lowest average absolute error in its predictions, but that the optimized MLP ANN and

  14. Quantifying Responses of Dung Beetles to Fire Disturbance in Tropical Forests: The Importance of Trapping Method and Seasonality

    PubMed Central

    de Andrade, Rafael Barreto; Barlow, Jos; Louzada, Julio; Vaz-de-Mello, Fernando Zagury; Souza, Mateus; Silveira, Juliana M.; Cochrane, Mark A.

    2011-01-01

    Understanding how biodiversity responds to environmental changes is essential to provide the evidence-base that underpins conservation initiatives. The present study provides a standardized comparison between unbaited flight intercept traps (FIT) and baited pitfall traps (BPT) for sampling dung beetles. We examine the effectiveness of the two to assess fire disturbance effects and how trap performance is affected by seasonality. The study was carried out in a transitional forest between Cerrado (Brazilian Savanna) and Amazon Forest. Dung beetles were collected during one wet and one dry sampling season. The two methods sampled different portions of the local beetle assemblage. Both FIT and BPT were sensitive to fire disturbance during the wet season, but only BPT detected community differences during the dry season. Both traps showed similar correlation with environmental factors. Our results indicate that seasonality had a stronger effect than trap type, with BPT more effective and robust under low population numbers, and FIT more sensitive to fine scale heterogeneity patterns. This study shows the strengths and weaknesses of two commonly used methodologies for sampling dung beetles in tropical forests, as well as highlighting the importance of seasonality in shaping the results obtained by both sampling strategies. PMID:22028831

  15. Best Practices in Using Large, Complex Samples: The Importance of Using Appropriate Weights and Design Effect Compensation

    ERIC Educational Resources Information Center

    Osborne, Jason W.

    2011-01-01

    Large surveys often use probability sampling in order to obtain representative samples, and these data sets are valuable tools for researchers in all areas of science. Yet many researchers are not formally prepared to appropriately utilize these resources. Indeed, users of one popular dataset were generally found "not" to have modeled…

  16. Systems and methods for separating particles and/or substances from a sample fluid

    DOEpatents

    Mariella, Jr., Raymond P.; Dougherty, George M.; Dzenitis, John M.; Miles, Robin R.; Clague, David S.

    2016-11-01

    Systems and methods for separating particles and/or toxins from a sample fluid. A method according to one embodiment comprises simultaneously passing a sample fluid and a buffer fluid through a chamber such that a fluidic interface is formed between the sample fluid and the buffer fluid as the fluids pass through the chamber, the sample fluid having particles of interest therein; applying a force to the fluids for urging the particles of interest to pass through the interface into the buffer fluid; and substantially separating the buffer fluid from the sample fluid.

  17. A standardized method for sampling and extraction methods for quantifying microplastics in beach sand.

    PubMed

    Besley, Aiken; Vijver, Martina G; Behrens, Paul; Bosker, Thijs

    2017-01-15

    Microplastics are ubiquitous in the environment, are frequently ingested by organisms, and may potentially cause harm. A range of studies have found significant levels of microplastics in beach sand. However, there is a considerable amount of methodological variability among these studies. Methodological variation currently limits comparisons as there is no standard procedure for sampling or extraction of microplastics. We identify key sampling and extraction procedures across the literature through a detailed review. We find that sampling depth, sampling location, number of repeat extractions, and settling times are the critical parameters of variation. Next, using a case-study we determine whether and to what extent these differences impact study outcomes. By investigating the common practices identified in the literature with the case-study, we provide a standard operating procedure for sampling and extracting microplastics from beach sand.

  18. 40 CFR 80.1642 - Sampling and testing requirements for producers and importers of denatured fuel ethanol and other...

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... producers and importers of denatured fuel ethanol and other oxygenates for use by oxygenate blenders. 80... requirements for producers and importers of denatured fuel ethanol and other oxygenates for use by oxygenate blenders. Beginning January 1, 2017, producers and importers of denatured fuel ethanol (DFE) and...

  19. Analysis of polyethylene microplastics in environmental samples, using a thermal decomposition method.

    PubMed

    Dümichen, Erik; Barthel, Anne-Kathrin; Braun, Ulrike; Bannick, Claus G; Brand, Kathrin; Jekel, Martin; Senz, Rainer

    2015-11-15

    Small polymer particles with a diameter of less than 5 mm called microplastics find their way into the environment from polymer debris and industrial production. Therefore a method is needed to identify and quantify microplastics in various environmental samples to generate reliable concentration values. Such concentration values, i.e. quantitative results, are necessary for an assessment of microplastic in environmental media. This was achieved by thermal extraction in thermogravimetric analysis (TGA), connected to a solid-phase adsorber. These adsorbers were subsequently analysed by thermal desorption gas chromatography mass spectrometry (TDS-GC-MS). In comparison to other chromatographic methods, like pyrolyse gas chromatography mass spectrometry (Py-GC-MS), the relatively high sample masses in TGA (about 200 times higher than used in Py-GC-MS) analysed here enable the measurement of complex matrices that are not homogenous on a small scale. Through the characteristic decomposition products known for every kind of polymer it is possible to identify and even to quantify polymer particles in various matrices. Polyethylene (PE), one of the most important representatives for microplastics, was chosen as an example for identification and quantification.

  20. A digital sampling moiré method for two-dimensional displacement measurement

    NASA Astrophysics Data System (ADS)

    Chen, Xinxing; Chang, Chih-Chen

    2015-04-01

    Measuring static and dynamic displacements for in-service structures is an important issue for the purpose of design validation, performance monitoring and safety assessment of structures. Currently available techniques can be classified into indirect measurement and direct measurement. These methods however have their own problems and limitations Digital sampling moiré method is a newly developed vision-based technique for direct displacement measurement. It uses one camera to capture digital images containing a grating pattern. The images are subsampled and interpolated to generate moiré patterns whose phase information can then be used to calculate displacements of the grating pattern. As the moiré patterns can magnify the pattern's movement, this technique is expected to provide more accurate displacement measurement than the other vision based approaches. In this study, a digital sampling moiré technique is proposed for measuring two-dimensional structural displacements using a designed grating pattern. The pattern contains two orthogonally inclined gratings and does not have to be perfectly aligned with the image plane. A series of simulation and laboratory tests are conducted to validate the accuracy of the proposed technique. Results show that the technique can achieve accuracy in the order of 10 micrometers in the laboratory. Also, the technique does not seem to suffer from the issue of misalignment between the camera and the pattern and exhibits a potential for accurate measurement of displacement for civil engineering structures.

  1. Total nitrogen determination of various sample types: a comparison of the Hach, Kjeltec, and Kjeldahl methods.

    PubMed

    Watkins, K L; Veum, T L; Krause, G F

    1987-01-01

    Conventional Kjeldahl analysis with modifications, Kjeltec analysis with block digestion and semiautomated distillation, and the Hach method for determining nitrogen (N) were compared using a wide range of samples. Twenty different sample types were ground and mixed. Each sample type was divided into 5 subsamples which were analyzed for N by each of the 3 methods. In each sample type, differences (P less than 0.05) were detected among the 3 N determination methods in 5 of the 20 N sources analyzed. The mean N content over all 20 samples was higher with Kjeldahl analysis (P less than 0.05) than with Kjeltec, while Hach analysis produced intermediate results. Results also indicated that the Hach procedure had the greatest ability to detect differences in N content among sample types, being more sensitive than either other method (P less than 0.05).

  2. The Basket Method for Selecting Balanced Samples. Part II. Applications to Price Estimation.

    DTIC Science & Technology

    1981-12-01

    AD-AI12 949 CLEMSON UNIV SC OEPT OF MATHEMATICAL SCIENCES F/B 12/1 THE BASKET METHOD FOR SELECTING BALANCED SAMPLES. PART 11. APPL-ETC(U) DEC SI K T...1111󈧝 1.4 1.6 MICROCOPY RESOLUTION TEST CHARTNN’( 4~ THE BASKET METHOD FOR SELECTING BALANCED SAMPLES - PART II: APPLICATIONS TO PRICE ESTIMATION * K...for Selecting Balanced Samples Part I: Applications to Price Estimation AB9TRACT The "Basket Method" of sampling, a tool designed to achieve

  3. A Method for Microalgae Proteomics Analysis Based on Modified Filter-Aided Sample Preparation.

    PubMed

    Li, Song; Cao, Xupeng; Wang, Yan; Zhu, Zhen; Zhang, Haowei; Xue, Song; Tian, Jing

    2017-04-11

    With the fast development of microalgal biofuel researches, the proteomics studies of microalgae increased quickly. A filter-aided sample preparation (FASP) method is widely used proteomics sample preparation method since 2009. Here, a method of microalgae proteomics analysis based on modified filter-aided sample preparation (mFASP) was described to meet the characteristics of microalgae cells and eliminate the error caused by over-alkylation. Using Chlamydomonas reinhardtii as the model, the prepared sample was tested by standard LC-MS/MS and compared with the previous reports. The results showed mFASP is suitable for most of occasions of microalgae proteomics studies.

  4. Purposeful Sampling for Qualitative Data Collection and Analysis in Mixed Method Implementation Research.

    PubMed

    Palinkas, Lawrence A; Horwitz, Sarah M; Green, Carla A; Wisdom, Jennifer P; Duan, Naihua; Hoagwood, Kimberly

    2015-09-01

    Purposeful sampling is widely used in qualitative research for the identification and selection of information-rich cases related to the phenomenon of interest. Although there are several different purposeful sampling strategies, criterion sampling appears to be used most commonly in implementation research. However, combining sampling strategies may be more appropriate to the aims of implementation research and more consistent with recent developments in quantitative methods. This paper reviews the principles and practice of purposeful sampling in implementation research, summarizes types and categories of purposeful sampling strategies and provides a set of recommendations for use of single strategy or multistage strategy designs, particularly for state implementation research.

  5. Optical method for the characterization of laterally-patterned samples in integrated circuits

    DOEpatents

    Maris, Humphrey J.

    2001-01-01

    Disclosed is a method for characterizing a sample having a structure disposed on or within the sample, comprising the steps of applying a first pulse of light to a surface of the sample for creating a propagating strain pulse in the sample, applying a second pulse of light to the surface so that the second pulse of light interacts with the propagating strain pulse in the sample, sensing from a reflection of the second pulse a change in optical response of the sample, and relating a time of occurrence of the change in optical response to at least one dimension of the structure.

  6. Optical method and system for the characterization of laterally-patterned samples in integrated circuits

    DOEpatents

    Maris, Humphrey J.

    2008-03-04

    Disclosed is a method for characterizing a sample having a structure disposed on or within the sample, comprising the steps of applying a first pulse of light to a surface of the sample for creating a propagating strain pulse in the sample, applying a second pulse of light to the surface so that the second pulse of light interacts with the propagating strain pulse in the sample, sensing from a reflection of the second pulse a change in optical response of the sample, and relating a time of occurrence of the change in optical response to at least one dimension of the structure.

  7. [Establishment and assessment of QA/QC method for sampling and analysis of atmosphere background CO2].

    PubMed

    Liu, Li-xin; Zhou, Ling-xi; Xia, Ling-jun; Wang, Hong-yang; Fang, Shuang-xi

    2014-12-01

    To strengthen scientific management and sharing of greenhouse gas data obtained from atmospheric background stations in China, it is important to ensure the standardization of quality assurance and quality control method for background CO2 sampling and analysis. Based on the greenhouse gas sampling and observation experience of CMA, using portable sampling observation and WS-CRDS analysis technique as an example, the quality assurance measures for atmospheric CO,sampling and observation in the Waliguan station (Qinghai), the glass bottle quality assurance measures and the systematic quality control method during sample analysis, the correction method during data processing, as well as the data grading quality markers and data fitting interpolation method were systematically introduced. Finally, using this research method, the CO2 sampling and observation data at the atmospheric background stations in 3 typical regions were processed and the concentration variation characteristics were analyzed, indicating that this research method could well catch the influences of the regional and local environmental factors on the observation results, and reflect the characteristics of natural and human activities in an objective and accurate way.

  8. Quantitative method of determining beryllium or a compound thereof in a sample

    DOEpatents

    McCleskey, T. Mark; Ehler, Deborah S.; John, Kevin D.; Burrell, Anthony K.; Collis, Gavin E.; Minogue, Edel M.; Warner, Benjamin P.

    2006-10-31

    A method of determining beryllium or a beryllium compound thereof in a sample, includes providing a sample suspected of comprising beryllium or a compound thereof, extracting beryllium or a compound thereof from the sample by dissolving in a solution, adding a fluorescent indicator to the solution to thereby bind any beryllium or a compound thereof to the fluorescent indicator, and determining the presence or amount of any beryllium or a compound thereof in the sample by measuring fluorescence.

  9. Quantitative method of determining beryllium or a compound thereof in a sample

    SciTech Connect

    McCleskey, T. Mark; Ehler, Deborah S.; John, Kevin D.; Burrell, Anthony K.; Collis, Gavin E.; Minogue, Edel M.; Warner, Benjamin P.

    2010-08-24

    A method of determining beryllium or a beryllium compound thereof in a sample, includes providing a sample suspected of comprising beryllium or a compound thereof, extracting beryllium or a compound thereof from the sample by dissolving in a solution, adding a fluorescent indicator to the solution to thereby bind any beryllium or a compound thereof to the fluorescent indicator, and determining the presence or amount of any beryllium or a compound thereof in the sample by measuring fluorescence.

  10. Estimation of the sugar cane cultivated area from LANDSAT images using the two phase sampling method

    NASA Technical Reports Server (NTRS)

    Parada, N. D. J. (Principal Investigator); Cappelletti, C. A.; Mendonca, F. J.; Lee, D. C. L.; Shimabukuro, Y. E.

    1982-01-01

    A two phase sampling method and the optimal sampling segment dimensions for the estimation of sugar cane cultivated area were developed. This technique employs visual interpretations of LANDSAT images and panchromatic aerial photographs considered as the ground truth. The estimates, as a mean value of 100 simulated samples, represent 99.3% of the true value with a CV of approximately 1%; the relative efficiency of the two phase design was 157% when compared with a one phase aerial photographs sample.

  11. 7 CFR 51.308 - Methods of sampling and calculation of percentages.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... Grades of Apples Methods of Sampling and Calculation of Percentages § 51.308 Methods of sampling and... weigh ten pounds or less, or in any container where the minimum diameter of the smallest apple does not vary more than 1/2 inch from the minimum diameter of the largest apple, percentages shall be...

  12. 7 CFR 51.308 - Methods of sampling and calculation of percentages.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... Grades of Apples Methods of Sampling and Calculation of Percentages § 51.308 Methods of sampling and... weigh ten pounds or less, or in any container where the minimum diameter of the smallest apple does not vary more than 1/2 inch from the minimum diameter of the largest apple, percentages shall be...

  13. Evaluation of beef trim sampling methods for detection of Shiga toxin-producing Escherichia coli (STEC)

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Presence of Shiga toxin-producing Escherichia coli (STEC) is a major concern in ground beef. Several methods for sampling beef trim prior to grinding are currently used in the beef industry. The purpose of this study was to determine the efficacy of the sampling methods for detecting STEC in beef ...

  14. THE INFLUENCE OF PHYSICAL FACTORS ON COMPARATIVE PERFORMANCE OF SAMPLING METHODS IN LARGE RIVERS

    EPA Science Inventory

    In 1999, we compared five existing benthic macroinvertebrate sampling methods used in boatable rivers. Each sampling protocol was performed at each of 60 sites distributed among four rivers in the Ohio River drainage basin. Initial comparison of methods using key macroinvertebr...

  15. Application of a Permethrin Immunosorbent Assay Method to Residential Soil and Dust Samples

    EPA Science Inventory

    A low-cost, high throughput bioanalytical screening method was developed for monitoring cis/trans-permethrin in dust and soil samples. The method consisted of a simple sample preparation procedure [sonication with dichloromethane followed by a solvent exchange into methanol:wate...

  16. Non-uniform sampling knife-edge method for camera modulation transfer function measurement

    NASA Astrophysics Data System (ADS)

    Duan, Yaxuan; Xue, Xun; Chen, Yongquan; Tian, Liude; Zhao, Jianke; Gao, Limin

    2016-11-01

    Traditional slanted knife-edge method experiences large errors in the camera modulation transfer function (MTF) due to tilt angle error in the knife-edge resulting in non-uniform sampling of the edge spread function. In order to resolve this problem, a non -uniform sampling knife-edge method for camera MTF measurement is proposed. By applying a simple direct calculation of the Fourier transform of the derivative for the non-uniform sampling data, the camera super-sampled MTF results are obtained. Theoretical simulations for images with and without noise under different tilt angle errors are run using the proposed method. It is demonstrated that the MTF results are insensitive to tilt angle errors. To verify the accuracy of the proposed method, an experimental setup for camera MTF measurement is established. Measurement results show that the proposed method is superior to traditional methods, and improves the universality of the slanted knife-edge method for camera MTF measurement.

  17. Evaluation of alternative mosquito sampling methods for malaria vectors in Lowland South - East Zambia

    PubMed Central

    2013-01-01

    Background Sampling malaria vectors and measuring their biting density is of paramount importance for entomological surveys of malaria transmission. Human landing catch (HLC) has been traditionally regarded as a gold standard method for surveying human exposure to mosquito bites. However, due to the risk of human participant exposure to mosquito-borne parasites and viruses, a variety of alternative, exposure-free trapping methods were compared in lowland, south-east Zambia. Methods Centres for Disease Control and Prevention miniature light trap (CDC-LT), Ifakara Tent Trap model C (ITT-C), resting boxes (RB) and window exit traps (WET) were all compared with HLC using a 3 × 3 Latin Squares design replicated in 4 blocks of 3 houses with long lasting insecticidal nets, half of which were also sprayed with a residual deltamethrin formulation, which was repeated for 10 rounds of 3 nights of rotation each during both the dry and wet seasons. Results The mean catches of HLC indoor, HLC outdoor, CDC-LT, ITT-C, WET, RB indoor and RB outdoor, were 1.687, 1.004, 3.267, 0.088, 0.004, 0.000 and 0.008 for Anopheles quadriannulatus Theobald respectively, and 7.287, 6.784, 10.958, 5.875, 0.296, 0.158 and 0.458, for An. funestus Giles, respectively. Indoor CDC-LT was more efficient in sampling An. quadriannulatus and An. funestus than HLC indoor (Relative rate [95% Confidence Interval] = 1.873 [1.653, 2.122] and 1.532 [1.441, 1.628], respectively, P < 0.001 for both). ITT-C was the only other alternative which had comparable sensitivity (RR = 0.821 [0.765, 0.881], P < 0.001), relative to HLC indoor other than CDC-LT for sampling An. funestus. Conclusions While the two most sensitive exposure-free techniques primarily capture host-seeking mosquitoes, both have substantial disadvantages for routine community-based surveillance applications: the CDC-LT requires regular recharging of batteries while the bulkiness of ITT-C makes it difficult to move between sampling

  18. Preliminary field evaluation of EPA Method CTM-039 (PM2.5 Stack Sampling Method)

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Agricultural operations are encountering difficulties complying with current air pollution regulations for particulate matter (PM). These regulations are based on the National Ambient Air Quality Standards which set maximum concentration limits for ambient air PM. Source sampling for compliance purp...

  19. Update on field evaluation of EPA Method CTM-039 (PM2.5 Stack Sampling Method)

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Agricultural operations are encountering difficulties complying with current air pollution regulations for particulate matter (PM). These regulations are based on the National Ambient Air Quality Standards, which set maximum concentration limits for ambient air PM. Source sampling for compliance pur...

  20. Evaluation of a modified sampling method for molecular analysis of air microflora.

    PubMed

    Lech, T; Ziembinska-Buczynska, A

    2015-04-10

    A serious issue concerning the durability of economically important materials for humans related to cultural heritage is the process of biodeterioration. As a result of this phenomenon, priceless works of art, documents, and old prints have undergone a process of decomposition caused by microorganisms. Therefore, it is important to constantly monitor the presence and diversity of microorganisms in exposition rooms and storage areas of historical objects. In addition, the use of molecular biology tools for conservation studies will enable detailed research as well as reduce the time needed to perform the analyses compared with using conventional methods related to microbiology and conservation. The aim of this study was to adapt the sampling indoor air method for direct DNA extraction from microorganisms, including evaluating the extracted DNA quality and concentration. The obtained DNA was used to study the diversity of mold fungi in indoor air using polymerase chain reaction-denaturing gradient gel electrophoresis in specific archives and museum environments. The research was conducted in 2 storage rooms of the National Archives in Krakow and in 1 exposition room of the Archaeological Museum in Krakow (Poland).

  1. The gas chromatographic determination of volatile fatty acids in wastewater samples: evaluation of experimental biases in direct injection method against thermal desorption method.

    PubMed

    Ullah, Md Ahsan; Kim, Ki-Hyun; Szulejko, Jan E; Cho, Jinwoo

    2014-04-11

    The production of short-chained volatile fatty acids (VFAs) by the anaerobic bacterial digestion of sewage (wastewater) affords an excellent opportunity to alternative greener viable bio-energy fuels (i.e., microbial fuel cell). VFAs in wastewater (sewage) samples are commonly quantified through direct injection (DI) into a gas chromatograph with a flame ionization detector (GC-FID). In this study, the reliability of VFA analysis by the DI-GC method has been examined against a thermal desorption (TD-GC) method. The results indicate that the VFA concentrations determined from an aliquot from each wastewater sample by the DI-GC method were generally underestimated, e.g., reductions of 7% (acetic acid) to 93.4% (hexanoic acid) relative to the TD-GC method. The observed differences between the two methods suggest the possibly important role of the matrix effect to give rise to the negative biases in DI-GC analysis. To further explore this possibility, an ancillary experiment was performed to examine bias patterns of three DI-GC approaches. For instance, the results of the standard addition (SA) method confirm the definite role of matrix effect when analyzing wastewater samples by DI-GC. More importantly, their biases tend to increase systematically with increasing molecular weight and decreasing VFA concentrations. As such, the use of DI-GC method, if applied for the analysis of samples with a complicated matrix, needs a thorough validation to improve the reliability in data acquisition.

  2. New trial methods for the detection of carbon and nitrogen in quartz samples

    NASA Astrophysics Data System (ADS)

    Kim, K. J.; Trompetter, W. J.; Eastoe, C.; Spilde, M.

    2007-06-01

    Common metamorphic quartz samples have abundant fluid inclusions that contain C, N, H2O, etc. Understanding the nitrogen content in quartz is especially important since neutron capture by nitrogen would contribute a significant amount of the total in situ14C production in exposed quartz by cosmic rays. To determine the carbon and nitrogen content experimentally, we applied nuclear reaction analysis (NRA) using a deuteron beam. The nuclear reactions of 14N(d, α0) and 12C(d, p0) were used to estimate atomic concentrations of N and C, respectively. In our six samples, the range of nitrogen concentration was 0.008 ± 0.001 to 0.050 ± 0.002 at%, and the carbon content ranged from 0.0005 ± 0.0003 to 0.0017 ± 0.0003 at%. The large errors associated with these estimates are due to low counting statistics from measurements of trace concentrations near the limits of detection. A Costech Elemental Analyzer was also used to determine nitrogen and carbon content. The nitrogen content of the quartz varied from 0.001 to 0.002 wt%, consistent with the NRA measurements, but with poor precision because the concentrations are close to the limit of detection. The carbon concentrations of our samples ranged from 0.0086 to 0.0242 wt% with better-defined chromatogram peaks than those of nitrogen. Further investigation is needed for the development of a reliable method of determination of C and N in quartz samples that could be used for in situ14C analysis for exposure dating applications [D. Lal, A.J.T. Jull, Nucl. Instr. and Meth. B 92 (1994) 291].

  3. Comparability among four invertebrate sampling methods, Fountain Creek Basin, Colorado, 2010-2012

    USGS Publications Warehouse

    Zuellig, Robert E.; Bruce, James F.; Stogner, Robert W.; Brown, Krystal D.

    2014-01-01

    The U.S. Geological Survey, in cooperation with Colorado Springs City Engineering and Colorado Springs Utilities, designed a study to determine if sampling method and sample timing resulted in comparable samples and assessments of biological condition. To accomplish this task, annual invertebrate samples were collected concurrently using four sampling methods at 15 U.S. Geological Survey streamflow gages in the Fountain Creek basin from 2010 to 2012. Collectively, the four methods are used by local (U.S. Geological Survey cooperative monitoring program) and State monitoring programs (Colorado Department of Public Health and Environment) in the Fountain Creek basin to produce two distinct sample types for each program that target single-and multiple-habitats. This study found distinguishable differences between single-and multi-habitat sample types using both community similarities and multi-metric index values, while methods from each program within sample type were comparable. This indicates that the Colorado Department of Public Health and Environment methods were compatible with the cooperative monitoring program methods within multi-and single-habitat sample types. Comparisons between September and October samples found distinguishable differences based on community similarities for both sample types, whereas only differences were found for single-habitat samples when multi-metric index values were considered. At one site, differences between September and October index values from single-habitat samples resulted in opposing assessments of biological condition. Direct application of the results to inform the revision of the existing Fountain Creek basin U.S. Geological Survey cooperative monitoring program are discussed.

  4. The importance of Guthrie cards and other medical samples for the direct matching of disaster victims using DNA profiling.

    PubMed

    Hartman, D; Benton, L; Morenos, L; Beyer, J; Spiden, M; Stock, A

    2011-02-25

    The identification of disaster victims through the use of DNA analysis is an integral part of any Disaster Victim Identification (DVI) response, regardless of the scale and nature of the disaster. As part of the DVI response to the 2009 Victorian Bushfires Disaster, DNA analysis was performed to assist in the identification of victims through kinship (familial matching to relatives) or direct (self source sample) matching of DNA profiles. Although most of the DNA identifications achieved were to reference samples from relatives, there were a number of DNA identifications (12) made through direct matching. Guthrie cards, which have been collected in Australia over the past 30 years, were used to provide direct reference samples. Of the 236 ante-mortem (AM) samples received, 21 were Guthrie cards and one was a biopsy specimen; all yielding complete DNA profiles when genotyped. This publication describes the use of such Biobanks and medical specimens as a sample source for the recovery of good quality DNA for comparisons to post-mortem (PM) samples.

  5. Synchronization sampling method based on delta-sigma analog-digital converter for underwater towed array system

    NASA Astrophysics Data System (ADS)

    Jiang, Jia-Jia; Duan, Fa-Jie; Li, Yan-Chao; Hua, Xiang-Ning

    2014-03-01

    Synchronization sampling is very important in underwater towed array system where every acquisition node (AN) samples analog signals by its own analog-digital converter (ADC). In this paper, a simple and effective synchronization sampling method is proposed to ensure synchronized operation among different ANs of the underwater towed array system. We first present a master-slave synchronization sampling model, and then design a high accuracy phase-locked loop to synchronize all delta-sigma ADCs to a reference clock. However, when the master-slave synchronization sampling model is used, both the time-delay (TD) of messages traveling along the wired transmission medium and the jitter of the clocks will bring out synchronization sampling error (SSE). Therefore, a simple method is proposed to estimate and compensate the TD of the messages transmission, and then another effective method is presented to overcome the SSE caused by the jitter of the clocks. An experimental system with three ANs is set up, and the related experimental results verify the validity of the synchronization sampling method proposed in this paper.

  6. Synchronization sampling method based on delta-sigma analog-digital converter for underwater towed array system.

    PubMed

    Jiang, Jia-Jia; Duan, Fa-Jie; Li, Yan-Chao; Hua, Xiang-Ning

    2014-03-01

    Synchronization sampling is very important in underwater towed array system where every acquisition node (AN) samples analog signals by its own analog-digital converter (ADC). In this paper, a simple and effective synchronization sampling method is proposed to ensure synchronized operation among different ANs of the underwater towed array system. We first present a master-slave synchronization sampling model, and then design a high accuracy phase-locked loop to synchronize all delta-sigma ADCs to a reference clock. However, when the master-slave synchronization sampling model is used, both the time-delay (TD) of messages traveling along the wired transmission medium and the jitter of the clocks will bring out synchronization sampling error (SSE). Therefore, a simple method is proposed to estimate and compensate the TD of the messages transmission, and then another effective method is presented to overcome the SSE caused by the jitter of the clocks. An experimental system with three ANs is set up, and the related experimental results verify the validity of the synchronization sampling method proposed in this paper.

  7. Evaluations of the Method to Measure Black Carbon Particles Suspended in Rainwater and Snow Samples

    NASA Astrophysics Data System (ADS)

    Ohata, S.; Moteki, N.; Schwarz, J. P.; Fahey, D. W.; Kondo, Y.

    2012-12-01

    The mass concentrations and size distributions of black carbon (BC) particles in rainwater and snow are important parameters for improved understanding of the wet deposition of BC, is a key process in quantifying the impacts of BC on climate. In this study, we have evaluated a new method to measure these parameters. The approach consists of an ultrasonic nebulizer (USN) used in conjunction with a Single Particle Soot Photometer (SP2). The USN converts sample water into micron-size droplets at a constant rate and then extracts airborne BC particles by dehydrating the water droplets. The mass of individual BC particles is measured by the SP2, based on the laser-induced incandescence technique. The combination of the USN and SP2 enabled the measurement of BC particles using only small amount of sample water, typically 10 ml (Ohata et al., 2011). However, the loss of BC during the extraction process depends on their size. We determined the size-dependent extraction efficiency using polystyrene latex spheres (PSLs) with twelve different diameters between 100-1050 nm. The PSL concentrations in water were determined by the light extinction of at 532nm. The extraction efficiency of the USN showed broad maximum in the diameter range of 200-500nm, and decreased substantially at larger sizes. The extraction efficiency determined using the PSL standards agreed to within ±40% with that determined using laboratory-generated BC concentration standards. We applied this method to the analysis of rainwater collected in Tokyo and Okinawa over the East China Sea. Measured BC size distributions in all rainwater samples showed negligible contribution of the BC particles larger than 600nm to the total BC amounts. However, for BC particles in surface snow collected in Greenland and Antarctica, size distributions were sometimes shifted to much larger size ranges.

  8. Application of ATP-based bioluminescence for bioaerosol quantification: effect of sampling method.

    PubMed

    Han, Taewon; Wren, Melody; DuBois, Kelsey; Therkorn, Jennifer; Mainelis, Gediminas

    2015-12-01

    An adenosine triphosphate (ATP)-based bioluminescence has potential to offer a quick and affordable method for quantifying bioaerosol samples. Here we report on our investigation into how different bioaerosol aerosolization parameters and sampling methods affect bioluminescence output per bacterium, and implications of that effect for bioaerosol research. Bacillus atrophaeus and Pseudomonas fluorescens bacteria were aerosolized by using a Collison nebulizer (BGI Inc., Waltham, MA) with a glass or polycarbonate jar and then collected for 15 and 60 min with: (1) Button Aerosol Sampler (SKC Inc., Eighty Four, PA) with polycarbonate, PTFE, and cellulose nitrate filters, (2) BioSampler (SKC Inc.) with 5 and 20 mL of collection liquid, and (3) our newly developed Electrostatic Precipitator with Superhydrophobic Surface (EPSS). For all aerosolization and sampling parameters we compared the ATP bioluminescence output per bacterium relative to that before aerosolization and sampling. In addition, we also determined the ATP reagent storage and preparation conditions that that do not affect the bioluminescence signal intensity. Our results show that aerosolization by a Collison nebulizer with a polycarbonate jar yields higher bioluminescence output per bacterium compared to the glass jar. Interestingly enough, the bioluminescence output by P. fluorescens increased substantially after its aerosolization compared to the fresh liquid suspension. For both test microorganisms, the bioluminescence intensity per bacterium after sampling was significantly lower than that before sampling suggesting negative effect of sampling stress on bioluminescence output. The decrease in bioluminescence intensity was more pronounces for longer sampling times and significantly and substantially depended on the sampling method. Among the investigated method, the EPSS was the least injurious for both microorganisms and sampling times. While the ATP-based bioluminescence offers a quick bioaerosol

  9. Application of ATP-based bioluminescence for bioaerosol quantification: effect of sampling method

    PubMed Central

    Han, Taewon; Wren, Melody; DuBois, Kelsey; Therkorn, Jennifer; Mainelis, Gediminas

    2015-01-01

    An adenosine triphosphate (ATP)-based bioluminescence has potential to offer a quick and affordable method for quantifying bioaerosol samples. Here we report on our investigation into how different bioaerosol aerosolization parameters and sampling methods affect bioluminescence output per bacterium, and implications of that effect for bioaerosol research. Bacillus atrophaeus and Pseudomonas fluorescens bacteria were aerosolized by using a Collison nebulizer (BGI Inc., Waltham, MA) with a glass or polycarbonate jar and then collected for 15 and 60 min with: (1) Button Aerosol Sampler (SKC Inc., Eighty Four, PA) with polycarbonate, PTFE, and cellulose nitrate filters, (2) BioSampler (SKC Inc.) with 5 and 20 mL of collection liquid, and (3) our newly developed Electrostatic Precipitator with Superhydrophobic Surface (EPSS). For all aerosolization and sampling parameters we compared the ATP bioluminescence output per bacterium relative to that before aerosolization and sampling. In addition, we also determined the ATP reagent storage and preparation conditions that that do not affect the bioluminescence signal intensity. Our results show that aerosolization by a Collison nebulizer with a polycarbonate jar yields higher bioluminescence output per bacterium compared to the glass jar. Interestingly enough, the bioluminescence output by P. fluorescens increased substantially after its aerosolization compared to the fresh liquid suspension. For both test microorganisms, the bioluminescence intensity per bacterium after sampling was significantly lower than that before sampling suggesting negative effect of sampling stress on bioluminescence output. The decrease in bioluminescence intensity was more pronounces for longer sampling times and significantly and substantially depended on the sampling method. Among the investigated method, the EPSS was the least injurious for both microorganisms and sampling times. While the ATP-based bioluminescence offers a quick bioaerosol

  10. A stochastic optimisation method to estimate the spatial distribution of a pathogen from a sample.

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Sampling is of central importance in plant pathology. It facilitates our understanding of how epidemics develop in space and time and can also be used to inform disease management decisions. Making inferences from a sample is necessary because we rarely have the resources to conduct a complete censu...

  11. Electrodeposition as an alternate method for preparation of environmental samples for iodide by AMS

    SciTech Connect

    Adamic, M. L.; Lister, T. E.; Dufek, E. J.; Jenson, D. D.; Olson, J. E.; Vockenhuber, C.; Watrous, M. G.

    2015-03-25

    This paper presents an evaluation of an alternate method for preparing environmental samples for 129I analysis by accelerator mass spectrometry (AMS) at Idaho National Laboratory. The optimal sample preparation method is characterized by ease of preparation, capability of processing very small quantities of iodide, and ease of loading into a cathode. Electrodeposition of iodide on a silver wire was evaluated using these criteria. This study indicates that the electrochemically-formed silver iodide deposits produce ion currents similar to those from precipitated silver iodide for the same sample mass. Furthermore, precipitated silver iodide samples are usually mixed with niobium or silver powder prior to loading in a cathode. Using electrodeposition, the silver is already mixed with the sample and can simply be picked up with tweezers, placed in the sample die, and pressed into a cathode. The major advantage of this method is that the silver wire/electrodeposited silver iodide is much easier to load into a cathode.

  12. Method validation and uncertainty evaluation of organically bound tritium analysis in environmental sample.

    PubMed

    Huang, Yan-Jun; Zeng, Fan; Zhang, Bing; Chen, Chao-Feng; Qin, Hong-Juan; Wu, Lian-Sheng; Guo, Gui-Yin; Yang, Li-Tao; Shang-Guan, Zhi-Hong

    2014-08-01

    The analytical method for organically bound tritium (OBT) was developed in our laboratory. The optimized operating conditions and parameters were established for sample drying, special combustion, distillation, and measurement on a liquid scintillation spectrometer (LSC). Selected types of OBT samples such as rice, corn, rapeseed, fresh lettuce and pork were analyzed for method validation of recovery rate reproducibility, the minimum detection concentration, and the uncertainty for typical low level environmental sample was evaluated. The combustion water recovery rate of different dried environmental sample was kept at about 80%, the minimum detection concentration of OBT ranged from 0.61 to 0.89 Bq/kg (dry weight), depending on the hydrogen content. It showed that this method is suitable for OBT analysis of environmental sample with stable recovery rate, and the combustion water yield of a sample with weight about 40 g would provide sufficient quantity for measurement on LSC.

  13. Large loop conformation sampling using the activation relaxation technique, ART-nouveau method.

    PubMed

    St-Pierre, Jean-François; Mousseau, Normand

    2012-07-01

    We present an adaptation of the ART-nouveau energy surface sampling method to the problem of loop structure prediction. This method, previously used to study protein folding pathways and peptide aggregation, is well suited to the problem of sampling the conformation space of large loops by targeting probable folding pathways instead of sampling exhaustively that space. The number of sampled conformations needed by ART nouveau to find the global energy minimum for a loop was found to scale linearly with the sequence length of the loop for loops between 8 and about 20 amino acids. Considering the linear scaling dependence of the computation cost on the loop sequence length for sampling new conformations, we estimate the total computational cost of sampling larger loops to scale quadratically compared to the exponential scaling of exhaustive search methods.

  14. Comparison of different methods to calculate total runoff and sediment yield based on aliquot sampling from rainfall simulations

    NASA Astrophysics Data System (ADS)

    Tresch, Simon; Fister, Wolfgang; Marzen, Miriam; Kuhn, Nikolaus J.

    2015-04-01

    The quality of data obtained by rainfall experiments depends mainly on the quality of the rainfall simulation itself. However, the best rainfall simulation cannot deliver valuable data, if runoff and sediment discharge from the plot are not sampled at a proper interval or if poor interpolation methods are being used. The safest way to get good results would be to collect all runoff and sediment amounts that come off the plot in the shortest possible intervals. Unfortunately, high rainfall amounts often coincide with limited transport and analysis capacities. Therefore, it is in most cases necessary to find a good compromise between sampling frequency, interpolation method, and available analysis capacities. The aim of this study was to compare different methods to calculate total sediment yield based on aliquot sampling intervals. The methods tested were (1) simple extrapolation of one sample until next sample was collected; (2) averaging between two successive samples; (3) extrapolation of the sediment concentration; (4) extrapolation using a regression function. The results indicate that all methods could, theoretically, be used to calculate total sediment yields, but errors between 10-25% would have to be taken into account for interpretation of the gained data. Highest deviations were always found for the first measurement interval, which shows that it is very important to capture the initial flush of sediment from the plot to be able to calculate reliable total values.

  15. A new enrichment method for isolation of Bacillus thuringiensis from diverse sample types.

    PubMed

    Patel, Ketan D; Bhanshali, Forum C; Chaudhary, Avani V; Ingle, Sanjay S

    2013-05-01

    New or more efficient methodologies having different principles are needed, as one method could not be suitable for isolation of organisms from samples of diverse types and from various environments. In present investigation, growth kinetics study revealed a higher germination rate, a higher growth rate, and maximum sporulation of Bacillus thuringiensis (Bt) compared to other Bacillus species. Considering these facts, a simple and efficient enrichment method was devised which allowed propagation of spores and vegetative cells of Bt and thereby increased Bt cell population proportionately. The new enrichment method yielded Bt from 44 out of 58 samples. Contrarily, Bt was isolated only from 16 and 18 samples by sodium acetate selection and dry heat pretreatment methods, respectively. Moreover, the percentages of Bt colonies isolated by the enrichment method were higher comparatively. Vegetative whole cell protein profile analysis indicated isolation of diverse population of Bt from various samples. Bt strains isolated by the enrichment method represented novel serovars and possibly new cry2 gene.

  16. A modified method for estimation of chemical oxygen demand for samples having high suspended solids.

    PubMed

    Yadvika; Yadav, Asheesh Kumar; Sreekrishnan, T R; Satya, Santosh; Kohli, Sangeeta

    2006-03-01

    Determination of chemical oxygen demand (COD) of samples having high suspended solids concentration such as cattle dung slurry with open reflux method of APHA-AWWA-WPCF did not give consistent results. This study presents a modification of the open reflux method (APHA-AWWA-WPCF) to make it suitable for samples with high percentage of suspended solids. The new method is based on a different technique of sample preparation, modified quantities of reagents and higher reflux time as compared to the existing open reflux method. For samples having solids contents of 14.0 g/l or higher, the modified method was found to give higher value of COD with much higher consistency and accuracy as compared to the existing open reflux method.

  17. Melting Temperature Mapping Method: A Novel Method for Rapid Identification of Unknown Pathogenic Microorganisms within Three Hours of Sample Collection.

    PubMed

    Niimi, Hideki; Ueno, Tomohiro; Hayashi, Shirou; Abe, Akihito; Tsurue, Takahiro; Mori, Masashi; Tabata, Homare; Minami, Hiroshi; Goto, Michihiko; Akiyama, Makoto; Yamamoto, Yoshihiro; Saito, Shigeru; Kitajima, Isao

    2015-07-28

    Acquiring the earliest possible identification of pathogenic microorganisms is critical for selecting the appropriate antimicrobial therapy in infected patients. We herein report the novel "melting temperature (Tm) mapping method" for rapidly identifying the dominant bacteria in a clinical sample from sterile sites. Employing only seven primer sets, more than 100 bacterial species can be identified. In particular, using the Difference Value, it is possible to identify samples suitable for Tm mapping identification. Moreover, this method can be used to rapidly diagnose the absence of bacteria in clinical samples. We tested the Tm mapping method using 200 whole blood samples obtained from patients with suspected sepsis, 85% (171/200) of which matched the culture results based on the detection level. A total of 130 samples were negative according to the Tm mapping method, 98% (128/130) of which were also negative based on the culture method. Meanwhile, 70 samples were positive according to the Tm mapping method, and of the 59 suitable for identification, 100% (59/59) exhibited a "match" or "broad match" with the culture or sequencing results. These findings were obtained within three hours of whole blood collection. The Tm mapping method is therefore useful for identifying infectious diseases requiring prompt treatment.

  18. 40 CFR 80.1645 - Sample retention requirements for producers and importers of denaturant designated as suitable...

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... producers and importers of denaturant designated as suitable for the manufacture of denatured fuel ethanol... suitable for the manufacture of denatured fuel ethanol meeting federal quality requirements. Beginning January 1, 2017, or on the first day that any producer or importer of ethanol denaturant designates...

  19. Guidance for characterizing explosives contaminated soils: Sampling and selecting on-site analytical methods

    SciTech Connect

    Crockett, A.B.; Craig, H.D.; Jenkins, T.F.; Sisk, W.E.

    1996-09-01

    A large number of defense-related sites are contaminated with elevated levels of secondary explosives. Levels of contamination range from barely detectable to levels above 10% that need special handling due to the detonation potential. Characterization of explosives-contaminated sites is particularly difficult due to the very heterogeneous distribution of contamination in the environment and within samples. To improve site characterization, several options exist including collecting more samples, providing on-site analytical data to help direct the investigation, compositing samples, improving homogenization of samples, and extracting larger samples. On-site analytical methods are essential to more economical and improved characterization. On-site methods might suffer in terms of precision and accuracy, but this is more than offset by the increased number of samples that can be run. While verification using a standard analytical procedure should be part of any quality assurance program, reducing the number of samples analyzed by the more expensive methods can result in significantly reduced costs. Often 70 to 90% of the soil samples analyzed during an explosives site investigation do not contain detectable levels of contamination. Two basic types of on-site analytical methods are in wide use for explosives in soil, calorimetric and immunoassay. Calorimetric methods generally detect broad classes of compounds such as nitroaromatics or nitramines, while immunoassay methods are more compound specific. Since TNT or RDX is usually present in explosive-contaminated soils, the use of procedures designed to detect only these or similar compounds can be very effective.

  20. A bone sample cleaning method using trypsin for the isolation of DNA.

    PubMed

    Li, Richard; Liriano, Lidissy

    2011-11-01

    Cleaning the surface of bone samples is a necessary step to remove contaminants prior to isolating DNA for forensic DNA analysis. In this study, a simple trypsin method for cleaning bone samples prior to DNA isolation was developed. Cleaning the surface of human bone samples was achieved by the application of trypsin solution. Light microscopy and scanning electron microscopy results indicated that trypsin treatment was effective in removing the outer surface of bone samples. The yield of DNA isolated from trypsin-treated bone samples was sufficient for subsequent short tandem repeat (STR) analysis. STR analysis revealed no adverse effect on the DNA profile after the trypsin treatment. The data suggest that this trypsin method can potentially be an alternative cleaning method to mechanical cleaning methods.

  1. Estimating the Importance of Private Adaptation to Climate Change in Agriculture: A Review of Empirical Methods

    NASA Astrophysics Data System (ADS)

    Moore, F.; Burke, M.

    2015-12-01

    A wide range of studies using a variety of methods strongly suggest that climate change will have a negative impact on agricultural production in many areas. Farmers though should be able to learn about a changing climate and to adjust what they grow and how they grow it in order to reduce these negative impacts. However, it remains unclear how effective these private (autonomous) adaptations will be, or how quickly they will be adopted. Constraining the uncertainty on this adaptation is important for understanding the impacts of climate change on agriculture. Here we review a number of empirical methods that have been proposed for understanding the rate and effectiveness of private adaptation to climate change. We compare these methods using data on agricultural yields in the United States and western Europe.

  2. The importance of optical methods for non-invasive measurements in the skin care industry

    NASA Astrophysics Data System (ADS)

    Stamatas, Georgios N.

    2010-02-01

    Pharmaceutical and cosmetic industries are concerned with treating skin disease, as well as maintaining and promoting skin health. They are dealing with a unique tissue that defines our body in space. As such, skin provides not only the natural boundary with the environment inhibiting body dehydration as well as penetration of exogenous aggressors to the body, it is also ideally situated for optical measurements. A plurality of spectroscopic and imaging methods is being used to understand skin physiology and pathology and document the effects of topically applied products on the skin. The obvious advantage of such methods over traditional biopsy techniques is the ability to measure the cutaneous tissue in vivo and non-invasively. In this work, we will review such applications of various spectroscopy and imaging methods in skin research that is of interest the cosmetic and pharmaceutical industry. Examples will be given on the importance of optical techniques in acquiring new insights about acne pathogenesis and infant skin development.

  3. Targeting prohibited substances in doping control blood samples by means of chromatographic-mass spectrometric methods.

    PubMed

    Thevis, Mario; Thomas, Andreas; Schänzer, Wilhelm

    2013-12-01

    Urine samples have been the predominant matrix for doping controls for several decades. However, owing to the complementary information provided by blood (as well as serum or plasma and dried blood spots (DBS)), the benefits of its analysis have resulted in continuously increasing appreciation by anti-doping authorities. On the one hand, blood samples allow for the detection of various different methods of blood doping and the abuse of erythropoiesis-stimulating agents (ESAs) via the Athlete Biological Passport; on the other hand, targeted and non-targeted drug detection by means of chromatographic-mass spectrometric methods represents an important tool to increase doping control frequencies out-of-competition and to determine drug concentrations particularly in in-competition scenarios. Moreover, blood analysis seldom requires in-depth knowledge of drug metabolism, and the intact substance rather than potentially unknown or assumed metabolic products can be targeted. In this review, the recent developments in human sports drug testing concerning mass spectrometry-based techniques for qualitative and quantitative analyses of therapeutics and emerging drug candidates are summarized and reviewed. The analytical methods include both low and high molecular mass compounds (e.g., anabolic agents, stimulants, metabolic modulators, peptide hormones, and small interfering RNA (siRNA)) determined from serum, plasma, and DBS using state-of-the-art instrumentation such as liquid chromatography (LC)-high resolution/high accuracy (tandem) mass spectrometry (LC-HRMS), LC-low resolution tandem mass spectrometry (LC-MS/MS), and gas chromatography-mass spectrometry (GC-MS).

  4. Utility of the microculture method for Leishmania detection in non-invasive samples obtained from a blood bank.

    PubMed

    Ates, Sezen Canim; Bagirova, Malahat; Allahverdiyev, Adil M; Kocazeybek, Bekir; Kosan, Erdogan

    2013-10-01

    In recent years, the role of donor blood has taken an important place in epidemiology of Leishmaniasis. According to the WHO, the numbers of patients considered as symptomatic are only 5-20% of individuals with asymptomatic leishmaniasis. In this study for detection of Leishmania infection in donor blood samples, 343 samples from the Capa Red Crescent Blood Center were obtained and primarily analyzed by microscopic and serological methods. Subsequently, the traditional culture (NNN), Immuno-chromatographic test (ICT) and Polymerase Chain Reaction (PCR) methods were applied to 21 samples which of them were found positive with at least one method. Buffy coat (BC) samples from 343 blood donors were analyzed: 15 (4.3%) were positive by a microculture method (MCM); and 4 (1.1%) by smear. The sera of these 343 samples included 9 (2.6%) determined positive by ELISA and 7 (2%) positive by IFAT. Thus, 21 of (6.1%) the 343 subjects studied by smear, MCM, IFAT and ELISA techniques were identified as positive for leishmaniasis at least one of the techniques and the sensitivity assessed. According to our data, the sensitivity of the methods are identified as MCM (71%), smear (19%), IFAT (33%), ELISA (42%), NNN (4%), PCR (14%) and ICT (4%). Thus, with this study for the first time, the sensitivity of a MCM was examined in blood donors by comparing MCM with the methods used in the diagnosis of leishmaniasis. As a result, MCM was found the most sensitive method for detection of Leishmania parasites in samples obtained from a blood bank. In addition, the presence of Leishmania parasites was detected in donor bloods in Istanbul, a non-endemic region of Turkey, and these results is a vital importance for the health of blood recipients.

  5. An effective method based on dynamic sampling for data assimilation in a global wave model

    NASA Astrophysics Data System (ADS)

    Sun, Meng; Yin, Xunqiang; Yang, Yongzeng; Wu, Kejian

    2017-01-01

    The ensemble Kalman filter (EnKF) performs well because that the covariance of background error is varying along time. It provides a dynamic estimate of background error and represents the reasonable statistic characters of background error. However, high computational cost due to model ensemble in EnKF is employed. In this study, two methods referred as static and dynamic sampling methods are proposed to obtain a good performance and reduce the computation cost. Ensemble adjustment Kalman filter (EAKF) method is used in a global surface wave model to examine the performance of EnKF. The 24-h interval difference of simulated significant wave height (SWH) within 1 year is used to compose the static samples for ensemble errors, and these errors are used to construct the ensemble states at each time the observations are available. And then, the same method of updating the model states in the EAKF is applied for the ensemble states constructed by a static sampling method. The dynamic sampling method employs a similar method to construct the ensemble states, but the period of the simulated SWH is changing with time. Here, 7 days before and after the observation time is used as this period. To examine the performance of three schemes, EAKF, static, or dynamic sampling method, observations from satellite Jason-2 in 2014 are assimilated into a global wave model, and observations from satellite Saral are used for validation. The results indicate that the EAKF performs best, while the static sampling method is relatively worse. The dynamic sampling method improves an assimilation effect dramatically compared to the static sampling method, and its overall performance is closed to the EAKF. In low latitudes, the dynamic sampling method has a slight advantage over the EAKF. In the dynamic or static sampling methods, only one wave model is required to run and their computational cost is reduced sharply. According to the performance of these three methods, the dynamic sampling

  6. RAPID FUSION METHOD FOR DETERMINATION OF PLUTONIUM ISOTOPES IN LARGE RICE SAMPLES

    SciTech Connect

    Maxwell, S.

    2013-03-01

    A new rapid fusion method for the determination of plutonium in large rice samples has been developed at the Savannah River National Laboratory (Aiken, SC, USA) that can be used to determine very low levels of plutonium isotopes in rice. The recent accident at Fukushima Nuclear Power Plant in March, 2011 reinforces the need to have rapid, reliable radiochemical analyses for radionuclides in environmental and food samples. Public concern regarding foods, particularly foods such as rice in Japan, highlights the need for analytical techniques that will allow very large sample aliquots of rice to be used for analysis so that very low levels of plutonium isotopes may be detected. The new method to determine plutonium isotopes in large rice samples utilizes a furnace ashing step, a rapid sodium hydroxide fusion method, a lanthanum fluoride matrix removal step, and a column separation process with TEVA Resin cartridges. The method can be applied to rice sample aliquots as large as 5 kg. Plutonium isotopes can be determined using alpha spectrometry or inductively-coupled plasma mass spectrometry (ICP-MS). The method showed high chemical recoveries and effective removal of interferences. The rapid fusion technique is a rugged sample digestion method that ensures that any refractory plutonium particles are effectively digested. The MDA for a 5 kg rice sample using alpha spectrometry is 7E-5 mBq g{sup -1}. The method can easily be adapted for use by ICP-MS to allow detection of plutonium isotopic ratios.

  7. Rapid method to determine actinides and 89/90Sr in limestone and marble samples

    DOE PAGES

    Maxwell, Sherrod L.; Culligan, Brian; Hutchison, Jay B.; ...

    2016-04-12

    A new method for the determination of actinides and radiostrontium in limestone and marble samples has been developed that utilizes a rapid sodium hydroxide fusion to digest the sample. Following rapid pre-concentration steps to remove sample matrix interferences, the actinides and 89/90Sr are separated using extraction chromatographic resins and measured radiometrically. The advantages of sodium hydroxide fusion versus other fusion techniques will be discussed. Lastly, this approach has a sample preparation time for limestone and marble samples of <4 hours.

  8. Melting Temperature Mapping Method: A Novel Method for Rapid Identification of Unknown Pathogenic Microorganisms within Three Hours of Sample Collection

    PubMed Central

    Niimi, Hideki; Ueno, Tomohiro; Hayashi, Shirou; Abe, Akihito; Tsurue, Takahiro; Mori, Masashi; Tabata, Homare; Minami, Hiroshi; Goto, Michihiko; Akiyama, Makoto; Yamamoto, Yoshihiro; Saito, Shigeru; Kitajima, Isao

    2015-01-01

    Acquiring the earliest possible identification of pathogenic microorganisms is critical for selecting the appropriate antimicrobial therapy in infected patients. We herein report the novel “melting temperature (Tm) mapping method” for rapidly identifying the dominant bacteria in a clinical sample from sterile sites. Employing only seven primer sets, more than 100 bacterial species can be identified. In particular, using the Difference Value, it is possible to identify samples suitable for Tm mapping identification. Moreover, this method can be used to rapidly diagnose the absence of bacteria in clinical samples. We tested the Tm mapping method using 200 whole blood samples obtained from patients with suspected sepsis, 85% (171/200) of which matched the culture results based on the detection level. A total of 130 samples were negative according to the Tm mapping method, 98% (128/130) of which were also negative based on the culture method. Meanwhile, 70 samples were positive according to the Tm mapping method, and of the 59 suitable for identification, 100% (59/59) exhibited a “match” or “broad match” with the culture or sequencing results. These findings were obtained within three hours of whole blood collection. The Tm mapping method is therefore useful for identifying infectious diseases requiring prompt treatment. PMID:26218169

  9. The effects of inference method, population sampling, and gene sampling on species tree inferences: an empirical study in slender salamanders (Plethodontidae: Batrachoseps).

    PubMed

    Jockusch, Elizabeth L; Martínez-Solano, Iñigo; Timpe, Elizabeth K

    2015-01-01

    Species tree methods are now widely used to infer the relationships among species from multilocus data sets. Many methods have been developed, which differ in whether gene and species trees are estimated simultaneously or sequentially, and in how gene trees are used to infer the species tree. While these methods perform well on simulated data, less is known about what impacts their performance on empirical data. We used a data set including five nuclear genes and one mitochondrial gene for 22 species of Batrachoseps to compare the effects of method of analysis, within-species sampling and gene sampling on species tree inferences. For this data set, the choice of inference method had the largest effect on the species tree topology. Exclusion of individual loci had large effects in *BEAST and STEM, but not in MP-EST. Different loci carried the greatest leverage in these different methods, showing that the causes of their disproportionate effects differ. Even though substantial information was present in the nuclear loci, the mitochondrial gene dominated the *BEAST species tree. This leverage is inherent to the mtDNA locus and results from its high variation and lower assumed ploidy. This mtDNA leverage may be problematic when mtDNA has undergone introgression, as is likely in this data set. By contrast, the leverage of RAG1 in STEM analyses does not reflect properties inherent to the locus, but rather results from a gene tree that is strongly discordant with all others, and is best explained by introgression between distantly related species. Within-species sampling was also important, especially in *BEAST analyses, as shown by differences in tree topology across 100 subsampled data sets. Despite the sensitivity of the species tree methods to multiple factors, five species groups, the relationships among these, and some relationships within them, are generally consistently resolved for Batrachoseps.

  10. Testing sample stability using four storage methods and the macroalgae Ulva and Gracilaria

    EPA Science Inventory

    Concern over the relative importance of different sample preparation and storage techniques frequently used in stable isotope analysis of particulate nitrogen (δ15N) and carbon (δ13C) prompted an experiment to determine how important such factors were to measured values in marine...

  11. Probe Heating Method for the Analysis of Solid Samples Using a Portable Mass Spectrometer

    PubMed Central

    Kumano, Shun; Sugiyama, Masuyuki; Yamada, Masuyoshi; Nishimura, Kazushige; Hasegawa, Hideki; Morokuma, Hidetoshi; Inoue, Hiroyuki; Hashimoto, Yuichiro

    2015-01-01

    We previously reported on the development of a portable mass spectrometer for the onsite screening of illicit drugs, but our previous sampling system could only be used for liquid samples. In this study, we report on an attempt to develop a probe heating method that also permits solid samples to be analyzed using a portable mass spectrometer. An aluminum rod is used as the sampling probe. The powdered sample is affixed to the sampling probe or a droplet of sample solution is placed on the tip of the probe and dried. The probe is then placed on a heater to vaporize the sample. The vapor is then introduced into the portable mass spectrometer and analyzed. With the heater temperature set to 130°C, the developed system detected 1 ng of methamphetamine, 1 ng of amphetamine, 3 ng of 3,4-methylenedioxymethamphetamine, 1 ng of 3,4-methylenedioxyamphetamine, and 0.3 ng of cocaine. Even from mixtures consisting of clove powder and methamphetamine powder, methamphetamine ions were detected by tandem mass spectrometry. The developed probe heating method provides a simple method for the analysis of solid samples. A portable mass spectrometer incorporating this method would thus be useful for the onsite screening of illicit drugs. PMID:26819909

  12. Probe Heating Method for the Analysis of Solid Samples Using a Portable Mass Spectrometer.

    PubMed

    Kumano, Shun; Sugiyama, Masuyuki; Yamada, Masuyoshi; Nishimura, Kazushige; Hasegawa, Hideki; Morokuma, Hidetoshi; Inoue, Hiroyuki; Hashimoto, Yuichiro

    2015-01-01

    We previously reported on the development of a portable mass spectrometer for the onsite screening of illicit drugs, but our previous sampling system could only be used for liquid samples. In this study, we report on an attempt to develop a probe heating method that also permits solid samples to be analyzed using a portable mass spectrometer. An aluminum rod is used as the sampling probe. The powdered sample is affixed to the sampling probe or a droplet of sample solution is placed on the tip of the probe and dried. The probe is then placed on a heater to vaporize the sample. The vapor is then introduced into the portable mass spectrometer and analyzed. With the heater temperature set to 130°C, the developed system detected 1 ng of methamphetamine, 1 ng of amphetamine, 3 ng of 3,4-methylenedioxymethamphetamine, 1 ng of 3,4-methylenedioxyamphetamine, and 0.3 ng of cocaine. Even from mixtures consisting of clove powder and methamphetamine powder, methamphetamine ions were detected by tandem mass spectrometry. The developed probe heating method provides a simple method for the analysis of solid samples. A portable mass spectrometer incorporating this method would thus be useful for the onsite screening of illicit drugs.

  13. What is important, what needs treating? How GPs perceive older patients’ multiple health problems: a mixed method research study

    PubMed Central

    2012-01-01

    Background GPs increasingly deal with multiple health problems of their older patients. They have to apply a hierarchical management approach that considers priorities to balance competing needs for treatment. Yet, the practice of setting individual priorities in older patients is largely unexplored. This paper analyses the GPs’ perceptions on important and unimportant health problems and how these affect their treatment. Methods GPs appraised the importance of health problems for a purposive sample of their older patients in semi-structured interviews. Prior to the interviews, the GPs had received a list of their patients’ health problems resulting from a geriatric assessment and were asked to rate the importance of each identified problem. In the interviews the GPs subsequently explained why they considered certain health problems important or not and how this affected treatment. Data was analysed using qualitative content analysis and quantitative methods. Results The problems GPs perceive as important are those that are medical and require active treatment or monitoring, or that induce empathy or awareness but cannot be assisted further. Unimportant problems are those that are well managed problems and need no further attention as well as age-related conditions or functional disabilities that provoke fatalism, or those considered outside the GPs’ responsibility. Statements of professional actions are closely linked to explanations of important problems and relate to physical problems rather than functional and social patient issues. Conclusions GPs tend to prioritise treatable clinical conditions. Treatment approaches are, however, vague or missing for complex chronic illnesses and disabilities. Here, patient empowerment strategies are of value and need to be developed and implemented. The professional concepts of ageing and disability should not impede but rather foster treatment and care. To this end, GPs need to be able to delegate care to a

  14. Development and Validation of Chronopotentiometric Method for Imidacloprid Determination in Pesticide Formulations and River Water Samples

    PubMed Central

    Đurović, Ana; Stojanović, Zorica; Kravić, Snežana; Grahovac, Nada; Bursić, Vojislava; Vuković, Gorica; Suturović, Zvonimir

    2016-01-01

    A new electrochemical method for determination of imidacloprid using chronopotentiometry on thin film mercury and glassy carbon electrode was presented. The most important experimental parameters of chronopotentiometry were examined and optimized with respect to imidacloprid analytical signal. Imidacloprid provided well-defined reduction peak in Britton-Robinson buffer on thin film mercury electrode at −1.0 V (versus Ag/AgCl (KCl, 3.5 mol/L)) and on glassy carbon electrode at −1.2 V (versus Ag/AgCl (KCl, 3.5 mol/L)). The reduction time was linearly proportional to concentrations from 0.8 to 30.0 mg/L on thin film mercury electrode and from 7.0 to 70.0 mg/L on glassy carbon electrode. The detection limits were 0.17 mg/L and 0.93 mg/L for thin film mercury and glassy carbon electrode, respectively. The estimation of method precision as a function of repeatability and reproducibility showed relative standard deviations values lower than 3.73%. Recovery values from 97.3 to 98.1% confirmed the accuracy of the proposed method, while the constancy of the transition time with deliberated small changes in the experimental parameters indicated a very good robustness. A minor influence of possible interfering compounds proved good selectivity of the method. Developed method was applied for imidacloprid determination in commercial pesticide formulations and river water samples. PMID:27042181

  15. INVESTIGATION OF THE TOTAL ORGANIC HALOGEN ANALYTICAL METHOD AT THE WASTE SAMPLING CHARACTERIZATION FACILITY (WSCF)

    SciTech Connect

    DOUGLAS JG; MEZNARICH HD, PHD; OLSEN JR; ROSS GA; STAUFFER M

    2008-09-30

    Total organic halogen (TOX) is used as a parameter to screen groundwater samples at the Hanford Site. Trending is done for each groundwater well, and changes in TOX and other screening parameters can lead to costly changes in the monitoring protocol. The Waste Sampling and Characterization Facility (WSCF) analyzes groundwater samples for TOX using the United States Environmental Protection Agency (EPA) SW-846 method 9020B (EPA 1996a). Samples from the Soil and Groundwater Remediation Project (S&GRP) are submitted to the WSCF for analysis without information regarding the source of the sample; each sample is in essence a 'blind' sample to the laboratory. Feedback from the S&GRP indicated that some of the WSCF-generated TOX data from groundwater wells had a number of outlier values based on the historical trends (Anastos 2008a). Additionally, analysts at WSCF observed inconsistent TOX results among field sample replicates. Therefore, the WSCF lab performed an investigation of the TOX analysis to determine the cause of the outlier data points. Two causes were found that contributed to generating out-of-trend TOX data: (1) The presence of inorganic chloride in the groundwater samples: at inorganic chloride concentrations greater than about 10 parts per million (ppm), apparent TOX values increase with increasing chloride concentration. A parallel observation is the increase in apparent breakthrough of TOX from the first to the second activated-carbon adsorption tubes with increasing inorganic chloride concentration. (2) During the sample preparation step, excessive purging of the adsorption tubes with oxygen pressurization gas after sample loading may cause channeling in the activated-carbon bed. This channeling leads to poor removal of inorganic chloride during the subsequent wash step with aqueous potassium nitrate. The presence of this residual inorganic chloride then produces erroneously high TOX values. Changes in sample preparation were studied to more effectively

  16. Apparatus and method for maintaining multi-component sample gas constituents in vapor phase during sample extraction and cooling

    DOEpatents

    Felix, Larry Gordon; Farthing, William Earl; Irvin, James Hodges; Snyder, Todd Robert

    2010-05-11

    A dilution apparatus for diluting a gas sample. The apparatus includes a sample gas conduit having a sample gas inlet end and a diluted sample gas outlet end, and a sample gas flow restricting orifice disposed proximate the sample gas inlet end connected with the sample gas conduit and providing fluid communication between the exterior and the interior of the sample gas conduit. A diluted sample gas conduit is provided within the sample gas conduit having a mixing end with a mixing space inlet opening disposed proximate the sample gas inlet end, thereby forming an annular space between the sample gas conduit and the diluted sample gas conduit. The mixing end of the diluted sample gas conduit is disposed at a distance from the sample gas flow restricting orifice. A dilution gas source connected with the sample gas inlet end of the sample gas conduit is provided for introducing a dilution gas into the annular space, and a filter is provided for filtering the sample gas. The apparatus is particularly suited for diluting heated sample gases containing one or more condensable components.

  17. Methods, compounds and systems for detecting a microorganism in a sample

    DOEpatents

    Colston, Jr, Bill W.; Fitch, J. Patrick; Gardner, Shea N.; Williams, Peter L.; Wagner, Mark C.

    2016-09-06

    Methods to identify a set of probe polynucleotides suitable for detecting a set of targets and in particular methods for identification of primers suitable for detection of target microorganisms related polynucleotides, set of polynucleotides and compositions, and related methods and systems for detection and/or identification of microorganisms in a sample.

  18. A COMPARISON OF SIX BENTHIC MACROINVERTEBRATE SAMPLING METHODS IN FOUR LARGE RIVERS

    EPA Science Inventory

    In 1999, a study was conducted to compare six macroinvertebrate sampling methods in four large (boatable) rivers that drain into the Ohio River. Two methods each were adapted from existing methods used by the USEPA, USGS and Ohio EPA. Drift nets were unable to collect a suffici...

  19. Comparison of dust sampling methods in Estonia and Sweden--a field study.

    PubMed

    Berg, P; Jaakmees, V; Bodin, L

    1999-09-01

    The purpose of this field study was to compare an Estonian dust sampling method, a method also used in other former East Block countries, with a Swedish method and to estimate inter-method agreement with statistical analyses. The Estonian standard method (ESM), used to assess exposure in Estonia since the early 1950s, is based on a strategy where air samples are collected for 10 minutes every hour over a full shift. This method was compared to a Swedish standard method (SSM), a modified NIOSH method, comparable to international standards, where one air sample is collected during a full shift. The study was carried out at a cement plant that in the beginning of the 1990s was subjected to an epidemiological study, including collection of exposure data. The results of the analysis from 31 clusters of parallel samples of the two methods, when dust consisting of Portland cement was collected, showed a relatively weak correlation between the SSM and the ESM, ri = 0.81 (Pearson's intra-class correlation coefficient). A conversion factor between the two methods was estimated, where SSM is 0.69 times ESM and the limits of agreement are 0.25 and 1.84, respectively. These results indicate a substantial inter-method difference. We therefore recommend that measurements obtained from the two methods should not be used interchangeably. Because the present study is of limited extent, our findings are confined to the operations studied and further studies covering other exposure situations will be needed.

  20. High-throughput DNA isolation method for detection of Xylella fastidiosa in plant and insect samples.

    PubMed

    Brady, Jeff A; Faske, Jennifer B; Castañeda-Gill, Jessica M; King, Jonathan L; Mitchell, Forrest L

    2011-09-01

    We report an inexpensive, high-throughput method for isolating DNA from insect and plant samples for the purpose of detecting Xylella fastidiosa infection. Existing methods often copurify inhibitors of DNA polymerases, limiting their usefulness for PCR-based detection assays. When compared to commercially available kits, the method provides enhanced pathogen detection at a fraction of the cost.

  1. 40 CFR 80.580 - What are the sampling and testing methods for sulfur?

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... methods for sulfur? 80.580 Section 80.580 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY... the sampling and testing methods for sulfur? The sulfur content of diesel fuel and diesel fuel... methodology is provided in § 80.330(b). (b) Test method for sulfur—(1) For ECA marine fuel subject to the...

  2. 40 CFR 80.580 - What are the sampling and testing methods for sulfur?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... methods for sulfur? 80.580 Section 80.580 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY... the sampling and testing methods for sulfur? The sulfur content of diesel fuel and diesel fuel... methodology is provided in § 80.330(b). (b) Test method for sulfur—(1) For ECA marine fuel subject to the...

  3. 40 CFR 80.580 - What are the sampling and testing methods for sulfur?

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... methods for sulfur? 80.580 Section 80.580 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY... the sampling and testing methods for sulfur? The sulfur content of diesel fuel and diesel fuel... methodology is provided in § 80.330(b). (b) Test method for sulfur—(1) For ECA marine fuel subject to the...

  4. A comparison of microscopic and spectroscopic identification methods for analysis of microplastics in environmental samples.

    PubMed

    Song, Young Kyoung; Hong, Sang Hee; Jang, Mi; Han, Gi Myung; Rani, Manviri; Lee, Jongmyoung; Shim, Won Joon

    2015-04-15

    The analysis of microplastics in various environmental samples requires the identification of microplastics from natural materials. The identification technique lacks a standardized protocol. Herein, stereomicroscope and Fourier transform infrared spectroscope (FT-IR) identification methods for microplastics (<1mm) were compared using the same samples from the sea surface microlayer (SML) and beach sand. Fragmented microplastics were significantly (p<0.05) underestimated and fiber was significantly overestimated using the stereomicroscope both in the SML and beach samples. The total abundance by FT-IR was higher than by microscope both in the SML and beach samples, but they were not significantly (p>0.05) different. Depending on the number of samples and the microplastic size range of interest, the appropriate identification method should be determined; selecting a suitable identification method for microplastics is crucial for evaluating microplastic pollution.

  5. Hyperspectral data influenced by sample matrix: the importance of building relevant reference spectral libraries to map materials of interest.

    PubMed

    Dillon, James C K; Bezerra, Leonardo; Del Pilar Sosa Peña, María; Neu-Baker, Nicole M; Brenner, Sara A

    2017-01-31

    Hyperspectral imaging (HSI) and mapping are increasingly used for visualization and identification of nanoparticles (NPs) in a variety of matrices, including aqueous suspensions and biological samples. Reference spectral libraries (RSLs) contain hyperspectral data collected from materials of known composition and are used to detect the known materials in experimental samples through a one-to-one pixel "mapping" process. In some HSI studies, RSLs created from raw NPs were used to map NPs in experimental samples in a different matrix; for example, RSLs created from NPs in suspension to map NPs in biological tissue. Others have utilized RSLs created from NPs in the same matrix. However, few studies have systematically compared hyperspectral data as a function of the matrix in which the NPs are found and its impact on mapping results. The objective of this study is to compare RSLs created from metal oxide NPs in aqueous suspensions to RSLs created from the same NPs in rat tissues following in vivo inhalation exposure, and to investigate the differences in mapping that result from the use of each RSL. Results demonstrate that the spectral profiles of these NPs are matrix dependent: RSLs created from NPs in positive control tissues mapped to experimental tissues more appropriately than RSLs created from NPs in suspension. Aqueous suspension RSLs mapped 0-602 out of 500,424 pixels per tissue image while tissue RSLs mapped 689-18,435 pixels for the same images. This study underscores the need for appropriate positive controls for the creation of RSLs for mapping NPs in experimental samples.

  6. A SIMPLE METHOD FOR DETERMINATION OF CAFFEINE CONTENT IN TEA SAMPLES

    PubMed Central

    Venkatesh, Sama; Swamy, M.M.; Reddy, Y.S.R.; Suresh, B.; Sethuraman, M.

    1994-01-01

    The present communication describes a simple and modified colorimetric procedure for the estimation of caffeine content in both commercial and locally available tea samples. Comparative data of caffeine content in different brands of tea samples are shown here. The present method is no doubt an improvised procedure for estimating directly caffeine content from the tea extracts. A possible explanation to account for the variability in caffeine content in different samples is offered. PMID:22556672

  7. Probabilistic Requirements (Partial) Verification Methods Best Practices Improvement. Variables Acceptance Sampling Calculators: Empirical Testing. Volume 2

    NASA Technical Reports Server (NTRS)

    Johnson, Kenneth L.; White, K. Preston, Jr.

    2012-01-01

    The NASA Engineering and Safety Center was requested to improve on the Best Practices document produced for the NESC assessment, Verification of Probabilistic Requirements for the Constellation Program, by giving a recommended procedure for using acceptance sampling by variables techniques as an alternative to the potentially resource-intensive acceptance sampling by attributes method given in the document. In this paper, the results of empirical tests intended to assess the accuracy of acceptance sampling plan calculators implemented for six variable distributions are presented.

  8. A unified method to process biosolids samples for the recovery of bacterial, viral, and helminths pathogens.

    PubMed

    Alum, Absar; Rock, Channah; Abbaszadegan, Morteza

    2014-01-01

    For land application, biosolids are classified as Class A or Class B based on the levels of bacterial, viral, and helminths pathogens in residual biosolids. The current EPA methods for the detection of these groups of pathogens in biosolids include discrete steps. Therefore, a separate sample is processed independently to quantify the number of each group of the pathogens in biosolids. The aim of the study was to develop a unified method for simultaneous processing of a single biosolids sample to recover bacterial, viral, and helminths pathogens. At the first stage for developing a simultaneous method, nine eluents were compared for their efficiency to recover viruses from a 100 gm spiked biosolids sample. In the second stage, the three top performing eluents were thoroughly evaluated for the recovery of bacteria, viruses, and helminthes. For all three groups of pathogens, the glycine-based eluent provided higher recovery than the beef extract-based eluent. Additional experiments were performed to optimize performance of glycine-based eluent under various procedural factors such as, solids to eluent ratio, stir time, and centrifugation conditions. Last, the new method was directly compared with the EPA methods for the recovery of the three groups of pathogens spiked in duplicate samples of biosolids collected from different sources. For viruses, the new method yielded up to 10% higher recoveries than the EPA method. For bacteria and helminths, recoveries were 74% and 83% by the new method compared to 34% and 68% by the EPA method, respectively. The unified sample processing method significantly reduces the time required for processing biosolids samples for different groups of pathogens; it is less impacted by the intrinsic variability of samples, while providing higher yields (P = 0.05) and greater consistency than the current EPA methods.

  9. Application of the van der Pauw method for samples with holes

    NASA Astrophysics Data System (ADS)

    Oh, Donggeon; Ahn, Cheongeung; Kim, Minchul; Park, Eung-Kyu; Kim, Yong-Sang

    2016-12-01

    Modifications of the original van der Pauw relation were suggested recently. The methods are applicable to samples with a hole, unlike the original van der Pauw relation, but it takes too much time and effort to apply the methods to samples with high randomness. So in this paper we suggest two generalizations of the van der Pauw method which are applicable for two-dimensional homogeneous systems with a finite number of holes. Both methods rely on obtaining a crucial constant of the system, ν. The first method involves setting the probes far from each other while conducting the experiment using a sample with a small hole, approximating a relation that gives ν as a linear function of the area ratio of the hole only. The second method produces an identical sample of known resistivity and thickness to obtain ν, which is believed to be dependent on the geometrical properties only. Unlike the earlier methods, which entailed complex procedures, little in the way of measurements and computation is needed for the new methods. The methods will be very useful in electric experiments or industries which need to measure the resistivity of samples.

  10. A venue-based method for sampling hard-to-reach populations.

    PubMed Central

    Muhib, F. B.; Lin, L. S.; Stueve, A.; Miller, R. L.; Ford, W. L.; Johnson, W. D.; Smith, P. J.

    2001-01-01

    Constructing scientifically sound samples of hard-to-reach populations, also known as hidden populations, is a challenge for many research projects. Traditional sample survey methods, such as random sampling from telephone or mailing lists, can yield low numbers of eligible respondents while non-probability sampling introduces unknown biases. The authors describe a venue-based application of time-space sampling (TSS) that addresses the challenges of accessing hard-to-reach populations. The method entails identifying days and times when the target population gathers at specific venues, constructing a sampling frame of venue, day-time units (VDTs), randomly selecting and visiting VDTs (the primary sampling units), and systematically intercepting and collecting information from consenting members of the target population. This allows researchers to construct a sample with known properties, make statistical inference to the larger population of venue visitors, and theorize about the introduction of biases that may limit generalization of results to the target population. The authors describe their use of TSS in the ongoing Community Intervention Trial for Youth (CITY) project to generate a systematic sample of young men who have sex with men. The project is an ongoing community level HIV prevention intervention trial funded by the Centers for Disease Control and Prevention. The TSS method is reproducible and can be adapted to hard-to-reach populations in other situations, environments, and cultures. PMID:11889287

  11. Development and Evaluation of a Micro- and Nanoscale Proteomic Sample Preparation Method

    SciTech Connect

    Wang, Haixing H.; Qian, Weijun; Mottaz, Heather M.; Clauss, Therese R.W.; Anderson, David J.; Moore, Ronald J.; Camp, David G.; Khan, Arshad H.; Sforza, Daniel M.; Pallavicini, Maria; Smith, Desmond J.; Smith, Richard D.

    2005-10-05

    Efficient and effective sample preparation of micro- and nano-scale (micro- and nano-gram) clinical specimens for proteomic applications is often difficult due to losses during the processing steps. Herein we describe a simple “single-tube” preparation protocol appropriate for small proteomic samples using the organic co-solvent, trifluoroethanol (TFE). TFE facilitates both protein extraction and protein denaturation without requiring a separate cleanup step, thus minimizing sample loss. The performance of the TFE method was initially evaluated by comparing to traditional detergent-based methods on relatively large scale sample processing using human breast cancer cells and mouse brain tissue. The results demonstrated that the TFE protocol provided comparable results to the traditional detergent-based protocols for larger samples (milligrams), based on both sample recovery and peptide/protein identification. The effectiveness of this protocol for micro- and nano-scale sample processing was then evaluated for the extraction of proteins/peptides and shown effective for small mouse brain tissue samples (~ 20 μg total protein content) and also for samples of ~ 5 000 human breast cancer MCF-7 cells (~ 500 ng total protein content), where the detergent-based methods were ineffective due to losses during cleanup and transfer steps.

  12. GROUND WATER MONITORING AND SAMPLING: MULTI-LEVEL VERSUS TRADITIONAL METHODS WHATS WHAT?

    EPA Science Inventory

    After years of research and many publications, the question still remains: What is the best method to collect representative ground water samples from monitoring wells? Numerous systems and devices are currently available for obtaining both multi-level samples as well as traditi...

  13. Fast identification of microplastics in complex environmental samples by a thermal degradation method.

    PubMed

    Dümichen, Erik; Eisentraut, Paul; Bannick, Claus Gerhard; Barthel, Anne-Kathrin; Senz, Rainer; Braun, Ulrike

    2017-05-01

    In order to determine the relevance of microplastic particles in various environmental media, comprehensive investigations are needed. However, no analytical method exists for fast identification and quantification. At present, optical spectroscopy methods like IR and RAMAN imaging are used. Due to their time consuming procedures and uncertain extrapolation, reliable monitoring is difficult. For analyzing polymers Py-GC-MS is a standard method. However, due to a limited sample amount of about 0.5 mg it is not suited for analysis of complex sample mixtures like environmental samples. Therefore, we developed a new thermoanalytical method as a first step for identifying microplastics in environmental samples. A sample amount of about 20 mg, which assures the homogeneity of the sample, is subjected to complete thermal decomposition. The specific degradation products of the respective polymer are adsorbed on a solid-phase adsorber and subsequently analyzed by thermal desorption gas chromatography mass spectrometry. For certain identification, the specific degradation products for the respective polymer were selected first. Afterwards real environmental samples from the aquatic (three different rivers) and the terrestrial (bio gas plant) systems were screened for microplastics. Mainly polypropylene (PP), polyethylene (PE) and polystyrene (PS) were identified for the samples from the bio gas plant and PE and PS from the rivers. However, this was only the first step and quantification measurements will follow.

  14. Effectiveness of Four Methods of Handling Missing Data Using Samples from a National Database.

    ERIC Educational Resources Information Center

    Witta, E. Lea

    The effectiveness of four methods of handling missing data in reproducing the target sample covariance matrix and mean vector was tested using three levels of incomplete cases: 30%, 50%, and 70%. Data were selected from the National Education Longitudinal Study (NELS) database. Three levels of sample sizes (500, 1000, and 2000) were used. The…

  15. Aerostat-lofted instrument and sampling method for determination of emissions from open area sources

    EPA Science Inventory

    An aerostat-borne instrument and sampling method was developed to characterize air samples from area sources, such as emissions from open burning. The 10 kg battery-powered instrument system, termed "the Flyer," is lofted with a helium-filled aerostat of 4 m nominal diameter and ...

  16. The "Closed School-Cluster" Method of Selecting a Probability Sample.

    ERIC Educational Resources Information Center

    Shaycoft, Marion F.

    In some educational research studies--particularly longitudinal studies requiring a probability sample of schools and spanning a wide range of grades--it is desirable to so select the sample that schools at different levels (e.g., elementary and secondary) "correspond." This has often proved unachievable, using standard methods of selecting school…

  17. Method and apparatus for measuring the NMR spectrum of an orientationally disordered sample

    DOEpatents

    Pines, Alexander; Samoson, Ago

    1990-01-01

    An improved NMR probe and method are described which substantially improve the resolution of NMR measurements made on powdered or amorphous or otherwise oreintationally disordered samples. The apparatus mechanically varies the orientation of the sample such that the time average of two or more sets of spherical harmonic functions is zero.

  18. Comparison of alternative methods, sample grinds, and fermentation times for determining indigestible neutral detergent fiber

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The objectives of this study were to evaluate the effects of sample grind, fermentation method, and time on the determination of indigestible neutral detergent fiber (iNDF). Samples of: 1) alfalfa hay and silage, 2) corn stalks and silage, and 3) ryegrass and mixed grass hays were ground through 2-m...

  19. Design Study of Methods for Sampling Migrant and Seasonal Farm Workers. Final Report.

    ERIC Educational Resources Information Center

    Kalsbeek, William D.; Parker, Rebecca Robin

    This report describes efforts to develop sampling methods to be used in national or regional studies of migrant and seasonal farm workers (MSFWs). Several facets of the MSFWs' lifestyle create sampling difficulties. One is mobility. Although the dynamic nature of MSFWs' movement is partly understood, it is sufficiently unpredictable to create…

  20. Drying and storage methods affect cyfluthrin concentrations in exposed plant samples

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Standard procedures exist for collection and chemical analyses of pyrethroid insecticides in environmental matrices. However, less detail is given for drying and potential storage methods of plant samples prior to analyses. Due to equipment and financial limitations, immediate sample analysis is n...