Probabilistic inference using linear Gaussian importance sampling for hybrid Bayesian networks
NASA Astrophysics Data System (ADS)
Sun, Wei; Chang, K. C.
2005-05-01
Probabilistic inference for Bayesian networks is in general NP-hard using either exact algorithms or approximate methods. However, for very complex networks, only the approximate methods such as stochastic sampling could be used to provide a solution given any time constraint. There are several simulation methods currently available. They include logic sampling (the first proposed stochastic method for Bayesian networks, the likelihood weighting algorithm) the most commonly used simulation method because of its simplicity and efficiency, the Markov blanket scoring method, and the importance sampling algorithm. In this paper, we first briefly review and compare these available simulation methods, then we propose an improved importance sampling algorithm called linear Gaussian importance sampling algorithm for general hybrid model (LGIS). LGIS is aimed for hybrid Bayesian networks consisting of both discrete and continuous random variables with arbitrary distributions. It uses linear function and Gaussian additive noise to approximate the true conditional probability distribution for continuous variable given both its parents and evidence in a Bayesian network. One of the most important features of the newly developed method is that it can adaptively learn the optimal important function from the previous samples. We test the inference performance of LGIS using a 16-node linear Gaussian model and a 6-node general hybrid model. The performance comparison with other well-known methods such as Junction tree (JT) and likelihood weighting (LW) shows that LGIS-GHM is very promising.
Some connections between importance sampling and enhanced sampling methods in molecular dynamics.
Lie, H C; Quer, J
2017-11-21
In molecular dynamics, enhanced sampling methods enable the collection of better statistics of rare events from a reference or target distribution. We show that a large class of these methods is based on the idea of importance sampling from mathematical statistics. We illustrate this connection by comparing the Hartmann-Schütte method for rare event simulation (J. Stat. Mech. Theor. Exp. 2012, P11004) and the Valsson-Parrinello method of variationally enhanced sampling [Phys. Rev. Lett. 113, 090601 (2014)]. We use this connection in order to discuss how recent results from the Monte Carlo methods literature can guide the development of enhanced sampling methods.
Some connections between importance sampling and enhanced sampling methods in molecular dynamics
NASA Astrophysics Data System (ADS)
Lie, H. C.; Quer, J.
2017-11-01
In molecular dynamics, enhanced sampling methods enable the collection of better statistics of rare events from a reference or target distribution. We show that a large class of these methods is based on the idea of importance sampling from mathematical statistics. We illustrate this connection by comparing the Hartmann-Schütte method for rare event simulation (J. Stat. Mech. Theor. Exp. 2012, P11004) and the Valsson-Parrinello method of variationally enhanced sampling [Phys. Rev. Lett. 113, 090601 (2014)]. We use this connection in order to discuss how recent results from the Monte Carlo methods literature can guide the development of enhanced sampling methods.
Harry V., Jr. Wiant; Michael L. Spangler; John E. Baumgras
2002-01-01
Various taper systems and the centroid method were compared to unbiased volume estimates made by importance sampling for 720 hardwood trees selected throughout the state of West Virginia. Only the centroid method consistently gave volumes estimates that did not differ significantly from those made by importance sampling, although some taper equations did well for most...
Tao, Guohua; Miller, William H
2011-07-14
An efficient time-dependent importance sampling method is developed for the Monte Carlo calculation of time correlation functions via the initial value representation (IVR) of semiclassical (SC) theory. A prefactor-free time-dependent sampling function weights the importance of a trajectory based on the magnitude of its contribution to the time correlation function, and global trial moves are used to facilitate the efficient sampling the phase space of initial conditions. The method can be generally applied to sampling rare events efficiently while avoiding being trapped in a local region of the phase space. Results presented in the paper for two system-bath models demonstrate the efficiency of this new importance sampling method for full SC-IVR calculations.
Methodology Series Module 5: Sampling Strategies.
Setia, Maninder Singh
2016-01-01
Once the research question and the research design have been finalised, it is important to select the appropriate sample for the study. The method by which the researcher selects the sample is the ' Sampling Method'. There are essentially two types of sampling methods: 1) probability sampling - based on chance events (such as random numbers, flipping a coin etc.); and 2) non-probability sampling - based on researcher's choice, population that accessible & available. Some of the non-probability sampling methods are: purposive sampling, convenience sampling, or quota sampling. Random sampling method (such as simple random sample or stratified random sample) is a form of probability sampling. It is important to understand the different sampling methods used in clinical studies and mention this method clearly in the manuscript. The researcher should not misrepresent the sampling method in the manuscript (such as using the term ' random sample' when the researcher has used convenience sample). The sampling method will depend on the research question. For instance, the researcher may want to understand an issue in greater detail for one particular population rather than worry about the ' generalizability' of these results. In such a scenario, the researcher may want to use ' purposive sampling' for the study.
Screen Space Ambient Occlusion Based Multiple Importance Sampling for Real-Time Rendering
NASA Astrophysics Data System (ADS)
Zerari, Abd El Mouméne; Babahenini, Mohamed Chaouki
2018-03-01
We propose a new approximation technique for accelerating the Global Illumination algorithm for real-time rendering. The proposed approach is based on the Screen-Space Ambient Occlusion (SSAO) method, which approximates the global illumination for large, fully dynamic scenes at interactive frame rates. Current algorithms that are based on the SSAO method suffer from difficulties due to the large number of samples that are required. In this paper, we propose an improvement to the SSAO technique by integrating it with a Multiple Importance Sampling technique that combines a stratified sampling method with an importance sampling method, with the objective of reducing the number of samples. Experimental evaluation demonstrates that our technique can produce high-quality images in real time and is significantly faster than traditional techniques.
COMPARISON OF TWO METHODS FOR THE ISOLATION OF SALMONELLAE FROM IMPORTED FOODS.
TAYLOR, W I; HOBBS, B C; SMITH, M E
1964-01-01
Two methods for the detection of salmonellae in foods were compared in 179 imported meat and egg samples. The number of positive samples and replications, and the number of strains and kinds of serotypes were statistically comparable by both the direct enrichment method of the Food Hygiene Laboratory in England, and the pre-enrichment method devised for processed foods in the United States. Boneless frozen beef, veal, and horsemeat imported from five countries for consumption in England were found to have salmonellae present in 48 of 116 (41%) samples. Dried egg products imported from three countries were observed to have salmonellae in 10 of 63 (16%) samples. The high incidence of salmonellae isolated from imported foods illustrated the existence of an international health hazard resulting from the continuous introduction of exogenous strains of pathogenic microorganisms on a large scale.
Methodology Series Module 5: Sampling Strategies
Setia, Maninder Singh
2016-01-01
Once the research question and the research design have been finalised, it is important to select the appropriate sample for the study. The method by which the researcher selects the sample is the ‘ Sampling Method’. There are essentially two types of sampling methods: 1) probability sampling – based on chance events (such as random numbers, flipping a coin etc.); and 2) non-probability sampling – based on researcher's choice, population that accessible & available. Some of the non-probability sampling methods are: purposive sampling, convenience sampling, or quota sampling. Random sampling method (such as simple random sample or stratified random sample) is a form of probability sampling. It is important to understand the different sampling methods used in clinical studies and mention this method clearly in the manuscript. The researcher should not misrepresent the sampling method in the manuscript (such as using the term ‘ random sample’ when the researcher has used convenience sample). The sampling method will depend on the research question. For instance, the researcher may want to understand an issue in greater detail for one particular population rather than worry about the ‘ generalizability’ of these results. In such a scenario, the researcher may want to use ‘ purposive sampling’ for the study. PMID:27688438
Comparison of Two Methods for the Isolation of Salmonellae From Imported Foods
Taylor, Welton I.; Hobbs, Betty C.; Smith, Muriel E.
1964-01-01
Two methods for the detection of salmonellae in foods were compared in 179 imported meat and egg samples. The number of positive samples and replications, and the number of strains and kinds of serotypes were statistically comparable by both the direct enrichment method of the Food Hygiene Laboratory in England, and the pre-enrichment method devised for processed foods in the United States. Boneless frozen beef, veal, and horsemeat imported from five countries for consumption in England were found to have salmonellae present in 48 of 116 (41%) samples. Dried egg products imported from three countries were observed to have salmonellae in 10 of 63 (16%) samples. The high incidence of salmonellae isolated from imported foods illustrated the existence of an international health hazard resulting from the continuous introduction of exogenous strains of pathogenic microorganisms on a large scale. PMID:14106941
Sampling High-Altitude and Stratified Mating Flights of Red Imported Fire Ant
USDA-ARS?s Scientific Manuscript database
With the exception of an airplane equipped with nets, no method has been developed that successfully samples red imported fire ant, Solenopsis invicta Buren, sexuals in mating/dispersal flights throughout their potential altitudinal trajectories. We developed and tested a method for sampling queens ...
Roh, Min K; Gillespie, Dan T; Petzold, Linda R
2010-11-07
The weighted stochastic simulation algorithm (wSSA) was developed by Kuwahara and Mura [J. Chem. Phys. 129, 165101 (2008)] to efficiently estimate the probabilities of rare events in discrete stochastic systems. The wSSA uses importance sampling to enhance the statistical accuracy in the estimation of the probability of the rare event. The original algorithm biases the reaction selection step with a fixed importance sampling parameter. In this paper, we introduce a novel method where the biasing parameter is state-dependent. The new method features improved accuracy, efficiency, and robustness.
Perpendicular distance sampling: an alternative method for sampling downed coarse woody debris
Michael S. Williams; Jeffrey H. Gove
2003-01-01
Coarse woody debris (CWD) plays an important role in many forest ecosystem processes. In recent years, a number of new methods have been proposed to sample CWD. These methods select individual logs into the sample using some form of unequal probability sampling. One concern with most of these methods is the difficulty in estimating the volume of each log. A new method...
A flexible importance sampling method for integrating subgrid processes
Raut, E. K.; Larson, V. E.
2016-01-29
Numerical models of weather and climate need to compute grid-box-averaged rates of physical processes such as microphysics. These averages are computed by integrating subgrid variability over a grid box. For this reason, an important aspect of atmospheric modeling is spatial integration over subgrid scales. The needed integrals can be estimated by Monte Carlo integration. Monte Carlo integration is simple and general but requires many evaluations of the physical process rate. To reduce the number of function evaluations, this paper describes a new, flexible method of importance sampling. It divides the domain of integration into eight categories, such as the portion that containsmore » both precipitation and cloud, or the portion that contains precipitation but no cloud. It then allows the modeler to prescribe the density of sample points within each of the eight categories. The new method is incorporated into the Subgrid Importance Latin Hypercube Sampler (SILHS). Here, the resulting method is tested on drizzling cumulus and stratocumulus cases. In the cumulus case, the sampling error can be considerably reduced by drawing more sample points from the region of rain evaporation.« less
Sampling Methods in Cardiovascular Nursing Research: An Overview.
Kandola, Damanpreet; Banner, Davina; O'Keefe-McCarthy, Sheila; Jassal, Debbie
2014-01-01
Cardiovascular nursing research covers a wide array of topics from health services to psychosocial patient experiences. The selection of specific participant samples is an important part of the research design and process. The sampling strategy employed is of utmost importance to ensure that a representative sample of participants is chosen. There are two main categories of sampling methods: probability and non-probability. Probability sampling is the random selection of elements from the population, where each element of the population has an equal and independent chance of being included in the sample. There are five main types of probability sampling including simple random sampling, systematic sampling, stratified sampling, cluster sampling, and multi-stage sampling. Non-probability sampling methods are those in which elements are chosen through non-random methods for inclusion into the research study and include convenience sampling, purposive sampling, and snowball sampling. Each approach offers distinct advantages and disadvantages and must be considered critically. In this research column, we provide an introduction to these key sampling techniques and draw on examples from the cardiovascular research. Understanding the differences in sampling techniques may aid nurses in effective appraisal of research literature and provide a reference pointfor nurses who engage in cardiovascular research.
NASA Astrophysics Data System (ADS)
Beck, Joakim; Dia, Ben Mansour; Espath, Luis F. R.; Long, Quan; Tempone, Raúl
2018-06-01
In calculating expected information gain in optimal Bayesian experimental design, the computation of the inner loop in the classical double-loop Monte Carlo requires a large number of samples and suffers from underflow if the number of samples is small. These drawbacks can be avoided by using an importance sampling approach. We present a computationally efficient method for optimal Bayesian experimental design that introduces importance sampling based on the Laplace method to the inner loop. We derive the optimal values for the method parameters in which the average computational cost is minimized according to the desired error tolerance. We use three numerical examples to demonstrate the computational efficiency of our method compared with the classical double-loop Monte Carlo, and a more recent single-loop Monte Carlo method that uses the Laplace method as an approximation of the return value of the inner loop. The first example is a scalar problem that is linear in the uncertain parameter. The second example is a nonlinear scalar problem. The third example deals with the optimal sensor placement for an electrical impedance tomography experiment to recover the fiber orientation in laminate composites.
Gan, Yanjun; Duan, Qingyun; Gong, Wei; ...
2014-01-01
Sensitivity analysis (SA) is a commonly used approach for identifying important parameters that dominate model behaviors. We use a newly developed software package, a Problem Solving environment for Uncertainty Analysis and Design Exploration (PSUADE), to evaluate the effectiveness and efficiency of ten widely used SA methods, including seven qualitative and three quantitative ones. All SA methods are tested using a variety of sampling techniques to screen out the most sensitive (i.e., important) parameters from the insensitive ones. The Sacramento Soil Moisture Accounting (SAC-SMA) model, which has thirteen tunable parameters, is used for illustration. The South Branch Potomac River basin nearmore » Springfield, West Virginia in the U.S. is chosen as the study area. The key findings from this study are: (1) For qualitative SA methods, Correlation Analysis (CA), Regression Analysis (RA), and Gaussian Process (GP) screening methods are shown to be not effective in this example. Morris One-At-a-Time (MOAT) screening is the most efficient, needing only 280 samples to identify the most important parameters, but it is the least robust method. Multivariate Adaptive Regression Splines (MARS), Delta Test (DT) and Sum-Of-Trees (SOT) screening methods need about 400–600 samples for the same purpose. Monte Carlo (MC), Orthogonal Array (OA) and Orthogonal Array based Latin Hypercube (OALH) are appropriate sampling techniques for them; (2) For quantitative SA methods, at least 2777 samples are needed for Fourier Amplitude Sensitivity Test (FAST) to identity parameter main effect. McKay method needs about 360 samples to evaluate the main effect, more than 1000 samples to assess the two-way interaction effect. OALH and LPτ (LPTAU) sampling techniques are more appropriate for McKay method. For the Sobol' method, the minimum samples needed are 1050 to compute the first-order and total sensitivity indices correctly. These comparisons show that qualitative SA methods are more efficient but less accurate and robust than quantitative ones.« less
Convergence and Efficiency of Adaptive Importance Sampling Techniques with Partial Biasing
NASA Astrophysics Data System (ADS)
Fort, G.; Jourdain, B.; Lelièvre, T.; Stoltz, G.
2018-04-01
We propose a new Monte Carlo method to efficiently sample a multimodal distribution (known up to a normalization constant). We consider a generalization of the discrete-time Self Healing Umbrella Sampling method, which can also be seen as a generalization of well-tempered metadynamics. The dynamics is based on an adaptive importance technique. The importance function relies on the weights (namely the relative probabilities) of disjoint sets which form a partition of the space. These weights are unknown but are learnt on the fly yielding an adaptive algorithm. In the context of computational statistical physics, the logarithm of these weights is, up to an additive constant, the free-energy, and the discrete valued function defining the partition is called the collective variable. The algorithm falls into the general class of Wang-Landau type methods, and is a generalization of the original Self Healing Umbrella Sampling method in two ways: (i) the updating strategy leads to a larger penalization strength of already visited sets in order to escape more quickly from metastable states, and (ii) the target distribution is biased using only a fraction of the free-energy, in order to increase the effective sample size and reduce the variance of importance sampling estimators. We prove the convergence of the algorithm and analyze numerically its efficiency on a toy example.
Cao, Youfang; Liang, Jie
2013-01-01
Critical events that occur rarely in biological processes are of great importance, but are challenging to study using Monte Carlo simulation. By introducing biases to reaction selection and reaction rates, weighted stochastic simulation algorithms based on importance sampling allow rare events to be sampled more effectively. However, existing methods do not address the important issue of barrier crossing, which often arises from multistable networks and systems with complex probability landscape. In addition, the proliferation of parameters and the associated computing cost pose significant problems. Here we introduce a general theoretical framework for obtaining optimized biases in sampling individual reactions for estimating probabilities of rare events. We further describe a practical algorithm called adaptively biased sequential importance sampling (ABSIS) method for efficient probability estimation. By adopting a look-ahead strategy and by enumerating short paths from the current state, we estimate the reaction-specific and state-specific forward and backward moving probabilities of the system, which are then used to bias reaction selections. The ABSIS algorithm can automatically detect barrier-crossing regions, and can adjust bias adaptively at different steps of the sampling process, with bias determined by the outcome of exhaustively generated short paths. In addition, there are only two bias parameters to be determined, regardless of the number of the reactions and the complexity of the network. We have applied the ABSIS method to four biochemical networks: the birth-death process, the reversible isomerization, the bistable Schlögl model, and the enzymatic futile cycle model. For comparison, we have also applied the finite buffer discrete chemical master equation (dCME) method recently developed to obtain exact numerical solutions of the underlying discrete chemical master equations of these problems. This allows us to assess sampling results objectively by comparing simulation results with true answers. Overall, ABSIS can accurately and efficiently estimate rare event probabilities for all examples, often with smaller variance than other importance sampling algorithms. The ABSIS method is general and can be applied to study rare events of other stochastic networks with complex probability landscape. PMID:23862966
NASA Astrophysics Data System (ADS)
Cao, Youfang; Liang, Jie
2013-07-01
Critical events that occur rarely in biological processes are of great importance, but are challenging to study using Monte Carlo simulation. By introducing biases to reaction selection and reaction rates, weighted stochastic simulation algorithms based on importance sampling allow rare events to be sampled more effectively. However, existing methods do not address the important issue of barrier crossing, which often arises from multistable networks and systems with complex probability landscape. In addition, the proliferation of parameters and the associated computing cost pose significant problems. Here we introduce a general theoretical framework for obtaining optimized biases in sampling individual reactions for estimating probabilities of rare events. We further describe a practical algorithm called adaptively biased sequential importance sampling (ABSIS) method for efficient probability estimation. By adopting a look-ahead strategy and by enumerating short paths from the current state, we estimate the reaction-specific and state-specific forward and backward moving probabilities of the system, which are then used to bias reaction selections. The ABSIS algorithm can automatically detect barrier-crossing regions, and can adjust bias adaptively at different steps of the sampling process, with bias determined by the outcome of exhaustively generated short paths. In addition, there are only two bias parameters to be determined, regardless of the number of the reactions and the complexity of the network. We have applied the ABSIS method to four biochemical networks: the birth-death process, the reversible isomerization, the bistable Schlögl model, and the enzymatic futile cycle model. For comparison, we have also applied the finite buffer discrete chemical master equation (dCME) method recently developed to obtain exact numerical solutions of the underlying discrete chemical master equations of these problems. This allows us to assess sampling results objectively by comparing simulation results with true answers. Overall, ABSIS can accurately and efficiently estimate rare event probabilities for all examples, often with smaller variance than other importance sampling algorithms. The ABSIS method is general and can be applied to study rare events of other stochastic networks with complex probability landscape.
Cao, Youfang; Liang, Jie
2013-07-14
Critical events that occur rarely in biological processes are of great importance, but are challenging to study using Monte Carlo simulation. By introducing biases to reaction selection and reaction rates, weighted stochastic simulation algorithms based on importance sampling allow rare events to be sampled more effectively. However, existing methods do not address the important issue of barrier crossing, which often arises from multistable networks and systems with complex probability landscape. In addition, the proliferation of parameters and the associated computing cost pose significant problems. Here we introduce a general theoretical framework for obtaining optimized biases in sampling individual reactions for estimating probabilities of rare events. We further describe a practical algorithm called adaptively biased sequential importance sampling (ABSIS) method for efficient probability estimation. By adopting a look-ahead strategy and by enumerating short paths from the current state, we estimate the reaction-specific and state-specific forward and backward moving probabilities of the system, which are then used to bias reaction selections. The ABSIS algorithm can automatically detect barrier-crossing regions, and can adjust bias adaptively at different steps of the sampling process, with bias determined by the outcome of exhaustively generated short paths. In addition, there are only two bias parameters to be determined, regardless of the number of the reactions and the complexity of the network. We have applied the ABSIS method to four biochemical networks: the birth-death process, the reversible isomerization, the bistable Schlögl model, and the enzymatic futile cycle model. For comparison, we have also applied the finite buffer discrete chemical master equation (dCME) method recently developed to obtain exact numerical solutions of the underlying discrete chemical master equations of these problems. This allows us to assess sampling results objectively by comparing simulation results with true answers. Overall, ABSIS can accurately and efficiently estimate rare event probabilities for all examples, often with smaller variance than other importance sampling algorithms. The ABSIS method is general and can be applied to study rare events of other stochastic networks with complex probability landscape.
Bannerman, J A; Costamagna, A C; McCornack, B P; Ragsdale, D W
2015-06-01
Generalist natural enemies play an important role in controlling soybean aphid, Aphis glycines (Hemiptera: Aphididae), in North America. Several sampling methods are used to monitor natural enemy populations in soybean, but there has been little work investigating their relative bias, precision, and efficiency. We compare five sampling methods: quadrats, whole-plant counts, sweep-netting, walking transects, and yellow sticky cards to determine the most practical methods for sampling the three most prominent species, which included Harmonia axyridis (Pallas), Coccinella septempunctata L. (Coleoptera: Coccinellidae), and Orius insidiosus (Say) (Hemiptera: Anthocoridae). We show an important time by sampling method interaction indicated by diverging community similarities within and between sampling methods as the growing season progressed. Similarly, correlations between sampling methods for the three most abundant species over multiple time periods indicated differences in relative bias between sampling methods and suggests that bias is not consistent throughout the growing season, particularly for sticky cards and whole-plant samples. Furthermore, we show that sticky cards produce strongly biased capture rates relative to the other four sampling methods. Precision and efficiency differed between sampling methods and sticky cards produced the most precise (but highly biased) results for adult natural enemies, while walking transects and whole-plant counts were the most efficient methods for detecting coccinellids and O. insidiosus, respectively. Based on bias, precision, and efficiency considerations, the most practical sampling methods for monitoring in soybean include walking transects for coccinellid detection and whole-plant counts for detection of small predators like O. insidiosus. Sweep-netting and quadrat samples are also useful for some applications, when efficiency is not paramount. © The Authors 2015. Published by Oxford University Press on behalf of Entomological Society of America. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Rodrigues, João Fabrício Mota; Coelho, Marco Túlio Pacheco
2016-01-01
Sampling the biodiversity is an essential step for conservation, and understanding the efficiency of sampling methods allows us to estimate the quality of our biodiversity data. Sex ratio is an important population characteristic, but until now, no study has evaluated how efficient are the sampling methods commonly used in biodiversity surveys in estimating the sex ratio of populations. We used a virtual ecologist approach to investigate whether active and passive capture methods are able to accurately sample a population's sex ratio and whether differences in movement pattern and detectability between males and females produce biased estimates of sex-ratios when using these methods. Our simulation allowed the recognition of individuals, similar to mark-recapture studies. We found that differences in both movement patterns and detectability between males and females produce biased estimates of sex ratios. However, increasing the sampling effort or the number of sampling days improves the ability of passive or active capture methods to properly sample sex ratio. Thus, prior knowledge regarding movement patterns and detectability for species is important information to guide field studies aiming to understand sex ratio related patterns.
Rodrigues, João Fabrício Mota; Coelho, Marco Túlio Pacheco
2016-01-01
Sampling the biodiversity is an essential step for conservation, and understanding the efficiency of sampling methods allows us to estimate the quality of our biodiversity data. Sex ratio is an important population characteristic, but until now, no study has evaluated how efficient are the sampling methods commonly used in biodiversity surveys in estimating the sex ratio of populations. We used a virtual ecologist approach to investigate whether active and passive capture methods are able to accurately sample a population’s sex ratio and whether differences in movement pattern and detectability between males and females produce biased estimates of sex-ratios when using these methods. Our simulation allowed the recognition of individuals, similar to mark-recapture studies. We found that differences in both movement patterns and detectability between males and females produce biased estimates of sex ratios. However, increasing the sampling effort or the number of sampling days improves the ability of passive or active capture methods to properly sample sex ratio. Thus, prior knowledge regarding movement patterns and detectability for species is important information to guide field studies aiming to understand sex ratio related patterns. PMID:27441554
Maes, Sharon; Huu, Son Nguyen; Heyndrickx, Marc; Weyenberg, Stephanie van; Steenackers, Hans; Verplaetse, Alex; Vackier, Thijs; Sampers, Imca; Raes, Katleen; Reu, Koen De
2017-12-01
Biofilms are an important source of contamination in food companies, yet the composition of biofilms in practice is still mostly unknown. The chemical and microbiological characterization of surface samples taken after cleaning and disinfection is very important to distinguish free-living bacteria from the attached bacteria in biofilms. In this study, sampling methods that are potentially useful for both chemical and microbiological analyses of surface samples were evaluated. In the manufacturing facilities of eight Belgian food companies, surfaces were sampled after cleaning and disinfection using two sampling methods: the scraper-flocked swab method and the sponge stick method. Microbiological and chemical analyses were performed on these samples to evaluate the suitability of the sampling methods for the quantification of extracellular polymeric substance components and microorganisms originating from biofilms in these facilities. The scraper-flocked swab method was most suitable for chemical analyses of the samples because the material in these swabs did not interfere with determination of the chemical components. For microbiological enumerations, the sponge stick method was slightly but not significantly more effective than the scraper-flocked swab method. In all but one of the facilities, at least 20% of the sampled surfaces had more than 10 2 CFU/100 cm 2 . Proteins were found in 20% of the chemically analyzed surface samples, and carbohydrates and uronic acids were found in 15 and 8% of the samples, respectively. When chemical and microbiological results were combined, 17% of the sampled surfaces were contaminated with both microorganisms and at least one of the analyzed chemical components; thus, these surfaces were characterized as carrying biofilm. Overall, microbiological contamination in the food industry is highly variable by food sector and even within a facility at various sampling points and sampling times.
Wipe sampling is an important technique for the estimation of contaminant deposition in buildings, homes, or outdoor surfaces as a source of possible human exposure. Numerous
methods of wipe sampling exist, and each method has its own specification for the type of wipe, we...
Nelson, Jennifer Clark; Marsh, Tracey; Lumley, Thomas; Larson, Eric B; Jackson, Lisa A; Jackson, Michael L
2013-08-01
Estimates of treatment effectiveness in epidemiologic studies using large observational health care databases may be biased owing to inaccurate or incomplete information on important confounders. Study methods that collect and incorporate more comprehensive confounder data on a validation cohort may reduce confounding bias. We applied two such methods, namely imputation and reweighting, to Group Health administrative data (full sample) supplemented by more detailed confounder data from the Adult Changes in Thought study (validation sample). We used influenza vaccination effectiveness (with an unexposed comparator group) as an example and evaluated each method's ability to reduce bias using the control time period before influenza circulation. Both methods reduced, but did not completely eliminate, the bias compared with traditional effectiveness estimates that do not use the validation sample confounders. Although these results support the use of validation sampling methods to improve the accuracy of comparative effectiveness findings from health care database studies, they also illustrate that the success of such methods depends on many factors, including the ability to measure important confounders in a representative and large enough validation sample, the comparability of the full sample and validation sample, and the accuracy with which the data can be imputed or reweighted using the additional validation sample information. Copyright © 2013 Elsevier Inc. All rights reserved.
Field efficiency and bias of snag inventory methods
Robert S. Kenning; Mark J. Ducey; John C. Brissette; Jeffery H. Gove
2005-01-01
Snags and cavity trees are important components of forests, but can be difficult to inventory precisely and are not always included in inventories because of limited resources. We tested the application of N-tree distance sampling as a time-saving snag sampling method and compared N-tree distance sampling to fixed-area sampling and modified horizontal line sampling in...
Comparing two sampling methods to engage hard-to-reach communities in research priority setting.
Valerio, Melissa A; Rodriguez, Natalia; Winkler, Paula; Lopez, Jaime; Dennison, Meagen; Liang, Yuanyuan; Turner, Barbara J
2016-10-28
Effective community-partnered and patient-centered outcomes research needs to address community priorities. However, optimal sampling methods to engage stakeholders from hard-to-reach, vulnerable communities to generate research priorities have not been identified. In two similar rural, largely Hispanic communities, a community advisory board guided recruitment of stakeholders affected by chronic pain using a different method in each community: 1) snowball sampling, a chain- referral method or 2) purposive sampling to recruit diverse stakeholders. In both communities, three groups of stakeholders attended a series of three facilitated meetings to orient, brainstorm, and prioritize ideas (9 meetings/community). Using mixed methods analysis, we compared stakeholder recruitment and retention as well as priorities from both communities' stakeholders on mean ratings of their ideas based on importance and feasibility for implementation in their community. Of 65 eligible stakeholders in one community recruited by snowball sampling, 55 (85 %) consented, 52 (95 %) attended the first meeting, and 36 (65 %) attended all 3 meetings. In the second community, the purposive sampling method was supplemented by convenience sampling to increase recruitment. Of 69 stakeholders recruited by this combined strategy, 62 (90 %) consented, 36 (58 %) attended the first meeting, and 26 (42 %) attended all 3 meetings. Snowball sampling recruited more Hispanics and disabled persons (all P < 0.05). Despite differing recruitment strategies, stakeholders from the two communities identified largely similar ideas for research, focusing on non-pharmacologic interventions for management of chronic pain. Ratings on importance and feasibility for community implementation differed only on the importance of massage services (P = 0.045) which was higher for the purposive/convenience sampling group and for city improvements/transportation services (P = 0.004) which was higher for the snowball sampling group. In each of the two similar hard-to-reach communities, a community advisory board partnered with researchers to implement a different sampling method to recruit stakeholders. The snowball sampling method achieved greater participation with more Hispanics but also more individuals with disabilities than a purposive-convenience sampling method. However, priorities for research on chronic pain from both stakeholder groups were similar. Although utilizing a snowball sampling method appears to be superior, further research is needed on implementation costs and resources.
NASA Astrophysics Data System (ADS)
Shayanfar, Mohsen Ali; Barkhordari, Mohammad Ali; Roudak, Mohammad Amin
2017-06-01
Monte Carlo simulation (MCS) is a useful tool for computation of probability of failure in reliability analysis. However, the large number of required random samples makes it time-consuming. Response surface method (RSM) is another common method in reliability analysis. Although RSM is widely used for its simplicity, it cannot be trusted in highly nonlinear problems due to its linear nature. In this paper, a new efficient algorithm, employing the combination of importance sampling, as a class of MCS, and RSM is proposed. In the proposed algorithm, analysis starts with importance sampling concepts and using a represented two-step updating rule of design point. This part finishes after a small number of samples are generated. Then RSM starts to work using Bucher experimental design, with the last design point and a represented effective length as the center point and radius of Bucher's approach, respectively. Through illustrative numerical examples, simplicity and efficiency of the proposed algorithm and the effectiveness of the represented rules are shown.
An improved sampling method of complex network
NASA Astrophysics Data System (ADS)
Gao, Qi; Ding, Xintong; Pan, Feng; Li, Weixing
2014-12-01
Sampling subnet is an important topic of complex network research. Sampling methods influence the structure and characteristics of subnet. Random multiple snowball with Cohen (RMSC) process sampling which combines the advantages of random sampling and snowball sampling is proposed in this paper. It has the ability to explore global information and discover the local structure at the same time. The experiments indicate that this novel sampling method could keep the similarity between sampling subnet and original network on degree distribution, connectivity rate and average shortest path. This method is applicable to the situation where the prior knowledge about degree distribution of original network is not sufficient.
Importance Sampling Variance Reduction in GRESS ATMOSIM
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wakeford, Daniel Tyler
This document is intended to introduce the importance sampling method of variance reduction to a Geant4 user for application to neutral particle Monte Carlo transport through the atmosphere, as implemented in GRESS ATMOSIM.
Annealed importance sampling with constant cooling rate
NASA Astrophysics Data System (ADS)
Giovannelli, Edoardo; Cardini, Gianni; Gellini, Cristina; Pietraperzia, Giangaetano; Chelli, Riccardo
2015-02-01
Annealed importance sampling is a simulation method devised by Neal [Stat. Comput. 11, 125 (2001)] to assign weights to configurations generated by simulated annealing trajectories. In particular, the equilibrium average of a generic physical quantity can be computed by a weighted average exploiting weights and estimates of this quantity associated to the final configurations of the annealed trajectories. Here, we review annealed importance sampling from the perspective of nonequilibrium path-ensemble averages [G. E. Crooks, Phys. Rev. E 61, 2361 (2000)]. The equivalence of Neal's and Crooks' treatments highlights the generality of the method, which goes beyond the mere thermal-based protocols. Furthermore, we show that a temperature schedule based on a constant cooling rate outperforms stepwise cooling schedules and that, for a given elapsed computer time, performances of annealed importance sampling are, in general, improved by increasing the number of intermediate temperatures.
Importance sampling variance reduction for the Fokker–Planck rarefied gas particle method
DOE Office of Scientific and Technical Information (OSTI.GOV)
Collyer, B.S., E-mail: benjamin.collyer@gmail.com; London Mathematical Laboratory, 14 Buckingham Street, London WC2N 6DF; Connaughton, C.
The Fokker–Planck approximation to the Boltzmann equation, solved numerically by stochastic particle schemes, is used to provide estimates for rarefied gas flows. This paper presents a variance reduction technique for a stochastic particle method that is able to greatly reduce the uncertainty of the estimated flow fields when the characteristic speed of the flow is small in comparison to the thermal velocity of the gas. The method relies on importance sampling, requiring minimal changes to the basic stochastic particle scheme. We test the importance sampling scheme on a homogeneous relaxation, planar Couette flow and a lid-driven-cavity flow, and find thatmore » our method is able to greatly reduce the noise of estimated quantities. Significantly, we find that as the characteristic speed of the flow decreases, the variance of the noisy estimators becomes independent of the characteristic speed.« less
Li, Longhai; Feng, Cindy X; Qiu, Shi
2017-06-30
An important statistical task in disease mapping problems is to identify divergent regions with unusually high or low risk of disease. Leave-one-out cross-validatory (LOOCV) model assessment is the gold standard for estimating predictive p-values that can flag such divergent regions. However, actual LOOCV is time-consuming because one needs to rerun a Markov chain Monte Carlo analysis for each posterior distribution in which an observation is held out as a test case. This paper introduces a new method, called integrated importance sampling (iIS), for estimating LOOCV predictive p-values with only Markov chain samples drawn from the posterior based on a full data set. The key step in iIS is that we integrate away the latent variables associated the test observation with respect to their conditional distribution without reference to the actual observation. By following the general theory for importance sampling, the formula used by iIS can be proved to be equivalent to the LOOCV predictive p-value. We compare iIS and other three existing methods in the literature with two disease mapping datasets. Our empirical results show that the predictive p-values estimated with iIS are almost identical to the predictive p-values estimated with actual LOOCV and outperform those given by the existing three methods, namely, the posterior predictive checking, the ordinary importance sampling, and the ghosting method by Marshall and Spiegelhalter (2003). Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.
Ribic, C.A.; Miller, T.W.
1998-01-01
We investigated CART performance with a unimodal response curve for one continuous response and four continuous explanatory variables, where two variables were important (ie directly related to the response) and the other two were not. We explored performance under three relationship strengths and two explanatory variable conditions: equal importance and one variable four times as important as the other. We compared CART variable selection performance using three tree-selection rules ('minimum risk', 'minimum risk complexity', 'one standard error') to stepwise polynomial ordinary least squares (OLS) under four sample size conditions. The one-standard-error and minimum-risk-complexity methods performed about as well as stepwise OLS with large sample sizes when the relationship was strong. With weaker relationships, equally important explanatory variables and larger sample sizes, the one-standard-error and minimum-risk-complexity rules performed better than stepwise OLS. With weaker relationships and explanatory variables of unequal importance, tree-structured methods did not perform as well as stepwise OLS. Comparing performance within tree-structured methods, with a strong relationship and equally important explanatory variables, the one-standard-error-rule was more likely to choose the correct model than were the other tree-selection rules 1) with weaker relationships and equally important explanatory variables; and 2) under all relationship strengths when explanatory variables were of unequal importance and sample sizes were lower.
External Standards or Standard Addition? Selecting and Validating a Method of Standardization
NASA Astrophysics Data System (ADS)
Harvey, David T.
2002-05-01
A common feature of many problem-based laboratories in analytical chemistry is a lengthy independent project involving the analysis of "real-world" samples. Students research the literature, adapting and developing a method suitable for their analyte, sample matrix, and problem scenario. Because these projects encompass the complete analytical process, students must consider issues such as obtaining a representative sample, selecting a method of analysis, developing a suitable standardization, validating results, and implementing appropriate quality assessment/quality control practices. Most textbooks and monographs suitable for an undergraduate course in analytical chemistry, however, provide only limited coverage of these important topics. The need for short laboratory experiments emphasizing important facets of method development, such as selecting a method of standardization, is evident. The experiment reported here, which is suitable for an introductory course in analytical chemistry, illustrates the importance of matrix effects when selecting a method of standardization. Students also learn how a spike recovery is used to validate an analytical method, and obtain a practical experience in the difference between performing an external standardization and a standard addition.
Nelson, Jennifer C.; Marsh, Tracey; Lumley, Thomas; Larson, Eric B.; Jackson, Lisa A.; Jackson, Michael
2014-01-01
Objective Estimates of treatment effectiveness in epidemiologic studies using large observational health care databases may be biased due to inaccurate or incomplete information on important confounders. Study methods that collect and incorporate more comprehensive confounder data on a validation cohort may reduce confounding bias. Study Design and Setting We applied two such methods, imputation and reweighting, to Group Health administrative data (full sample) supplemented by more detailed confounder data from the Adult Changes in Thought study (validation sample). We used influenza vaccination effectiveness (with an unexposed comparator group) as an example and evaluated each method’s ability to reduce bias using the control time period prior to influenza circulation. Results Both methods reduced, but did not completely eliminate, the bias compared with traditional effectiveness estimates that do not utilize the validation sample confounders. Conclusion Although these results support the use of validation sampling methods to improve the accuracy of comparative effectiveness findings from healthcare database studies, they also illustrate that the success of such methods depends on many factors, including the ability to measure important confounders in a representative and large enough validation sample, the comparability of the full sample and validation sample, and the accuracy with which data can be imputed or reweighted using the additional validation sample information. PMID:23849144
Tao, Guohua; Miller, William H
2012-09-28
An efficient time-dependent (TD) Monte Carlo (MC) importance sampling method has recently been developed [G. Tao and W. H. Miller, J. Chem. Phys. 135, 024104 (2011)] for the evaluation of time correlation functions using the semiclassical (SC) initial value representation (IVR) methodology. In this TD-SC-IVR method, the MC sampling uses information from both time-evolved phase points as well as their initial values, and only the "important" trajectories are sampled frequently. Even though the TD-SC-IVR was shown in some benchmark examples to be much more efficient than the traditional time-independent sampling method (which uses only initial conditions), the calculation of the SC prefactor-which is computationally expensive, especially for large systems-is still required for accepted trajectories. In the present work, we present an approximate implementation of the TD-SC-IVR method that is completely prefactor-free; it gives the time correlation function as a classical-like magnitude function multiplied by a phase function. Application of this approach to flux-flux correlation functions (which yield reaction rate constants) for the benchmark H + H(2) system shows very good agreement with exact quantum results. Limitations of the approximate approach are also discussed.
40 CFR 80.330 - What are the sampling and testing requirements for refiners and importers?
Code of Federal Regulations, 2010 CFR
2010-07-01
... an exemption under this section, upon notification from EPA, the refiner's exemption will be void ab initio. (b) Sampling methods. For purposes of paragraph (a) of this section, refiners and importers shall...
40 CFR 80.330 - What are the sampling and testing requirements for refiners and importers?
Code of Federal Regulations, 2013 CFR
2013-07-01
... an exemption under this section, upon notification from EPA, the refiner's exemption will be void ab initio. (b) Sampling methods. For purposes of paragraph (a) of this section, refiners and importers shall...
40 CFR 80.330 - What are the sampling and testing requirements for refiners and importers?
Code of Federal Regulations, 2011 CFR
2011-07-01
... an exemption under this section, upon notification from EPA, the refiner's exemption will be void ab initio. (b) Sampling methods. For purposes of paragraph (a) of this section, refiners and importers shall...
40 CFR 80.330 - What are the sampling and testing requirements for refiners and importers?
Code of Federal Regulations, 2012 CFR
2012-07-01
... an exemption under this section, upon notification from EPA, the refiner's exemption will be void ab initio. (b) Sampling methods. For purposes of paragraph (a) of this section, refiners and importers shall...
40 CFR 80.330 - What are the sampling and testing requirements for refiners and importers?
Code of Federal Regulations, 2014 CFR
2014-07-01
... an exemption under this section, upon notification from EPA, the refiner's exemption will be void ab initio. (b) Sampling methods. For purposes of paragraph (a) of this section, refiners and importers shall...
Chao, Li-Wei; Szrek, Helena; Peltzer, Karl; Ramlagan, Shandir; Fleming, Peter; Leite, Rui; Magerman, Jesswill; Ngwenya, Godfrey B.; Pereira, Nuno Sousa; Behrman, Jere
2011-01-01
Finding an efficient method for sampling micro- and small-enterprises (MSEs) for research and statistical reporting purposes is a challenge in developing countries, where registries of MSEs are often nonexistent or outdated. This lack of a sampling frame creates an obstacle in finding a representative sample of MSEs. This study uses computer simulations to draw samples from a census of businesses and non-businesses in the Tshwane Municipality of South Africa, using three different sampling methods: the traditional probability sampling method, the compact segment sampling method, and the World Health Organization’s Expanded Programme on Immunization (EPI) sampling method. Three mechanisms by which the methods could differ are tested, the proximity selection of respondents, the at-home selection of respondents, and the use of inaccurate probability weights. The results highlight the importance of revisits and accurate probability weights, but the lesser effect of proximity selection on the samples’ statistical properties. PMID:22582004
Estimating the Effective Sample Size of Tree Topologies from Bayesian Phylogenetic Analyses
Lanfear, Robert; Hua, Xia; Warren, Dan L.
2016-01-01
Bayesian phylogenetic analyses estimate posterior distributions of phylogenetic tree topologies and other parameters using Markov chain Monte Carlo (MCMC) methods. Before making inferences from these distributions, it is important to assess their adequacy. To this end, the effective sample size (ESS) estimates how many truly independent samples of a given parameter the output of the MCMC represents. The ESS of a parameter is frequently much lower than the number of samples taken from the MCMC because sequential samples from the chain can be non-independent due to autocorrelation. Typically, phylogeneticists use a rule of thumb that the ESS of all parameters should be greater than 200. However, we have no method to calculate an ESS of tree topology samples, despite the fact that the tree topology is often the parameter of primary interest and is almost always central to the estimation of other parameters. That is, we lack a method to determine whether we have adequately sampled one of the most important parameters in our analyses. In this study, we address this problem by developing methods to estimate the ESS for tree topologies. We combine these methods with two new diagnostic plots for assessing posterior samples of tree topologies, and compare their performance on simulated and empirical data sets. Combined, the methods we present provide new ways to assess the mixing and convergence of phylogenetic tree topologies in Bayesian MCMC analyses. PMID:27435794
Reduction of aflatoxin in rice by different cooking methods.
Sani, Ali Mohamadi; Azizi, Eisa Gholampour; Salehi, Esmaeel Ataye; Rahimi, Khadije
2014-07-01
Rice (Oryza sativa Linn) is one of the basic diets in the north of Iran. The aim of present study was to detect total aflatoxin (AFT) in domestic and imported rice in Amol (in the north of Iran) and to evaluate the effect of different cooking methods on the levels of the toxin. For this purpose, 42 rice samples were collected from retail stores. The raw samples were analysed by enzyme-linked immunosorbent assay (ELISA) technique for toxin assessment and then submitted to two different cooking methods including traditional local method and in rice cooker. After treatment, AFT was determined. Results show that the average concentration of AFT in domestic and imported samples was 1.08 ± 0.02 and 1.89 ± 0.87 ppb, respectively, which is lower than national and European Union standards. The highest AFT reduction (24.8%) was observed when rice samples were cooked by rice cooker but the difference with local method was not statistically significant (p > 0.05). © The Author(s) 2012.
An Overview of Conventional and Emerging Analytical Methods for the Determination of Mycotoxins
Cigić, Irena Kralj; Prosen, Helena
2009-01-01
Mycotoxins are a group of compounds produced by various fungi and excreted into the matrices on which they grow, often food intended for human consumption or animal feed. The high toxicity and carcinogenicity of these compounds and their ability to cause various pathological conditions has led to widespread screening of foods and feeds potentially polluted with them. Maximum permissible levels in different matrices have also been established for some toxins. As these are quite low, analytical methods for determination of mycotoxins have to be both sensitive and specific. In addition, an appropriate sample preparation and pre-concentration method is needed to isolate analytes from rather complicated samples. In this article, an overview of methods for analysis and sample preparation published in the last ten years is given for the most often encountered mycotoxins in different samples, mainly in food. Special emphasis is on liquid chromatography with fluorescence and mass spectrometric detection, while in the field of sample preparation various solid-phase extraction approaches are discussed. However, an overview of other analytical and sample preparation methods less often used is also given. Finally, different matrices where mycotoxins have to be determined are discussed with the emphasis on their specific characteristics important for the analysis (human food and beverages, animal feed, biological samples, environmental samples). Various issues important for accurate qualitative and quantitative analyses are critically discussed: sampling and choice of representative sample, sample preparation and possible bias associated with it, specificity of the analytical method and critical evaluation of results. PMID:19333436
NASA Astrophysics Data System (ADS)
Theunissen, Raf; Kadosh, Jesse S.; Allen, Christian B.
2015-06-01
Spatially varying signals are typically sampled by collecting uniformly spaced samples irrespective of the signal content. For signals with inhomogeneous information content, this leads to unnecessarily dense sampling in regions of low interest or insufficient sample density at important features, or both. A new adaptive sampling technique is presented directing sample collection in proportion to local information content, capturing adequately the short-period features while sparsely sampling less dynamic regions. The proposed method incorporates a data-adapted sampling strategy on the basis of signal curvature, sample space-filling, variable experimental uncertainty and iterative improvement. Numerical assessment has indicated a reduction in the number of samples required to achieve a predefined uncertainty level overall while improving local accuracy for important features. The potential of the proposed method has been further demonstrated on the basis of Laser Doppler Anemometry experiments examining the wake behind a NACA0012 airfoil and the boundary layer characterisation of a flat plate.
Capel, P.D.; Larson, S.J.
1995-01-01
Minimizing the loss of target organic chemicals from environmental water samples between the time of sample collection and isolation is important to the integrity of an investigation. During this sample holding time, there is a potential for analyte loss through volatilization from the water to the headspace, sorption to the walls and cap of the sample bottle; and transformation through biotic and/or abiotic reactions. This paper presents a chemodynamic-based, generalized approach to estimate the most probable loss processes for individual target organic chemicals. The basic premise is that the investigator must know which loss process(es) are important for a particular analyte, based on its chemodynamic properties, when choosing the appropriate method(s) to prevent loss.
GROUND WATER SAMPLING USING LOW-FLOW TECHNIQUES
Obtaining representative ground water samples is important for site assessment and remedial performance monitoring objectives. The sampling device or method used to collect samples from monitoring or compliance well can significantly impact data quality and reliability. Low-flo...
[Sampling methods for PM2.5 from stationary sources: a review].
Jiang, Jing-Kun; Deng, Jian-Guo; Li, Zhen; Li, Xing-Hua; Duan, Lei; Hao, Ji-Ming
2014-05-01
The new China national ambient air quality standard has been published in 2012 and will be implemented in 2016. To meet the requirements in this new standard, monitoring and controlling PM2,,5 emission from stationary sources are very important. However, so far there is no national standard method on sampling PM2.5 from stationary sources. Different sampling methods for PM2.5 from stationary sources and relevant international standards were reviewed in this study. It includes the methods for PM2.5 sampling in flue gas and the methods for PM2.5 sampling after dilution. Both advantages and disadvantages of these sampling methods were discussed. For environmental management, the method for PM2.5 sampling in flue gas such as impactor and virtual impactor was suggested as a standard to determine filterable PM2.5. To evaluate environmental and health effects of PM2.5 from stationary sources, standard dilution method for sampling of total PM2.5 should be established.
Salter, Robert; Holmes, Steven; Legg, David; Coble, Joel; George, Bruce
2012-02-01
Pork tissue samples that tested positive and negative by the Charm II tetracycline test screening method in the slaughter plant laboratory were tested with the modified AOAC International liquid chromatography tandem mass spectrometry (LC-MS-MS) method 995.09 to determine the predictive value of the screening method at detecting total tetracyclines at 10 μg/kg of tissue, in compliance with Russian import regulations. There were 218 presumptive-positive tetracycline samples of 4,195 randomly tested hogs. Of these screening test positive samples, 83% (182) were positive, >10 μg/kg by LC-MS-MS; 12.8% (28) were false violative, greater than limit of detection (LOD) but <10 μg/kg; and 4.2% (8) were not detected at the LC-MS-MS LOD. The 36 false-violative and not-detected samples represent 1% of the total samples screened. Twenty-seven of 30 randomly selected tetracycline screening negative samples tested below the LC-MS-MS LOD, and 3 samples tested <3 μg/kg chlortetracycline. Results indicate that the Charm II tetracycline test is effective at predicting hogs containing >10 μg/kg total tetracyclines in compliance with Russian import regulations.
Zapka, C; Leff, J; Henley, J; Tittl, J; De Nardo, E; Butler, M; Griggs, R; Fierer, N; Edmonds-Wilson, S
2017-03-28
Hands play a critical role in the transmission of microbiota on one's own body, between individuals, and on environmental surfaces. Effectively measuring the composition of the hand microbiome is important to hand hygiene science, which has implications for human health. Hand hygiene products are evaluated using standard culture-based methods, but standard test methods for culture-independent microbiome characterization are lacking. We sampled the hands of 50 participants using swab-based and glove-based methods prior to and following four hand hygiene treatments (using a nonantimicrobial hand wash, alcohol-based hand sanitizer [ABHS], a 70% ethanol solution, or tap water). We compared results among culture plate counts, 16S rRNA gene sequencing of DNA extracted directly from hands, and sequencing of DNA extracted from culture plates. Glove-based sampling yielded higher numbers of unique operational taxonomic units (OTUs) but had less diversity in bacterial community composition than swab-based sampling. We detected treatment-induced changes in diversity only by using swab-based samples ( P < 0.001); we were unable to detect changes with glove-based samples. Bacterial cell counts significantly decreased with use of the ABHS ( P < 0.05) and ethanol control ( P < 0.05). Skin hydration at baseline correlated with bacterial abundances, bacterial community composition, pH, and redness across subjects. The importance of the method choice was substantial. These findings are important to ensure improvement of hand hygiene industry methods and for future hand microbiome studies. On the basis of our results and previously published studies, we propose recommendations for best practices in hand microbiome research. IMPORTANCE The hand microbiome is a critical area of research for diverse fields, such as public health and forensics. The suitability of culture-independent methods for assessing effects of hygiene products on microbiota has not been demonstrated. This is the first controlled laboratory clinical hand study to have compared traditional hand hygiene test methods with newer culture-independent characterization methods typically used by skin microbiologists. This study resulted in recommendations for hand hygiene product testing, development of methods, and future hand skin microbiome research. It also demonstrated the importance of inclusion of skin physiological metadata in skin microbiome research, which is atypical for skin microbiome studies. Copyright © 2017 Zapka et al.
GROUND WATER SAMPLING FOR VERTICAL PROFILING OF CONTAMINANTS
Accurate delineation of plume boundaries and vertical contaminant distribution are necessary in order to adequately characterize waste sites and determine remedial strategies to be employed. However, it is important to consider the sampling objectives, sampling methods, and sampl...
Adaptive Importance Sampling for Control and Inference
NASA Astrophysics Data System (ADS)
Kappen, H. J.; Ruiz, H. C.
2016-03-01
Path integral (PI) control problems are a restricted class of non-linear control problems that can be solved formally as a Feynman-Kac PI and can be estimated using Monte Carlo sampling. In this contribution we review PI control theory in the finite horizon case. We subsequently focus on the problem how to compute and represent control solutions. We review the most commonly used methods in robotics and control. Within the PI theory, the question of how to compute becomes the question of importance sampling. Efficient importance samplers are state feedback controllers and the use of these requires an efficient representation. Learning and representing effective state-feedback controllers for non-linear stochastic control problems is a very challenging, and largely unsolved, problem. We show how to learn and represent such controllers using ideas from the cross entropy method. We derive a gradient descent method that allows to learn feed-back controllers using an arbitrary parametrisation. We refer to this method as the path integral cross entropy method or PICE. We illustrate this method for some simple examples. The PI control methods can be used to estimate the posterior distribution in latent state models. In neuroscience these problems arise when estimating connectivity from neural recording data using EM. We demonstrate the PI control method as an accurate alternative to particle filtering.
Masuyama, Kotoka; Shojo, Hideki; Nakanishi, Hiroaki; Inokuchi, Shota; Adachi, Noboru
2017-01-01
Sex determination is important in archeology and anthropology for the study of past societies, cultures, and human activities. Sex determination is also one of the most important components of individual identification in criminal investigations. We developed a new method of sex determination by detecting a single-nucleotide polymorphism in the amelogenin gene using amplified product-length polymorphisms in combination with sex-determining region Y analysis. We particularly focused on the most common types of postmortem DNA damage in ancient and forensic samples: fragmentation and nucleotide modification resulting from deamination. Amplicon size was designed to be less than 60 bp to make the method more useful for analyzing degraded DNA samples. All DNA samples collected from eight Japanese individuals (four male, four female) were evaluated correctly using our method. The detection limit for accurate sex determination was determined to be 20 pg of DNA. We compared our new method with commercial short tandem repeat analysis kits using DNA samples artificially fragmented by ultraviolet irradiation. Our novel method was the most robust for highly fragmented DNA samples. To deal with allelic dropout resulting from deamination, we adopted "bidirectional analysis," which analyzed samples from both sense and antisense strands. This new method was applied to 14 Jomon individuals (3500-year-old bone samples) whose sex had been identified morphologically. We could correctly identify the sex of 11 out of 14 individuals. These results show that our method is reliable for the sex determination of highly degenerated samples.
Masuyama, Kotoka; Shojo, Hideki; Nakanishi, Hiroaki; Inokuchi, Shota; Adachi, Noboru
2017-01-01
Sex determination is important in archeology and anthropology for the study of past societies, cultures, and human activities. Sex determination is also one of the most important components of individual identification in criminal investigations. We developed a new method of sex determination by detecting a single-nucleotide polymorphism in the amelogenin gene using amplified product-length polymorphisms in combination with sex-determining region Y analysis. We particularly focused on the most common types of postmortem DNA damage in ancient and forensic samples: fragmentation and nucleotide modification resulting from deamination. Amplicon size was designed to be less than 60 bp to make the method more useful for analyzing degraded DNA samples. All DNA samples collected from eight Japanese individuals (four male, four female) were evaluated correctly using our method. The detection limit for accurate sex determination was determined to be 20 pg of DNA. We compared our new method with commercial short tandem repeat analysis kits using DNA samples artificially fragmented by ultraviolet irradiation. Our novel method was the most robust for highly fragmented DNA samples. To deal with allelic dropout resulting from deamination, we adopted “bidirectional analysis,” which analyzed samples from both sense and antisense strands. This new method was applied to 14 Jomon individuals (3500-year-old bone samples) whose sex had been identified morphologically. We could correctly identify the sex of 11 out of 14 individuals. These results show that our method is reliable for the sex determination of highly degenerated samples. PMID:28052096
Volatile organic compounds: sampling methods and their worldwide profile in ambient air.
Kumar, Anuj; Víden, Ivan
2007-08-01
The atmosphere is a particularly difficult analytical system because of the very low levels of substances to be analysed, sharp variations in pollutant levels with time and location, differences in wind, temperature and humidity. This makes the selection of an efficient sampling technique for air analysis a key step to reliable results. Generally, methods for volatile organic compounds sampling include collection of the whole air or preconcentration of samples on adsorbents. All the methods vary from each other according to the sampling technique, type of sorbent, method of extraction and identification technique. In this review paper we discuss various important aspects for sampling of volatile organic compounds by the widely used and advanced sampling methods. Characteristics of various adsorbents used for VOCs sampling are also described. Furthermore, this paper makes an effort to comprehensively review the concentration levels of volatile organic compounds along with the methodology used for analysis, in major cities of the world.
van der Elst, Kim C. M.; Span, Lambert F. R.; van Hateren, Kai; Vermeulen, Karin M.; van der Werf, Tjip S.; Greijdanus, Ben; Kosterink, Jos G. W.; Uges, Donald R. A.
2013-01-01
Invasive aspergillosis and candidemia are important causes of morbidity and mortality in immunocompromised and critically ill patients. The triazoles voriconazole, fluconazole, and posaconazole are widely used for the treatment and prophylaxis of these fungal infections. Due to the variability of the pharmacokinetics of the triazoles among and within individual patients, therapeutic drug monitoring is important for optimizing the efficacy and safety of antifungal treatment. A dried blood spot (DBS) analysis was developed and was clinically validated for voriconazole, fluconazole, and posaconazole in 28 patients. Furthermore, a questionnaire was administered to evaluate the patients' opinions of the sampling method. The DBS analytical method showed linearity over the concentration range measured for all triazoles. Results for accuracy and precision were within accepted ranges; samples were stable at room temperature for at least 12 days; and different hematocrit values and blood spot volumes had no significant influence. The ratio of the drug concentration in DBS samples to that in plasma was 1.0 for voriconazole and fluconazole and 0.9 for posaconazole. Sixty percent of the patients preferred DBS analysis as a sampling method; 15% preferred venous blood sampling; and 25% had no preferred method. There was significantly less perception of pain with the DBS sampling method (P = 0.021). In conclusion, DBS analysis is a reliable alternative to venous blood sampling and can be used for therapeutic drug monitoring of voriconazole, fluconazole, and posaconazole. Patients were satisfied with DBS sampling and had less pain than with venous sampling. Most patients preferred DBS sampling to venous blood sampling. PMID:23896473
Recommendations for level-determined sampling in wells
NASA Astrophysics Data System (ADS)
Lerner, David N.; Teutsch, Georg
1995-10-01
Level-determined samples of groundwater are increasingly important for hydrogeological studies. The techniques for collecting them range from the use of purpose drilled wells, sometimes with sophisticated dedicated multi-level samplers in them, to a variety of methods used in open wells. Open, often existing, wells are frequently used on cost grounds, but there are risks of obtaining poor and unrepresentative samples. Alternative approaches to level-determined sampling incorporate seven concepts: depth sampling; packer systems; individual wells; dedicated multi-level systems; separation pumping; baffle systems; multi-port sock samplers. These are outlined and evaluated in terms of the environment to be sampled, and the features and performance of the methods. Recommendations are offered to match methods to sampling problems.
Two research studies funded and overseen by EPA have been conducted since October 2006 on soil gas sampling methods and variations in shallow soil gas concentrations with the purpose of improving our understanding of soil gas methods and data for vapor intrusion applications. Al...
Rapid method for the determination of 226Ra in hydraulic fracturing wastewater samples
DOE Office of Scientific and Technical Information (OSTI.GOV)
Maxwell, Sherrod L.; Culligan, Brian K.; Warren, Richard A.
A new method that rapidly preconcentrates and measures 226Ra from hydraulic fracturing wastewater samples was developed in the Savannah River Environmental Laboratory. The method improves the quality of 226Ra measurements using gamma spectrometry by providing up to 100x preconcentration of 226Ra from this difficult sample matrix, which contains very high levels of calcium, barium, strontium, magnesium and sodium. The high chemical yield, typically 80-90%, facilitates a low detection limit, important for lower level samples, and indicates method ruggedness. Ba-133 tracer is used to determine chemical yield and correct for geometry-related counting issues. The 226Ra sample preparation takes < 2 hours.
Rapid method for the determination of 226Ra in hydraulic fracturing wastewater samples
Maxwell, Sherrod L.; Culligan, Brian K.; Warren, Richard A.; ...
2016-03-24
A new method that rapidly preconcentrates and measures 226Ra from hydraulic fracturing wastewater samples was developed in the Savannah River Environmental Laboratory. The method improves the quality of 226Ra measurements using gamma spectrometry by providing up to 100x preconcentration of 226Ra from this difficult sample matrix, which contains very high levels of calcium, barium, strontium, magnesium and sodium. The high chemical yield, typically 80-90%, facilitates a low detection limit, important for lower level samples, and indicates method ruggedness. Ba-133 tracer is used to determine chemical yield and correct for geometry-related counting issues. The 226Ra sample preparation takes < 2 hours.
Testing sample stability using four storage methods and the macroalgae Ulva and Gracilaria
Concern over the relative importance of different sample preparation and storage techniques frequently used in stable isotope analysis of particulate nitrogen (δ15N) and carbon (δ13C) prompted an experiment to determine how important such factors were to measured values in marine...
SAMPL4 & DOCK3.7: lessons for automated docking procedures
NASA Astrophysics Data System (ADS)
Coleman, Ryan G.; Sterling, Teague; Weiss, Dahlia R.
2014-03-01
The SAMPL4 challenges were used to test current automated methods for solvation energy, virtual screening, pose and affinity prediction of the molecular docking pipeline DOCK 3.7. Additionally, first-order models of binding affinity were proposed as milestones for any method predicting binding affinity. Several important discoveries about the molecular docking software were made during the challenge: (1) Solvation energies of ligands were five-fold worse than any other method used in SAMPL4, including methods that were similarly fast, (2) HIV Integrase is a challenging target, but automated docking on the correct allosteric site performed well in terms of virtual screening and pose prediction (compared to other methods) but affinity prediction, as expected, was very poor, (3) Molecular docking grid sizes can be very important, serious errors were discovered with default settings that have been adjusted for all future work. Overall, lessons from SAMPL4 suggest many changes to molecular docking tools, not just DOCK 3.7, that could improve the state of the art. Future difficulties and projects will be discussed.
Molecular diagnosis of strongyloidiasis in a population of an endemic area through nested-PCR.
Sharifdini, Meysam; Keyhani, Amir; Eshraghian, Mohammad Reza; Beigom Kia, Eshrat
2018-01-01
This study is aimed to diagnose and analyze strongyloidiasis in a population of an endemic area of Iran using nested-PCR, coupled with parasitological methods. Screening of strongyloidiasis infected people using reliable diagnostic techniques are essential to decrease the mortality and morbidity associated with this infection. Molecular methods have been proved to be highly sensitive and specific for detection of Strongyloides stercoralis in stool samples. A total of 155 fresh single stool samples were randomly collected from residents of north and northwest of Khouzestan Province, Iran. All samples were examined by parasitological methods including formalin-ether concentration and nutrient agar plate culture, and molecular method of nested-PCR. Infections with S. stercoralis were analyzed according to demographic criteria. Based on the results of nested-PCR method 15 cases (9.7%) were strongyloidiasis positive. Nested-PCR was more sensitive than parasitological techniques on single stool sampling. Elderly was the most important population index for higher infectivity with S. stercoralis . In endemic areas of S. stercoralis , old age should be considered as one of the most important risk factors of infection, especially among the immunosuppressed individuals.
Leff, J.; Henley, J.; Tittl, J.; De Nardo, E.; Butler, M.; Griggs, R.; Fierer, N.
2017-01-01
ABSTRACT Hands play a critical role in the transmission of microbiota on one’s own body, between individuals, and on environmental surfaces. Effectively measuring the composition of the hand microbiome is important to hand hygiene science, which has implications for human health. Hand hygiene products are evaluated using standard culture-based methods, but standard test methods for culture-independent microbiome characterization are lacking. We sampled the hands of 50 participants using swab-based and glove-based methods prior to and following four hand hygiene treatments (using a nonantimicrobial hand wash, alcohol-based hand sanitizer [ABHS], a 70% ethanol solution, or tap water). We compared results among culture plate counts, 16S rRNA gene sequencing of DNA extracted directly from hands, and sequencing of DNA extracted from culture plates. Glove-based sampling yielded higher numbers of unique operational taxonomic units (OTUs) but had less diversity in bacterial community composition than swab-based sampling. We detected treatment-induced changes in diversity only by using swab-based samples (P < 0.001); we were unable to detect changes with glove-based samples. Bacterial cell counts significantly decreased with use of the ABHS (P < 0.05) and ethanol control (P < 0.05). Skin hydration at baseline correlated with bacterial abundances, bacterial community composition, pH, and redness across subjects. The importance of the method choice was substantial. These findings are important to ensure improvement of hand hygiene industry methods and for future hand microbiome studies. On the basis of our results and previously published studies, we propose recommendations for best practices in hand microbiome research. PMID:28351915
Standard methods for sampling North American freshwater fishes
Bonar, Scott A.; Hubert, Wayne A.; Willis, David W.
2009-01-01
This important reference book provides standard sampling methods recommended by the American Fisheries Society for assessing and monitoring freshwater fish populations in North America. Methods apply to ponds, reservoirs, natural lakes, and streams and rivers containing cold and warmwater fishes. Range-wide and eco-regional averages for indices of abundance, population structure, and condition for individual species are supplied to facilitate comparisons of standard data among populations. Provides information on converting nonstandard to standard data, statistical and database procedures for analyzing and storing standard data, and methods to prevent transfer of invasive species while sampling.
Evon, Donna M.; Golin, Carol E.; Stoica, Teodora; Jones, Rachel; Willis, Sarah; Galanko, Joseph; Fried, Michael W.
2017-01-01
Background & Objectives Multiple treatment options with direct-acting antivirals (DAAs) are now available for hepatitis C virus (HCV). Study aims were to understand (1) The informational topics patients want to have to make informed treatment decisions; (2) The importance patients place on each topic; and (3) The topics patients prioritize as most important. Methods Mixed methods study utilizing two samples recruited from an academic liver center. Participants were not currently on treatment. Sample I (n=45) free-listed all informational topics deemed important to decision-making. Raw responses were coded into several broad and subcategories. Sample II (n=38) rated the importance of the subcategories from Sample I and ranked their highest priorities on two surveys, one containing topics for which sufficient research existed to inform patients (“static”), and the other containing topics that would require additional research. Results The topics listed by Sample I fell into 6 broad categories with 17 total subcategories. The most oft-cited informational topics were harms of treatment (100%), treatment benefits (62%) and treatment regimen details (84%). Sample II rated 16 of 17 subcategories as “pretty important’ or “extremely important.” Sample II prioritized (1) viral cure, (2) long-term survival, and (3) side effects on the survey of topics requiring additional research, and (1) liver disease, (2) lifestyle changes, and (3) medication details on the second survey of the most important static topics patients needed. Conclusions Patients weighed several informational topics to make an informed decision about HCV treatment. These findings lay the groundwork for future patient-centered outcomes research in HCV and patient-provider communication to enhance patients’ informed decision-making regarding DAA treatment options. PMID:27882509
2014-01-01
Background This study compares the efficiency of identifying the plants in an area of semi-arid Northeast Brazil by methods that a) access the local knowledge used in ethnobotanical studies using semi-structured interviews conducted within the entire community, an inventory interview conducted with two participants using the previously collected vegetation inventory, and a participatory workshop presenting exsiccates and photographs to 32 people and b) inventory the vegetation (phytosociology) in locations with different histories of disturbance using rectangular plots and quadrant points. Methods The proportion of species identified using each method was then compared with Cochran’s Q test. We calculated the use value (UV) of each species using semi-structured interviews; this quantitative index was correlated against values of the vegetation’s structural importance obtained from the sample plot method and point-centered quarter method applied in two areas with different historical usage. The analysis sought to correlate the relative importance of plants to the local community (use value - UV) with the ecological importance of the plants in the vegetation structure (importance value - IV; relative density - RD) by using different sampling methods to analyze the two areas. Results With regard to the methods used for accessing the local knowledge, a difference was observed among the ethnobotanical methods of surveying species (Q = 13.37, df = 2, p = 0.0013): 44 species were identified in the inventory interview, 38 in the participatory workshop and 33 in the semi-structured interviews with the community. There was either no correlation between the UV, relative density (RD) and importance value (IV) of some species, or this correlation was negative. Conclusion It was concluded that the inventory interview was the most efficient method for recording species and their uses, as it allowed more plants to be identified in their original environment. To optimize researchers’ time in future studies, the use of the point-centered quarter method rather than the sample plot method is recommended. PMID:24916833
Representativeness of direct observations selected using a work-sampling equation.
Sharp, Rebecca A; Mudford, Oliver C; Elliffe, Douglas
2015-01-01
Deciding on appropriate sampling to obtain representative samples of behavior is important but not straightforward, because the relative duration of the target behavior may affect its observation in a given sampling interval. Work-sampling methods, which offer a way to adjust the frequency of sampling according to a priori or ongoing estimates of the behavior to achieve a preselected level of representativeness, may provide a solution. Full-week observations of 7 behaviors were conducted for 3 students with autism spectrum disorder and intellectual disabilities. Work-sampling methods were used to select momentary time samples from the full time-of-interest, which produced representative samples. However, work sampling required impractically high numbers of time samples to obtain representative samples. More practical momentary time samples produced less representative samples, particularly for low-duration behaviors. The utility and limits of work-sampling methods for applied behavior analysis are discussed. © Society for the Experimental Analysis of Behavior.
Monte Carlo sampling in diffusive dynamical systems
NASA Astrophysics Data System (ADS)
Tapias, Diego; Sanders, David P.; Altmann, Eduardo G.
2018-05-01
We introduce a Monte Carlo algorithm to efficiently compute transport properties of chaotic dynamical systems. Our method exploits the importance sampling technique that favors trajectories in the tail of the distribution of displacements, where deviations from a diffusive process are most prominent. We search for initial conditions using a proposal that correlates states in the Markov chain constructed via a Metropolis-Hastings algorithm. We show that our method outperforms the direct sampling method and also Metropolis-Hastings methods with alternative proposals. We test our general method through numerical simulations in 1D (box-map) and 2D (Lorentz gas) systems.
Comparison of microstickies measurement methods. Part II, Results and discussion
Mahendra R. Doshi; Angeles Blanco; Carlos Negro; Concepcion Monte; Gilles M. Dorris; Carlos C. Castro; Axel Hamann; R. Daniel Haynes; Carl Houtman; Karen Scallon; Hans-Joachim Putz; Hans Johansson; R. A. Venditti; K. Copeland; H.-M. Chang
2003-01-01
In part I of the article we discussed sample preparation procedure and described various methods used for the measurement of microstickies. Some of the important features of different methods are highlighted in Table 1. Temperatures used in the measurement methods vary from room temperature in some cases, 45 °C to 65 °C in other cases. Sample size ranges from as low as...
Wheeler, Sarah S; Boyce, Walter M; Reisen, William K
2016-01-01
Avian hosts play an important role in the spread, maintenance, and amplification of West Nile virus (WNV). Avian susceptibility to WNV varies from species to species thus surveillance efforts can focus both on birds that survive infection and those that succumb. Here we describe methods for the collection and sampling of live birds for WNV antibodies or viremia, and methods for the sampling of dead birds. Target species and study design considerations are discussed.
Ancestral inference from haplotypes and mutations.
Griffiths, Robert C; Tavaré, Simon
2018-04-25
We consider inference about the history of a sample of DNA sequences, conditional upon the haplotype counts and the number of segregating sites observed at the present time. After deriving some theoretical results in the coalescent setting, we implement rejection sampling and importance sampling schemes to perform the inference. The importance sampling scheme addresses an extension of the Ewens Sampling Formula for a configuration of haplotypes and the number of segregating sites in the sample. The implementations include both constant and variable population size models. The methods are illustrated by two human Y chromosome datasets. Copyright © 2018. Published by Elsevier Inc.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Graf, Peter; Damiani, Rick R.; Dykes, Katherine
2017-01-09
A new adaptive stratified importance sampling (ASIS) method is proposed as an alternative approach for the calculation of the 50 year extreme load under operational conditions, as in design load case 1.1 of the the International Electrotechnical Commission design standard. ASIS combines elements of the binning and extrapolation technique, currently described by the standard, and of the importance sampling (IS) method to estimate load probability of exceedances (POEs). Whereas a Monte Carlo (MC) approach would lead to the sought level of POE with a daunting number of simulations, IS-based techniques are promising as they target the sampling of the inputmore » parameters on the parts of the distributions that are most responsible for the extreme loads, thus reducing the number of runs required. We compared the various methods on select load channels as output from FAST, an aero-hydro-servo-elastic tool for the design and analysis of wind turbines developed by the National Renewable Energy Laboratory (NREL). Our newly devised method, although still in its infancy in terms of tuning of the subparameters, is comparable to the others in terms of load estimation and its variance versus computational cost, and offers great promise going forward due to the incorporation of adaptivity into the already powerful importance sampling concept.« less
Methods for measuring populations of small, diurnal forest birds.
D.A. Manuwal; A.B. Carey
1991-01-01
Before a bird population is measured, the objectives of the study should be clearly defined. Important factors to be considered in designing a study are study site selection, plot size or transect length, distance between sampling points, duration of counts, and frequency and timing of sampling. Qualified field personnel are especially important. Assumptions applying...
Data in support of the detection of genetically modified organisms (GMOs) in food and feed samples.
Alasaad, Noor; Alzubi, Hussein; Kader, Ahmad Abdul
2016-06-01
Food and feed samples were randomly collected from different sources, including local and imported materials from the Syrian local market. These included maize, barley, soybean, fresh food samples and raw material. GMO detection was conducted by PCR and nested PCR-based techniques using specific primers for the most used foreign DNA commonly used in genetic transformation procedures, i.e., 35S promoter, T-nos, epsps, cryIA(b) gene and nptII gene. The results revealed for the first time in Syria the presence of GM foods and feeds with glyphosate-resistant trait of P35S promoter and NOS terminator in the imported soybean samples with high frequency (5 out of the 6 imported soybean samples). While, tests showed negative results for the local samples. Also, tests revealed existence of GMOs in two imported maize samples detecting the presence of 35S promoter and nos terminator. Nested PCR results using two sets of primers confirmed our data. The methods applied in the brief data are based on DNA analysis by Polymerase Chain Reaction (PCR). This technique is specific, practical, reproducible and sensitive enough to detect up to 0.1% GMO in food and/or feedstuffs. Furthermore, all of the techniques mentioned are economic and can be applied in Syria and other developing countries. For all these reasons, the DNA-based analysis methods were chosen and preferred over protein-based analysis.
Extending the solvent-free MALDI sample preparation method.
Hanton, Scott D; Parees, David M
2005-01-01
Matrix-assisted laser desorption/ionization (MALDI) mass spectrometry is an important technique to characterize many different materials, including synthetic polymers. MALDI mass spectral data can be used to determine the polymer average molecular weights, repeat units, and end groups. One of the key issues in traditional MALDI sample preparation is making good solutions of the analyte and the matrix. Solvent-free sample preparation methods have been developed to address these issues. Previous results of solvent-free or dry prepared samples show some advantages over traditional wet sample preparation methods. Although the results of the published solvent-free sample preparation methods produced excellent mass spectra, we found the method to be very time-consuming, with significant tool cleaning, which presents a significant possibility of cross contamination. To address these issues, we developed an extension of the solvent-free method that replaces the mortar and pestle grinding with ball milling the sample in a glass vial with two small steel balls. This new method generates mass spectra with equal quality of the previous methods, but has significant advantages in productivity, eliminates cross contamination, and is applicable to liquid and soft or waxy analytes.
Frosth, Sara; König, Ulrika; Nyman, Ann-Kristin; Aspán, Anna
2017-09-01
Dichelobacter nodosus is the principal cause of ovine footrot and strain virulence is an important factor in disease severity. Therefore, detection and virulence determination of D. nodosus is important for proper diagnosis of the disease. Today this is possible by real-time PCR analysis. Analysis of large numbers of samples is costly and laborious; therefore, pooling of individual samples is common in surveillance programs. However, pooling can reduce the sensitivity of the method. The aim of this study was to develop a pooling method for real-time PCR analysis that would allow sensitive detection and simultaneous virulence determination of D. nodosus. A total of 225 sheep from 17 flocks were sampled using ESwabs within the Swedish Footrot Control Program in 2014. Samples were first analysed individually and then in pools of five by real-time PCR assays targeting the 16S rRNA and aprV2/B2 genes of D. nodosus. Each pool consisted of four negative and one positive D. nodosus samples with varying amounts of the bacterium. In the individual analysis, 61 (27.1%) samples were positive in the 16S rRNA and the aprV2/B2 PCR assays and 164 (72.9%) samples were negative. All samples positive in the aprV2/B2 PCR-assay were of aprB2 variant. The pooled analysis showed that all 41 pools were also positive for D. nodosus 16S rRNA and the aprB2 variant. The diagnostic sensitivity for pooled and individual samples was therefore similar. Our method includes concentration of the bacteria before DNA-extraction. This may account for the maintenance of diagnostic sensitivity. Diagnostic sensitivity in the real-time PCR assays of the pooled samples were comparable to the sensitivity obtained for individually analysed samples. Even sub-clinical infections were able to be detected in the pooled PCR samples which is important for control of the disease. This method may therefore be implemented in footrot control programs where it can replace analysis of individual samples.
The sleeping beauty kissed awake: new methods in electron microscopy to study cellular membranes.
Chlanda, Petr; Krijnse Locker, Jacomine
2017-03-07
Electron microscopy (EM) for biological samples, developed in the 1940-1950s, changed our conception about the architecture of eukaryotic cells. It was followed by a period where EM applied to cell biology had seemingly fallen asleep, even though new methods with important implications for modern EM were developed. Among these was the discovery that samples can be preserved by chemical fixation and most importantly by rapid freezing without the formation of crystalline ice, giving birth to the world of cryo-EM. The past 15-20 years are hallmarked by a tremendous interest in EM, driven by important technological advances. Cryo-EM, in particular, is now capable of revealing structures of proteins at a near-atomic resolution owing to improved sample preparation methods, microscopes and cameras. In this review, we focus on the challenges associated with the imaging of membranes by EM and give examples from the field of host-pathogen interactions, in particular of virus-infected cells. Despite the advantages of imaging membranes under native conditions in cryo-EM, conventional EM will remain an important complementary method, in particular if large volumes need to be imaged. © 2017 The Author(s); published by Portland Press Limited on behalf of the Biochemical Society.
Improved sampling and analysis of images in corneal confocal microscopy.
Schaldemose, E L; Fontain, F I; Karlsson, P; Nyengaard, J R
2017-10-01
Corneal confocal microscopy (CCM) is a noninvasive clinical method to analyse and quantify corneal nerve fibres in vivo. Although the CCM technique is in constant progress, there are methodological limitations in terms of sampling of images and objectivity of the nerve quantification. The aim of this study was to present a randomized sampling method of the CCM images and to develop an adjusted area-dependent image analysis. Furthermore, a manual nerve fibre analysis method was compared to a fully automated method. 23 idiopathic small-fibre neuropathy patients were investigated using CCM. Corneal nerve fibre length density (CNFL) and corneal nerve fibre branch density (CNBD) were determined in both a manual and automatic manner. Differences in CNFL and CNBD between (1) the randomized and the most common sampling method, (2) the adjusted and the unadjusted area and (3) the manual and automated quantification method were investigated. The CNFL values were significantly lower when using the randomized sampling method compared to the most common method (p = 0.01). There was not a statistical significant difference in the CNBD values between the randomized and the most common sampling method (p = 0.85). CNFL and CNBD values were increased when using the adjusted area compared to the standard area. Additionally, the study found a significant increase in the CNFL and CNBD values when using the manual method compared to the automatic method (p ≤ 0.001). The study demonstrated a significant difference in the CNFL values between the randomized and common sampling method indicating the importance of clear guidelines for the image sampling. The increase in CNFL and CNBD values when using the adjusted cornea area is not surprising. The observed increases in both CNFL and CNBD values when using the manual method of nerve quantification compared to the automatic method are consistent with earlier findings. This study underlines the importance of improving the analysis of the CCM images in order to obtain more objective corneal nerve fibre measurements. © 2017 The Authors Journal of Microscopy © 2017 Royal Microscopical Society.
A Simple Sampling Method for Estimating the Accuracy of Large Scale Record Linkage Projects.
Boyd, James H; Guiver, Tenniel; Randall, Sean M; Ferrante, Anna M; Semmens, James B; Anderson, Phil; Dickinson, Teresa
2016-05-17
Record linkage techniques allow different data collections to be brought together to provide a wider picture of the health status of individuals. Ensuring high linkage quality is important to guarantee the quality and integrity of research. Current methods for measuring linkage quality typically focus on precision (the proportion of incorrect links), given the difficulty of measuring the proportion of false negatives. The aim of this work is to introduce and evaluate a sampling based method to estimate both precision and recall following record linkage. In the sampling based method, record-pairs from each threshold (including those below the identified cut-off for acceptance) are sampled and clerically reviewed. These results are then applied to the entire set of record-pairs, providing estimates of false positives and false negatives. This method was evaluated on a synthetically generated dataset, where the true match status (which records belonged to the same person) was known. The sampled estimates of linkage quality were relatively close to actual linkage quality metrics calculated for the whole synthetic dataset. The precision and recall measures for seven reviewers were very consistent with little variation in the clerical assessment results (overall agreement using the Fleiss Kappa statistics was 0.601). This method presents as a possible means of accurately estimating matching quality and refining linkages in population level linkage studies. The sampling approach is especially important for large project linkages where the number of record pairs produced may be very large often running into millions.
Importance of sample preparation for molecular diagnosis of lyme borreliosis from urine.
Bergmann, A R; Schmidt, B L; Derler, A-M; Aberer, E
2002-12-01
Urine PCR has been used for the diagnosis of Borrelia burgdorferi infection in recent years but has been abandoned because of its low sensitivity and the irreproducibility of the results. Our study aimed to analyze technical details related to sample preparation and detection methods. Crucial for a successful urine PCR were (i) avoidance of the first morning urine sample; (ii) centrifugation at 36,000 x g; and (iii) the extraction method, with only DNAzol of the seven different extraction methods used yielding positive results with patient urine specimens. Furthermore, storage of frozen urine samples at -80 degrees C reduced the sensitivity of a positive urine PCR result obtained with samples from 72 untreated erythema migrans (EM) patients from 85% in the first 3 months to <30% after more than 3 months. Bands were detected at 276 bp on ethidium bromide-stained agarose gels after amplification by a nested PCR. The specificity of bands for 32 of 33 samples was proven by hybridization with a GEN-ETI-K-DEIA kit and for a 10 further positive amplicons by sequencing. By using all of these steps to optimize the urine PCR technique, B. burgdorferi infection could be diagnosed by using urine samples from EM patients with a sensitivity (85%) substantially better than that of serological methods (50%). This improved method could be of future importance as an additional laboratory technique for the diagnosis of unclear, unrecognized borrelia infections and diseases possibly related to Lyme borreliosis.
Helmersson-Karlqvist, Johanna; Flodin, Mats; Havelka, Aleksandra Mandic; Xu, Xiao Yan; Larsson, Anders
2016-09-01
Serum/plasma albumin is an important and widely used laboratory marker and it is important that we measure albumin correctly without bias. We had indications that the immunoturbidimetric method on Cobas c 501 and the bromocresol purple (BCP) method on Architect 16000 differed, so we decided to study these methods more closely. A total of 1,951 patient requests with albumin measured with both the Architect BCP and Cobas immunoturbidimetric methods were extracted from the laboratory system. A comparison with fresh plasma samples was also performed that included immunoturbidimetric and BCP methods on Cobas c 501 and analysis of the international protein calibrator ERM-DA470k/IFCC. The median difference between the Abbott BCP and Roche immunoturbidimetric methods was 3.3 g/l and the Roche method overestimated ERM-DA470k/IFCC by 2.2 g/l. The Roche immunoturbidimetric method gave higher values than the Roche BCP method: y = 1.111x - 0.739, R² = 0.971. The Roche immunoturbidimetric albumin method gives clearly higher values than the Abbott and Roche BCP methods when analyzing fresh patient samples. The differences between the two methods were similar at normal and low albumin levels. © 2016 Wiley Periodicals, Inc.
Sampling high-altitude and stratified mating flights of red imported fire ant.
Fritz, Gary N; Fritz, Ann H; Vander Meer, Robert K
2011-05-01
With the exception of an airplane equipped with nets, no method has been developed that successfully samples red imported fire ant, Solenopsis invicta Buren, sexuals in mating/dispersal flights throughout their potential altitudinal trajectories. We developed and tested a method for sampling queens and males during mating flights at altitudinal intervals reaching as high as "140 m. Our trapping system uses an electric winch and a 1.2-m spindle bolted to a swiveling platform. The winch dispenses up to 183 m of Kevlar-core, nylon rope and the spindle stores 10 panels (0.9 by 4.6 m each) of nylon tulle impregnated with Tangle-Trap. The panels can be attached to the rope at various intervals and hoisted into the air by using a 3-m-diameter, helium-filled balloon. Raising or lowering all 10 panels takes approximately 15-20 min. This trap also should be useful for altitudinal sampling of other insects of medical importance.
Performance of sampling methods to estimate log characteristics for wildlife.
Lisa J. Bate; Torolf R. Torgersen; Michael J. Wisdom; Edward O. Garton
2004-01-01
Accurate estimation of the characteristics of log resources, or coarse woody debris (CWD), is critical to effective management of wildlife and other forest resources. Despite the importance of logs as wildlife habitat, methods for sampling logs have traditionally focused on silvicultural and fire applications. These applications have emphasized estimates of log volume...
Fung, Tak; Keenan, Kevin
2014-01-01
The estimation of population allele frequencies using sample data forms a central component of studies in population genetics. These estimates can be used to test hypotheses on the evolutionary processes governing changes in genetic variation among populations. However, existing studies frequently do not account for sampling uncertainty in these estimates, thus compromising their utility. Incorporation of this uncertainty has been hindered by the lack of a method for constructing confidence intervals containing the population allele frequencies, for the general case of sampling from a finite diploid population of any size. In this study, we address this important knowledge gap by presenting a rigorous mathematical method to construct such confidence intervals. For a range of scenarios, the method is used to demonstrate that for a particular allele, in order to obtain accurate estimates within 0.05 of the population allele frequency with high probability (> or = 95%), a sample size of > 30 is often required. This analysis is augmented by an application of the method to empirical sample allele frequency data for two populations of the checkerspot butterfly (Melitaea cinxia L.), occupying meadows in Finland. For each population, the method is used to derive > or = 98.3% confidence intervals for the population frequencies of three alleles. These intervals are then used to construct two joint > or = 95% confidence regions, one for the set of three frequencies for each population. These regions are then used to derive a > or = 95%% confidence interval for Jost's D, a measure of genetic differentiation between the two populations. Overall, the results demonstrate the practical utility of the method with respect to informing sampling design and accounting for sampling uncertainty in studies of population genetics, important for scientific hypothesis-testing and also for risk-based natural resource management.
Dynamics Sampling in Transition Pathway Space.
Zhou, Hongyu; Tao, Peng
2018-01-09
The minimum energy pathway contains important information describing the transition between two states on a potential energy surface (PES). Chain-of-states methods were developed to efficiently calculate minimum energy pathways connecting two stable states. In the chain-of-states framework, a series of structures are generated and optimized to represent the minimum energy pathway connecting two states. However, multiple pathways may exist connecting two existing states and should be identified to obtain a full view of the transitions. Therefore, we developed an enhanced sampling method, named as the direct pathway dynamics sampling (DPDS) method, to facilitate exploration of a PES for multiple pathways connecting two stable states as well as addition minima and their associated transition pathways. In the DPDS method, molecular dynamics simulations are carried out on the targeting PES within a chain-of-states framework to directly sample the transition pathway space. The simulations of DPDS could be regulated by two parameters controlling distance among states along the pathway and smoothness of the pathway. One advantage of the chain-of-states framework is that no specific reaction coordinates are necessary to generate the reaction pathway, because such information is implicitly represented by the structures along the pathway. The chain-of-states setup in a DPDS method greatly enhances the sufficient sampling in high-energy space between two end states, such as transition states. By removing the constraint on the end states of the pathway, DPDS will also sample pathways connecting minima on a PES in addition to the end points of the starting pathway. This feature makes DPDS an ideal method to directly explore transition pathway space. Three examples demonstrate the efficiency of DPDS methods in sampling the high-energy area important for reactions on the PES.
Jaki, Thomas; Allacher, Peter; Horling, Frank
2016-09-05
Detecting and characterizing of anti-drug antibodies (ADA) against a protein therapeutic are crucially important to monitor the unwanted immune response. Usually a multi-tiered approach that initially rapidly screens for positive samples that are subsequently confirmed in a separate assay is employed for testing of patient samples for ADA activity. In this manuscript we evaluate the ability of different methods used to classify subject with screening and competition based confirmatory assays. We find that for the overall performance of the multi-stage process the method used for confirmation is most important where a t-test is best when differences are moderate to large. Moreover we find that, when differences between positive and negative samples are not sufficiently large, using a competition based confirmation step does yield poor classification of positive samples. Copyright © 2016 The Authors. Published by Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Yuan, Shenfang; Chen, Jian; Yang, Weibo; Qiu, Lei
2017-08-01
Fatigue crack growth prognosis is important for prolonging service time, improving safety, and reducing maintenance cost in many safety-critical systems, such as in aircraft, wind turbines, bridges, and nuclear plants. Combining fatigue crack growth models with the particle filter (PF) method has proved promising to deal with the uncertainties during fatigue crack growth and reach a more accurate prognosis. However, research on prognosis methods integrating on-line crack monitoring with the PF method is still lacking, as well as experimental verifications. Besides, the PF methods adopted so far are almost all sequential importance resampling-based PFs, which usually encounter sample impoverishment problems, and hence performs poorly. To solve these problems, in this paper, the piezoelectric transducers (PZTs)-based active Lamb wave method is adopted for on-line crack monitoring. The deterministic resampling PF (DRPF) is proposed to be used in fatigue crack growth prognosis, which can overcome the sample impoverishment problem. The proposed method is verified through fatigue tests of attachment lugs, which are a kind of important joint component in aerospace systems.
NASA Astrophysics Data System (ADS)
Raff, L. M.; Malshe, M.; Hagan, M.; Doughan, D. I.; Rockley, M. G.; Komanduri, R.
2005-02-01
A neural network/trajectory approach is presented for the development of accurate potential-energy hypersurfaces that can be utilized to conduct ab initio molecular dynamics (AIMD) and Monte Carlo studies of gas-phase chemical reactions, nanometric cutting, and nanotribology, and of a variety of mechanical properties of importance in potential microelectromechanical systems applications. The method is sufficiently robust that it can be applied to a wide range of polyatomic systems. The overall method integrates ab initio electronic structure calculations with importance sampling techniques that permit the critical regions of configuration space to be determined. The computed ab initio energies and gradients are then accurately interpolated using neural networks (NN) rather than arbitrary parametrized analytical functional forms, moving interpolation or least-squares methods. The sampling method involves a tight integration of molecular dynamics calculations with neural networks that employ early stopping and regularization procedures to improve network performance and test for convergence. The procedure can be initiated using an empirical potential surface or direct dynamics. The accuracy and interpolation power of the method has been tested for two cases, the global potential surface for vinyl bromide undergoing unimolecular decomposition via four different reaction channels and nanometric cutting of silicon. The results show that the sampling methods permit the important regions of configuration space to be easily and rapidly identified, that convergence of the NN fit to the ab initio electronic structure database can be easily monitored, and that the interpolation accuracy of the NN fits is excellent, even for systems involving five atoms or more. The method permits a substantial computational speed and accuracy advantage over existing methods, is robust, and relatively easy to implement.
Migaszewski, Z.M.; Lamothe, P.J.; Crock, J.G.; Galuszka, A.; Dolegowska, S.
2011-01-01
Trace element concentrations in plant bioindicators are often determined to assess the quality of the environment. Instrumental methods used for trace element determination require digestion of samples. There are different methods of sample preparation for trace element analysis, and the selection of the best method should be fitted for the purpose of a study. Our hypothesis is that the method of sample preparation is important for interpretation of the results. Here we compare the results of 36 element determinations performed by ICP-MS on ashed and on acid-digested (HNO3, H2O2) samples of two moss species (Hylocomium splendens and Pleurozium schreberi) collected in Alaska and in south-central Poland. We found that dry ashing of the moss samples prior to analysis resulted in considerably lower detection limits of all the elements examined. We also show that this sample preparation technique facilitated the determination of interregional and interspecies differences in the chemistry of trace elements. Compared to the Polish mosses, the Alaskan mosses displayed more positive correlations of the major rock-forming elements with ash content, reflecting those elements' geogenic origin. Of the two moss species, P. schreberi from both Alaska and Poland was also highlighted by a larger number of positive element pair correlations. The cluster analysis suggests that the more uniform element distribution pattern of the Polish mosses primarily reflects regional air pollution sources. Our study has shown that the method of sample preparation is an important factor in statistical interpretation of the results of trace element determinations. ?? 2010 Springer-Verlag.
Crystallographic Characterization of Extraterrestrial Materials by Energy-Scanning X-ray Diffraction
NASA Technical Reports Server (NTRS)
Hagiya, Kenji; Mikouchi, Takashi; Ohsumi, Kazumasa; Terada, Yasuko; Yagi, Naoto; Komatsu, Mutsumi; Yamaguchi, Shoki; Hirata, Arashi; Kurokawa, Ayaka; Zolensky, Michael E. (Principal Investigator)
2016-01-01
We have continued our long-term project using X-ray diffraction to characterize a wide range of extraterrestrial samples. The stationary sample method with polychromatic X-rays is advantageous because the irradiated area of the sample is always same and fixed, meaning that all diffraction spots occur from the same area of the sample, however, unit cell parameters cannot be directly obtained by this method though they are very important for identification of mineral and for determination of crystal structures. In order to obtain the cell parameters even in the case of the sample stationary method, we apply energy scanning of a micro-beam of monochromatic SR at SPring-8.
Universal nucleic acids sample preparation method for cells, spores and their mixture
Bavykin, Sergei [Darien, IL
2011-01-18
The present invention relates to a method for extracting nucleic acids from biological samples. More specifically the invention relates to a universal method for extracting nucleic acids from unidentified biological samples. An advantage of the presently invented method is its ability to effectively and efficiently extract nucleic acids from a variety of different cell types including but not limited to prokaryotic or eukaryotic cells and/or recalcitrant organisms (i.e. spores). Unlike prior art methods which are focused on extracting nucleic acids from vegetative cell or spores, the present invention effectively extracts nucleic acids from spores, multiple cell types or mixtures thereof using a single method. Important that the invented method has demonstrated an ability to extract nucleic acids from spores and vegetative bacterial cells with similar levels effectiveness. The invented method employs a multi-step protocol which erodes the cell structure of the biological sample, isolates, labels, fragments nucleic acids and purifies labeled samples from the excess of dye.
Nogly, Przemyslaw; Panneels, Valerie; Nelson, Garrett; ...
2016-08-22
Serial femtosecond crystallography (SFX) using X-ray free-electron laser sources is an emerging method with considerable potential for time-resolved pump-probe experiments. Here we present a lipidic cubic phase SFX structure of the light-driven proton pump bacteriorhodopsin (bR) to 2.3 Å resolution and a method to investigate protein dynamics with modest sample requirement. Time-resolved SFX (TR-SFX) with a pump-probe delay of 1 ms yields difference Fourier maps compatible with the dark to M state transition of bR. Importantly, the method is very sample efficient and reduces sample consumption to about 1 mg per collected time point. Accumulation of M intermediate within themore » crystal lattice is confirmed by time-resolved visible absorption spectroscopy. Furthermore, this study provides an important step towards characterizing the complete photocycle dynamics of retinal proteins and demonstrates the feasibility of a sample efficient viscous medium jet for TR-SFX.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nogly, Przemyslaw; Panneels, Valerie; Nelson, Garrett
Serial femtosecond crystallography (SFX) using X-ray free-electron laser sources is an emerging method with considerable potential for time-resolved pump-probe experiments. Here we present a lipidic cubic phase SFX structure of the light-driven proton pump bacteriorhodopsin (bR) to 2.3 Å resolution and a method to investigate protein dynamics with modest sample requirement. Time-resolved SFX (TR-SFX) with a pump-probe delay of 1 ms yields difference Fourier maps compatible with the dark to M state transition of bR. Importantly, the method is very sample efficient and reduces sample consumption to about 1 mg per collected time point. Accumulation of M intermediate within themore » crystal lattice is confirmed by time-resolved visible absorption spectroscopy. Furthermore, this study provides an important step towards characterizing the complete photocycle dynamics of retinal proteins and demonstrates the feasibility of a sample efficient viscous medium jet for TR-SFX.« less
Nogly, Przemyslaw; Panneels, Valerie; Nelson, Garrett; Gati, Cornelius; Kimura, Tetsunari; Milne, Christopher; Milathianaki, Despina; Kubo, Minoru; Wu, Wenting; Conrad, Chelsie; Coe, Jesse; Bean, Richard; Zhao, Yun; Båth, Petra; Dods, Robert; Harimoorthy, Rajiv; Beyerlein, Kenneth R.; Rheinberger, Jan; James, Daniel; DePonte, Daniel; Li, Chufeng; Sala, Leonardo; Williams, Garth J.; Hunter, Mark S.; Koglin, Jason E.; Berntsen, Peter; Nango, Eriko; Iwata, So; Chapman, Henry N.; Fromme, Petra; Frank, Matthias; Abela, Rafael; Boutet, Sébastien; Barty, Anton; White, Thomas A.; Weierstall, Uwe; Spence, John; Neutze, Richard; Schertler, Gebhard; Standfuss, Jörg
2016-01-01
Serial femtosecond crystallography (SFX) using X-ray free-electron laser sources is an emerging method with considerable potential for time-resolved pump-probe experiments. Here we present a lipidic cubic phase SFX structure of the light-driven proton pump bacteriorhodopsin (bR) to 2.3 Å resolution and a method to investigate protein dynamics with modest sample requirement. Time-resolved SFX (TR-SFX) with a pump-probe delay of 1 ms yields difference Fourier maps compatible with the dark to M state transition of bR. Importantly, the method is very sample efficient and reduces sample consumption to about 1 mg per collected time point. Accumulation of M intermediate within the crystal lattice is confirmed by time-resolved visible absorption spectroscopy. This study provides an important step towards characterizing the complete photocycle dynamics of retinal proteins and demonstrates the feasibility of a sample efficient viscous medium jet for TR-SFX. PMID:27545823
Ullah, Md Ahsan; Kim, Ki-Hyun; Szulejko, Jan E; Cho, Jinwoo
2014-04-11
The production of short-chained volatile fatty acids (VFAs) by the anaerobic bacterial digestion of sewage (wastewater) affords an excellent opportunity to alternative greener viable bio-energy fuels (i.e., microbial fuel cell). VFAs in wastewater (sewage) samples are commonly quantified through direct injection (DI) into a gas chromatograph with a flame ionization detector (GC-FID). In this study, the reliability of VFA analysis by the DI-GC method has been examined against a thermal desorption (TD-GC) method. The results indicate that the VFA concentrations determined from an aliquot from each wastewater sample by the DI-GC method were generally underestimated, e.g., reductions of 7% (acetic acid) to 93.4% (hexanoic acid) relative to the TD-GC method. The observed differences between the two methods suggest the possibly important role of the matrix effect to give rise to the negative biases in DI-GC analysis. To further explore this possibility, an ancillary experiment was performed to examine bias patterns of three DI-GC approaches. For instance, the results of the standard addition (SA) method confirm the definite role of matrix effect when analyzing wastewater samples by DI-GC. More importantly, their biases tend to increase systematically with increasing molecular weight and decreasing VFA concentrations. As such, the use of DI-GC method, if applied for the analysis of samples with a complicated matrix, needs a thorough validation to improve the reliability in data acquisition. Copyright © 2014 Elsevier B.V. All rights reserved.
Palumbo, J D; O'Keeffe, T L; Fidelibus, M W
2016-12-01
Identification of populations of Aspergillus section Nigri species in environmental samples using traditional methods is laborious and impractical for large numbers of samples. We developed species-specific primers and probes for quantitative droplet digital PCR (ddPCR) to improve sample throughput and simultaneously detect multiple species in each sample. The ddPCR method was used to distinguish Aspergillus niger, Aspergillus welwitschiae, Aspergillus tubingensis and Aspergillus carbonarius in mixed samples of total DNA. Relative abundance of each species measured by ddPCR agreed with input ratios of template DNAs. Soil samples were collected at six time points over two growing seasons from two raisin vineyards in Fresno County, California. Aspergillus section Nigri strains were detected in these soils in the range of 10 2 -10 5 CFU g -1 . Relative abundance of each species varied widely among samples, but in 52 of 60 samples, A. niger was the most abundant species, ranging from 38 to 88% of the total population. In combination with total plate counts, this ddPCR method provides a high-throughput method for describing population dynamics of important potential mycotoxin-producing species in environmental samples. This is the first study to demonstrate the utility of ddPCR as a means to quantify species of Aspergillus section Nigri in soil. This method eliminates the need for isolation and sequence identification of individual fungal isolates, and allows for greater throughput in measuring relative population sizes of important (i.e. mycotoxigenic) Aspergillus species within a population of morphologically indistinguishable species. Published 2016. This article is a U.S. Government work and is in the public domain in the USA.
Terry, Leann; Kelley, Ken
2012-11-01
Composite measures play an important role in psychology and related disciplines. Composite measures almost always have error. Correspondingly, it is important to understand the reliability of the scores from any particular composite measure. However, the point estimates of the reliability of composite measures are fallible and thus all such point estimates should be accompanied by a confidence interval. When confidence intervals are wide, there is much uncertainty in the population value of the reliability coefficient. Given the importance of reporting confidence intervals for estimates of reliability, coupled with the undesirability of wide confidence intervals, we develop methods that allow researchers to plan sample size in order to obtain narrow confidence intervals for population reliability coefficients. We first discuss composite reliability coefficients and then provide a discussion on confidence interval formation for the corresponding population value. Using the accuracy in parameter estimation approach, we develop two methods to obtain accurate estimates of reliability by planning sample size. The first method provides a way to plan sample size so that the expected confidence interval width for the population reliability coefficient is sufficiently narrow. The second method ensures that the confidence interval width will be sufficiently narrow with some desired degree of assurance (e.g., 99% assurance that the 95% confidence interval for the population reliability coefficient will be less than W units wide). The effectiveness of our methods was verified with Monte Carlo simulation studies. We demonstrate how to easily implement the methods with easy-to-use and freely available software. ©2011 The British Psychological Society.
A quantitative evaluation of two methods for preserving hair samples
Roon, David A.; Waits, L.P.; Kendall, K.C.
2003-01-01
Hair samples are an increasingly important DNA source for wildlife studies, yet optimal storage methods and DNA degradation rates have not been rigorously evaluated. We tested amplification success rates over a one-year storage period for DNA extracted from brown bear (Ursus arctos) hair samples preserved using silica desiccation and -20C freezing. For three nuclear DNA microsatellites, success rates decreased significantly after a six-month time point, regardless of storage method. For a 1000 bp mitochondrial fragment, a similar decrease occurred after a two-week time point. Minimizing delays between collection and DNA extraction will maximize success rates for hair-based noninvasive genetic sampling projects.
USDA-ARS?s Scientific Manuscript database
The fiber testing instruments such as HVI can rapidly measure fiber length by testing a tapered fiber beard of the sample. But these instruments that use the beard testing method only report a limited number of fiber length parameters instead of the complete length distribution that is important fo...
Evaluation of seven aquatic sampling methods for amphibians and other aquatic fauna
Gunzburger, M.S.
2007-01-01
To design effective and efficient research and monitoring programs researchers must have a thorough understanding of the capabilities and limitations of their sampling methods. Few direct comparative studies exist for aquatic sampling methods for amphibians. The objective of this study was to simultaneously employ seven aquatic sampling methods in 10 wetlands to compare amphibian species richness and number of individuals detected with each method. Four sampling methods allowed counts of individuals (metal dipnet, D-frame dipnet, box trap, crayfish trap), whereas the other three methods allowed detection of species (visual encounter, aural, and froglogger). Amphibian species richness was greatest with froglogger, box trap, and aural samples. For anuran species, the sampling methods by which each life stage was detected was related to relative length of larval and breeding periods and tadpole size. Detection probability of amphibians varied across sampling methods. Box trap sampling resulted in the most precise amphibian count, but the precision of all four count-based methods was low (coefficient of variation > 145 for all methods). The efficacy of the four count sampling methods at sampling fish and aquatic invertebrates was also analyzed because these predatory taxa are known to be important predictors of amphibian habitat distribution. Species richness and counts were similar for fish with the four methods, whereas invertebrate species richness and counts were greatest in box traps. An effective wetland amphibian monitoring program in the southeastern United States should include multiple sampling methods to obtain the most accurate assessment of species community composition at each site. The combined use of frogloggers, crayfish traps, and dipnets may be the most efficient and effective amphibian monitoring protocol. ?? 2007 Brill Academic Publishers.
Zhang, L; Liu, X J
2016-06-03
With the rapid development of next-generation high-throughput sequencing technology, RNA-seq has become a standard and important technique for transcriptome analysis. For multi-sample RNA-seq data, the existing expression estimation methods usually deal with each single-RNA-seq sample, and ignore that the read distributions are consistent across multiple samples. In the current study, we propose a structured sparse regression method, SSRSeq, to estimate isoform expression using multi-sample RNA-seq data. SSRSeq uses a non-parameter model to capture the general tendency of non-uniformity read distribution for all genes across multiple samples. Additionally, our method adds a structured sparse regularization, which not only incorporates the sparse specificity between a gene and its corresponding isoform expression levels, but also reduces the effects of noisy reads, especially for lowly expressed genes and isoforms. Four real datasets were used to evaluate our method on isoform expression estimation. Compared with other popular methods, SSRSeq reduced the variance between multiple samples, and produced more accurate isoform expression estimations, and thus more meaningful biological interpretations.
Baker, Duncan G L; Eddy, Tyler D; McIver, Reba; Schmidt, Allison L; Thériault, Marie-Hélène; Boudreau, Monica; Courtenay, Simon C; Lotze, Heike K
2016-01-01
Coastal ecosystems are among the most productive yet increasingly threatened marine ecosystems worldwide. Particularly vegetated habitats, such as eelgrass (Zostera marina) beds, play important roles in providing key spawning, nursery and foraging habitats for a wide range of fauna. To properly assess changes in coastal ecosystems and manage these critical habitats, it is essential to develop sound monitoring programs for foundation species and associated assemblages. Several survey methods exist, thus understanding how different methods perform is important for survey selection. We compared two common methods for surveying macrofaunal assemblages: beach seine netting and underwater visual census (UVC). We also tested whether assemblages in shallow nearshore habitats commonly sampled by beach seines are similar to those of nearby eelgrass beds often sampled by UVC. Among five estuaries along the Southern Gulf of St. Lawrence, Canada, our results suggest that the two survey methods yield comparable results for species richness, diversity and evenness, yet beach seines yield significantly higher abundance and different species composition. However, sampling nearshore assemblages does not represent those in eelgrass beds despite considerable overlap and close proximity. These results have important implications for how and where macrofaunal assemblages are monitored in coastal ecosystems. Ideally, multiple survey methods and locations should be combined to complement each other in assessing the entire assemblage and full range of changes in coastal ecosystems, thereby better informing coastal zone management.
Yan, Rui; Edwards, Thomas J.; Pankratz, Logan M.; Kuhn, Richard J.; Lanman, Jason K.; Liu, Jun; Jiang, Wen
2015-01-01
Cryo-electron tomography (cryo-ET) is an emerging technique that can elucidate the architecture of macromolecular complexes and cellular ultrastructure in a near-native state. Some important sample parameters, such as thickness and tilt, are needed for 3-D reconstruction. However, these parameters can currently only be determined using trial 3-D reconstructions. Accurate electron mean free path plays a significant role in modeling image formation process essential for simulation of electron microscopy images and model-based iterative 3-D reconstruction methods; however, their values are voltage and sample dependent and have only been experimentally measured for a limited number of sample conditions. Here, we report a computational method, tomoThickness, based on the Beer-Lambert law, to simultaneously determine the sample thickness, tilt and electron inelastic mean free path by solving an overdetermined nonlinear least square optimization problem utilizing the strong constraints of tilt relationships. The method has been extensively tested with both stained and cryo datasets. The fitted electron mean free paths are consistent with reported experimental measurements. The accurate thickness estimation eliminates the need for a generous assignment of Z-dimension size of the tomogram. Interestingly, we have also found that nearly all samples are a few degrees tilted relative to the electron beam. Compensation of the intrinsic sample tilt can result in horizontal structure and reduced Z-dimension of tomograms. Our fast, pre-reconstruction method can thus provide important sample parameters that can help improve performance of tomographic reconstruction of a wide range of samples. PMID:26433027
Yan, Rui; Edwards, Thomas J; Pankratz, Logan M; Kuhn, Richard J; Lanman, Jason K; Liu, Jun; Jiang, Wen
2015-11-01
Cryo-electron tomography (cryo-ET) is an emerging technique that can elucidate the architecture of macromolecular complexes and cellular ultrastructure in a near-native state. Some important sample parameters, such as thickness and tilt, are needed for 3-D reconstruction. However, these parameters can currently only be determined using trial 3-D reconstructions. Accurate electron mean free path plays a significant role in modeling image formation process essential for simulation of electron microscopy images and model-based iterative 3-D reconstruction methods; however, their values are voltage and sample dependent and have only been experimentally measured for a limited number of sample conditions. Here, we report a computational method, tomoThickness, based on the Beer-Lambert law, to simultaneously determine the sample thickness, tilt and electron inelastic mean free path by solving an overdetermined nonlinear least square optimization problem utilizing the strong constraints of tilt relationships. The method has been extensively tested with both stained and cryo datasets. The fitted electron mean free paths are consistent with reported experimental measurements. The accurate thickness estimation eliminates the need for a generous assignment of Z-dimension size of the tomogram. Interestingly, we have also found that nearly all samples are a few degrees tilted relative to the electron beam. Compensation of the intrinsic sample tilt can result in horizontal structure and reduced Z-dimension of tomograms. Our fast, pre-reconstruction method can thus provide important sample parameters that can help improve performance of tomographic reconstruction of a wide range of samples. Copyright © 2015 Elsevier Inc. All rights reserved.
Comparability of river suspended-sediment sampling and laboratory analysis methods
Groten, Joel T.; Johnson, Gregory D.
2018-03-06
Accurate measurements of suspended sediment, a leading water-quality impairment in many Minnesota rivers, are important for managing and protecting water resources; however, water-quality standards for suspended sediment in Minnesota are based on grab field sampling and total suspended solids (TSS) laboratory analysis methods that have underrepresented concentrations of suspended sediment in rivers compared to U.S. Geological Survey equal-width-increment or equal-discharge-increment (EWDI) field sampling and suspended sediment concentration (SSC) laboratory analysis methods. Because of this underrepresentation, the U.S. Geological Survey, in collaboration with the Minnesota Pollution Control Agency, collected concurrent grab and EWDI samples at eight sites to compare results obtained using different combinations of field sampling and laboratory analysis methods.Study results determined that grab field sampling and TSS laboratory analysis results were biased substantially low compared to EWDI sampling and SSC laboratory analysis results, respectively. Differences in both field sampling and laboratory analysis methods caused grab and TSS methods to be biased substantially low. The difference in laboratory analysis methods was slightly greater than field sampling methods.Sand-sized particles had a strong effect on the comparability of the field sampling and laboratory analysis methods. These results indicated that grab field sampling and TSS laboratory analysis methods fail to capture most of the sand being transported by the stream. The results indicate there is less of a difference among samples collected with grab field sampling and analyzed for TSS and concentration of fines in SSC. Even though differences are present, the presence of strong correlations between SSC and TSS concentrations provides the opportunity to develop site specific relations to address transport processes not captured by grab field sampling and TSS laboratory analysis methods.
Consistent Adjoint Driven Importance Sampling using Space, Energy and Angle
DOE Office of Scientific and Technical Information (OSTI.GOV)
Peplow, Douglas E.; Mosher, Scott W; Evans, Thomas M
2012-08-01
For challenging radiation transport problems, hybrid methods combine the accuracy of Monte Carlo methods with the global information present in deterministic methods. One of the most successful hybrid methods is CADIS Consistent Adjoint Driven Importance Sampling. This method uses a deterministic adjoint solution to construct a biased source distribution and consistent weight windows to optimize a specific tally in a Monte Carlo calculation. The method has been implemented into transport codes using just the spatial and energy information from the deterministic adjoint and has been used in many applications to compute tallies with much higher figures-of-merit than analog calculations. CADISmore » also outperforms user-supplied importance values, which usually take long periods of user time to develop. This work extends CADIS to develop weight windows that are a function of the position, energy, and direction of the Monte Carlo particle. Two types of consistent source biasing are presented: one method that biases the source in space and energy while preserving the original directional distribution and one method that biases the source in space, energy, and direction. Seven simple example problems are presented which compare the use of the standard space/energy CADIS with the new space/energy/angle treatments.« less
Forecasting electricity usage using univariate time series models
NASA Astrophysics Data System (ADS)
Hock-Eam, Lim; Chee-Yin, Yip
2014-12-01
Electricity is one of the important energy sources. A sufficient supply of electricity is vital to support a country's development and growth. Due to the changing of socio-economic characteristics, increasing competition and deregulation of electricity supply industry, the electricity demand forecasting is even more important than before. It is imperative to evaluate and compare the predictive performance of various forecasting methods. This will provide further insights on the weakness and strengths of each method. In literature, there are mixed evidences on the best forecasting methods of electricity demand. This paper aims to compare the predictive performance of univariate time series models for forecasting the electricity demand using a monthly data of maximum electricity load in Malaysia from January 2003 to December 2013. Results reveal that the Box-Jenkins method produces the best out-of-sample predictive performance. On the other hand, Holt-Winters exponential smoothing method is a good forecasting method for in-sample predictive performance.
Jiang, Jia-Jia; Duan, Fa-Jie; Li, Yan-Chao; Hua, Xiang-Ning
2014-03-01
Synchronization sampling is very important in underwater towed array system where every acquisition node (AN) samples analog signals by its own analog-digital converter (ADC). In this paper, a simple and effective synchronization sampling method is proposed to ensure synchronized operation among different ANs of the underwater towed array system. We first present a master-slave synchronization sampling model, and then design a high accuracy phase-locked loop to synchronize all delta-sigma ADCs to a reference clock. However, when the master-slave synchronization sampling model is used, both the time-delay (TD) of messages traveling along the wired transmission medium and the jitter of the clocks will bring out synchronization sampling error (SSE). Therefore, a simple method is proposed to estimate and compensate the TD of the messages transmission, and then another effective method is presented to overcome the SSE caused by the jitter of the clocks. An experimental system with three ANs is set up, and the related experimental results verify the validity of the synchronization sampling method proposed in this paper.
NASA Astrophysics Data System (ADS)
Jiang, Jia-Jia; Duan, Fa-Jie; Li, Yan-Chao; Hua, Xiang-Ning
2014-03-01
Synchronization sampling is very important in underwater towed array system where every acquisition node (AN) samples analog signals by its own analog-digital converter (ADC). In this paper, a simple and effective synchronization sampling method is proposed to ensure synchronized operation among different ANs of the underwater towed array system. We first present a master-slave synchronization sampling model, and then design a high accuracy phase-locked loop to synchronize all delta-sigma ADCs to a reference clock. However, when the master-slave synchronization sampling model is used, both the time-delay (TD) of messages traveling along the wired transmission medium and the jitter of the clocks will bring out synchronization sampling error (SSE). Therefore, a simple method is proposed to estimate and compensate the TD of the messages transmission, and then another effective method is presented to overcome the SSE caused by the jitter of the clocks. An experimental system with three ANs is set up, and the related experimental results verify the validity of the synchronization sampling method proposed in this paper.
Rapid fusion method for the determination of refractory thorium and uranium isotopes in soil samples
Maxwell, Sherrod L.; Hutchison, Jay B.; McAlister, Daniel R.
2015-02-14
Recently, approximately 80% of participating laboratories failed to accurately determine uranium isotopes in soil samples in the U.S Department of Energy Mixed Analyte Performance Evaluation Program (MAPEP) Session 30, due to incomplete dissolution of refractory particles in the samples. Failing laboratories employed acid dissolution methods, including hydrofluoric acid, to recover uranium from the soil matrix. The failures illustrate the importance of rugged soil dissolution methods for the accurate measurement of analytes in the sample matrix. A new rapid fusion method has been developed by the Savannah River National Laboratory (SRNL) to prepare 1-2 g soil sample aliquots very quickly, withmore » total dissolution of refractory particles. Soil samples are fused with sodium hydroxide at 600 ºC in zirconium crucibles to enable complete dissolution of the sample. Uranium and thorium are separated on stacked TEVA and TRU extraction chromatographic resin cartridges, prior to isotopic measurements by alpha spectrometry on cerium fluoride microprecipitation sources. Plutonium can also be separated and measured using this method. Batches of 12 samples can be prepared for measurement in <5 hours.« less
Genualdi, Susan; Jeong, Nahyun; DeJager, Lowri
2018-04-01
Nitrites and nitrates can be present in dairy products from both endogenous and exogenous sources. In the European Union (EU), 150 mg kg - 1 of nitrates are allowed to be added to the cheese milk during the manufacturing process. The CODEX General Standard for Food Additives has a maximum permitted level of 50 mg kg - 1 residue in cheese, while in the United States (U.S.) nitrates are unapproved for use as food additives in cheese. In order to be able to investigate imported cheeses for nitrates intentionally added as preservatives and the endogenous concentrations of nitrates and nitrites present in cheeses in the U.S. marketplace, a method was developed and validated using ion chromatography with conductivity detection. A market sampling of cheese samples purchased in the Washington DC metro area was performed. In 64 samples of cheese, concentrations ranged from below the method detection limit (MDL) to 26 mg kg - 1 for nitrates and no concentrations of nitrites were found in any of the cheese samples above the MDL of 0.1 mg kg - 1 . A majority of the samples (93%) had concentrations below 10 mg kg - 1 , which indicate the presence of endogenous nitrates. The samples with concentrations above 10 mg kg - 1 were mainly processed cheese spread, which can contain additional ingredients often of plant-based origin. These ingredients are likely the cause of the elevated nitrate concentrations. The analysis of 12 additional cheese samples that are liable to the intentional addition of nitrates, 9 of which were imported, indicated that in this limited study, concentrations of nitrate in the U.S.-produced cheeses did not differ from those in imported samples.
A guide to large-scale RNA sample preparation.
Baronti, Lorenzo; Karlsson, Hampus; Marušič, Maja; Petzold, Katja
2018-05-01
RNA is becoming more important as an increasing number of functions, both regulatory and enzymatic, are being discovered on a daily basis. As the RNA boom has just begun, most techniques are still in development and changes occur frequently. To understand RNA functions, revealing the structure of RNA is of utmost importance, which requires sample preparation. We review the latest methods to produce and purify a variation of RNA molecules for different purposes with the main focus on structural biology and biophysics. We present a guide aimed at identifying the most suitable method for your RNA and your biological question and highlighting the advantages of different methods. Graphical abstract In this review we present different methods for large-scale production and purification of RNAs for structural and biophysical studies.
Chen, Peichen; Liu, Shih-Chia; Liu, Hung-I; Chen, Tse-Wei
2011-01-01
For quarantine sampling, it is of fundamental importance to determine the probability of finding an infestation when a specified number of units are inspected. In general, current sampling procedures assume 100% probability (perfect) of detecting a pest if it is present within a unit. Ideally, a nematode extraction method should remove all stages of all species with 100% efficiency regardless of season, temperature, or other environmental conditions; in practice however, no method approaches these criteria. In this study we determined the probability of detecting nematode infestations for quarantine sampling with imperfect extraction efficacy. Also, the required sample and the risk involved in detecting nematode infestations with imperfect extraction efficacy are presented. Moreover, we developed a computer program to calculate confidence levels for different scenarios with varying proportions of infestation and efficacy of detection. In addition, a case study, presenting the extraction efficacy of the modified Baermann's Funnel method on Aphelenchoides besseyi, is used to exemplify the use of our program to calculate the probability of detecting nematode infestations in quarantine sampling with imperfect extraction efficacy. The result has important implications for quarantine programs and highlights the need for a very large number of samples if perfect extraction efficacy is not achieved in such programs. We believe that the results of the study will be useful for the determination of realistic goals in the implementation of quarantine sampling. PMID:22791911
Adsorptive Stripping Voltammetry of Environmental Indicators: Determination of Zinc in Algae
ERIC Educational Resources Information Center
Collado-Sanchez, C.; Hernandez-Brito, J. J.; Perez-Pena, J.; Torres-Padron, M. E.; Gelado-Caballero, M. D.
2005-01-01
A method for sample preparation and for the determination of average zinc content in algae using adsorptive stripping voltammetry are described. The students gain important didactic advantages through metal determination in environmental matrices, which include carrying out clean protocols for sampling and handling, and digesting samples using…
Ryba, Stepan; Kindlmann, Pavel; Titera, Dalibor; Haklova, Marcela; Stopka, Pavel
2012-10-01
American foulbrood, because of its virulence and worldwide spread, is currently one of the most dangerous diseases of honey bees. Quick diagnosis of this disease is therefore vitally important. For its successful eradication, however, all the hives in the region must be tested. This is time consuming and costly. Therefore, a fast and sensitive method of detecting American foulbrood is needed. Here we present a method that significantly reduces the number of tests needed by combining batches of samples from different hives. The results of this method were verified by testing each sample. A simulation study was used to compare the efficiency of the new method with testing all the samples and to develop a decision tool for determining when best to use the new method. The method is suitable for testing large numbers of samples (over 100) when the incidence of the disease is low (10% or less).
Mokhtari, Amirhossein; Christopher Frey, H; Zheng, Junyu
2006-11-01
Sensitivity analyses of exposure or risk models can help identify the most significant factors to aid in risk management or to prioritize additional research to reduce uncertainty in the estimates. However, sensitivity analysis is challenged by non-linearity, interactions between inputs, and multiple days or time scales. Selected sensitivity analysis methods are evaluated with respect to their applicability to human exposure models with such features using a testbed. The testbed is a simplified version of a US Environmental Protection Agency's Stochastic Human Exposure and Dose Simulation (SHEDS) model. The methods evaluated include the Pearson and Spearman correlation, sample and rank regression, analysis of variance, Fourier amplitude sensitivity test (FAST), and Sobol's method. The first five methods are known as "sampling-based" techniques, wheras the latter two methods are known as "variance-based" techniques. The main objective of the test cases was to identify the main and total contributions of individual inputs to the output variance. Sobol's method and FAST directly quantified these measures of sensitivity. Results show that sensitivity of an input typically changed when evaluated under different time scales (e.g., daily versus monthly). All methods provided similar insights regarding less important inputs; however, Sobol's method and FAST provided more robust insights with respect to sensitivity of important inputs compared to the sampling-based techniques. Thus, the sampling-based methods can be used in a screening step to identify unimportant inputs, followed by application of more computationally intensive refined methods to a smaller set of inputs. The implications of time variation in sensitivity results for risk management are briefly discussed.
Quantitative Method for Simultaneous Analysis of Acetaminophen and 6 Metabolites.
Lammers, Laureen A; Achterbergh, Roos; Pistorius, Marcel C M; Romijn, Johannes A; Mathôt, Ron A A
2017-04-01
Hepatotoxicity after ingestion of high-dose acetaminophen [N-acetyl-para-aminophenol (APAP)] is caused by the metabolites of the drug. To gain more insight into factors influencing susceptibility to APAP hepatotoxicity, quantification of APAP and metabolites is important. A few methods have been developed to simultaneously quantify APAP and its most important metabolites. However, these methods require a comprehensive sample preparation and long run times. The aim of this study was to develop and validate a simplified, but sensitive method for the simultaneous quantification of acetaminophen, the main metabolites acetaminophen glucuronide and acetaminophen sulfate, and 4 Cytochrome P450-mediated metabolites by using liquid chromatography with mass spectrometric (LC-MS) detection. The method was developed and validated for the human plasma, and it entailed a single method for sample preparation, enabling quick processing of the samples followed by an LC-MS method with a chromatographic run time of 9 minutes. The method was validated for selectivity, linearity, accuracy, imprecision, dilution integrity, recovery, process efficiency, ionization efficiency, and carryover effect. The method showed good selectivity without matrix interferences. For all analytes, the mean process efficiency was >86%, and the mean ionization efficiency was >94%. Furthermore, the accuracy was between 90.3% and 112% for all analytes, and the within- and between-run imprecision were <20% for the lower limit of quantification and <14.3% for the middle level and upper limit of quantification. The method presented here enables the simultaneous quantification of APAP and 6 of its metabolites. It is less time consuming than previously reported methods because it requires only a single and simple method for the sample preparation followed by an LC-MS method with a short run time. Therefore, this analytical method provides a useful method for both clinical and research purposes.
HPLC analysis and standardization of Brahmi vati - An Ayurvedic poly-herbal formulation.
Mishra, Amrita; Mishra, Arun K; Tiwari, Om Prakash; Jha, Shivesh
2013-09-01
The aim of the present study was to standardize Brahmi vati (BV) by simultaneous quantitative estimation of Bacoside A3 and Piperine adopting HPLC-UV method. BV very important Ayurvedic polyherbo formulation used to treat epilepsy and mental disorders containing thirty eight ingredients including Bacopa monnieri L. and Piper longum L. An HPLC-UV method was developed for the standardization of BV in light of simultaneous quantitative estimation of Bacoside A3 and Piperine, the major constituents of B. monnieri L. and P. longum L. respectively. The developed method was validated on parameters including linearity, precision, accuracy and robustness. The HPLC analysis showed significant increase in amount of Bacoside A3 and Piperine in the in-house sample of BV when compared with all three different marketed samples of the same. Results showed variations in the amount of Bacoside A3 and Piperine in different samples which indicate non-uniformity in their quality which will lead to difference in their therapeutic effects. The outcome of the present investigation underlines the importance of standardization of Ayurvedic formulations. The developed method may be further used to standardize other samples of BV or other formulations containing Bacoside A3 and Piperine.
HPLC analysis and standardization of Brahmi vati – An Ayurvedic poly-herbal formulation
Mishra, Amrita; Mishra, Arun K.; Tiwari, Om Prakash; Jha, Shivesh
2013-01-01
Objectives The aim of the present study was to standardize Brahmi vati (BV) by simultaneous quantitative estimation of Bacoside A3 and Piperine adopting HPLC–UV method. BV very important Ayurvedic polyherbo formulation used to treat epilepsy and mental disorders containing thirty eight ingredients including Bacopa monnieri L. and Piper longum L. Materials and methods An HPLC–UV method was developed for the standardization of BV in light of simultaneous quantitative estimation of Bacoside A3 and Piperine, the major constituents of B. monnieri L. and P. longum L. respectively. The developed method was validated on parameters including linearity, precision, accuracy and robustness. Results The HPLC analysis showed significant increase in amount of Bacoside A3 and Piperine in the in-house sample of BV when compared with all three different marketed samples of the same. Results showed variations in the amount of Bacoside A3 and Piperine in different samples which indicate non-uniformity in their quality which will lead to difference in their therapeutic effects. Conclusion The outcome of the present investigation underlines the importance of standardization of Ayurvedic formulations. The developed method may be further used to standardize other samples of BV or other formulations containing Bacoside A3 and Piperine. PMID:24396246
Chen, Meilian; Lee, Jong-Hyeon; Hur, Jin
2015-10-01
Despite literature evidence suggesting the importance of sampling methods on the properties of sediment pore waters, their effects on the dissolved organic matter (PW-DOM) have been unexplored to date. Here, we compared the effects of two commonly used sampling methods (i.e., centrifuge and Rhizon sampler) on the characteristics of PW-DOM for the first time. The bulk dissolved organic carbon (DOC), ultraviolet-visible (UV-Vis) absorption, and excitation-emission matrixes coupled with parallel factor analysis (EEM-PARAFAC) of the PW-DOM samples were compared for the two sampling methods with the sediments from minimal to severely contaminated sites. The centrifuged samples were found to have higher average values of DOC, UV absorption, and protein-like EEM-PARAFAC components. The samples collected with the Rhizon sampler, however, exhibited generally more humified characteristics than the centrifuged ones, implying a preferential collection of PW-DOM with respect to the sampling methods. Furthermore, the differences between the two sampling methods seem more pronounced in relatively more polluted sites. Our observations were possibly explained by either the filtration effect resulting from the smaller pore size of the Rhizon sampler or the desorption of DOM molecules loosely bound to minerals during centrifugation, or both. Our study suggests that consistent use of one sampling method is crucial for PW-DOM studies and also that caution should be taken in the comparison of data collected with different sampling methods.
Dong, Qi; Elliott, Michael R; Raghunathan, Trivellore E
2014-06-01
Outside of the survey sampling literature, samples are often assumed to be generated by a simple random sampling process that produces independent and identically distributed (IID) samples. Many statistical methods are developed largely in this IID world. Application of these methods to data from complex sample surveys without making allowance for the survey design features can lead to erroneous inferences. Hence, much time and effort have been devoted to develop the statistical methods to analyze complex survey data and account for the sample design. This issue is particularly important when generating synthetic populations using finite population Bayesian inference, as is often done in missing data or disclosure risk settings, or when combining data from multiple surveys. By extending previous work in finite population Bayesian bootstrap literature, we propose a method to generate synthetic populations from a posterior predictive distribution in a fashion inverts the complex sampling design features and generates simple random samples from a superpopulation point of view, making adjustment on the complex data so that they can be analyzed as simple random samples. We consider a simulation study with a stratified, clustered unequal-probability of selection sample design, and use the proposed nonparametric method to generate synthetic populations for the 2006 National Health Interview Survey (NHIS), and the Medical Expenditure Panel Survey (MEPS), which are stratified, clustered unequal-probability of selection sample designs.
Dong, Qi; Elliott, Michael R.; Raghunathan, Trivellore E.
2017-01-01
Outside of the survey sampling literature, samples are often assumed to be generated by a simple random sampling process that produces independent and identically distributed (IID) samples. Many statistical methods are developed largely in this IID world. Application of these methods to data from complex sample surveys without making allowance for the survey design features can lead to erroneous inferences. Hence, much time and effort have been devoted to develop the statistical methods to analyze complex survey data and account for the sample design. This issue is particularly important when generating synthetic populations using finite population Bayesian inference, as is often done in missing data or disclosure risk settings, or when combining data from multiple surveys. By extending previous work in finite population Bayesian bootstrap literature, we propose a method to generate synthetic populations from a posterior predictive distribution in a fashion inverts the complex sampling design features and generates simple random samples from a superpopulation point of view, making adjustment on the complex data so that they can be analyzed as simple random samples. We consider a simulation study with a stratified, clustered unequal-probability of selection sample design, and use the proposed nonparametric method to generate synthetic populations for the 2006 National Health Interview Survey (NHIS), and the Medical Expenditure Panel Survey (MEPS), which are stratified, clustered unequal-probability of selection sample designs. PMID:29200608
Won, Jonghun; Lee, Gyu Rie; Park, Hahnbeom; Seok, Chaok
2018-06-07
The second extracellular loops (ECL2s) of G-protein-coupled receptors (GPCRs) are often involved in GPCR functions, and their structures have important implications in drug discovery. However, structure prediction of ECL2 is difficult because of its long length and the structural diversity among different GPCRs. In this study, a new ECL2 conformational sampling method involving both template-based and ab initio sampling was developed. Inspired by the observation of similar ECL2 structures of closely related GPCRs, a template-based sampling method employing loop structure templates selected from the structure database was developed. A new metric for evaluating similarity of the target loop to templates was introduced for template selection. An ab initio loop sampling method was also developed to treat cases without highly similar templates. The ab initio method is based on the previously developed fragment assembly and loop closure method. A new sampling component that takes advantage of secondary structure prediction was added. In addition, a conserved disulfide bridge restraining ECL2 conformation was predicted and analytically incorporated into sampling, reducing the effective dimension of the conformational search space. The sampling method was combined with an existing energy function for comparison with previously reported loop structure prediction methods, and the benchmark test demonstrated outstanding performance.
NASA Astrophysics Data System (ADS)
Frandas, A.; Dadarlat, Dorin; Chirtoc, Mihai; Jalink, Henk; Bicanic, Dane D.; Paris, D.; Antoniow, Jean S.; Egee, Michel; Ungureanu, Costica
1998-07-01
The photopyroelectric method in different experimental configurations was used for thermophysical characterization of agricultural and biological samples. The study appears important due to the relation of thermal parameters to the quality of foodstuffs (connected to their preservation, storage and adulteration), migration profiles in biodegradable packages, and the mechanism of desiccation tolerance of seeds. Results are presented on the thermal parameters measurement and their dependence on temperature and water content for samples such as: honey, starch, seeds.
NASA Astrophysics Data System (ADS)
Wang, Zhen; Cui, Shengcheng; Yang, Jun; Gao, Haiyang; Liu, Chao; Zhang, Zhibo
2017-03-01
We present a novel hybrid scattering order-dependent variance reduction method to accelerate the convergence rate in both forward and backward Monte Carlo radiative transfer simulations involving highly forward-peaked scattering phase function. This method is built upon a newly developed theoretical framework that not only unifies both forward and backward radiative transfer in scattering-order-dependent integral equation, but also generalizes the variance reduction formalism in a wide range of simulation scenarios. In previous studies, variance reduction is achieved either by using the scattering phase function forward truncation technique or the target directional importance sampling technique. Our method combines both of them. A novel feature of our method is that all the tuning parameters used for phase function truncation and importance sampling techniques at each order of scattering are automatically optimized by the scattering order-dependent numerical evaluation experiments. To make such experiments feasible, we present a new scattering order sampling algorithm by remodeling integral radiative transfer kernel for the phase function truncation method. The presented method has been implemented in our Multiple-Scaling-based Cloudy Atmospheric Radiative Transfer (MSCART) model for validation and evaluation. The main advantage of the method is that it greatly improves the trade-off between numerical efficiency and accuracy order by order.
2018-01-01
ABSTRACT To assess phenotypic bacterial antimicrobial resistance (AMR) in different strata (e.g., host populations, environmental areas, manure, or sewage effluents) for epidemiological purposes, isolates of target bacteria can be obtained from a stratum using various sample types. Also, different sample processing methods can be applied. The MIC of each target antimicrobial drug for each isolate is measured. Statistical equivalence testing of the MIC data for the isolates allows evaluation of whether different sample types or sample processing methods yield equivalent estimates of the bacterial antimicrobial susceptibility in the stratum. We demonstrate this approach on the antimicrobial susceptibility estimates for (i) nontyphoidal Salmonella spp. from ground or trimmed meat versus cecal content samples of cattle in processing plants in 2013-2014 and (ii) nontyphoidal Salmonella spp. from urine, fecal, and blood human samples in 2015 (U.S. National Antimicrobial Resistance Monitoring System data). We found that the sample types for cattle yielded nonequivalent susceptibility estimates for several antimicrobial drug classes and thus may gauge distinct subpopulations of salmonellae. The quinolone and fluoroquinolone susceptibility estimates for nontyphoidal salmonellae from human blood are nonequivalent to those from urine or feces, conjecturally due to the fluoroquinolone (ciprofloxacin) use to treat infections caused by nontyphoidal salmonellae. We also demonstrate statistical equivalence testing for comparing sample processing methods for fecal samples (culturing one versus multiple aliquots per sample) to assess AMR in fecal Escherichia coli. These methods yield equivalent results, except for tetracyclines. Importantly, statistical equivalence testing provides the MIC difference at which the data from two sample types or sample processing methods differ statistically. Data users (e.g., microbiologists and epidemiologists) may then interpret practical relevance of the difference. IMPORTANCE Bacterial antimicrobial resistance (AMR) needs to be assessed in different populations or strata for the purposes of surveillance and determination of the efficacy of interventions to halt AMR dissemination. To assess phenotypic antimicrobial susceptibility, isolates of target bacteria can be obtained from a stratum using different sample types or employing different sample processing methods in the laboratory. The MIC of each target antimicrobial drug for each of the isolates is measured, yielding the MIC distribution across the isolates from each sample type or sample processing method. We describe statistical equivalence testing for the MIC data for evaluating whether two sample types or sample processing methods yield equivalent estimates of the bacterial phenotypic antimicrobial susceptibility in the stratum. This includes estimating the MIC difference at which the data from the two approaches differ statistically. Data users (e.g., microbiologists, epidemiologists, and public health professionals) can then interpret whether that present difference is practically relevant. PMID:29475868
Shakeri, Heman; Volkova, Victoriya; Wen, Xuesong; Deters, Andrea; Cull, Charley; Drouillard, James; Müller, Christian; Moradijamei, Behnaz; Jaberi-Douraki, Majid
2018-05-01
To assess phenotypic bacterial antimicrobial resistance (AMR) in different strata (e.g., host populations, environmental areas, manure, or sewage effluents) for epidemiological purposes, isolates of target bacteria can be obtained from a stratum using various sample types. Also, different sample processing methods can be applied. The MIC of each target antimicrobial drug for each isolate is measured. Statistical equivalence testing of the MIC data for the isolates allows evaluation of whether different sample types or sample processing methods yield equivalent estimates of the bacterial antimicrobial susceptibility in the stratum. We demonstrate this approach on the antimicrobial susceptibility estimates for (i) nontyphoidal Salmonella spp. from ground or trimmed meat versus cecal content samples of cattle in processing plants in 2013-2014 and (ii) nontyphoidal Salmonella spp. from urine, fecal, and blood human samples in 2015 (U.S. National Antimicrobial Resistance Monitoring System data). We found that the sample types for cattle yielded nonequivalent susceptibility estimates for several antimicrobial drug classes and thus may gauge distinct subpopulations of salmonellae. The quinolone and fluoroquinolone susceptibility estimates for nontyphoidal salmonellae from human blood are nonequivalent to those from urine or feces, conjecturally due to the fluoroquinolone (ciprofloxacin) use to treat infections caused by nontyphoidal salmonellae. We also demonstrate statistical equivalence testing for comparing sample processing methods for fecal samples (culturing one versus multiple aliquots per sample) to assess AMR in fecal Escherichia coli These methods yield equivalent results, except for tetracyclines. Importantly, statistical equivalence testing provides the MIC difference at which the data from two sample types or sample processing methods differ statistically. Data users (e.g., microbiologists and epidemiologists) may then interpret practical relevance of the difference. IMPORTANCE Bacterial antimicrobial resistance (AMR) needs to be assessed in different populations or strata for the purposes of surveillance and determination of the efficacy of interventions to halt AMR dissemination. To assess phenotypic antimicrobial susceptibility, isolates of target bacteria can be obtained from a stratum using different sample types or employing different sample processing methods in the laboratory. The MIC of each target antimicrobial drug for each of the isolates is measured, yielding the MIC distribution across the isolates from each sample type or sample processing method. We describe statistical equivalence testing for the MIC data for evaluating whether two sample types or sample processing methods yield equivalent estimates of the bacterial phenotypic antimicrobial susceptibility in the stratum. This includes estimating the MIC difference at which the data from the two approaches differ statistically. Data users (e.g., microbiologists, epidemiologists, and public health professionals) can then interpret whether that present difference is practically relevant. Copyright © 2018 Shakeri et al.
Popic, Tony J; Davila, Yvonne C; Wardle, Glenda M
2013-01-01
Methods for sampling ecological assemblages strive to be efficient, repeatable, and representative. Unknowingly, common methods may be limited in terms of revealing species function and so of less value for comparative studies. The global decline in pollination services has stimulated surveys of flower-visiting invertebrates, using pan traps and net sampling. We explore the relative merits of these two methods in terms of species discovery, quantifying abundance, function, and composition, and responses of species to changing floral resources. Using a spatially-nested design we sampled across a 5000 km(2) area of arid grasslands, including 432 hours of net sampling and 1296 pan trap-days, between June 2010 and July 2011. Net sampling yielded 22% more species and 30% higher abundance than pan traps, and better reflected the spatio-temporal variation of floral resources. Species composition differed significantly between methods; from 436 total species, 25% were sampled by both methods, 50% only by nets, and the remaining 25% only by pans. Apart from being less comprehensive, if pan traps do not sample flower-visitors, the link to pollination is questionable. By contrast, net sampling functionally linked species to pollination through behavioural observations of flower-visitation interaction frequency. Netted specimens are also necessary for evidence of pollen transport. Benefits of net-based sampling outweighed minor differences in overall sampling effort. As pan traps and net sampling methods are not equivalent for sampling invertebrate-flower interactions, we recommend net sampling of invertebrate pollinator assemblages, especially if datasets are intended to document declines in pollination and guide measures to retain this important ecosystem service.
Popic, Tony J.; Davila, Yvonne C.; Wardle, Glenda M.
2013-01-01
Methods for sampling ecological assemblages strive to be efficient, repeatable, and representative. Unknowingly, common methods may be limited in terms of revealing species function and so of less value for comparative studies. The global decline in pollination services has stimulated surveys of flower-visiting invertebrates, using pan traps and net sampling. We explore the relative merits of these two methods in terms of species discovery, quantifying abundance, function, and composition, and responses of species to changing floral resources. Using a spatially-nested design we sampled across a 5000 km2 area of arid grasslands, including 432 hours of net sampling and 1296 pan trap-days, between June 2010 and July 2011. Net sampling yielded 22% more species and 30% higher abundance than pan traps, and better reflected the spatio-temporal variation of floral resources. Species composition differed significantly between methods; from 436 total species, 25% were sampled by both methods, 50% only by nets, and the remaining 25% only by pans. Apart from being less comprehensive, if pan traps do not sample flower-visitors, the link to pollination is questionable. By contrast, net sampling functionally linked species to pollination through behavioural observations of flower-visitation interaction frequency. Netted specimens are also necessary for evidence of pollen transport. Benefits of net-based sampling outweighed minor differences in overall sampling effort. As pan traps and net sampling methods are not equivalent for sampling invertebrate-flower interactions, we recommend net sampling of invertebrate pollinator assemblages, especially if datasets are intended to document declines in pollination and guide measures to retain this important ecosystem service. PMID:23799127
Carter, Melissa D.; Crow, Brian S.; Pantazides, Brooke G.; Watson, Caroline M.; deCastro, B. Rey; Thomas, Jerry D.; Blake, Thomas A.; Johnson, Rudolph C.
2017-01-01
A high-throughput prioritization method was developed for use with a validated confirmatory method detecting organophosphorus nerve agent exposure by immunomagnetic separation-HPLC-MS/MS. A ballistic gradient was incorporated into this analytical method in order to profile unadducted butyrylcholinesterase (BChE) in clinical samples. With Zhang, et al. 1999’s Z′-factor of 0.88 ± 0.01 (SD) of control analytes and Z-factor of 0.25 ± 0.06 (SD) of serum samples, the assay is rated an “excellent assay” for the synthetic peptide controls used and a “double assay” when used to prioritize clinical samples. Hits, defined as samples containing BChE Ser-198 adducts or no BChE present, were analyzed in a confirmatory method for identification and quantitation of the BChE adduct, if present. The ability to prioritize samples by highest exposure for confirmatory analysis is of particular importance in an exposure to cholinesterase inhibitors such as organophosphorus nerve agents where a large number of clinical samples may be collected. In an initial blind screen, 67 out of 70 samples were accurately identified giving an assay accuracy of 96% and yielded no false negatives. The method is the first to provide a high-throughput prioritization assay for profiling adduction of Ser-198 BChE in clinical samples. PMID:23954929
The hydroxyl-functionalized magnetic particles for purification of glycan-binding proteins.
Sun, Xiuxuan; Yang, Ganglong; Sun, Shisheng; Quan, Rui; Dai, Weiwei; Li, Bin; Chen, Chao; Li, Zheng
2009-12-01
Glycan-protein interactions play important biological roles in biological processes. Although there are some methods such as glycan arrays that may elucidate recognition events between carbohydrates and protein as well as screen the important glycan-binding proteins, there is a lack of simple effectively separate method to purify them from complex samples. In proteomics studies, fractionation of samples can help to reduce their complexity and to enrich specific classes of proteins for subsequent downstream analyses. Herein, a rapid simple method for purification of glycan-binding proteins from proteomic samples was developed using hydroxyl-coated magnetic particles coupled with underivatized carbohydrate. Firstly, the epoxy-coated magnetic particles were further hydroxyl functionalized with 4-hydroxybenzhydrazide, then the carbohydrates were efficiently immobilized on hydroxyl functionalized surface of magnetic particles by formation of glycosidic bond with the hemiacetal group at the reducing end of the suitable carbohydrates via condensation. All conditions of this method were optimized. The magnetic particle-carbohydrate conjugates were used to purify the glycan-binding proteins from human serum. The fractionated glycan-binding protein population was displayed by SDS-PAGE. The result showed that the amount of 1 mg magnetic particles coupled with mannose in acetate buffer (pH 5.4) was 10 micromol. The fractionated glycan-binding protein population in human serum could be eluted from the magnetic particle-mannose conjugates by 0.1% SDS. The methodology could work together with the glycan microarrays for screening and purification of the important GBPs from complex protein samples.
SPR based immunosensor for detection of Legionella pneumophila in water samples
NASA Astrophysics Data System (ADS)
Enrico, De Lorenzis; Manera, Maria G.; Montagna, Giovanni; Cimaglia, Fabio; Chiesa, Maurizio; Poltronieri, Palmiro; Santino, Angelo; Rella, Roberto
2013-05-01
Detection of legionellae by water sampling is an important factor in epidemiological investigations of Legionnaires' disease and its prevention. To avoid labor-intensive problems with conventional methods, an alternative, highly sensitive and simple method is proposed for detecting L. pneumophila in aqueous samples. A compact Surface Plasmon Resonance (SPR) instrumentation prototype, provided with proper microfluidics tools, is built. The developed immunosensor is capable of dynamically following the binding between antigens and the corresponding antibody molecules immobilized on the SPR sensor surface. A proper immobilization strategy is used in this work that makes use of an important efficient step aimed at the orientation of antibodies onto the sensor surface. The feasibility of the integration of SPR-based biosensing setups with microfluidic technologies, resulting in a low-cost and portable biosensor is demonstrated.
Qiu, Junlang; Wang, Fuxin; Zhang, Tianlang; Chen, Le; Liu, Yuan; Zhu, Fang; Ouyang, Gangfeng
2018-01-02
Decreasing the tedious sample preparation duration is one of the most important concerns for the environmental analytical chemistry especially for in vivo experiments. However, due to the slow mass diffusion paths for most of the conventional methods, ultrafast in vivo sampling remains challenging. Herein, for the first time, we report an ultrafast in vivo solid-phase microextraction (SPME) device based on electrosorption enhancement and a novel custom-made CNT@PPY@pNE fiber for in vivo sampling of ionized acidic pharmaceuticals in fish. This sampling device exhibited an excellent robustness, reproducibility, matrix effect-resistant capacity, and quantitative ability. Importantly, the extraction kinetics of the targeted ionized pharmaceuticals were significantly accelerated using the device, which significantly improved the sensitivity of the SPME in vivo sampling method (limits of detection ranged from 0.12 ng·g -1 to 0.25 ng·g -1 ) and shorten the sampling time (only 1 min). The proposed approach was successfully applied to monitor the concentrations of ionized pharmaceuticals in living fish, which demonstrated that the device and fiber were suitable for ultrafast in vivo sampling and continuous monitoring. In addition, the bioconcentration factor (BCF) values of the pharmaceuticals were derived in tilapia (Oreochromis mossambicus) for the first time, based on the data of ultrafast in vivo sampling. Therefore, we developed and validated an effective and ultrafast SPME sampling device for in vivo sampling of ionized analytes in living organisms and this state-of-the-art method provides an alternative technique for future in vivo studies.
Rapid method to determine 226Ra in steel samples
Maxwell, Sherrod L.; Culligan, Brian; Hutchison, Jay B.; ...
2017-09-22
The rapid measurement of 226Ra in steel samples is very important in the event of a radiological emergency. 226Ra (T 1/2 = 1600 y) is a natural radionuclide present in the environment and a highly toxic alpha-emitter. Due to its long life and tendency to concentrate in bones, 226Ra ingestion or inhalation can lead to significant committed dose to individuals. A new method for the determination of 226Ra in steel samples has been developed at the Savannah River Environmental Laboratory. The new method employs a rugged acid digestion method that includes hydrofluoric acid, followed by a single precipitation step tomore » rapidly preconcentrate the radium and remove most of the dissolved steel sample matrix. Radium is then separated using a combination of cation exchange and extraction chromatography, and 226Ra is measured by alpha spectrometry. This approach has a sample preparation time of ~ 8 h for steel samples, has a very high tracer yield (> 88%), and removes interferences effectively. A 133Ba yield tracer is used so that samples can be counted immediately following the separation method, avoiding lengthy ingrowth times that are required in other methods.« less
Rapid method to determine 226Ra in steel samples
DOE Office of Scientific and Technical Information (OSTI.GOV)
Maxwell, Sherrod L.; Culligan, Brian; Hutchison, Jay B.
The rapid measurement of 226Ra in steel samples is very important in the event of a radiological emergency. 226Ra (T 1/2 = 1600 y) is a natural radionuclide present in the environment and a highly toxic alpha-emitter. Due to its long life and tendency to concentrate in bones, 226Ra ingestion or inhalation can lead to significant committed dose to individuals. A new method for the determination of 226Ra in steel samples has been developed at the Savannah River Environmental Laboratory. The new method employs a rugged acid digestion method that includes hydrofluoric acid, followed by a single precipitation step tomore » rapidly preconcentrate the radium and remove most of the dissolved steel sample matrix. Radium is then separated using a combination of cation exchange and extraction chromatography, and 226Ra is measured by alpha spectrometry. This approach has a sample preparation time of ~ 8 h for steel samples, has a very high tracer yield (> 88%), and removes interferences effectively. A 133Ba yield tracer is used so that samples can be counted immediately following the separation method, avoiding lengthy ingrowth times that are required in other methods.« less
Spectral feature characterization methods for blood stain detection in crime scene backgrounds
NASA Astrophysics Data System (ADS)
Yang, Jie; Mathew, Jobin J.; Dube, Roger R.; Messinger, David W.
2016-05-01
Blood stains are one of the most important types of evidence for forensic investigation. They contain valuable DNA information, and the pattern of the stains can suggest specifics about the nature of the violence that transpired at the scene. Blood spectral signatures containing unique reflectance or absorption features are important both for forensic on-site investigation and laboratory testing. They can be used for target detection and identification applied to crime scene hyperspectral imagery, and also be utilized to analyze the spectral variation of blood on various backgrounds. Non-blood stains often mislead the detection and can generate false alarms at a real crime scene, especially for dark and red backgrounds. This paper measured the reflectance of liquid blood and 9 kinds of non-blood samples in the range of 350 nm - 2500 nm in various crime scene backgrounds, such as pure samples contained in petri dish with various thicknesses, mixed samples with different colors and materials of fabrics, and mixed samples with wood, all of which are examined to provide sub-visual evidence for detecting and recognizing blood from non-blood samples in a realistic crime scene. The spectral difference between blood and non-blood samples are examined and spectral features such as "peaks" and "depths" of reflectance are selected. Two blood stain detection methods are proposed in this paper. The first method uses index to denote the ratio of "depth" minus "peak" over"depth" add"peak" within a wavelength range of the reflectance spectrum. The second method uses relative band depth of the selected wavelength ranges of the reflectance spectrum. Results show that the index method is able to discriminate blood from non-blood samples in most tested crime scene backgrounds, but is not able to detect it from black felt. Whereas the relative band depth method is able to discriminate blood from non-blood samples on all of the tested background material types and colors.
Jain, Raka; Quraishi, Rizwana; Verma, Arpita
2017-01-01
Assessment of cotinine, a metabolite of nicotine in body fluids, is an important approach for validating the self-report among tobacco users. Adaptation of assays on dried urine spots (DUSs) has advantages of ease of collection, transportation, minimal invasiveness, and requirement of small volume. The aim of the present study was to develop an efficient method for testing cotinine in DUSs and evaluating its clinical applicability. This involved optimization of conditions for detection, recovery, and stability of cotinine from dried urine, spotted on filter paper. Enzyme-linked immunosorbent assay was used for screening, whereas confirmation was done by gas chromatography. For clinical applicability, urine samples of tobacco users were tested. Water was found to be a suitable extracting solvent as compared to carbonate-bicarbonate buffer (pH 9.2) and saline. Screening was achieved by two punches taken from a 20 μl (diameter 1.3 cm) spotted urine samples, and confirmation was achieved by five complete circles each of 20 μl sample volume. The recovery was found to be 97% in water. Limit of detection for the method was found to be 100 ng/ml. No signs of significant degradation were found under all storage conditions. All the urine samples of tobacco users were found to be positive by a conventional method as well as DUSs, and the method proved to be efficient. DUS samples are a useful alternative for biological monitoring of recent nicotine use, especially in developing countries where sample logistics could be an important concern.
Development of a novel cell sorting method that samples population diversity in flow cytometry.
Osborne, Geoffrey W; Andersen, Stacey B; Battye, Francis L
2015-11-01
Flow cytometry based electrostatic cell sorting is an important tool in the separation of cell populations. Existing instruments can sort single cells into multi-well collection plates, and keep track of cell of origin and sorted well location. However currently single sorted cell results reflect the population distribution and fail to capture the population diversity. Software was designed that implements a novel sorting approach, "Slice and Dice Sorting," that links a graphical representation of a multi-well plate to logic that ensures that single cells are sampled and sorted from all areas defined by the sort region/s. Therefore the diversity of the total population is captured, and the more frequently occurring or rarer cell types are all sampled. The sorting approach was tested computationally, and using functional cell based assays. Computationally we demonstrate that conventional single cell sorting can sample as little as 50% of the population diversity dependant on the population distribution, and that Slice and Dice sorting samples much more of the variety present within a cell population. We then show by sorting single cells into wells using the Slice and Dice sorting method that there are cells sorted using this method that would be either rarely sorted, or not sorted at all using conventional single cell sorting approaches. The present study demonstrates a novel single cell sorting method that samples much more of the population diversity than current methods. It has implications in clonal selection, stem cell sorting, single cell sequencing and any areas where population heterogeneity is of importance. © 2015 International Society for Advancement of Cytometry.
The Importance of Measuring Mercury in Wood
NASA Astrophysics Data System (ADS)
Yang, Y.; Yanai, R. D.; Driscoll, C. T.; Montesdeoca, M.
2014-12-01
Forests are important receptors of Hg deposition, and biological Hg hotspots occur mainly in forested regions, but few efforts have been made to determine the Hg content of trees. Mercury concentrations in stem tissue are lower than the foliage and bark, so low that they have often been below detection limits, especially in hardwood species. However, because wood is the largest component of forest biomass, it can be a larger Hg pool than the foliage, and thus quantifying concentrations in wood is important to Hg budgets in forests. The objective of our study was to determine the methods necessary to detect Hg in bole wood of four tree species, including two hardwoods and two conifers. We also evaluated the effect of air-drying and oven-drying samples on Hg recovery, compared to freeze-drying samples prior to analysis, which is the standard procedure. Many archived wood samples that were air-dried or oven-dried could be appropriate for Hg analysis if these methods could be validated; few are freeze-dried. We analyzed samples for total Hg using thermal decomposition, catalytic conversion, amalgamation, and atomic absorption spectrophotometry (Method 7473, USEPA 1998). The result of the method detection limit study was 1.27 ng g-1, based on apple leaf standards (NIST 1515, 44 ± 4 ng/g). Concentrations in the hardwood species were 1.48 ± 0.23 ng g-1 for sugar maple and 1.75 ± 0.14 ng g-1 for American beech. Concentrations were higher in red spruce and balsam fir. Samples that were analyzed fresh, freeze-dried, or oven-dried at 65 ˚C were in close agreement, after correcting for moisture content. However, Hg concentrations were 34 to 45% too high in the air-dry samples, presumably reflecting absorption from the atmosphere, and they were 44 to 66% too low in the samples oven-dried at 103 ˚C, presumably due to volatilization. We recommend that samples be freeze-dried or oven-dried at 65 ˚C for analysis of Hg in wood; archived samples that have been oven-dried at higher temperatures or stored exposed to the air may not be suitable for Hg analysis.
Importance sampling with imperfect cloning for the computation of generalized Lyapunov exponents
NASA Astrophysics Data System (ADS)
Anteneodo, Celia; Camargo, Sabrina; Vallejos, Raúl O.
2017-12-01
We revisit the numerical calculation of generalized Lyapunov exponents, L (q ) , in deterministic dynamical systems. The standard method consists of adding noise to the dynamics in order to use importance sampling algorithms. Then L (q ) is obtained by taking the limit noise-amplitude → 0 after the calculation. We focus on a particular method that involves periodic cloning and pruning of a set of trajectories. However, instead of considering a noisy dynamics, we implement an imperfect (noisy) cloning. This alternative method is compared with the standard one and, when possible, with analytical results. As a workbench we use the asymmetric tent map, the standard map, and a system of coupled symplectic maps. The general conclusion of this study is that the imperfect-cloning method performs as well as the standard one, with the advantage of preserving the deterministic dynamics.
Anneken, David; Striebich, Richard; DeWitt, Matthew J; Klingshirn, Christopher; Corporan, Edwin
2015-03-01
Aircraft turbine engines are a significant source of particulate matter (PM) and gaseous emissions in the vicinity of airports and military installations. Hazardous air pollutants (HAPs) (e.g., formaldehyde, benzene, naphthalene and other compounds) associated with aircraft emissions are an environmental concern both in flight and at ground level. Therefore, effective sampling, identification, and accurate measurement of these trace species are important to assess their environmental impact. This effort evaluates two established ambient air sampling and analysis methods, U.S. Environmental Protection Agency (EPA) Method TO-11A and National Institute for Occupational Safety and Health (NIOSH) Method 1501, for potential use to quantify HAPs from aircraft turbine engines. The techniques were used to perform analysis of the exhaust from a T63 turboshaft engine, and were examined using certified gas standards transferred through the heated sampling systems used for engine exhaust gaseous emissions measurements. Test results show that the EPA Method TO-11A (for aldehydes) and NIOSH Method 1501 (for semivolatile hydrocarbons) were effective techniques for the sampling and analysis of most HAPs of interest. Both methods showed reasonable extraction efficiencies of HAP species from the sorbent tubes, with the exception of acrolein, styrene, and phenol, which were not well quantified. Formaldehyde measurements using dinitrophenylhydrazine (DNPH) tubes (EPA method TO-11A) were accurate for gas-phase standards, and compared favorably to measurements using gas-phase Fourier-transform infrared (FTIR) spectroscopy. In general, these two standard methodologies proved to be suitable techniques for field measurement of turbine engine HAPs within a reasonable (5-10 minutes) sampling period. Details of the tests, the analysis methods, calibration procedures, and results from the gas standards and T63 engine tested using a conventional JP-8 jet fuel are provided. HAPs from aviation-related sources are important because of their adverse health and environmental impacts in and around airports and flight lines. Simpler, more convenient techniques to measure the important HAPs, especially aldehydes and volatile organic HAPs, are needed to provide information about their occurrence and assist in the development of engines that emit fewer harmful emissions.
Comparison of individual and pooled sampling methods for detecting bacterial pathogens of fish
Mumford, Sonia; Patterson, Chris; Evered, J.; Brunson, Ray; Levine, J.; Winton, J.
2005-01-01
Examination of finfish populations for viral and bacterial pathogens is an important component of fish disease control programs worldwide. Two methods are commonly used for collecting tissue samples for bacteriological culture, the currently accepted standards for detection of bacterial fish pathogens. The method specified in the Office International des Epizooties Manual of Diagnostic Tests for Aquatic Animals permits combining renal and splenic tissues from as many as 5 fish into pooled samples. The American Fisheries Society (AFS) Blue Book/US Fish and Wildlife Service (USFWS) Inspection Manual specifies the use of a bacteriological loop for collecting samples from the kidney of individual fish. An alternative would be to more fully utilize the pooled samples taken for virology. If implemented, this approach would provide substantial savings in labor and materials. To compare the relative performance of the AFS/USFWS method and this alternative approach, cultures of Yersinia ruckeri were used to establish low-level infections in groups of rainbow trout (Oncorhynchus mykiss) that were sampled by both methods. Yersinia ruckeri was cultured from 22 of 37 groups by at least 1 method. The loop method yielded 18 positive groups, with 1 group positive in the loop samples but negative in the pooled samples. The pooled samples produced 21 positive groups, with 4 groups positive in the pooled samples but negative in the loop samples. There was statistically significant agreement (Spearman coefficient 0.80, P < 0.001) in the relative ability of the 2 sampling methods to permit detection of low-level bacterial infections of rainbow trout.
USDA-ARS?s Scientific Manuscript database
Spatial and temporal dynamics of pest populations is an important aspect of effective pest management. However, absolute sampling of some pest populations such as the ham mite, Tyrophagus putrescentiae (Schrank) (Sarcoptiformes: Acaridae), a serious pest of dry-cured ham, can be difficult. Sampling ...
Using Language Sample Analysis to Assess Spoken Language Production in Adolescents
ERIC Educational Resources Information Center
Miller, Jon F.; Andriacchi, Karen; Nockerts, Ann
2016-01-01
Purpose: This tutorial discusses the importance of language sample analysis and how Systematic Analysis of Language Transcripts (SALT) software can be used to simplify the process and effectively assess the spoken language production of adolescents. Method: Over the past 30 years, thousands of language samples have been collected from typical…
A practical modification of horizontal line sampling for snag and cavity tree inventory
M. J. Ducey; G. J. Jordan; J. H. Gove; H. T. Valentine
2002-01-01
Snags and cavity trees are important structural features in forests, but they are often sparsely distributed, making efficient inventories problematic. We present a straightforward modification of horizontal line sampling designed to facilitate inventory of these features while remaining compatible with commonly employed sampling methods for the living overstory. The...
Efficiency of snake sampling methods in the Brazilian semiarid region.
Mesquita, Paula C M D; Passos, Daniel C; Cechin, Sonia Z
2013-09-01
The choice of sampling methods is a crucial step in every field survey in herpetology. In countries where time and financial support are limited, the choice of the methods is critical. The methods used to sample snakes often lack objective criteria, and the traditional methods have apparently been more important when making the choice. Consequently researches using not-standardized methods are frequently found in the literature. We have compared four commonly used methods for sampling snake assemblages in a semiarid area in Brazil. We compared the efficacy of each method based on the cost-benefit regarding the number of individuals and species captured, time, and financial investment. We found that pitfall traps were the less effective method in all aspects that were evaluated and it was not complementary to the other methods in terms of abundance of species and assemblage structure. We conclude that methods can only be considered complementary if they are standardized to the objectives of the study. The use of pitfall traps in short-term surveys of the snake fauna in areas with shrubby vegetation and stony soil is not recommended.
GeoLab Concept: The Importance of Sample Selection During Long Duration Human Exploration Mission
NASA Technical Reports Server (NTRS)
Calaway, M. J.; Evans, C. A.; Bell, M. S.; Graff, T. G.
2011-01-01
In the future when humans explore planetary surfaces on the Moon, Mars, and asteroids or beyond, the return of geologic samples to Earth will be a high priority for human spaceflight operations. All future sample return missions will have strict down-mass and volume requirements; methods for in-situ sample assessment and prioritization will be critical for selecting the best samples for return-to-Earth.
A guide to the use of distance sampling to estimate abundance of Karner blue butterflies
Grundel, Ralph
2015-01-01
This guide is intended to describe the use of distance sampling as a method for evaluating the abundance of Karner blue butterflies at a location. Other methods for evaluating abundance exist, including mark-release-recapture and index counts derived from Pollard-Yates surveys, for example. Although this guide is not intended to be a detailed comparison of the pros and cons of each type of method, there are important preliminary considerations to think about before selecting any method for evaluating the abundance of Karner blue butterflies.
[Comparison of the Conventional Centrifuged and Filtrated Preparations in Urine Cytology].
Sekita, Nobuyuki; Shimosakai, Hirofumi; Nishikawa, Rika; Sato, Hiroaki; Kouno, Hiroyoshi; Fujimura, Masaaki; Mikami, Kazuo
2016-03-01
The urine cytology test is one of the most important tools for the diagnosis of malignant urinary tract tumors. This test is also of great value for predicting malignancy. However, the sensitivity of this test is not high enough to screen for malignant cells. In our laboratory, we were able to attain a high sensitivity of urine cytology tests after changing the preparation method of urine samples. The differences in the cytodiagnosis between the two methods are discussed here. From January 2012 to June 2013, 2,031 urine samples were prepared using the conventional centrifuge method (C method) ; and from September 2013 to March 2015, 2,453 urine samples were prepared using the filtration method (F method) for the cytology test. When the samples included in category 4 or 5, were defined as cytological positive, the sensitivities of this test with samples prepared using the F method were significantly high compared with samples prepared using the C method (72% vs 28%, p<0.001). The number of cells on the glass slides prepared by the F method was significantly higher than that of the samples prepared by the C method (p<0.001). After introduction of the F method, the number of f alse negative cases was decreased in the urine cytology test because a larger number of cells was seen and easily detected as atypical or malignant epithelial cells. Therefore, this method has a higher sensitivity than the conventional C method as the sensitivity of urine cytology tests relies partially on the number of cells visualized in the prepared samples.
A LITERATURE REVIEW OF WIPE SAMPLING METHODS ...
Wipe sampling is an important technique for the estimation of contaminant deposition in buildings, homes, or outdoor surfaces as a source of possible human exposure. Numerousmethods of wipe sampling exist, and each method has its own specification for the type of wipe, wetting solvent, and determinative step to be used, depending upon the contaminant of concern. The objective of this report is to concisely summarize the findings of a literature review that was conducted to identify the state-of-the-art wipe sampling techniques for a target list of compounds. This report describes the methods used to perform the literature review; a brief review of wipe sampling techniques in general; an analysis of physical and chemical properties of each target analyte; an analysis of wipe sampling techniques for the target analyte list; and asummary of the wipe sampling techniques for the target analyte list, including existing data gaps. In general, no overwhelming consensus can be drawn from the current literature on how to collect a wipe sample for the chemical warfare agents, organophosphate pesticides, and other toxic industrial chemicals of interest to this study. Different methods, media, and wetting solvents have been recommended and used by various groups and different studies. For many of the compounds of interest, no specific wipe sampling methodology has been established for their collection. Before a wipe sampling method (or methods) can be established for the co
Structural system reliability calculation using a probabilistic fault tree analysis method
NASA Technical Reports Server (NTRS)
Torng, T. Y.; Wu, Y.-T.; Millwater, H. R.
1992-01-01
The development of a new probabilistic fault tree analysis (PFTA) method for calculating structural system reliability is summarized. The proposed PFTA procedure includes: developing a fault tree to represent the complex structural system, constructing an approximation function for each bottom event, determining a dominant sampling sequence for all bottom events, and calculating the system reliability using an adaptive importance sampling method. PFTA is suitable for complicated structural problems that require computer-intensive computer calculations. A computer program has been developed to implement the PFTA.
Graves, Tabitha A.; Royle, J. Andrew; Kendall, Katherine C.; Beier, Paul; Stetz, Jeffrey B.; Macleod, Amy C.
2012-01-01
Using multiple detection methods can increase the number, kind, and distribution of individuals sampled, which may increase accuracy and precision and reduce cost of population abundance estimates. However, when variables influencing abundance are of interest, if individuals detected via different methods are influenced by the landscape differently, separate analysis of multiple detection methods may be more appropriate. We evaluated the effects of combining two detection methods on the identification of variables important to local abundance using detections of grizzly bears with hair traps (systematic) and bear rubs (opportunistic). We used hierarchical abundance models (N-mixture models) with separate model components for each detection method. If both methods sample the same population, the use of either data set alone should (1) lead to the selection of the same variables as important and (2) provide similar estimates of relative local abundance. We hypothesized that the inclusion of 2 detection methods versus either method alone should (3) yield more support for variables identified in single method analyses (i.e. fewer variables and models with greater weight), and (4) improve precision of covariate estimates for variables selected in both separate and combined analyses because sample size is larger. As expected, joint analysis of both methods increased precision as well as certainty in variable and model selection. However, the single-method analyses identified different variables and the resulting predicted abundances had different spatial distributions. We recommend comparing single-method and jointly modeled results to identify the presence of individual heterogeneity between detection methods in N-mixture models, along with consideration of detection probabilities, correlations among variables, and tolerance to risk of failing to identify variables important to a subset of the population. The benefits of increased precision should be weighed against those risks. The analysis framework presented here will be useful for other species exhibiting heterogeneity by detection method.
Determination of technetium-99 in environmental samples: a review.
Shi, Keliang; Hou, Xiaolin; Roos, Per; Wu, Wangsuo
2012-01-04
Due to the lack of a stable technetium isotope, and the high mobility and long half-life, (99)Tc is considered to be one of the most important radionuclides in safety assessment of environmental radioactivity as well as nuclear waste management. (99)Tc is also an important tracer for oceanographic research due to the high technetium solubility in seawater as TcO(4)(-). A number of analytical methods, using chemical separation combined with radiometric and mass spectrometric measurement techniques, have been developed over the past decades for determination of (99)Tc in different environmental samples. This article summarizes and compares recently reported chemical separation procedures and measurement methods for determination of (99)Tc. Due to the extremely low concentration of (99)Tc in environmental samples, the sample preparation, pre-concentration, chemical separation and purification for removal of the interferences for detection of (99)Tc are the most important issues governing the accurate determination of (99)Tc. These aspects are discussed in detail in this article. Meanwhile, the different measurement techniques for (99)Tc are also compared with respect to advantages and drawbacks. Novel automated analytical methods for rapid determination of (99)Tc using solid extraction or ion exchange chromatography for separation of (99)Tc, employing flow injection or sequential injection approaches are also discussed. Copyright © 2011 Elsevier B.V. All rights reserved.
Sofaer, Helen R.; Jarnevich, Catherine S.
2017-01-01
AimThe distributions of exotic species reflect patterns of human-mediated dispersal, species climatic tolerances and a suite of other biotic and abiotic factors. The relative importance of each of these factors will shape how the spread of exotic species is affected by ongoing economic globalization and climate change. However, patterns of trade may be correlated with variation in scientific sampling effort globally, potentially confounding studies that do not account for sampling patterns.LocationGlobal.Time periodMuseum records, generally from the 1800s up to 2015.Major taxa studiedPlant species exotic to the United States.MethodsWe used data from the Global Biodiversity Information Facility (GBIF) to summarize the number of plant species with exotic occurrences in the United States that also occur in each other country world-wide. We assessed the relative importance of trade and climatic similarity for explaining variation in the number of shared species while evaluating several methods to account for variation in sampling effort among countries.ResultsAccounting for variation in sampling effort reversed the relative importance of trade and climate for explaining numbers of shared species. Trade was strongly correlated with numbers of shared U.S. exotic plants between the United States and other countries before, but not after, accounting for sampling variation among countries. Conversely, accounting for sampling effort strengthened the relationship between climatic similarity and species sharing. Using the number of records as a measure of sampling effort provided a straightforward approach for the analysis of occurrence data, whereas species richness estimators and rarefaction were less effective at removing sampling bias.Main conclusionsOur work provides support for broad-scale climatic limitation on the distributions of exotic species, illustrates the need to account for variation in sampling effort in large biodiversity databases, and highlights the difficulty in inferring causal links between the economic drivers of invasion and global patterns of exotic species occurrence.
Olekšáková, Tereza; Žurovcová, Martina; Klimešová, Vanda; Barták, Miroslav; Šuláková, Hana
2018-04-01
Several methods of DNA extraction, coupled with 'DNA barcoding' species identification, were compared using specimens from early developmental stages of forensically important flies from the Calliphoridae and Sarcophagidae families. DNA was extracted at three immature stages - eggs, the first instar larvae, and empty pupal cases (puparia) - using four different extraction methods, namely, one simple 'homemade' extraction buffer protocol and three commercial kits. The extraction conditions, including the amount of proteinase K and incubation times, were optimized. The simple extraction buffer method was successful for half of the eggs and for the first instar larval samples. The DNA Lego Kit and DEP-25 DNA Extraction Kit were useful for DNA extractions from the first instar larvae samples, and the DNA Lego Kit was also successful regarding the extraction from eggs. The QIAamp DNA mini kit was the most effective; the extraction was successful with regard to all sample types - eggs, larvae, and pupari.
Videvall, Elin; Strandh, Maria; Engelbrecht, Anel; Cloete, Schalk; Cornwallis, Charlie K
2017-01-01
The gut microbiome of animals is emerging as an important factor influencing ecological and evolutionary processes. A major bottleneck in obtaining microbiome data from large numbers of samples is the time-consuming laboratory procedures required, specifically the isolation of DNA and generation of amplicon libraries. Recently, direct PCR kits have been developed that circumvent conventional DNA extraction steps, thereby streamlining the laboratory process by reducing preparation time and costs. However, the reliability and efficacy of direct PCR for measuring host microbiomes have not yet been investigated other than in humans with 454 sequencing. Here, we conduct a comprehensive evaluation of the microbial communities obtained with direct PCR and the widely used Mo Bio PowerSoil DNA extraction kit in five distinct gut sample types (ileum, cecum, colon, feces, and cloaca) from 20 juvenile ostriches, using 16S rRNA Illumina MiSeq sequencing. We found that direct PCR was highly comparable over a range of measures to the DNA extraction method in cecal, colon, and fecal samples. However, the two methods significantly differed in samples with comparably low bacterial biomass: cloacal and especially ileal samples. We also sequenced 100 replicate sample pairs to evaluate repeatability during both extraction and PCR stages and found that both methods were highly consistent for cecal, colon, and fecal samples ( r s > 0.7) but had low repeatability for cloacal ( r s = 0.39) and ileal ( r s = -0.24) samples. This study indicates that direct PCR provides a fast, cheap, and reliable alternative to conventional DNA extraction methods for retrieving 16S rRNA data, which can aid future gut microbiome studies. IMPORTANCE The microbial communities of animals can have large impacts on their hosts, and the number of studies using high-throughput sequencing to measure gut microbiomes is rapidly increasing. However, the library preparation procedure in microbiome research is both costly and time-consuming, especially for large numbers of samples. We investigated a cheaper and faster direct PCR method designed to bypass the DNA isolation steps during 16S rRNA library preparation and compared it with a standard DNA extraction method. We used both techniques on five different gut sample types collected from 20 juvenile ostriches and sequenced samples with Illumina MiSeq. The methods were highly comparable and highly repeatable in three sample types with high microbial biomass (cecum, colon, and feces), but larger differences and low repeatability were found in the microbiomes obtained from the ileum and cloaca. These results will help microbiome researchers assess library preparation procedures and plan their studies accordingly.
Abdali, Khadijeh; Soleimani, Marzieh; Khajehei, Marjan; Tabatabaee, Hamid Reza; Komar, Perikala V; Montazer, Nader Riaz
2010-01-01
The Papanicolaou smear is a standard test for cervical cancer screening; however, the most important challenge is high false negative results. Several factors contribute to this problem and one the most important is inappropriate sampling. The aim of this study was to compare the quality of smears obtained by either an anatomical spatula or a spatula-cyto brush. One hundred married women participated in this single blind clinical trial. After all participants were interviewed, two samples were obtained from each: one with a spatula-cytobrush and another with an anatomical spatula. Slides were prepared and assessed by two pathologists for kappa coefficient analysis. Cell adequacy was 96.1 % in anatomical spatula method and 91.2 % in spatula-cyto brush method (p= 0.016). The rates for endocervical cells and metaplasia cells were 70.6%and 24.5%, respectively, with the anatomical spatula method and 69.6% and 24.5% using a spatula-cytobrush (p<0.001). No one reported pain and the amount of bleeding was 38.2% in both methods (p>0.05). In addition, there were no statistically significant differences regarding infection and inflammatory reactions (p>0.05). Based on the findings of this study, the results of sampling with anatomical spatula were more acceptable and better than those of spatula-cytobrush sampling.
Computational methods for efficient structural reliability and reliability sensitivity analysis
NASA Technical Reports Server (NTRS)
Wu, Y.-T.
1993-01-01
This paper presents recent developments in efficient structural reliability analysis methods. The paper proposes an efficient, adaptive importance sampling (AIS) method that can be used to compute reliability and reliability sensitivities. The AIS approach uses a sampling density that is proportional to the joint PDF of the random variables. Starting from an initial approximate failure domain, sampling proceeds adaptively and incrementally with the goal of reaching a sampling domain that is slightly greater than the failure domain to minimize over-sampling in the safe region. Several reliability sensitivity coefficients are proposed that can be computed directly and easily from the above AIS-based failure points. These probability sensitivities can be used for identifying key random variables and for adjusting design to achieve reliability-based objectives. The proposed AIS methodology is demonstrated using a turbine blade reliability analysis problem.
Mutation Clusters from Cancer Exome.
Kakushadze, Zura; Yu, Willie
2017-08-15
We apply our statistically deterministic machine learning/clustering algorithm *K-means (recently developed in https://ssrn.com/abstract=2908286) to 10,656 published exome samples for 32 cancer types. A majority of cancer types exhibit a mutation clustering structure. Our results are in-sample stable. They are also out-of-sample stable when applied to 1389 published genome samples across 14 cancer types. In contrast, we find in- and out-of-sample instabilities in cancer signatures extracted from exome samples via nonnegative matrix factorization (NMF), a computationally-costly and non-deterministic method. Extracting stable mutation structures from exome data could have important implications for speed and cost, which are critical for early-stage cancer diagnostics, such as novel blood-test methods currently in development.
Mutation Clusters from Cancer Exome
Kakushadze, Zura; Yu, Willie
2017-01-01
We apply our statistically deterministic machine learning/clustering algorithm *K-means (recently developed in https://ssrn.com/abstract=2908286) to 10,656 published exome samples for 32 cancer types. A majority of cancer types exhibit a mutation clustering structure. Our results are in-sample stable. They are also out-of-sample stable when applied to 1389 published genome samples across 14 cancer types. In contrast, we find in- and out-of-sample instabilities in cancer signatures extracted from exome samples via nonnegative matrix factorization (NMF), a computationally-costly and non-deterministic method. Extracting stable mutation structures from exome data could have important implications for speed and cost, which are critical for early-stage cancer diagnostics, such as novel blood-test methods currently in development. PMID:28809811
William B. Sutton; Yong Wang; Callie J. Schweitzer
2010-01-01
Understanding vertebrate habitat relationships is important to promote management strategies for the longterm conservation of many species. Using a modified drift fence method, we sampled reptiles and compared habitat variables within the William B. Bankhead National Forest (BNF) in Alabama, U.S.A from April 2005 to June 2006. We captured 226 individual reptiles...
The experience sampling method: Investigating students' affective experience
NASA Astrophysics Data System (ADS)
Nissen, Jayson M.; Stetzer, MacKenzie R.; Shemwell, Jonathan T.
2013-01-01
Improving non-cognitive outcomes such as attitudes, efficacy, and persistence in physics courses is an important goal of physics education. This investigation implemented an in-the-moment surveying technique called the Experience Sampling Method (ESM) [1] to measure students' affective experience in physics. Measurements included: self-efficacy, cognitive efficiency, activation, intrinsic motivation, and affect. Data are presented that show contrasts in students' experiences (e.g., in physics vs. non-physics courses).
Probability Issues in without Replacement Sampling
ERIC Educational Resources Information Center
Joarder, A. H.; Al-Sabah, W. S.
2007-01-01
Sampling without replacement is an important aspect in teaching conditional probabilities in elementary statistics courses. Different methods proposed in different texts for calculating probabilities of events in this context are reviewed and their relative merits and limitations in applications are pinpointed. An alternative representation of…
Small-Chamber Measurements of Chemical-Specific Emission Factors for Drywall
DOE Office of Scientific and Technical Information (OSTI.GOV)
Maddalena, Randy; Russell, Marion; Apte, Michael G.
2010-06-01
Imported drywall installed in U.S. homes is suspected of being a source of odorous and potentially corrosive indoor pollutants. To support an investigation of those building materials by the Consumer Products Safety Commission (CPSC), Lawrence Berkeley National Laboratory (LBNL) measured chemical-specific emission factors for 30 samples of drywall materials. Emission factors are reported for 75 chemicals and 30 different drywall samples encompassing both domestic and imported stock and incorporating natural, synthetic, or mixed gypsum core material. CPSC supplied all drywall materials. First the drywall samples were isolated and conditioned in dedicated chambers, then they were transferred to small chambers wheremore » emission testing was performed. Four sampling and analysis methods were utilized to assess (1) volatile organic compounds, (2) low molecular weight carbonyls, (3) volatile sulfur compounds, and (4) reactive sulfur gases. LBNL developed a new method that combines the use of solid phase microextraction (SPME) with small emission chambers to measure the reactive sulfur gases, then extended that technique to measure the full suite of volatile sulfur compounds. The testing procedure and analysis methods are described in detail herein. Emission factors were measured under a single set of controlled environmental conditions. The results are compared graphically for each method and in detailed tables for use in estimating indoor exposure concentrations.« less
Gwida, Mayada M; Al-Ashmawy, Maha A M
2014-01-01
A total of 200 samples of milk and dairy products as well as 120 samples of dairy handlers were randomly collected from different dairy farms and supermarkets in Dakahlia Governorate, Egypt. The conventional cultural and serotyping methods for detection of Salmonella in dairy products were applied and the results were compared with those obtained by molecular screening assay using (ttr sequence). The obtained results revealed that 21% of milk and dairy products (42/200) were positive for Salmonella species using enrichment culture-based PCR method, while 12% of different dairy samples (24/200) were found to be positive for Salmonella species by using the conventional culture methods. Two stool specimens out of 40 apparently healthy dairy handlers were positive by the PCR method. Serotyping of Salmonella isolates revealed that 58.3% (14/24) from different dairy products were contaminated with Salmonella Typhimurium. We conclude that the enrichment culture-based PCR assay has high sensitivity and specificity for detection of Salmonella species in dairy products and handlers. High incidence of Salmonella Typhimurium in the examined dairy samples highlights the important role played by milk and dairy products as a vehicle in disease prevalence. Great effort should be applied for reducing foodborne risk for consumers.
Out-of-Sample Extensions for Non-Parametric Kernel Methods.
Pan, Binbin; Chen, Wen-Sheng; Chen, Bo; Xu, Chen; Lai, Jianhuang
2017-02-01
Choosing suitable kernels plays an important role in the performance of kernel methods. Recently, a number of studies were devoted to developing nonparametric kernels. Without assuming any parametric form of the target kernel, nonparametric kernel learning offers a flexible scheme to utilize the information of the data, which may potentially characterize the data similarity better. The kernel methods using nonparametric kernels are referred to as nonparametric kernel methods. However, many nonparametric kernel methods are restricted to transductive learning, where the prediction function is defined only over the data points given beforehand. They have no straightforward extension for the out-of-sample data points, and thus cannot be applied to inductive learning. In this paper, we show how to make the nonparametric kernel methods applicable to inductive learning. The key problem of out-of-sample extension is how to extend the nonparametric kernel matrix to the corresponding kernel function. A regression approach in the hyper reproducing kernel Hilbert space is proposed to solve this problem. Empirical results indicate that the out-of-sample performance is comparable to the in-sample performance in most cases. Experiments on face recognition demonstrate the superiority of our nonparametric kernel method over the state-of-the-art parametric kernel methods.
Statistical Methods for Detecting Differentially Abundant Features in Clinical Metagenomic Samples
White, James Robert; Nagarajan, Niranjan; Pop, Mihai
2009-01-01
Numerous studies are currently underway to characterize the microbial communities inhabiting our world. These studies aim to dramatically expand our understanding of the microbial biosphere and, more importantly, hope to reveal the secrets of the complex symbiotic relationship between us and our commensal bacterial microflora. An important prerequisite for such discoveries are computational tools that are able to rapidly and accurately compare large datasets generated from complex bacterial communities to identify features that distinguish them. We present a statistical method for comparing clinical metagenomic samples from two treatment populations on the basis of count data (e.g. as obtained through sequencing) to detect differentially abundant features. Our method, Metastats, employs the false discovery rate to improve specificity in high-complexity environments, and separately handles sparsely-sampled features using Fisher's exact test. Under a variety of simulations, we show that Metastats performs well compared to previously used methods, and significantly outperforms other methods for features with sparse counts. We demonstrate the utility of our method on several datasets including a 16S rRNA survey of obese and lean human gut microbiomes, COG functional profiles of infant and mature gut microbiomes, and bacterial and viral metabolic subsystem data inferred from random sequencing of 85 metagenomes. The application of our method to the obesity dataset reveals differences between obese and lean subjects not reported in the original study. For the COG and subsystem datasets, we provide the first statistically rigorous assessment of the differences between these populations. The methods described in this paper are the first to address clinical metagenomic datasets comprising samples from multiple subjects. Our methods are robust across datasets of varied complexity and sampling level. While designed for metagenomic applications, our software can also be applied to digital gene expression studies (e.g. SAGE). A web server implementation of our methods and freely available source code can be found at http://metastats.cbcb.umd.edu/. PMID:19360128
Room temperature ferromagnetism of tin oxide nanocrystal based on synthesis methods
NASA Astrophysics Data System (ADS)
Sakthiraj, K.; Hema, M.; Balachandrakumar, K.
2016-04-01
The experimental conditions used in the preparation of nanocrystalline oxide materials play an important role in the room temperature ferromagnetism of the product. In the present work, a comparison was made between sol-gel, microwave assisted sol-gel and hydrothermal methods for preparing tin oxide nanocrystal. X-ray diffraction analysis indicates the formation of tetragonal rutile phase structure for all the samples. The crystallite size was estimated from the HRTEM images and it is around 6-12 nm. Using optical absorbance measurement, the band gap energy value of the samples has been calculated. It reveals the existence of quantum confinement effect in all the prepared samples. Photoluminescence (PL) spectra confirms that the luminescence process originates from the structural defects such as oxygen vacancies present in the samples. Room temperature hysteresis loop was clearly observed in M-H curve of all the samples. But the sol-gel derived sample shows the higher values of saturation magnetization (Ms) and remanence (Mr) than other two samples. This study reveals that the sol-gel method is superior to the other two methods for producing room temperature ferromagnetism in tin oxide nanocrystal.
NASA Astrophysics Data System (ADS)
Hashemian, Behrooz; Millán, Daniel; Arroyo, Marino
2013-12-01
Collective variables (CVs) are low-dimensional representations of the state of a complex system, which help us rationalize molecular conformations and sample free energy landscapes with molecular dynamics simulations. Given their importance, there is need for systematic methods that effectively identify CVs for complex systems. In recent years, nonlinear manifold learning has shown its ability to automatically characterize molecular collective behavior. Unfortunately, these methods fail to provide a differentiable function mapping high-dimensional configurations to their low-dimensional representation, as required in enhanced sampling methods. We introduce a methodology that, starting from an ensemble representative of molecular flexibility, builds smooth and nonlinear data-driven collective variables (SandCV) from the output of nonlinear manifold learning algorithms. We demonstrate the method with a standard benchmark molecule, alanine dipeptide, and show how it can be non-intrusively combined with off-the-shelf enhanced sampling methods, here the adaptive biasing force method. We illustrate how enhanced sampling simulations with SandCV can explore regions that were poorly sampled in the original molecular ensemble. We further explore the transferability of SandCV from a simpler system, alanine dipeptide in vacuum, to a more complex system, alanine dipeptide in explicit water.
Hashemian, Behrooz; Millán, Daniel; Arroyo, Marino
2013-12-07
Collective variables (CVs) are low-dimensional representations of the state of a complex system, which help us rationalize molecular conformations and sample free energy landscapes with molecular dynamics simulations. Given their importance, there is need for systematic methods that effectively identify CVs for complex systems. In recent years, nonlinear manifold learning has shown its ability to automatically characterize molecular collective behavior. Unfortunately, these methods fail to provide a differentiable function mapping high-dimensional configurations to their low-dimensional representation, as required in enhanced sampling methods. We introduce a methodology that, starting from an ensemble representative of molecular flexibility, builds smooth and nonlinear data-driven collective variables (SandCV) from the output of nonlinear manifold learning algorithms. We demonstrate the method with a standard benchmark molecule, alanine dipeptide, and show how it can be non-intrusively combined with off-the-shelf enhanced sampling methods, here the adaptive biasing force method. We illustrate how enhanced sampling simulations with SandCV can explore regions that were poorly sampled in the original molecular ensemble. We further explore the transferability of SandCV from a simpler system, alanine dipeptide in vacuum, to a more complex system, alanine dipeptide in explicit water.
NASA Astrophysics Data System (ADS)
Hu, Jiexiang; Zhou, Qi; Jiang, Ping; Shao, Xinyu; Xie, Tingli
2018-01-01
Variable-fidelity (VF) modelling methods have been widely used in complex engineering system design to mitigate the computational burden. Building a VF model generally includes two parts: design of experiments and metamodel construction. In this article, an adaptive sampling method based on improved hierarchical kriging (ASM-IHK) is proposed to refine the improved VF model. First, an improved hierarchical kriging model is developed as the metamodel, in which the low-fidelity model is varied through a polynomial response surface function to capture the characteristics of a high-fidelity model. Secondly, to reduce local approximation errors, an active learning strategy based on a sequential sampling method is introduced to make full use of the already required information on the current sampling points and to guide the sampling process of the high-fidelity model. Finally, two numerical examples and the modelling of the aerodynamic coefficient for an aircraft are provided to demonstrate the approximation capability of the proposed approach, as well as three other metamodelling methods and two sequential sampling methods. The results show that ASM-IHK provides a more accurate metamodel at the same simulation cost, which is very important in metamodel-based engineering design problems.
Methods for sample size determination in cluster randomized trials
Rutterford, Clare; Copas, Andrew; Eldridge, Sandra
2015-01-01
Background: The use of cluster randomized trials (CRTs) is increasing, along with the variety in their design and analysis. The simplest approach for their sample size calculation is to calculate the sample size assuming individual randomization and inflate this by a design effect to account for randomization by cluster. The assumptions of a simple design effect may not always be met; alternative or more complicated approaches are required. Methods: We summarise a wide range of sample size methods available for cluster randomized trials. For those familiar with sample size calculations for individually randomized trials but with less experience in the clustered case, this manuscript provides formulae for a wide range of scenarios with associated explanation and recommendations. For those with more experience, comprehensive summaries are provided that allow quick identification of methods for a given design, outcome and analysis method. Results: We present first those methods applicable to the simplest two-arm, parallel group, completely randomized design followed by methods that incorporate deviations from this design such as: variability in cluster sizes; attrition; non-compliance; or the inclusion of baseline covariates or repeated measures. The paper concludes with methods for alternative designs. Conclusions: There is a large amount of methodology available for sample size calculations in CRTs. This paper gives the most comprehensive description of published methodology for sample size calculation and provides an important resource for those designing these trials. PMID:26174515
A sample preparation method for recovering suppressed analyte ions in MALDI TOF MS.
Lou, Xianwen; de Waal, Bas F M; Milroy, Lech-Gustav; van Dongen, Joost L J
2015-05-01
In matrix-assisted laser desorption/ionization time-of-flight mass spectrometry (MALDI TOF MS), analyte signals can be substantially suppressed by other compounds in the sample. In this technical note, we describe a modified thin-layer sample preparation method that significantly reduces the analyte suppression effect (ASE). In our method, analytes are deposited on top of the surface of matrix preloaded on the MALDI plate. To prevent embedding of analyte into the matrix crystals, the sample solution were prepared without matrix and efforts were taken not to re-dissolve the preloaded matrix. The results with model mixtures of peptides, synthetic polymers and lipids show that detection of analyte ions, which were completely suppressed using the conventional dried-droplet method, could be effectively recovered by using our method. Our findings suggest that the incorporation of analytes in the matrix crystals has an important contributory effect on ASE. By reducing ASE, our method should be useful for the direct MALDI MS analysis of multicomponent mixtures. Copyright © 2015 John Wiley & Sons, Ltd.
Batt, Angela L; Furlong, Edward T; Mash, Heath E; Glassmeyer, Susan T; Kolpin, Dana W
2017-02-01
A national-scale survey of 247 contaminants of emerging concern (CECs), including organic and inorganic chemical compounds, and microbial contaminants, was conducted in source and treated drinking water samples from 25 treatment plants across the United States. Multiple methods were used to determine these CECs, including six analytical methods to measure 174 pharmaceuticals, personal care products, and pesticides. A three-component quality assurance/quality control (QA/QC) program was designed for the subset of 174 CECs which allowed us to assess and compare performances of the methods used. The three components included: 1) a common field QA/QC protocol and sample design, 2) individual investigator-developed method-specific QA/QC protocols, and 3) a suite of 46 method comparison analytes that were determined in two or more analytical methods. Overall method performance for the 174 organic chemical CECs was assessed by comparing spiked recoveries in reagent, source, and treated water over a two-year period. In addition to the 247 CECs reported in the larger drinking water study, another 48 pharmaceutical compounds measured did not consistently meet predetermined quality standards. Methodologies that did not seem suitable for these analytes are overviewed. The need to exclude analytes based on method performance demonstrates the importance of additional QA/QC protocols. Published by Elsevier B.V.
Soria, M C; Soria, M A; Bueno, D J; Godano, E I; Gómez, S C; ViaButron, I A; Padin, V M; Rogé, A D
2017-08-01
The performance of detection methods (culture methods and polymerase chain reaction assay) and plating media used in the same type of samples were determined as well as the specificity of PCR primers to detected Salmonella spp. contamination in layer hen farms. Also, the association of farm characteristics with Salmonella presence was evaluated. Environmental samples (feces, feed, drinking water, air, boot-swabs) and eggs were taken from 40 layer hen houses. Salmonella spp. was most detected in boot-swabs taken around the houses (30% and 35% by isolation and PCR, respectively) follow by fecal samples (15.2% and 13.6% by isolation and PCR, respectively). Eggs, drinking water, and air samples were negative for Salmonella detection. Salmonella Schwarzengrund and S. Enteritidis were the most isolated serotypes. For plating media, relative specificity was 1, and the relative sensitivity was greater for EF-18 agar than XLDT agar in feed and fecal samples. However, relative sensitivity was greater in XLDT agar than EF-18 agar for boot-swab samples. Agreement was between fair to good depending on the sample, and it was good between isolation and PCR (feces and boot-swabs), without agreement for feed samples. Salmonella spp. PCR was positive for all strains, while S. Typhimurium PCR was negative. Salmonella Enteritidis PCR used was not specific. Based in the multiple logistic regression analyses, categorization by counties was significant for Salmonella spp. presence (P-value = 0.010). This study shows the importance of considering different types of samples, plating media and detection methods during a Salmonella spp. monitoring study. In addition, it is important to incorporate the sampling of floors around the layer hen houses to learn if biosecurity measures should be strengthened to minimize the entry and spread of Salmonella in the houses. Also, the performance of some PCR methods and S. Enteritidis PCR should be improved, and biosecurity measures in hen farms must be reinforced in the region of more concentrated layer hen houses to reduce the probability of Salmonella spp. presence. © 2017 Poultry Science Association Inc.
Auditing of chromatographic data.
Mabie, J T
1998-01-01
During a data audit, it is important to ensure that there is clear documentation and an audit trail. The Quality Assurance Unit should review all areas, including the laboratory, during the conduct of the sample analyses. The analytical methodology that is developed should be documented prior to sample analyses. This is an important document for the auditor, as it is the instrumental piece used by the laboratory personnel to maintain integrity throughout the process. It is expected that this document will give insight into the sample analysis, run controls, run sequencing, instrument parameters, and acceptance criteria for the samples. The sample analysis and all supporting documentation should be audited in conjunction with this written analytical method and any supporting Standard Operating Procedures to ensure the quality and integrity of the data.
Vosloo, W; Morris, J; Davis, A; Giles, M; Wang, J; Nguyen, H T T; Kim, P V; Quach, N V; Le, P T T; Nguyen, P H N; Dang, H; Tran, H X; Vu, P P; Hung, V V; Le, Q T; Tran, T M; Mai, T M T; Le, Q T V; Singanallur, N B
2015-10-01
In high-density farming practices, it is important to constantly monitor for infectious diseases, especially diseases that have the potential to spread rapidly between holdings. Pigs are known to amplify foot-and-mouth disease (FMD) by excreting large amounts of virus, and it is therefore important to detect the virus quickly and accurately to minimize the spread of disease. Ropes were used to collect oral fluid samples from pigs, and each sample was compared to saliva samples collected from individual animals by detecting FMD virus RNA using real-time PCR. Two different experiments are described where groups of pigs were infected with different serotypes of FMD virus, either with or without vaccination, and unvaccinated pigs were kept in aerosol contact. The sensitivity of the rope sampling varied between 0.67 and 0.92, and the statistical agreement between this method and individual sampling ranged from substantial to moderate for the two different serotypes. The ease of collecting oral fluids using ropes together with the high sensitivity of subsequent FMD detection through PCR indicates that this could be a useful method to monitor pig populations for FMD virus infection. With further validation of the sensitivity of detection of FMD virus RNA, this can be a cost-effective, non-invasive diagnostic tool. © 2013 Blackwell Verlag GmbH.
Hough, Rachael; Archer, Debra; Probert, Christopher
2018-01-01
Disturbance to the hindgut microbiota can be detrimental to equine health. Metabolomics provides a robust approach to studying the functional aspect of hindgut microorganisms. Sample preparation is an important step towards achieving optimal results in the later stages of analysis. The preparation of samples is unique depending on the technique employed and the sample matrix to be analysed. Gas chromatography mass spectrometry (GCMS) is one of the most widely used platforms for the study of metabolomics and until now an optimised method has not been developed for equine faeces. To compare a sample preparation method for extracting volatile organic compounds (VOCs) from equine faeces. Volatile organic compounds were determined by headspace solid phase microextraction gas chromatography mass spectrometry (HS-SPME-GCMS). Factors investigated were the mass of equine faeces, type of SPME fibre coating, vial volume and storage conditions. The resultant method was unique to those developed for other species. Aliquots of 1000 or 2000 mg in 10 ml or 20 ml SPME headspace were optimal. From those tested, the extraction of VOCs should ideally be performed using a divinylbenzene-carboxen-polydimethysiloxane (DVB-CAR-PDMS) SPME fibre. Storage of faeces for up to 12 months at - 80 °C shared a greater percentage of VOCs with a fresh sample than the equivalent stored at - 20 °C. An optimised method for extracting VOCs from equine faeces using HS-SPME-GCMS has been developed and will act as a standard to enable comparisons between studies. This work has also highlighted storage conditions as an important factor to consider in experimental design for faecal metabolomics studies.
Hoorfar, J.; Hansen, F.; Christensen, J.; Mansdal, S.; Josefsen, M. H.
2016-01-01
ABSTRACT Salmonella is recognized as one of the most important foodborne bacteria and has wide health and socioeconomic impacts worldwide. Fresh pork meat is one of the main sources of Salmonella, and efficient and fast methods for detection are therefore necessary. Current methods for Salmonella detection in fresh meat usually include >16 h of culture enrichment, in a few cases <12 h, thus requiring at least two working shifts. Here, we report a rapid (<5 h) and high-throughput method for screening of Salmonella in samples from fresh pork meat, consisting of a 3-h enrichment in standard buffered peptone water and a real-time PCR-compatible sample preparation method based on filtration, centrifugation, and enzymatic digestion, followed by fast-cycling real-time PCR detection. The method was validated in an unpaired comparative study against the Nordic Committee on Food Analysis (NMKL) reference culture method 187. Pork meat samples (n = 140) were either artificially contaminated with Salmonella at 0, 1 to 10, or 10 to 100 CFU/25 g of meat or naturally contaminated. Cohen's kappa for the degree of agreement between the rapid method and the reference was 0.64, and the relative accuracy, sensitivity, and specificity for the rapid method were 81.4, 95.1, and 97.9%, respectively. The 50% limit of detections (LOD50s) were 8.8 CFU/25 g for the rapid method and 7.7 CFU/25 g for the reference method. Implementation of this method will enable faster release of Salmonella low-risk meat, providing savings for meat producers, and it will help contribute to improved food safety. IMPORTANCE While the cost of analysis and hands-on time of the presented rapid method were comparable to those of reference culture methods, the fast product release by this method can provide the meat industry with a competitive advantage. Not only will the abattoirs save costs for work hours and cold storage, but consumers and retailers will also benefit from fresher meat with a longer shelf life. Furthermore, the presented sample preparation might be adjusted for application in the detection of other pathogenic bacteria in different sample types. PMID:27986726
Fachmann, M S R; Löfström, C; Hoorfar, J; Hansen, F; Christensen, J; Mansdal, S; Josefsen, M H
2017-03-01
Salmonella is recognized as one of the most important foodborne bacteria and has wide health and socioeconomic impacts worldwide. Fresh pork meat is one of the main sources of Salmonella , and efficient and fast methods for detection are therefore necessary. Current methods for Salmonella detection in fresh meat usually include >16 h of culture enrichment, in a few cases <12 h, thus requiring at least two working shifts. Here, we report a rapid (<5 h) and high-throughput method for screening of Salmonella in samples from fresh pork meat, consisting of a 3-h enrichment in standard buffered peptone water and a real-time PCR-compatible sample preparation method based on filtration, centrifugation, and enzymatic digestion, followed by fast-cycling real-time PCR detection. The method was validated in an unpaired comparative study against the Nordic Committee on Food Analysis (NMKL) reference culture method 187. Pork meat samples ( n = 140) were either artificially contaminated with Salmonella at 0, 1 to 10, or 10 to 100 CFU/25 g of meat or naturally contaminated. Cohen's kappa for the degree of agreement between the rapid method and the reference was 0.64, and the relative accuracy, sensitivity, and specificity for the rapid method were 81.4, 95.1, and 97.9%, respectively. The 50% limit of detections (LOD 50 s) were 8.8 CFU/25 g for the rapid method and 7.7 CFU/25 g for the reference method. Implementation of this method will enable faster release of Salmonella low-risk meat, providing savings for meat producers, and it will help contribute to improved food safety. IMPORTANCE While the cost of analysis and hands-on time of the presented rapid method were comparable to those of reference culture methods, the fast product release by this method can provide the meat industry with a competitive advantage. Not only will the abattoirs save costs for work hours and cold storage, but consumers and retailers will also benefit from fresher meat with a longer shelf life. Furthermore, the presented sample preparation might be adjusted for application in the detection of other pathogenic bacteria in different sample types. Copyright © 2017 American Society for Microbiology.
Mirante, Clara; Clemente, Isabel; Zambu, Graciette; Alexandre, Catarina; Ganga, Teresa; Mayer, Carlos; Brito, Miguel
2016-09-01
Helminth intestinal parasitoses are responsible for high levels of child mortality and morbidity. Hence, the capacity to diagnose these parasitoses and consequently ensure due treatment represents a factor of great importance. The main objective of this study involves comparing two methods of concentration, parasitrap and Kato-Katz, for the diagnosis of intestinal parasitoses in faecal samples. Sample processing made recourse to two different concentration methods: the commercial parasitrap® method and the Kato-Katz method. We correspondingly collected a total of 610 stool samples from pre-school and school age children. The results demonstrate the incidence of helminth parasites in 32.8% or 32.3% of the sample collected depending on whether the concentration method applied was either the parasitrap method or the Kato-Katz method. We detected a relatively high percentage of samples testing positive for two or more species of helminth parasites. We would highlight that in searching for larvae the Kato-Katz method does not prove as appropriate as the parasitrap method. Both techniques prove easily applicable even in field working conditions and returning mutually agreeing results. This study concludes in favour of the need for deworming programs and greater public awareness among the rural populations of Angola.
DOT National Transportation Integrated Search
2002-01-01
Analysis of chloride contents in ground concrete samples collected from reinforced concrete bridges and other structures exposed to deicing salts or seawater has become an important part of the inspection for such structures. Such an analysis provide...
Method Development in Forensic Toxicology.
Peters, Frank T; Wissenbach, Dirk K; Busardo, Francesco Paolo; Marchei, Emilia; Pichini, Simona
2017-01-01
In the field of forensic toxicology, the quality of analytical methods is of great importance to ensure the reliability of results and to avoid unjustified legal consequences. A key to high quality analytical methods is a thorough method development. The presented article will provide an overview on the process of developing methods for forensic applications. This includes the definition of the method's purpose (e.g. qualitative vs quantitative) and the analytes to be included, choosing an appropriate sample matrix, setting up separation and detection systems as well as establishing a versatile sample preparation. Method development is concluded by an optimization process after which the new method is subject to method validation. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.
NASA Astrophysics Data System (ADS)
Zhang, Linna; Zhang, Shengzhao; Sun, Meixiu; Li, Hongxiao; Li, Yingxin; Fu, Zhigang; Guan, Yang; Li, Gang; Lin, Ling
2017-03-01
Discrimination of human and nonhuman blood is crucial for import-export ports and inspection and quarantine departments. Current methods are usually destructive, complicated and time-consuming. We had previously demonstrated that visible diffuse reflectance spectroscopy combining PLS-DA method can successfully realize human blood discrimination. In that research, the spectra were measured with the fiber probe under the surface of blood samples. However, open sampling may pollute the blood samples. Virulence factors in blood samples can also endanger inspectors. In this paper, we explored the classification effect with the blood samples measured in the original containers-vacuum blood vessel. Furthermore, we studied the impact of different conditions of blood samples, such as coagulation and hemolysis, on the prediction ability of the discrimination model. The calibration model built with blood samples in different conditions displayed a satisfactory prediction result. This research demonstrated that visible and near-infrared diffuse reflectance spectroscopy method was potential for noncontact discrimination of human blood.
On Bayesian Testing of Additive Conjoint Measurement Axioms Using Synthetic Likelihood
ERIC Educational Resources Information Center
Karabatsos, George
2017-01-01
This article introduces a Bayesian method for testing the axioms of additive conjoint measurement. The method is based on an importance sampling algorithm that performs likelihood-free, approximate Bayesian inference using a synthetic likelihood to overcome the analytical intractability of this testing problem. This new method improves upon…
Comparison of DNA extraction methods for human gut microbial community profiling.
Lim, Mi Young; Song, Eun-Ji; Kim, Sang Ho; Lee, Jangwon; Nam, Young-Do
2018-03-01
The human gut harbors a vast range of microbes that have significant impact on health and disease. Therefore, gut microbiome profiling holds promise for use in early diagnosis and precision medicine development. Accurate profiling of the highly complex gut microbiome requires DNA extraction methods that provide sufficient coverage of the original community as well as adequate quality and quantity. We tested nine different DNA extraction methods using three commercial kits (TianLong Stool DNA/RNA Extraction Kit (TS), QIAamp DNA Stool Mini Kit (QS), and QIAamp PowerFecal DNA Kit (QP)) with or without additional bead-beating step using manual or automated methods and compared them in terms of DNA extraction ability from human fecal sample. All methods produced DNA in sufficient concentration and quality for use in sequencing, and the samples were clustered according to the DNA extraction method. Inclusion of bead-beating step especially resulted in higher degrees of microbial diversity and had the greatest effect on gut microbiome composition. Among the samples subjected to bead-beating method, TS kit samples were more similar to QP kit samples than QS kit samples. Our results emphasize the importance of mechanical disruption step for a more comprehensive profiling of the human gut microbiome. Copyright © 2017 The Authors. Published by Elsevier GmbH.. All rights reserved.
Nonuniform fast Fourier transform method for numerical diffraction simulation on tilted planes.
Xiao, Yu; Tang, Xiahui; Qin, Yingxiong; Peng, Hao; Wang, Wei; Zhong, Lijing
2016-10-01
The method, based on the rotation of the angular spectrum in the frequency domain, is generally used for the diffraction simulation between the tilted planes. Due to the rotation of the angular spectrum, the interval between the sampling points in the Fourier domain is not even. For the conventional fast Fourier transform (FFT)-based methods, a spectrum interpolation is needed to get the approximate sampling value on the equidistant sampling points. However, due to the numerical error caused by the spectrum interpolation, the calculation accuracy degrades very quickly as the rotation angle increases. Here, the diffraction propagation between the tilted planes is transformed into a problem about the discrete Fourier transform on the uneven sampling points, which can be evaluated effectively and precisely through the nonuniform fast Fourier transform method (NUFFT). The most important advantage of this method is that the conventional spectrum interpolation is avoided and the high calculation accuracy can be guaranteed for different rotation angles, even when the rotation angle is close to π/2. Also, its calculation efficiency is comparable with that of the conventional FFT-based methods. Numerical examples as well as a discussion about the calculation accuracy and the sampling method are presented.
Sampling in ecology and evolution - bridging the gap between theory and practice
Albert, C.H.; Yoccoz, N.G.; Edwards, T.C.; Graham, C.H.; Zimmermann, N.E.; Thuiller, W.
2010-01-01
Sampling is a key issue for answering most ecological and evolutionary questions. The importance of developing a rigorous sampling design tailored to specific questions has already been discussed in the ecological and sampling literature and has provided useful tools and recommendations to sample and analyse ecological data. However, sampling issues are often difficult to overcome in ecological studies due to apparent inconsistencies between theory and practice, often leading to the implementation of simplified sampling designs that suffer from unknown biases. Moreover, we believe that classical sampling principles which are based on estimation of means and variances are insufficient to fully address many ecological questions that rely on estimating relationships between a response and a set of predictor variables over time and space. Our objective is thus to highlight the importance of selecting an appropriate sampling space and an appropriate sampling design. We also emphasize the importance of using prior knowledge of the study system to estimate models or complex parameters and thus better understand ecological patterns and processes generating these patterns. Using a semi-virtual simulation study as an illustration we reveal how the selection of the space (e.g. geographic, climatic), in which the sampling is designed, influences the patterns that can be ultimately detected. We also demonstrate the inefficiency of common sampling designs to reveal response curves between ecological variables and climatic gradients. Further, we show that response-surface methodology, which has rarely been used in ecology, is much more efficient than more traditional methods. Finally, we discuss the use of prior knowledge, simulation studies and model-based designs in defining appropriate sampling designs. We conclude by a call for development of methods to unbiasedly estimate nonlinear ecologically relevant parameters, in order to make inferences while fulfilling requirements of both sampling theory and field work logistics. ?? 2010 The Authors.
Kalivas, John H; Georgiou, Constantinos A; Moira, Marianna; Tsafaras, Ilias; Petrakis, Eleftherios A; Mousdis, George A
2014-04-01
Quantitative analysis of food adulterants is an important health and economic issue that needs to be fast and simple. Spectroscopy has significantly reduced analysis time. However, still needed are preparations of analyte calibration samples matrix matched to prediction samples which can be laborious and costly. Reported in this paper is the application of a newly developed pure component Tikhonov regularization (PCTR) process that does not require laboratory prepared or reference analysis methods, and hence, is a greener calibration method. The PCTR method requires an analyte pure component spectrum and non-analyte spectra. As a food analysis example, synchronous fluorescence spectra of extra virgin olive oil samples adulterated with sunflower oil is used. Results are shown to be better than those obtained using ridge regression with reference calibration samples. The flexibility of PCTR allows including reference samples and is generic for use with other instrumental methods and food products. Copyright © 2013 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Castelletti, Davide; Demir, Begüm; Bruzzone, Lorenzo
2014-10-01
This paper presents a novel semisupervised learning (SSL) technique defined in the context of ɛ-insensitive support vector regression (SVR) to estimate biophysical parameters from remotely sensed images. The proposed SSL method aims to mitigate the problems of small-sized biased training sets without collecting any additional samples with reference measures. This is achieved on the basis of two consecutive steps. The first step is devoted to inject additional priors information in the learning phase of the SVR in order to adapt the importance of each training sample according to distribution of the unlabeled samples. To this end, a weight is initially associated to each training sample based on a novel strategy that defines higher weights for the samples located in the high density regions of the feature space while giving reduced weights to those that fall into the low density regions of the feature space. Then, in order to exploit different weights for training samples in the learning phase of the SVR, we introduce a weighted SVR (WSVR) algorithm. The second step is devoted to jointly exploit labeled and informative unlabeled samples for further improving the definition of the WSVR learning function. To this end, the most informative unlabeled samples that have an expected accurate target values are initially selected according to a novel strategy that relies on the distribution of the unlabeled samples in the feature space and on the WSVR function estimated at the first step. Then, we introduce a restructured WSVR algorithm that jointly uses labeled and unlabeled samples in the learning phase of the WSVR algorithm and tunes their importance by different values of regularization parameters. Experimental results obtained for the estimation of single-tree stem volume show the effectiveness of the proposed SSL method.
1983-10-01
sites should be collected and correlated to vegetational response through ordination or factor analysis techniques. 3. If site-specific methods are...specified number of samples is collected . If the 40 ."l ’ - - - .. -’-. - : -. .~ -, .4 .- . -.- , - ...... rb. . o* . -.% o - .- * .-.- v- . J...only, although the data collected were sufficient for determination of such parameters as density, cover, frequency, basal area, and importance value
Twenty-four cases of imported zika virus infections diagnosed by molecular methods.
Alejo-Cancho, Izaskun; Torner, Nuria; Oliveira, Inés; Martínez, Ana; Muñoz, José; Jane, Mireia; Gascón, Joaquim; Requena-Méndez, Ana; Vilella, Anna; Marcos, M Ángeles; Pinazo, María Jesús; Gonzalo, Verónica; Rodriguez, Natalia; Martínez, Miguel J
2016-10-01
Zika virus is an emerging flavivirus widely spreading through Latin America. Molecular diagnosis of the infection can be performed using serum, urine and saliva samples, although a well-defined diagnostic algorithm is not yet established. We describe a series of 24 cases of imported zika virus infection into Catalonia (northeastern Spain). Based on our findings, testing of paired serum and urine samples is recommended. Copyright © 2016 Elsevier Inc. All rights reserved.
Active Search on Carcasses versus Pitfall Traps: a Comparison of Sampling Methods.
Zanetti, N I; Camina, R; Visciarelli, E C; Centeno, N D
2016-04-01
The study of insect succession in cadavers and the classification of arthropods have mostly been done by placing a carcass in a cage, protected from vertebrate scavengers, which is then visited periodically. An alternative is to use specific traps. Few studies on carrion ecology and forensic entomology involving the carcasses of large vertebrates have employed pitfall traps. The aims of this study were to compare both sampling methods (active search on a carcass and pitfall trapping) for each coleopteran family, and to establish whether there is a discrepancy (underestimation and/or overestimation) in the presence of each family by either method. A great discrepancy was found for almost all families with some of them being more abundant in samples obtained through active search on carcasses and others in samples from traps, whereas two families did not show any bias towards a given sampling method. The fact that families may be underestimated or overestimated by the type of sampling technique highlights the importance of combining both methods, active search on carcasses and pitfall traps, in order to obtain more complete information on decomposition, carrion habitat and cadaveric families or species. Furthermore, a hypothesis advanced on the reasons for the underestimation by either sampling method showing biases towards certain families. Information about the sampling techniques indicating which would be more appropriate to detect or find a particular family is provided.
Binding, N; Schilder, K; Czeschinski, P A; Witting, U
1998-08-01
The 2,4-dinitrophenylhydrazine (2,4-DNPH) derivatization method mainly used for the determination of airborne formaldehyde was extended for acetaldehyde, acetone, 2-butanone, and cyclohexanone, the next four carbonyl compounds of industrial importance. Sampling devices and sampling conditions were adjusted for the respective limit value regulations. Analytical reliability criteria were established and compared to those of other recommended methods. With a minimum analytical range from one tenth to the 3-fold limit value in all cases and with relative standard deviations below 5%, the adjusted method meets all requirements for the reliable quantification of the four compounds in workplace air as well as in ambient air.
Estimating rare events in biochemical systems using conditional sampling.
Sundar, V S
2017-01-28
The paper focuses on development of variance reduction strategies to estimate rare events in biochemical systems. Obtaining this probability using brute force Monte Carlo simulations in conjunction with the stochastic simulation algorithm (Gillespie's method) is computationally prohibitive. To circumvent this, important sampling tools such as the weighted stochastic simulation algorithm and the doubly weighted stochastic simulation algorithm have been proposed. However, these strategies require an additional step of determining the important region to sample from, which is not straightforward for most of the problems. In this paper, we apply the subset simulation method, developed as a variance reduction tool in the context of structural engineering, to the problem of rare event estimation in biochemical systems. The main idea is that the rare event probability is expressed as a product of more frequent conditional probabilities. These conditional probabilities are estimated with high accuracy using Monte Carlo simulations, specifically the Markov chain Monte Carlo method with the modified Metropolis-Hastings algorithm. Generating sample realizations of the state vector using the stochastic simulation algorithm is viewed as mapping the discrete-state continuous-time random process to the standard normal random variable vector. This viewpoint opens up the possibility of applying more sophisticated and efficient sampling schemes developed elsewhere to problems in stochastic chemical kinetics. The results obtained using the subset simulation method are compared with existing variance reduction strategies for a few benchmark problems, and a satisfactory improvement in computational time is demonstrated.
Data Analysis Recipes: Using Markov Chain Monte Carlo
NASA Astrophysics Data System (ADS)
Hogg, David W.; Foreman-Mackey, Daniel
2018-05-01
Markov Chain Monte Carlo (MCMC) methods for sampling probability density functions (combined with abundant computational resources) have transformed the sciences, especially in performing probabilistic inferences, or fitting models to data. In this primarily pedagogical contribution, we give a brief overview of the most basic MCMC method and some practical advice for the use of MCMC in real inference problems. We give advice on method choice, tuning for performance, methods for initialization, tests of convergence, troubleshooting, and use of the chain output to produce or report parameter estimates with associated uncertainties. We argue that autocorrelation time is the most important test for convergence, as it directly connects to the uncertainty on the sampling estimate of any quantity of interest. We emphasize that sampling is a method for doing integrals; this guides our thinking about how MCMC output is best used. .
NASA Astrophysics Data System (ADS)
Breeson, Andrew C.; Sankar, Gopinathan; Goh, Gregory K. L.; Palgrave, Robert G.
2017-11-01
A method of quantitative phase analysis using valence band X-ray photoelectron spectra is presented and applied to the analysis of TiO2 anatase-rutile mixtures. The valence band spectra of pure TiO2 polymorphs were measured, and these spectral shapes used to fit valence band spectra from mixed phase samples. Given the surface sensitive nature of the technique, this yields a surface phase fraction. Mixed phase samples were prepared from high and low surface area anatase and rutile powders. In the samples studied here, the surface phase fraction of anatase was found to be linearly correlated with photocatalytic activity of the mixed phase samples, even for samples with very different anatase and rutile surface areas. We apply this method to determine the surface phase fraction of P25 powder. This method may be applied to other systems where a surface phase fraction is an important characteristic.
SOURCES OF VARIABILITY IN COLLECTION AND PREPARATION OF PAINT AND LEAD-COATING SAMPLES
Chronic exposure of children to lead can result in permanent physiologic impairment. Since surfaces coated with lead-containing paints and varnishes are potential sources of exposure, it is extremely important that reliable methods for sampling and analysis be available. The so...
Enhancing Important Fluctuations: Rare Events and Metadynamics from a Conceptual Viewpoint
NASA Astrophysics Data System (ADS)
Valsson, Omar; Tiwary, Pratyush; Parrinello, Michele
2016-05-01
Atomistic simulations play a central role in many fields of science. However, their usefulness is often limited by the fact that many systems are characterized by several metastable states separated by high barriers, leading to kinetic bottlenecks. Transitions between metastable states are thus rare events that occur on significantly longer timescales than one can simulate in practice. Numerous enhanced sampling methods have been introduced to alleviate this timescale problem, including methods based on identifying a few crucial order parameters or collective variables and enhancing the sampling of these variables. Metadynamics is one such method that has proven successful in a great variety of fields. Here we review the conceptual and theoretical foundations of metadynamics. As demonstrated, metadynamics is not just a practical tool but can also be considered an important development in the theory of statistical mechanics.
Solid-Phase Extraction (SPE): Principles and Applications in Food Samples.
Ötles, Semih; Kartal, Canan
2016-01-01
Solid-Phase Extraction (SPE) is a sample preparation method that is practised on numerous application fields due to its many advantages compared to other traditional methods. SPE was invented as an alternative to liquid/liquid extraction and eliminated multiple disadvantages, such as usage of large amount of solvent, extended operation time/procedure steps, potential sources of error, and high cost. Moreover, SPE can be plied to the samples combined with other analytical methods and sample preparation techniques optionally. SPE technique is a useful tool for many purposes through its versatility. Isolation, concentration, purification and clean-up are the main approaches in the practices of this method. Food structures represent a complicated matrix and can be formed into different physical stages, such as solid, viscous or liquid. Therefore, sample preparation step particularly has an important role for the determination of specific compounds in foods. SPE offers many opportunities not only for analysis of a large diversity of food samples but also for optimization and advances. This review aims to provide a comprehensive overview on basic principles of SPE and its applications for many analytes in food matrix.
On the use of transition matrix methods with extended ensembles.
Escobedo, Fernando A; Abreu, Charlles R A
2006-03-14
Different extended ensemble schemes for non-Boltzmann sampling (NBS) of a selected reaction coordinate lambda were formulated so that they employ (i) "variable" sampling window schemes (that include the "successive umbrella sampling" method) to comprehensibly explore the lambda domain and (ii) transition matrix methods to iteratively obtain the underlying free-energy eta landscape (or "importance" weights) associated with lambda. The connection between "acceptance ratio" and transition matrix methods was first established to form the basis of the approach for estimating eta(lambda). The validity and performance of the different NBS schemes were then assessed using as lambda coordinate the configurational energy of the Lennard-Jones fluid. For the cases studied, it was found that the convergence rate in the estimation of eta is little affected by the use of data from high-order transitions, while it is noticeably improved by the use of a broader window of sampling in the variable window methods. Finally, it is shown how an "elastic" window of sampling can be used to effectively enact (nonuniform) preferential sampling over the lambda domain, and how to stitch the weights from separate one-dimensional NBS runs to produce a eta surface over a two-dimensional domain.
NASA Astrophysics Data System (ADS)
Houben, Georg J.; Koeniger, Paul; Schloemer, Stefan; Gröger-Trampe, Jens; Sültenfuß, Jürgen
2018-02-01
Depth-specific sampling of groundwater is important for a variety of hydrogeological applications. Several sampling methods are available but comparably little is known about how their results compare. Therefore, samples from regular observation wells (short screen), micro-filters and direct push were compared for two sites with differing hydrogeological conditions and land use, both located in the Fuhrberger Feld, Germany. The encountered hydrochemical zonation requires a high resolution of 1 m or better, which the available small number of regular observation wells could only roughly mirror. Because the three methods employ significantly varying pumping rates and therefore, have varying spatial origins of the sample, individual concentrations at similar depths may differ significantly. In a hydrologically and chemically dynamical environment such as the agricultural site, this effect becomes more pronounced than for the more stable forest site. The micro-filters are probably the most depth-specific, but showed distinctly lower concentrations for dissolved gases than the other two methods, due to degassing during sampling. They should thus not be used for any method that relies on dissolved gas analysis.
NASA Astrophysics Data System (ADS)
WANG, P. T.
2015-12-01
Groundwater modeling requires to assign hydrogeological properties to every numerical grid. Due to the lack of detailed information and the inherent spatial heterogeneity, geological properties can be treated as random variables. Hydrogeological property is assumed to be a multivariate distribution with spatial correlations. By sampling random numbers from a given statistical distribution and assigning a value to each grid, a random field for modeling can be completed. Therefore, statistics sampling plays an important role in the efficiency of modeling procedure. Latin Hypercube Sampling (LHS) is a stratified random sampling procedure that provides an efficient way to sample variables from their multivariate distributions. This study combines the the stratified random procedure from LHS and the simulation by using LU decomposition to form LULHS. Both conditional and unconditional simulations of LULHS were develpoed. The simulation efficiency and spatial correlation of LULHS are compared to the other three different simulation methods. The results show that for the conditional simulation and unconditional simulation, LULHS method is more efficient in terms of computational effort. Less realizations are required to achieve the required statistical accuracy and spatial correlation.
Strategy to increase Barangan Banana production in Kabupaten Deli Serdang
NASA Astrophysics Data System (ADS)
Adhany, I.; Chalil, D.; Ginting, R.
2018-02-01
This study was conducted to analyze internal and external factors in increasing Barangan Banana production in Kabupaten Deli Serdang. Samples were determined by snowball sampling technique and purposive sampling method. Using SWOT analysis method, this study found that there were 6 internal strategic factors and 9 external strategic factors. Among that strategic factors, support for production facilities appears as the most important internal strategic factor, while the demand for Barangan Banana. as the most important external strategic factor. Based on the importance and existing condition of these strategic factors, using support for production facilities and realization of supporting facilities with farming experience are the strategies covering strength-opportunity (SO), organizing mentoring to meet the demand for Barangan Banana are the strategies covering weakness-opportunity (WO), making use of funding support and subsidies to widen the land, using tissue culture seeds and facilities and infrastructures are the strategies covering strength-threat (ST), increas the funding support to widen the land, the use of tissue culture seeds and facilities and infrastructures are the strategies covering weakness-threat (WT) are discussed and proposed to increase Barangan Banana productivity in Kabupaten Deli Serdang.
Ponce, Camille; Kaczorowski, Flora; Perpoint, Thomas; Miailhes, Patrick; Sigal, Alain; Javouhey, Etienne; Gillet, Yves; Jacquin, Laurent; Douplat, Marion; Tazarourte, Karim; Potinet, Véronique; Simon, Bruno; Lavoignat, Adeline; Bonnot, Guillaume; Sow, Fatimata; Bienvenu, Anne-Lise; Picot, Stéphane
2017-01-01
Background: Sensitive and easy-to-perform methods for the diagnosis of malaria are not yet available. Improving the limit of detection and following the requirements for certification are issues to be addressed in both endemic and non-endemic settings. The aim of this study was to test whether loop-mediated isothermal amplification of DNA (LAMP) may be an alternative to microscopy or real-time PCR for the screening of imported malaria cases in non-endemic area. Results: 310 blood samples associated with 829 suspected cases of imported malaria were tested during a one year period. Microscopy (thin and thick stained blood slides, reference standard) was used for the diagnosis. Real-time PCR was used as a standard of truth, and LAMP (Meridian Malaria Plus) was used as an index test in a prospective study conducted following the Standards for Reporting Diagnosis Accuracy Studies. In the 83 positive samples, species identification was P. falciparum (n = 66), P. ovale (n = 9), P. vivax (n = 3) P. malariae (n = 3) and 2 co-infections with P. falciparum + P.malariae. Using LAMP methods, 93 samples gave positive results, including 4 false-positives. Sensitivity, specificity, positive predictive value and negative predictive value for LAMP tests were 100%, 98.13%, 95.51%, and 100% compared to PCR. Conclusion: High negative predictive value, and limit of detection suggest that LAMP can be used for screening of imported malaria cases in non-endemic countries when expert microscopists are not immediately available. However, the rare occurrence of non-valid results and the need for species identification and quantification of positive samples preclude the use of LAMP as a single reference method. PMID:29251261
Mass load estimation errors utilizing grab sampling strategies in a karst watershed
Fogle, A.W.; Taraba, J.L.; Dinger, J.S.
2003-01-01
Developing a mass load estimation method appropriate for a given stream and constituent is difficult due to inconsistencies in hydrologic and constituent characteristics. The difficulty may be increased in flashy flow conditions such as karst. Many projects undertaken are constrained by budget and manpower and do not have the luxury of sophisticated sampling strategies. The objectives of this study were to: (1) examine two grab sampling strategies with varying sampling intervals and determine the error in mass load estimates, and (2) determine the error that can be expected when a grab sample is collected at a time of day when the diurnal variation is most divergent from the daily mean. Results show grab sampling with continuous flow to be a viable data collection method for estimating mass load in the study watershed. Comparing weekly, biweekly, and monthly grab sampling, monthly sampling produces the best results with this method. However, the time of day the sample is collected is important. Failure to account for diurnal variability when collecting a grab sample may produce unacceptable error in mass load estimates. The best time to collect a sample is when the diurnal cycle is nearest the daily mean.
MontePython 3: Parameter inference code for cosmology
NASA Astrophysics Data System (ADS)
Brinckmann, Thejs; Lesgourgues, Julien; Audren, Benjamin; Benabed, Karim; Prunet, Simon
2018-05-01
MontePython 3 provides numerous ways to explore parameter space using Monte Carlo Markov Chain (MCMC) sampling, including Metropolis-Hastings, Nested Sampling, Cosmo Hammer, and a Fisher sampling method. This improved version of the Monte Python (ascl:1307.002) parameter inference code for cosmology offers new ingredients that improve the performance of Metropolis-Hastings sampling, speeding up convergence and offering significant time improvement in difficult runs. Additional likelihoods and plotting options are available, as are post-processing algorithms such as Importance Sampling and Adding Derived Parameter.
Waist Circumference and Objectively Measured Sedentary Behavior in Rural School Adolescents
ERIC Educational Resources Information Center
Machado-Rodrigues, Aristides M.; Coelho e Silva, Manuel J.; Ribeiro, Luís P.; Fernandes, Romulo; Mota, Jorge; Malina, Robert M.
2016-01-01
Background: Research on relationships between lifestyle behaviors and adiposity in school youth is potentially important for identifying subgroups at risk. This study evaluates the associations between waist circumference (WC) and objective measures of sedentary behavior (SB) in a sample of rural school adolescents. Methods: The sample included…
Cleveland, Danielle; Brumbaugh, William G.; MacDonald, Donald D.
2017-01-01
Evaluations of sediment quality conditions are commonly conducted using whole-sediment chemistry analyses but can be enhanced by evaluating multiple lines of evidence, including measures of the bioavailable forms of contaminants. In particular, porewater chemistry data provide information that is directly relevant for interpreting sediment toxicity data. Various methods for sampling porewater for trace metals and dissolved organic carbon (DOC), which is an important moderator of metal bioavailability, have been employed. The present study compares the peeper, push point, centrifugation, and diffusive gradients in thin films (DGT) methods for the quantification of 6 metals and DOC. The methods were evaluated at low and high concentrations of metals in 3 sediments having different concentrations of total organic carbon and acid volatile sulfide and different particle-size distributions. At low metal concentrations, centrifugation and push point sampling resulted in up to 100 times higher concentrations of metals and DOC in porewater compared with peepers and DGTs. At elevated metal levels, the measured concentrations were in better agreement among the 4 sampling techniques. The results indicate that there can be marked differences among operationally different porewater sampling methods, and it is unclear if there is a definitive best method for sampling metals and DOC in porewater.
ERIC Educational Resources Information Center
Lahey, Benjamin B.; D'Onofrio, Brian M.; Waldman, Irwin D.
2009-01-01
Epidemiology uses strong sampling methods and study designs to test refutable hypotheses regarding the causes of important health, mental health, and social outcomes. Epidemiologic methods are increasingly being used to move developmental psychopathology from studies that catalogue correlates of child and adolescent mental health to designs that…
NASA Astrophysics Data System (ADS)
Lang-Yona, N.; Dannemiller, K.; Yamamoto, N.; Burshtein, N.; Peccia, J.; Yarden, O.; Rudich, Y.
2012-03-01
Airborne fungal spores are an important fraction of atmospheric particulate matter and are major causative agents of allergenic and infectious diseases. Predicting the variability and species of allergy-causing fungal spores requires detailed and reliable methods for identification and quantification. There are diverse methods for their detection in the atmosphere and in the indoor environments; yet, it is important to optimize suitable methods for characterization of fungal spores in atmospheric samples. In this study we sampled and characterized total and specific airborne fungal spores from PM10 samples collected in Rehovot, Israel over an entire year. The total fungal spore concentrations vary throughout the year although the species variability was nearly the same. Seasonal equivalent spore concentrations analyzed by real-time quantitative-PCR-based methods were fall > winter > spring > summer. Reported concentrations based on ergosterol analysis for the same samples were and fall > spring > winter > summer. Correlation between the two analytical methods was found only for the spring season. These poor associations may be due to the per-spore ergosterol variations that arise from both varying production rates, as well as molecular degradation of ergosterol. While conversion of genome copies to spore concentration is not yet straightforward, the potential for improving this conversion and the ability of qPCR to identify groups of fungi or specific species makes this method preferable for environmental spore quantification. Identifying tools for establishing the relation between the presence of species and the actual ability to induce allergies is still needed in order to predict the effect on human health.
NASA Astrophysics Data System (ADS)
Lang-Yona, N.; Dannemiller, K.; Yamamoto, N.; Burshtein, N.; Peccia, J.; Yarden, O.; Rudich, Y.
2011-10-01
Airborne fungal spores are an important fraction of atmospheric particulate matter and are major causative agents of allergenic and infectious diseases. Predicting the variability and species of allergy-causing fungal spores requires detailed and reliable methods for identification and quantification. There are diverse methods for their detection in the atmosphere and in the indoor environments; yet, it is important to optimize suitable methods for characterization of fungal spores in atmospheric samples. In this study we sampled and characterized total and specific airborne fungal spores from PM10 samples collected in Rohovot, Israel over an entire year. The total fungal spore concentrations vary throughout the year although the species variability was nearly the same. Seasonal equivalent spore concentrations analyzed by real-time quantitative-PCR-based methods were fall > winter > spring > summer. Reported concentrations based on ergosterol analysis for the same samples were and fall > spring > winter > summer. Correlation between the two analytical methods was found only for the spring season. These poor associations may be due to the per-spore ergosterol variations that arise from both varying production rates, as well as molecular degradation of ergosterol. While conversion of genome copies to spore concentration is not yet straightforward, the potential for improving this conversion and the ability of qPCR to identify groups of fungi or specific species makes this method preferable for environmental spore quantification. Identifying tools for establishing the relation between the presence of species and the actual ability to induce allergies is still needed in order to predict the effect on human health.
Rennie, Robert P; Turnbull, Lee-Ann; Gauchier-Pitts, Kaylee; Bennett, Tracy; Dyrland, Debbie; Blonski, Susan
2016-08-01
The ability to isolate and identify causative agents of urinary tract infections relies primarily on the quality of the urine sample that is submitted to the microbiology. The most important factors are the method of collection, the maintenance of viability of the potential pathogens during transport, and standardization of the culturing of the urine sample. This report is a composite of several investigations comparing collection and transport on urine culture paddles, with a preservative urine sponge (Uriswab), and a comparison of Uriswab with the BD preservative transport tube as methods of preservation of urinary pathogens. Primary studies showed that Uriswab maintained significantly more urinary pathogens than the urine culture paddle with fewer mixed or contaminated cultures. The two preservative transport systems were comparable for maintenance of viability of the pathogens, but there were fewer mixed cultures when samples were collected with Uriswab. This study confirms the importance of a standard volume of 1 μL of urine for culture. Copyright © 2016 Elsevier Inc. All rights reserved.
Neural Network and Nearest Neighbor Algorithms for Enhancing Sampling of Molecular Dynamics.
Galvelis, Raimondas; Sugita, Yuji
2017-06-13
The free energy calculations of complex chemical and biological systems with molecular dynamics (MD) are inefficient due to multiple local minima separated by high-energy barriers. The minima can be escaped using an enhanced sampling method such as metadynamics, which apply bias (i.e., importance sampling) along a set of collective variables (CV), but the maximum number of CVs (or dimensions) is severely limited. We propose a high-dimensional bias potential method (NN2B) based on two machine learning algorithms: the nearest neighbor density estimator (NNDE) and the artificial neural network (ANN) for the bias potential approximation. The bias potential is constructed iteratively from short biased MD simulations accounting for correlation among CVs. Our method is capable of achieving ergodic sampling and calculating free energy of polypeptides with up to 8-dimensional bias potential.
Ruple-Czerniak, A; Bolte, D S; Burgess, B A; Morley, P S
2014-07-01
Nosocomial salmonellosis is an important problem in veterinary hospitals that treat horses and other large animals. Detection and mitigation of outbreaks and prevention of healthcare-associated infections often require detection of Salmonella enterica in the hospital environment. To compare 2 previously published methods for detecting environmental contamination with S. enterica in a large animal veterinary teaching hospital. Hospital-based comparison of environmental sampling techniques. A total of 100 pairs of environmental samples were collected from stalls used to house large animal cases (horses, cows or New World camelids) that were confirmed to be shedding S. enterica by faecal culture. Stalls were cleaned and disinfected prior to sampling, and the same areas within each stall were sampled for the paired samples. One method of detection used sterile, premoistened sponges that were cultured using thioglycolate enrichment before plating on XLT-4 agar. The other method used electrostatic wipes that were cultured using buffered peptone water, tetrathionate and Rappaport-Vassiliadis R10 broths before plating on XLT-4 agar. Salmonella enterica was recovered from 14% of samples processed using the electrostatic wipe sampling and culture procedure, whereas S. enterica was recovered from only 4% of samples processed using the sponge sampling and culture procedure. There was test agreement for 85 pairs of culture-negative samples and 3 pairs of culture-positive samples. However, the remaining 12 pairs of samples with discordant results created significant disagreement between the 2 detection methods (P<0.01). Persistence of Salmonella in the environment of veterinary hospitals can occur even with rigorous cleaning and disinfection. Use of sensitive methods for detection of environmental contamination is critical when detecting and mitigating this problem in veterinary hospitals. These results suggest that the electrostatic wipe sampling and culture method was more sensitive than the sponge sampling and culture method. © 2013 EVJ Ltd.
Observational studies of patients in the emergency department: a comparison of 4 sampling methods.
Valley, Morgan A; Heard, Kennon J; Ginde, Adit A; Lezotte, Dennis C; Lowenstein, Steven R
2012-08-01
We evaluate the ability of 4 sampling methods to generate representative samples of the emergency department (ED) population. We analyzed the electronic records of 21,662 consecutive patient visits at an urban, academic ED. From this population, we simulated different models of study recruitment in the ED by using 2 sample sizes (n=200 and n=400) and 4 sampling methods: true random, random 4-hour time blocks by exact sample size, random 4-hour time blocks by a predetermined number of blocks, and convenience or "business hours." For each method and sample size, we obtained 1,000 samples from the population. Using χ(2) tests, we measured the number of statistically significant differences between the sample and the population for 8 variables (age, sex, race/ethnicity, language, triage acuity, arrival mode, disposition, and payer source). Then, for each variable, method, and sample size, we compared the proportion of the 1,000 samples that differed from the overall ED population to the expected proportion (5%). Only the true random samples represented the population with respect to sex, race/ethnicity, triage acuity, mode of arrival, language, and payer source in at least 95% of the samples. Patient samples obtained using random 4-hour time blocks and business hours sampling systematically differed from the overall ED patient population for several important demographic and clinical variables. However, the magnitude of these differences was not large. Common sampling strategies selected for ED-based studies may affect parameter estimates for several representative population variables. However, the potential for bias for these variables appears small. Copyright © 2012. Published by Mosby, Inc.
Anomaly detection in reconstructed quantum states using a machine-learning technique
NASA Astrophysics Data System (ADS)
Hara, Satoshi; Ono, Takafumi; Okamoto, Ryo; Washio, Takashi; Takeuchi, Shigeki
2014-02-01
The accurate detection of small deviations in given density matrices is important for quantum information processing. Here we propose a method based on the concept of data mining. We demonstrate that the proposed method can more accurately detect small erroneous deviations in reconstructed density matrices, which contain intrinsic fluctuations due to the limited number of samples, than a naive method of checking the trace distance from the average of the given density matrices. This method has the potential to be a key tool in broad areas of physics where the detection of small deviations of quantum states reconstructed using a limited number of samples is essential.
Random Numbers and Monte Carlo Methods
NASA Astrophysics Data System (ADS)
Scherer, Philipp O. J.
Many-body problems often involve the calculation of integrals of very high dimension which cannot be treated by standard methods. For the calculation of thermodynamic averages Monte Carlo methods are very useful which sample the integration volume at randomly chosen points. After summarizing some basic statistics, we discuss algorithms for the generation of pseudo-random numbers with given probability distribution which are essential for all Monte Carlo methods. We show how the efficiency of Monte Carlo integration can be improved by sampling preferentially the important configurations. Finally the famous Metropolis algorithm is applied to classical many-particle systems. Computer experiments visualize the central limit theorem and apply the Metropolis method to the traveling salesman problem.
Scanning electron microscopy of bone.
Boyde, Alan
2012-01-01
This chapter described methods for Scanning Electron Microscopical imaging of bone and bone cells. Backscattered electron (BSE) imaging is by far the most useful in the bone field, followed by secondary electrons (SE) and the energy dispersive X-ray (EDX) analytical modes. This chapter considers preparing and imaging samples of unembedded bone having 3D detail in a 3D surface, topography-free, polished or micromilled, resin-embedded block surfaces, and resin casts of space in bone matrix. The chapter considers methods for fixation, drying, looking at undersides of bone cells, and coating. Maceration with alkaline bacterial pronase, hypochlorite, hydrogen peroxide, and sodium or potassium hydroxide to remove cells and unmineralised matrix is described in detail. Attention is given especially to methods for 3D BSE SEM imaging of bone samples and recommendations for the types of resin embedding of bone for BSE imaging are given. Correlated confocal and SEM imaging of PMMA-embedded bone requires the use of glycerol to coverslip. Cathodoluminescence (CL) mode SEM imaging is an alternative for visualising fluorescent mineralising front labels such as calcein and tetracyclines. Making spatial casts from PMMA or other resin embedded samples is an important use of this material. Correlation with other imaging means, including microradiography and microtomography is important. Shipping wet bone samples between labs is best done in glycerol. Environmental SEM (ESEM, controlled vacuum mode) is valuable in eliminating -"charging" problems which are common with complex, cancellous bone samples.
Preparation of samples for leaf architecture studies, a method for mounting cleared leaves.
Vasco, Alejandra; Thadeo, Marcela; Conover, Margaret; Daly, Douglas C
2014-09-01
Several recent waves of interest in leaf architecture have shown an expanding range of approaches and applications across a number of disciplines. Despite this increased interest, examination of existing archives of cleared and mounted leaves shows that current methods for mounting, in particular, yield unsatisfactory results and deterioration of samples over relatively short periods. Although techniques for clearing and staining leaves are numerous, published techniques for mounting leaves are scarce. • Here we present a complete protocol and recommendations for clearing, staining, and imaging leaves, and, most importantly, a method to permanently mount cleared leaves. • The mounting protocol is faster than other methods, inexpensive, and straightforward; moreover, it yields clear and permanent samples that can easily be imaged, scanned, and stored. Specimens mounted with this method preserve well, with leaves that were mounted more than 35 years ago showing no signs of bubbling or discoloration.
Identifying High-Rate Flows Based on Sequential Sampling
NASA Astrophysics Data System (ADS)
Zhang, Yu; Fang, Binxing; Luo, Hao
We consider the problem of fast identification of high-rate flows in backbone links with possibly millions of flows. Accurate identification of high-rate flows is important for active queue management, traffic measurement and network security such as detection of distributed denial of service attacks. It is difficult to directly identify high-rate flows in backbone links because tracking the possible millions of flows needs correspondingly large high speed memories. To reduce the measurement overhead, the deterministic 1-out-of-k sampling technique is adopted which is also implemented in Cisco routers (NetFlow). Ideally, a high-rate flow identification method should have short identification time, low memory cost and processing cost. Most importantly, it should be able to specify the identification accuracy. We develop two such methods. The first method is based on fixed sample size test (FSST) which is able to identify high-rate flows with user-specified identification accuracy. However, since FSST has to record every sampled flow during the measurement period, it is not memory efficient. Therefore the second novel method based on truncated sequential probability ratio test (TSPRT) is proposed. Through sequential sampling, TSPRT is able to remove the low-rate flows and identify the high-rate flows at the early stage which can reduce the memory cost and identification time respectively. According to the way to determine the parameters in TSPRT, two versions of TSPRT are proposed: TSPRT-M which is suitable when low memory cost is preferred and TSPRT-T which is suitable when short identification time is preferred. The experimental results show that TSPRT requires less memory and identification time in identifying high-rate flows while satisfying the accuracy requirement as compared to previously proposed methods.
Nuclear Ensemble Approach with Importance Sampling.
Kossoski, Fábris; Barbatti, Mario
2018-06-12
We show that the importance sampling technique can effectively augment the range of problems where the nuclear ensemble approach can be applied. A sampling probability distribution function initially determines the collection of initial conditions for which calculations are performed, as usual. Then, results for a distinct target distribution are computed by introducing compensating importance sampling weights for each sampled point. This mapping between the two probability distributions can be performed whenever they are both explicitly constructed. Perhaps most notably, this procedure allows for the computation of temperature dependent observables. As a test case, we investigated the UV absorption spectra of phenol, which has been shown to have a marked temperature dependence. Application of the proposed technique to a range that covers 500 K provides results that converge to those obtained with conventional sampling. We further show that an overall improved rate of convergence is obtained when sampling is performed at intermediate temperatures. The comparison between calculated and the available measured cross sections is very satisfactory, as the main features of the spectra are correctly reproduced. As a second test case, one of Tully's classical models was revisited, and we show that the computation of dynamical observables also profits from the importance sampling technique. In summary, the strategy developed here can be employed to assess the role of temperature for any property calculated within the nuclear ensemble method, with the same computational cost as doing so for a single temperature.
Wen, Tingxi; Zhang, Zhongnan; Qiu, Ming; Zeng, Ming; Luo, Weizhen
2017-01-01
The computer mouse is an important human-computer interaction device. But patients with physical finger disability are unable to operate this device. Surface EMG (sEMG) can be monitored by electrodes on the skin surface and is a reflection of the neuromuscular activities. Therefore, we can control limbs auxiliary equipment by utilizing sEMG classification in order to help the physically disabled patients to operate the mouse. To develop a new a method to extract sEMG generated by finger motion and apply novel features to classify sEMG. A window-based data acquisition method was presented to extract signal samples from sEMG electordes. Afterwards, a two-dimensional matrix image based feature extraction method, which differs from the classical methods based on time domain or frequency domain, was employed to transform signal samples to feature maps used for classification. In the experiments, sEMG data samples produced by the index and middle fingers at the click of a mouse button were separately acquired. Then, characteristics of the samples were analyzed to generate a feature map for each sample. Finally, the machine learning classification algorithms (SVM, KNN, RBF-NN) were employed to classify these feature maps on a GPU. The study demonstrated that all classifiers can identify and classify sEMG samples effectively. In particular, the accuracy of the SVM classifier reached up to 100%. The signal separation method is a convenient, efficient and quick method, which can effectively extract the sEMG samples produced by fingers. In addition, unlike the classical methods, the new method enables to extract features by enlarging sample signals' energy appropriately. The classical machine learning classifiers all performed well by using these features.
Seroprevalence and risk factors associated with bovine brucellosis in the Potohar Plateau, Pakistan.
Ali, Shahzad; Akhter, Shamim; Neubauer, Heinrich; Melzer, Falk; Khan, Iahtasham; Abatih, Emmanuel Nji; El-Adawy, Hosny; Irfan, Muhammad; Muhammad, Ali; Akbar, Muhammad Waqas; Umar, Sajid; Ali, Qurban; Iqbal, Muhammad Naeem; Mahmood, Abid; Ahmed, Haroon
2017-01-28
The seroprevalence and risk factors of bovine brucellosis were studied at animal and herd level using a combination of culture, serological and molecular methods. The study was conducted in 253 randomly selected cattle herds of the Potohar plateau, Pakistan from which a total of 2709 serum (1462 cattle and 1247 buffaloes) and 2330 milk (1168 cattle and 1162 buffaloes) samples were collected. Data on risk factors associated with seroprevalence of brucellosis were collected through interviews using questionnaires. Univariable and multivariable random effects logistic regression models were used for identifying important risk factors at animal and herd levels. One hundred and seventy (6.3%) samples and 47 (18.6%) herds were seropositive for brucellosis by Rose Bengal Plate test. Variations in seroprevalence were observed across the different sampling sites. At animal level, sex, species and stock replacement were found to be potential risk factors for brucellosis. At herd level, herd size (≥9 animals) and insemination method used were important risk factors. The presence of Brucella DNA was confirmed with a real-time polymerase chain reaction assay (qRT-PCR) in 52.4% out of 170 serological positive samples. In total, 156 (6.7%) milk samples were positive by milk ring test. B. abortus biovar 1 was cultured from 5 positive milk samples. This study shows that the seroprevalence of bovine brucellosis is high in some regions in Pakistan. Prevalence was associated with herd size, abortion history, insemination methods used, age, sex and stock replacement methods. The infected animal may act as source of infection for other animals and for humans. The development of control strategies for bovine brucellosis through implementation of continuous surveillance and education programs in Pakistan is warranted.
A New Automated Method and Sample Data Flow for Analysis of Volatile Nitrosamines in Human Urine*
Hodgson, James A.; Seyler, Tiffany H.; McGahee, Ernest; Arnstein, Stephen; Wang, Lanqing
2016-01-01
Volatile nitrosamines (VNAs) are a group of compounds classified as probable (group 2A) and possible (group 2B) carcinogens in humans. Along with certain foods and contaminated drinking water, VNAs are detected at high levels in tobacco products and in both mainstream and sidestream smoke. Our laboratory monitors six urinary VNAs—N-nitrosodimethylamine (NDMA), N-nitrosomethylethylamine (NMEA), N-nitrosodiethylamine (NDEA), N-nitrosopiperidine (NPIP), N-nitrosopyrrolidine (NPYR), and N-nitrosomorpholine (NMOR)—using isotope dilution GC-MS/MS (QQQ) for large population studies such as the National Health and Nutrition Examination Survey (NHANES). In this paper, we report for the first time a new automated sample preparation method to more efficiently quantitate these VNAs. Automation is done using Hamilton STAR™ and Caliper Staccato™ workstations. This new automated method reduces sample preparation time from 4 hours to 2.5 hours while maintaining precision (inter-run CV < 10%) and accuracy (85% - 111%). More importantly this method increases sample throughput while maintaining a low limit of detection (<10 pg/mL) for all analytes. A streamlined sample data flow was created in parallel to the automated method, in which samples can be tracked from receiving to final LIMs output with minimal human intervention, further minimizing human error in the sample preparation process. This new automated method and the sample data flow are currently applied in bio-monitoring of VNAs in the US non-institutionalized population NHANES 2013-2014 cycle. PMID:26949569
Enhanced sampling techniques in molecular dynamics simulations of biological systems.
Bernardi, Rafael C; Melo, Marcelo C R; Schulten, Klaus
2015-05-01
Molecular dynamics has emerged as an important research methodology covering systems to the level of millions of atoms. However, insufficient sampling often limits its application. The limitation is due to rough energy landscapes, with many local minima separated by high-energy barriers, which govern the biomolecular motion. In the past few decades methods have been developed that address the sampling problem, such as replica-exchange molecular dynamics, metadynamics and simulated annealing. Here we present an overview over theses sampling methods in an attempt to shed light on which should be selected depending on the type of system property studied. Enhanced sampling methods have been employed for a broad range of biological systems and the choice of a suitable method is connected to biological and physical characteristics of the system, in particular system size. While metadynamics and replica-exchange molecular dynamics are the most adopted sampling methods to study biomolecular dynamics, simulated annealing is well suited to characterize very flexible systems. The use of annealing methods for a long time was restricted to simulation of small proteins; however, a variant of the method, generalized simulated annealing, can be employed at a relatively low computational cost to large macromolecular complexes. Molecular dynamics trajectories frequently do not reach all relevant conformational substates, for example those connected with biological function, a problem that can be addressed by employing enhanced sampling algorithms. This article is part of a Special Issue entitled Recent developments of molecular dynamics. Copyright © 2014 Elsevier B.V. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Black, Stuart; Ferrell, Jack R.
Carbonyl compounds present in bio-oils are known to be responsible for bio-oil property changes upon storage and during upgrading. As such, carbonyl content has previously been used as a method of tracking bio-oil aging and condensation reactions with less variability than viscosity measurements. Given the importance of carbonyls in bio-oils, accurate analytical methods for their quantification are very important for the bio-oil community. Potentiometric titration methods based on carbonyl oximation have long been used for the determination of carbonyl content in pyrolysis bio-oils. Here in this study, we present a modification of the traditional carbonyl oximation procedures that results inmore » less reaction time, smaller sample size, higher precision, and more accurate carbonyl determinations. Some compounds such as carbohydrates are not measured by the traditional method (modified Nicolaides method), resulting in low estimations of the carbonyl content. Furthermore, we have shown that reaction completion for the traditional method can take up to 300 hours. The new method presented here (the modified Faix method) reduces the reaction time to 2 hours, uses triethanolamine (TEA) in the place of pyridine, and requires a smaller sample size for the analysis. Carbonyl contents determined using this new method are consistently higher than when using the traditional titration methods.« less
Black, Stuart; Ferrell, Jack R.
2016-01-06
Carbonyl compounds present in bio-oils are known to be responsible for bio-oil property changes upon storage and during upgrading. As such, carbonyl content has previously been used as a method of tracking bio-oil aging and condensation reactions with less variability than viscosity measurements. Given the importance of carbonyls in bio-oils, accurate analytical methods for their quantification are very important for the bio-oil community. Potentiometric titration methods based on carbonyl oximation have long been used for the determination of carbonyl content in pyrolysis bio-oils. Here in this study, we present a modification of the traditional carbonyl oximation procedures that results inmore » less reaction time, smaller sample size, higher precision, and more accurate carbonyl determinations. Some compounds such as carbohydrates are not measured by the traditional method (modified Nicolaides method), resulting in low estimations of the carbonyl content. Furthermore, we have shown that reaction completion for the traditional method can take up to 300 hours. The new method presented here (the modified Faix method) reduces the reaction time to 2 hours, uses triethanolamine (TEA) in the place of pyridine, and requires a smaller sample size for the analysis. Carbonyl contents determined using this new method are consistently higher than when using the traditional titration methods.« less
Laser-Induced Breakdown Spectroscopy Based Protein Assay for Cereal Samples.
Sezer, Banu; Bilge, Gonca; Boyaci, Ismail Hakki
2016-12-14
Protein content is an important quality parameter in terms of price, nutritional value, and labeling of various cereal samples. However, conventional analysis methods, namely, Kjeldahl and Dumas, have major drawbacks such as long analysis time, titration mistakes, and carrier gas dependence with high purity. For this reason, there is an urgent need for rapid, reliable, and environmentally friendly technologies for protein analysis. The present study aims to develop a new method for protein analysis in wheat flour and whole meal by using laser-induced breakdown spectroscopy (LIBS), which is a multielemental, fast, and simple spectroscopic method. Unlike the Kjeldahl and Dumas methods, it has potential to analyze a high number of samples in considerably short time. In the study, nitrogen peaks in LIBS spectra of wheat flour and whole meal samples with different protein contents were correlated with results of the standard Dumas method with the aid of chemometric methods. A calibration graph showed good linearity with the protein content between 7.9 and 20.9% and a 0.992 coefficient of determination (R 2 ). The limit of detection was calculated as 0.26%. The results indicated that LIBS is a promising and reliable method with its high sensitivity for routine protein analysis in wheat flour and whole meal samples.
NASA Astrophysics Data System (ADS)
Bártová, H.; Kučera, J.; Musílek, L.; Trojek, T.; Gregorová, E.
2017-11-01
Knowledge of the content of natural radionuclides in bricks can be important in some cases in dosimetry and application of ionizing radiation. Dosimetry of naturally occurring radionuclides in matter (NORM) in general is one of them, the other one, related to radiation protection, is radon exposure evaluation, and finally, it is needed for the thermoluminescence (TL) dating method. The internal dose rate inside bricks is caused mostly by contributions of the natural radionuclides 238U, 232Th, radionuclides of their decay chains, and 40K. The decay chain of 235U is usually much less important. The concentrations of 238U, 232Th and 40K were measured by various methods, namely by gamma-ray spectrometry, X-ray fluorescence analysis (XRF), and neutron activation analysis (NAA) which was used as a reference method. These methods were compared from the point of view of accuracy, limit of detection (LOD), amount of sample needed and sample handling, time demands, and instrument availability.
Shaffer, Patrick; Valsson, Omar; Parrinello, Michele
2016-01-01
The capabilities of molecular simulations have been greatly extended by a number of widely used enhanced sampling methods that facilitate escaping from metastable states and crossing large barriers. Despite these developments there are still many problems which remain out of reach for these methods which has led to a vigorous effort in this area. One of the most important problems that remains unsolved is sampling high-dimensional free-energy landscapes and systems that are not easily described by a small number of collective variables. In this work we demonstrate a new way to compute free-energy landscapes of high dimensionality based on the previously introduced variationally enhanced sampling, and we apply it to the miniprotein chignolin. PMID:26787868
Adaptive control of theophylline therapy: importance of blood sampling times.
D'Argenio, D Z; Khakmahd, K
1983-10-01
A two-observation protocol for estimating theophylline clearance during a constant-rate intravenous infusion is used to examine the importance of blood sampling schedules with regard to the information content of resulting concentration data. Guided by a theory for calculating maximally informative sample times, population simulations are used to assess the effect of specific sampling times on the precision of resulting clearance estimates and subsequent predictions of theophylline plasma concentrations. The simulations incorporated noise terms for intersubject variability, dosing errors, sample collection errors, and assay error. Clearance was estimated using Chiou's method, least squares, and a Bayesian estimation procedure. The results of these simulations suggest that clinically significant estimation and prediction errors may result when using the above two-point protocol for estimating theophylline clearance if the time separating the two blood samples is less than one population mean elimination half-life.
Müller, David; Cattaneo, Stefano; Meier, Florian; Welz, Roland; de Vries, Tjerk; Portugal-Cohen, Meital; Antonio, Diana C; Cascio, Claudia; Calzolai, Luigi; Gilliland, Douglas; de Mello, Andrew
2016-04-01
We demonstrate the use of inverse supercritical carbon dioxide (scCO2) extraction as a novel method of sample preparation for the analysis of complex nanoparticle-containing samples, in our case a model sunscreen agent with titanium dioxide nanoparticles. The sample was prepared for analysis in a simplified process using a lab scale supercritical fluid extraction system. The residual material was easily dispersed in an aqueous solution and analyzed by Asymmetrical Flow Field-Flow Fractionation (AF4) hyphenated with UV- and Multi-Angle Light Scattering detection. The obtained results allowed an unambiguous determination of the presence of nanoparticles within the sample, with almost no background from the matrix itself, and showed that the size distribution of the nanoparticles is essentially maintained. These results are especially relevant in view of recently introduced regulatory requirements concerning the labeling of nanoparticle-containing products. The novel sample preparation method is potentially applicable to commercial sunscreens or other emulsion-based cosmetic products and has important ecological advantages over currently used sample preparation techniques involving organic solvents. Copyright © 2016 Elsevier B.V. All rights reserved.
Li, Hongzhi; Yang, Wei
2007-03-21
An approach is developed in the replica exchange framework to enhance conformational sampling for the quantum mechanical (QM) potential based molecular dynamics simulations. Importantly, with our enhanced sampling treatment, a decent convergence for electronic structure self-consistent-field calculation is robustly guaranteed, which is made possible in our replica exchange design by avoiding direct structure exchanges between the QM-related replicas and the activated (scaled by low scaling parameters or treated with high "effective temperatures") molecular mechanical (MM) replicas. Although the present approach represents one of the early efforts in the enhanced sampling developments specifically for quantum mechanical potentials, the QM-based simulations treated with the present technique can possess the similar sampling efficiency to the MM based simulations treated with the Hamiltonian replica exchange method (HREM). In the present paper, by combining this sampling method with one of our recent developments (the dual-topology alchemical HREM approach), we also introduce a method for the sampling enhanced QM-based free energy calculations.
Ultraviolet resonance Raman spectroscopy for the detection of cocaine in oral fluid
NASA Astrophysics Data System (ADS)
D'Elia, Valentina; Montalvo, Gemma; Ruiz, Carmen García; Ermolenkov, Vladimir V.; Ahmed, Yasmine; Lednev, Igor K.
2018-01-01
Detecting and quantifying cocaine in oral fluid is of significant importance for practical forensics. Up to date, mainly destructive methods or biochemical tests have been used, while spectroscopic methods were only applied to pretreated samples. In this work, the possibility of using resonance Raman spectroscopy to detect cocaine in oral fluid without pretreating samples was tested. It was found that ultraviolet resonance Raman spectroscopy with 239-nm excitation allows for the detection of cocaine in oral fluid at 10 μg/mL level. Further method development will be needed for reaching the practically useful levels of cocaine detection.
Optimal model-based sensorless adaptive optics for epifluorescence microscopy.
Pozzi, Paolo; Soloviev, Oleg; Wilding, Dean; Vdovin, Gleb; Verhaegen, Michel
2018-01-01
We report on a universal sample-independent sensorless adaptive optics method, based on modal optimization of the second moment of the fluorescence emission from a point-like excitation. Our method employs a sample-independent precalibration, performed only once for the particular system, to establish the direct relation between the image quality and the aberration. The method is potentially applicable to any form of microscopy with epifluorescence detection, including the practically important case of incoherent fluorescence emission from a three dimensional object, through minor hardware modifications. We have applied the technique successfully to a widefield epifluorescence microscope and to a multiaperture confocal microscope.
de Andrade, Rafael Barreto; Barlow, Jos; Louzada, Julio; Vaz-de-Mello, Fernando Zagury; Souza, Mateus; Silveira, Juliana M; Cochrane, Mark A
2011-01-01
Understanding how biodiversity responds to environmental changes is essential to provide the evidence-base that underpins conservation initiatives. The present study provides a standardized comparison between unbaited flight intercept traps (FIT) and baited pitfall traps (BPT) for sampling dung beetles. We examine the effectiveness of the two to assess fire disturbance effects and how trap performance is affected by seasonality. The study was carried out in a transitional forest between Cerrado (Brazilian Savanna) and Amazon Forest. Dung beetles were collected during one wet and one dry sampling season. The two methods sampled different portions of the local beetle assemblage. Both FIT and BPT were sensitive to fire disturbance during the wet season, but only BPT detected community differences during the dry season. Both traps showed similar correlation with environmental factors. Our results indicate that seasonality had a stronger effect than trap type, with BPT more effective and robust under low population numbers, and FIT more sensitive to fine scale heterogeneity patterns. This study shows the strengths and weaknesses of two commonly used methodologies for sampling dung beetles in tropical forests, as well as highlighting the importance of seasonality in shaping the results obtained by both sampling strategies.
Mori, Toshifumi; Hamers, Robert J; Pedersen, Joel A; Cui, Qiang
2014-07-17
Motivated by specific applications and the recent work of Gao and co-workers on integrated tempering sampling (ITS), we have developed a novel sampling approach referred to as integrated Hamiltonian sampling (IHS). IHS is straightforward to implement and complementary to existing methods for free energy simulation and enhanced configurational sampling. The method carries out sampling using an effective Hamiltonian constructed by integrating the Boltzmann distributions of a series of Hamiltonians. By judiciously selecting the weights of the different Hamiltonians, one achieves rapid transitions among the energy landscapes that underlie different Hamiltonians and therefore an efficient sampling of important regions of the conformational space. Along this line, IHS shares similar motivations as the enveloping distribution sampling (EDS) approach of van Gunsteren and co-workers, although the ways that distributions of different Hamiltonians are integrated are rather different in IHS and EDS. Specifically, we report efficient ways for determining the weights using a combination of histogram flattening and weighted histogram analysis approaches, which make it straightforward to include many end-state and intermediate Hamiltonians in IHS so as to enhance its flexibility. Using several relatively simple condensed phase examples, we illustrate the implementation and application of IHS as well as potential developments for the near future. The relation of IHS to several related sampling methods such as Hamiltonian replica exchange molecular dynamics and λ-dynamics is also briefly discussed.
Aase, Audun; Hajdusek, Ondrej; Øines, Øivind; Quarsten, Hanne; Wilhelmsson, Peter; Herstad, Tove K; Kjelland, Vivian; Sima, Radek; Jalovecka, Marie; Lindgren, Per-Eric; Aaberge, Ingeborg S
2016-01-01
A modified microscopy protocol (the LM-method) was used to demonstrate what was interpreted as Borrelia spirochetes and later also Babesia sp., in peripheral blood from patients. The method gained much publicity, but was not validated prior to publication, which became the purpose of this study using appropriate scientific methodology, including a control group. Blood from 21 patients previously interpreted as positive for Borrelia and/or Babesia infection by the LM-method and 41 healthy controls without known history of tick bite were collected, blinded and analysed for these pathogens by microscopy in two laboratories by the LM-method and conventional method, respectively, by PCR methods in five laboratories and by serology in one laboratory. Microscopy by the LM-method identified structures claimed to be Borrelia- and/or Babesia in 66% of the blood samples of the patient group and in 85% in the healthy control group. Microscopy by the conventional method for Babesia only did not identify Babesia in any samples. PCR analysis detected Borrelia DNA in one sample of the patient group and in eight samples of the control group; whereas Babesia DNA was not detected in any of the blood samples using molecular methods. The structures interpreted as Borrelia and Babesia by the LM-method could not be verified by PCR. The method was, thus, falsified. This study underlines the importance of doing proper test validation before new or modified assays are introduced.
Marinelli, L; Cottarelli, A; Solimini, A G; Del Cimmuto, A; De Giusti, M
2017-01-01
In this study we estimated the presence of Legionella species, viable but non-culturable (VBNC), in hospital water networks. We also evaluated the time and load of Legionella appearance in samples found negative using the standard culture method. A total of 42 samples was obtained from the tap water of five hospital buildings. The samples were tested for Legionella by the standard culture method and were monitored for up to 12 months for the appearance of VBNC Legionella. All the 42 samples were negative at the time of collection. Seven of the 42 samples (17.0%) became positive for Legionella at different times of monitoring. The time to the appearance of VBNC Legionella was extremely variable, from 15 days to 9 months from sampling. The most frequent Legionella species observed were Legionella spp and L. anisa and only in one sample L. pneumophila srg.1. Our study confirms the presence of VBNC Legionella in samples resulting negative using the standard culture method and highlights the different time to its appearance that can occur several months after sampling. The results are important for risk assessment and risk management of engineered water systems.
Metadynamics for training neural network model chemistries: A competitive assessment
NASA Astrophysics Data System (ADS)
Herr, John E.; Yao, Kun; McIntyre, Ryker; Toth, David W.; Parkhill, John
2018-06-01
Neural network model chemistries (NNMCs) promise to facilitate the accurate exploration of chemical space and simulation of large reactive systems. One important path to improving these models is to add layers of physical detail, especially long-range forces. At short range, however, these models are data driven and data limited. Little is systematically known about how data should be sampled, and "test data" chosen randomly from some sampling techniques can provide poor information about generality. If the sampling method is narrow, "test error" can appear encouragingly tiny while the model fails catastrophically elsewhere. In this manuscript, we competitively evaluate two common sampling methods: molecular dynamics (MD), normal-mode sampling, and one uncommon alternative, Metadynamics (MetaMD), for preparing training geometries. We show that MD is an inefficient sampling method in the sense that additional samples do not improve generality. We also show that MetaMD is easily implemented in any NNMC software package with cost that scales linearly with the number of atoms in a sample molecule. MetaMD is a black-box way to ensure samples always reach out to new regions of chemical space, while remaining relevant to chemistry near kbT. It is a cheap tool to address the issue of generalization.
Reflexion on linear regression trip production modelling method for ensuring good model quality
NASA Astrophysics Data System (ADS)
Suprayitno, Hitapriya; Ratnasari, Vita
2017-11-01
Transport Modelling is important. For certain cases, the conventional model still has to be used, in which having a good trip production model is capital. A good model can only be obtained from a good sample. Two of the basic principles of a good sampling is having a sample capable to represent the population characteristics and capable to produce an acceptable error at a certain confidence level. It seems that this principle is not yet quite understood and used in trip production modeling. Therefore, investigating the Trip Production Modelling practice in Indonesia and try to formulate a better modeling method for ensuring the Model Quality is necessary. This research result is presented as follows. Statistics knows a method to calculate span of prediction value at a certain confidence level for linear regression, which is called Confidence Interval of Predicted Value. The common modeling practice uses R2 as the principal quality measure, the sampling practice varies and not always conform to the sampling principles. An experiment indicates that small sample is already capable to give excellent R2 value and sample composition can significantly change the model. Hence, good R2 value, in fact, does not always mean good model quality. These lead to three basic ideas for ensuring good model quality, i.e. reformulating quality measure, calculation procedure, and sampling method. A quality measure is defined as having a good R2 value and a good Confidence Interval of Predicted Value. Calculation procedure must incorporate statistical calculation method and appropriate statistical tests needed. A good sampling method must incorporate random well distributed stratified sampling with a certain minimum number of samples. These three ideas need to be more developed and tested.
A national-scale survey of 247 contaminants of emerging concern (CECs), including organic and inorganic chemical compounds, and microbial contaminants, was conducted in source and treated drinking water samples from 25 treatment plants across the United States. Multiple methods w...
First Report of the new root-lesion nematode Pratylenchus sp. on soybean in North Dakota
USDA-ARS?s Scientific Manuscript database
Root-lesion nematodes (Pratylenchus spp.) are important nematode pests on soybean. In 2015, two soil samples were collected from a soybean field in Richland County, ND. Nematodes were extracted from soil using a sugar centrifugal flotation method, revealing these two samples contained 125 and 350 ro...
USDA-ARS?s Scientific Manuscript database
Raman spectroscopy technique has proven to be a reliable method for detection of chemical contaminants in food ingredients and products. To detect each contaminant particle in a food sample, it is important to determine the effective depth of penetration of laser through the food sample and the corr...
USDA-ARS?s Scientific Manuscript database
Accurate and timely spatial predictions of vegetation cover from remote imagery are an important data source for natural resource management. High-quality in situ data are needed to develop and validate these products. Point-intercept sampling techniques are a common method for obtaining quantitativ...
USDA-ARS?s Scientific Manuscript database
Campylobacter jejuni (C. jejuni) is one of the most common causes of gastroenteritis in the world. Given the potential risks to human, animal and environmental health the development and optimization of methods to quantify this important pathogen in environmental samples is essential. Two of the mos...
First report of a new, unnamed lesion nematode Pratylenchus sp. infecting soybean in North Dakota
USDA-ARS?s Scientific Manuscript database
Lesion nematodes (Pratylenchus spp.) are important pests on soybean. In 2015 and 2016, nematodes in 11 soil samples from a soybean field in Richland County, ND were extracted by sugar centrifugal flotation method (Jenkins 1964). Ten samples contained lesion nematodes from 150 to 875/kg of soil. One ...
ERIC Educational Resources Information Center
Begeny, John C.; Krouse, Hailey E.; Brown, Kristina G.; Mann, Courtney M.
2011-01-01
Teacher judgments about students' academic abilities are important for instructional decision making and potential special education entitlement decisions. However, the small number of studies evaluating teachers' judgments are limited methodologically (e.g., sample size, procedural sophistication) and have yet to answer important questions…
Technology for return of planetary samples
NASA Technical Reports Server (NTRS)
1975-01-01
The problem of returning a Mars sample to Earth was considered. The model ecosystem concept was advanced as the most reliable, sensitive method for assessing the biohazard from the Mars sample before it is permitted on Earth. Two approaches to ecosystem development were studied. In the first approach, the Mars sample would be introduced into the ecosystem and exposed to conditions which are as similar to the Martian environment as the constitutent terrestrial organisms can tolerate. In the second approach, the Mars sample would be tested to determine its effects on important terrestrial cellular functions. In addition, efforts were directed toward establishing design considerations for a Mars Planetary Receiving Laboratory. The problems encountered with the Lunar Receiving Laboratory were evaluated in this context. A questionnaire was developed to obtain information regarding important experiments to be conducted in the Planetary Receiving Laboratory.
Recent developments in detection and enumeration of waterborne bacteria: a retrospective minireview.
Deshmukh, Rehan A; Joshi, Kopal; Bhand, Sunil; Roy, Utpal
2016-12-01
Waterborne diseases have emerged as global health problems and their rapid and sensitive detection in environmental water samples is of great importance. Bacterial identification and enumeration in water samples is significant as it helps to maintain safe drinking water for public consumption. Culture-based methods are laborious, time-consuming, and yield false-positive results, whereas viable but nonculturable (VBNCs) microorganisms cannot be recovered. Hence, numerous methods have been developed for rapid detection and quantification of waterborne pathogenic bacteria in water. These rapid methods can be classified into nucleic acid-based, immunology-based, and biosensor-based detection methods. This review summarizes the principle and current state of rapid methods for the monitoring and detection of waterborne bacterial pathogens. Rapid methods outlined are polymerase chain reaction (PCR), digital droplet PCR, real-time PCR, multiplex PCR, DNA microarray, Next-generation sequencing (pyrosequencing, Illumina technology and genomics), and fluorescence in situ hybridization that are categorized as nucleic acid-based methods. Enzyme-linked immunosorbent assay (ELISA) and immunofluorescence are classified into immunology-based methods. Optical, electrochemical, and mass-based biosensors are grouped into biosensor-based methods. Overall, these methods are sensitive, specific, time-effective, and important in prevention and diagnosis of waterborne bacterial diseases. © 2016 The Authors. MicrobiologyOpen published by John Wiley & Sons Ltd.
The importance of quality control in validating concentrations ...
A national-scale survey of 247 contaminants of emerging concern (CECs), including organic and inorganic chemical compounds, and microbial contaminants, was conducted in source and treated drinking water samples from 25 treatment plants across the United States. Multiple methods were used to determine these CECs, including six analytical methods to measure 174 pharmaceuticals, personal care products, and pesticides. A three-component quality assurance/quality control (QA/QC) program was designed for the subset of 174 CECs which allowed us to assess and compare performances of the methods used. The three components included: 1) a common field QA/QC protocol and sample design, 2) individual investigator-developed method-specific QA/QC protocols, and 3) a suite of 46 method comparison analytes that were determined in two or more analytical methods. Overall method performance for the 174 organic chemical CECs was assessed by comparing spiked recoveries in reagent, source, and treated water over a two-year period. In addition to the 247 CECs reported in the larger drinking water study, another 48 pharmaceutical compounds measured did not consistently meet predetermined quality standards. Methodologies that did not seem suitable for these analytes are overviewed. The need to exclude analytes based on method performance demonstrates the importance of additional QA/QC protocols. This paper compares the method performance of six analytical methods used to measure 174 emer
Leung, Elvis M K; Chan, Wan
2014-02-01
Creatinine is an important biomarker for renal function diagnosis and normalizing variations in urinary drug/metabolites concentration. Quantification of creatinine in biological fluids such as urine and plasma is important for clinical diagnosis as well as in biomonitoring programs and urinary metabolomics/metabonomics research. Current methods for creatinine determination either are nonselective or involve the use of expensive mass spectrometers. In this paper, a novel reversed-phase high-performance liquid chromatographic (HPLC) method for the determination of creatinine of high hydrophilicity by pre-column derivatization with ethyl chloroformate is presented. N-Ethyloxycarbonylation of creatinine significantly enhanced the hydrophobicity of creatinine, facilitating its chromatographic retention as well as quantification by HPLC. Factors governing the derivatization reaction were studied and optimized. The developed method was validated and applied for the determination of creatinine in rat urine samples. Comparative studies with isotope-dilution mass spectrometric method revealed that the two methods do not yield systematic differences in creatinine concentrations, indicating the HPLC method is suitable for the determination of creatinine in urine samples.
Azemard, Sabine; Vassileva, Emilia
2015-06-01
In this paper, we present a simple, fast and cost-effective method for determination of methyl mercury (MeHg) in marine samples. All important parameters influencing the sample preparation process were investigated and optimized. Full validation of the method was performed in accordance to the ISO-17025 (ISO/IEC, 2005) and Eurachem guidelines. Blanks, selectivity, working range (0.09-3.0ng), recovery (92-108%), intermediate precision (1.7-4.5%), traceability, limit of detection (0.009ng), limit of quantification (0.045ng) and expanded uncertainty (15%, k=2) were assessed. Estimation of the uncertainty contribution of each parameter and the demonstration of traceability of measurement results was provided as well. Furthermore, the selectivity of the method was studied by analyzing the same sample extracts by advanced mercury analyzer (AMA) and gas chromatography-atomic fluorescence spectrometry (GC-AFS). Additional validation of the proposed procedure was effectuated by participation in the IAEA-461 worldwide inter-laboratory comparison exercises. Copyright © 2014 Elsevier Ltd. All rights reserved.
Renoux, Lance P; Dolan, Maureen C; Cook, Courtney A; Smit, Nico J; Sikkel, Paul C
2017-08-01
Apicomplexan parasites are obligate parasites of many species of vertebrates. To date, there is very limited understanding of these parasites in the most-diverse group of vertebrates, actinopterygian fishes. While DNA barcoding targeting the eukaryotic 18S small subunit rRNA gene sequence has been useful in identifying apicomplexans in tetrapods, identification of apicomplexans infecting fishes has relied solely on morphological identification by microscopy. In this study, a DNA barcoding method was developed that targets the 18S rRNA gene primers for identifying apicomplexans parasitizing certain actinopterygian fishes. A lead primer set was selected showing no cross-reactivity to the overwhelming abundant host DNA and successfully confirmed 37 of the 41 (90.2%) microscopically verified parasitized fish blood samples analyzed in this study. Furthermore, this DNA barcoding method identified 4 additional samples that screened negative for parasitemia, suggesting this molecular method may provide improved sensitivity over morphological characterization by microscopy. In addition, this PCR screening method for fish apicomplexans, using Whatman FTA preserved DNA, was tested in efforts leading to a more simplified field collection, transport, and sample storage method as well as a streamlining sample processing important for DNA barcoding of large sample sets.
Selection of species and sampling areas: The importance of inference
Paul Stephen Corn
2009-01-01
Inductive inference, the process of drawing general conclusions from specific observations, is fundamental to the scientific method. Platt (1964) termed conclusions obtained through rigorous application of the scientific method as "strong inference" and noted the following basic steps: generating alternative hypotheses; devising experiments, the...
Li, C T; Shi, C H; Wu, J G; Xu, H M; Zhang, H Z; Ren, Y L
2004-04-01
The selection of an appropriate sampling strategy and a clustering method is important in the construction of core collections based on predicted genotypic values in order to retain the greatest degree of genetic diversity of the initial collection. In this study, methods of developing rice core collections were evaluated based on the predicted genotypic values for 992 rice varieties with 13 quantitative traits. The genotypic values of the traits were predicted by the adjusted unbiased prediction (AUP) method. Based on the predicted genotypic values, Mahalanobis distances were calculated and employed to measure the genetic similarities among the rice varieties. Six hierarchical clustering methods, including the single linkage, median linkage, centroid, unweighted pair-group average, weighted pair-group average and flexible-beta methods, were combined with random, preferred and deviation sampling to develop 18 core collections of rice germplasm. The results show that the deviation sampling strategy in combination with the unweighted pair-group average method of hierarchical clustering retains the greatest degree of genetic diversities of the initial collection. The core collections sampled using predicted genotypic values had more genetic diversity than those based on phenotypic values.
Amino Acid profile as a feasible tool for determination of the authenticity of fruit juices.
Asadpoor, Mostafa; Ansarin, Masoud; Nemati, Mahboob
2014-12-01
Fruit juice is a nutrient rich food product with a direct connection to public health. The purpose of this research was to determine the amino acid profile of juices and provide a quick and accurate indicator for determining their authenticity. The method of analysis was HPLC with fluorescence detector and pre-column derivatization by orthophtaldialdehyde (OPA). Sixty-six samples of fruit juices were analyzed, and fourteen amino acids were identified and determined in the sampled fruit juices. The fruit samples used for this analysis were apples, oranges, cherry, pineapple, mango, apricot, pomegranate, peach and grapes. The results showed that 32% of samples tested in this study had a lower concentrate percentage as compared to that of their labels and/or other possible authenticity problems in the manufacturing process. The following samples showed probable adulteration: four cherry juice samples, two pomegranate juice samples, one mango, three grape, four peach, seven orange, two apple and one apricot juice samples. In general, determining the amount of amino acids and comparing sample amino acids profiles with the standard values seems to be an indicator for quality control. This method can provide the regulatory agencies with a tool, to help produce a healthier juice. The aim of this study is the analytical control of the fruit juice composition is becoming an important issue, and HPLC can provide an important and essential tool for more accurate research as well as for routine analysis.
Sato, Yuka; Seimiya, Masanori; Yoshida, Toshihiko; Sawabe, Yuji; Hokazono, Eisaku; Osawa, Susumu; Matsushita, Kazuyuki
2017-01-01
Background The indocyanine green retention rate is important for assessing the severity of liver disorders. In the conventional method, blood needs to be collected twice. In the present study, we developed an automated indocyanine green method that does not require blood sampling before intravenous indocyanine green injections and is applicable to an automated biochemical analyser. Methods The serum samples of 471 patients collected before and after intravenous indocyanine green injections and submitted to the clinical laboratory of our hospital were used as samples. The standard procedure established by the Japan Society of Hepatology was used as the standard method. In the automated indocyanine green method, serum collected after an intravenous indocyanine green injection was mixed with the saline reagent containing a surfactant, and the indocyanine green concentration was measured at a dominant wavelength of 805 nm and a complementary wavelength of 884 nm. Results The coefficient of variations of the within- and between-run reproducibilities of this method were 2% or lower, and dilution linearity passing the origin was noted up to 10 mg/L indocyanine green. The reagent was stable for four weeks or longer. Haemoglobin, bilirubin and chyle had no impact on the results obtained. The correlation coefficient between the standard method (x) and this method (y) was r=0.995; however, slight divergence was noted in turbid samples. Conclusion Divergence in turbid samples may have corresponded to false negativity with the standard procedure. Our method may be highly practical because blood sampling before indocyanine green loading is unnecessary and measurements are simple.
Porosity, permeability and 3D fracture network characterisation of dolomite reservoir rock samples
Voorn, Maarten; Exner, Ulrike; Barnhoorn, Auke; Baud, Patrick; Reuschlé, Thierry
2015-01-01
With fractured rocks making up an important part of hydrocarbon reservoirs worldwide, detailed analysis of fractures and fracture networks is essential. However, common analyses on drill core and plug samples taken from such reservoirs (including hand specimen analysis, thin section analysis and laboratory porosity and permeability determination) however suffer from various problems, such as having a limited resolution, providing only 2D and no internal structure information, being destructive on the samples and/or not being representative for full fracture networks. In this paper, we therefore explore the use of an additional method – non-destructive 3D X-ray micro-Computed Tomography (μCT) – to obtain more information on such fractured samples. Seven plug-sized samples were selected from narrowly fractured rocks of the Hauptdolomit formation, taken from wellbores in the Vienna basin, Austria. These samples span a range of different fault rocks in a fault zone interpretation, from damage zone to fault core. We process the 3D μCT data in this study by a Hessian-based fracture filtering routine and can successfully extract porosity, fracture aperture, fracture density and fracture orientations – in bulk as well as locally. Additionally, thin sections made from selected plug samples provide 2D information with a much higher detail than the μCT data. Finally, gas- and water permeability measurements under confining pressure provide an important link (at least in order of magnitude) towards more realistic reservoir conditions. This study shows that 3D μCT can be applied efficiently on plug-sized samples of naturally fractured rocks, and that although there are limitations, several important parameters can be extracted. μCT can therefore be a useful addition to studies on such reservoir rocks, and provide valuable input for modelling and simulations. Also permeability experiments under confining pressure provide important additional insights. Combining these and other methods can therefore be a powerful approach in microstructural analysis of reservoir rocks, especially when applying the concepts that we present (on a small set of samples) in a larger study, in an automated and standardised manner. PMID:26549935
Porosity, permeability and 3D fracture network characterisation of dolomite reservoir rock samples.
Voorn, Maarten; Exner, Ulrike; Barnhoorn, Auke; Baud, Patrick; Reuschlé, Thierry
2015-03-01
With fractured rocks making up an important part of hydrocarbon reservoirs worldwide, detailed analysis of fractures and fracture networks is essential. However, common analyses on drill core and plug samples taken from such reservoirs (including hand specimen analysis, thin section analysis and laboratory porosity and permeability determination) however suffer from various problems, such as having a limited resolution, providing only 2D and no internal structure information, being destructive on the samples and/or not being representative for full fracture networks. In this paper, we therefore explore the use of an additional method - non-destructive 3D X-ray micro-Computed Tomography (μCT) - to obtain more information on such fractured samples. Seven plug-sized samples were selected from narrowly fractured rocks of the Hauptdolomit formation, taken from wellbores in the Vienna basin, Austria. These samples span a range of different fault rocks in a fault zone interpretation, from damage zone to fault core. We process the 3D μCT data in this study by a Hessian-based fracture filtering routine and can successfully extract porosity, fracture aperture, fracture density and fracture orientations - in bulk as well as locally. Additionally, thin sections made from selected plug samples provide 2D information with a much higher detail than the μCT data. Finally, gas- and water permeability measurements under confining pressure provide an important link (at least in order of magnitude) towards more realistic reservoir conditions. This study shows that 3D μCT can be applied efficiently on plug-sized samples of naturally fractured rocks, and that although there are limitations, several important parameters can be extracted. μCT can therefore be a useful addition to studies on such reservoir rocks, and provide valuable input for modelling and simulations. Also permeability experiments under confining pressure provide important additional insights. Combining these and other methods can therefore be a powerful approach in microstructural analysis of reservoir rocks, especially when applying the concepts that we present (on a small set of samples) in a larger study, in an automated and standardised manner.
NASA Astrophysics Data System (ADS)
Voorn, Maarten; Barnhoorn, Auke; Exner, Ulrike; Baud, Patrick; Reuschlé, Thierry
2015-04-01
Fractured reservoir rocks make up an important part of the hydrocarbon reservoirs worldwide. A detailed analysis of fractures and fracture networks in reservoir rock samples is thus essential to determine the potential of these fractured reservoirs. However, common analyses on drill core and plug samples taken from such reservoirs (including hand specimen analysis, thin section analysis and laboratory porosity and permeability determination) suffer from various problems, such as having a limited resolution, providing only 2D and no internal structure information, being destructive on the samples and/or not being representative for full fracture networks. In this study, we therefore explore the use of an additional method - non-destructive 3D X-ray micro-Computed Tomography (μCT) - to obtain more information on such fractured samples. Seven plug-sized samples were selected from narrowly fractured rocks of the Hauptdolomit formation, taken from wellbores in the Vienna Basin, Austria. These samples span a range of different fault rocks in a fault zone interpretation, from damage zone to fault core. 3D μCT data is used to extract porosity, fracture aperture, fracture density and fracture orientations - in bulk as well as locally. The 3D analyses are complemented with thin sections made to provide some 2D information with a much higher detail than the μCT data. Finally, gas- and water permeability measurements under confining pressure provide an important link (at least in order of magnitude) of the µCT results towards more realistic reservoir conditions. Our results show that 3D μCT can be applied efficiently on plug-sized samples of naturally fractured rocks, and that several important parameters can be extracted. μCT can therefore be a useful addition to studies on such reservoir rocks, and provide valuable input for modelling and simulations. Also permeability experiments under confining pressure provide important additional insights. Combining these and other methods can therefore be a powerful approach in microstructural analysis of reservoir rocks, especially when applying the concepts that we present (on a small set of samples) in a larger study, in an automated and standardised manner.
Lunar carbon chemistry - Relations to and implications for terrestrial organic geochemistry.
NASA Technical Reports Server (NTRS)
Eglinton, G.; Maxwell, J. R.; Pillinger, C. T.
1972-01-01
Survey of the various ways in which studies of lunar carbon chemistry have beneficially affected terrestrial organic geochemistry. A lunar organic gas-analysis operating system is cited as the most important instrumental development in relation to terrestrial organic geochemistry. Improved methods of analysis and handling of organic samples are cited as another benefit derived from studies of lunar carbon chemistry. The problem of controlling contamination and minimizing organic vapors is considered, as well as the possibility of analyzing terrestrial samples by the techniques developed for lunar samples. A need for new methods of analyzing carbonaceous material which is insoluble in organic solvents is indicated.
Nessen, Merel A; van der Zwaan, Dennis J; Grevers, Sander; Dalebout, Hans; Staats, Martijn; Kok, Esther; Palmblad, Magnus
2016-05-11
Proteomics methodology has seen increased application in food authentication, including tandem mass spectrometry of targeted species-specific peptides in raw, processed, or mixed food products. We have previously described an alternative principle that uses untargeted data acquisition and spectral library matching, essentially spectral counting, to compare and identify samples without the need for genomic sequence information in food species populations. Here, we present an interlaboratory comparison demonstrating how a method based on this principle performs in a realistic context. We also increasingly challenge the method by using data from different types of mass spectrometers, by trying to distinguish closely related and commercially important flatfish, and by analyzing heavily contaminated samples. The method was found to be robust in different laboratories, and 94-97% of the analyzed samples were correctly identified, including all processed and contaminated samples.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wen, Haiming; Lin, Yaojun; Seidman, David N.
The preparation of transmission electron microcopy (TEM) samples from powders with particle sizes larger than ~100 nm poses a challenge. The existing methods are complicated and expensive, or have a low probability of success. Herein, we report a modified methodology for preparation of TEM samples from powders, which is efficient, cost-effective, and easy to perform. This method involves mixing powders with an epoxy on a piece of weighing paper, curing the powder–epoxy mixture to form a bulk material, grinding the bulk to obtain a thin foil, punching TEM discs from the foil, dimpling the discs, and ion milling the dimpledmore » discs to electron transparency. Compared with the well established and robust grinding–dimpling–ion-milling method for TEM sample preparation for bulk materials, our modified approach for preparing TEM samples from powders only requires two additional simple steps. In this article, step-by-step procedures for our methodology are described in detail, and important strategies to ensure success are elucidated. Furthermore, our methodology has been applied successfully for preparing TEM samples with large thin areas and high quality for many different mechanically milled metallic powders.« less
Wen, Haiming; Lin, Yaojun; Seidman, David N.; ...
2015-09-09
The preparation of transmission electron microcopy (TEM) samples from powders with particle sizes larger than ~100 nm poses a challenge. The existing methods are complicated and expensive, or have a low probability of success. Herein, we report a modified methodology for preparation of TEM samples from powders, which is efficient, cost-effective, and easy to perform. This method involves mixing powders with an epoxy on a piece of weighing paper, curing the powder–epoxy mixture to form a bulk material, grinding the bulk to obtain a thin foil, punching TEM discs from the foil, dimpling the discs, and ion milling the dimpledmore » discs to electron transparency. Compared with the well established and robust grinding–dimpling–ion-milling method for TEM sample preparation for bulk materials, our modified approach for preparing TEM samples from powders only requires two additional simple steps. In this article, step-by-step procedures for our methodology are described in detail, and important strategies to ensure success are elucidated. Furthermore, our methodology has been applied successfully for preparing TEM samples with large thin areas and high quality for many different mechanically milled metallic powders.« less
A sub-sampled approach to extremely low-dose STEM
Stevens, A.; Luzi, L.; Yang, H.; ...
2018-01-22
The inpainting of deliberately and randomly sub-sampled images offers a potential means to image specimens at a high resolution and under extremely low-dose conditions (≤1 e -/Å 2) using a scanning transmission electron microscope. We show that deliberate sub-sampling acquires images at least an order of magnitude faster than conventional low-dose methods for an equivalent electron dose. More importantly, when adaptive sub-sampling is implemented to acquire the images, there is a significant increase in the resolution and sensitivity which accompanies the increase in imaging speed. Lastly, we demonstrate the potential of this method for beam sensitive materials and in-situ observationsmore » by experimentally imaging the node distribution in a metal-organic framework.« less
A sub-sampled approach to extremely low-dose STEM
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stevens, A.; Luzi, L.; Yang, H.
The inpainting of deliberately and randomly sub-sampled images offers a potential means to image specimens at a high resolution and under extremely low-dose conditions (≤1 e -/Å 2) using a scanning transmission electron microscope. We show that deliberate sub-sampling acquires images at least an order of magnitude faster than conventional low-dose methods for an equivalent electron dose. More importantly, when adaptive sub-sampling is implemented to acquire the images, there is a significant increase in the resolution and sensitivity which accompanies the increase in imaging speed. Lastly, we demonstrate the potential of this method for beam sensitive materials and in-situ observationsmore » by experimentally imaging the node distribution in a metal-organic framework.« less
Guo, Yan; Li, Xiaoming; Fang, Xiaoyi; Lin, Xiuyun; Song, Yan; Jiang, Shuling; Stanton, Bonita
2011-01-01
Sample representativeness remains one of the challenges in effective HIV/STD surveillance and prevention targeting MSM worldwide. Although convenience samples are widely used in studies of MSM, previous studies suggested that these samples might not be representative of the broader MSM population. This issue becomes even more critical in many developing countries where needed resources for conducting probability sampling are limited. We examined variations in HIV and Syphilis infections and sociodemographic and behavioral factors among 307 young migrant MSM recruited using four different convenience sampling methods (peer outreach, informal social network, Internet, and venue-based) in Beijing, China in 2009. The participants completed a self-administered survey and provided blood specimens for HIV/STD testing. Among the four MSM samples using different recruitment methods, rates of HIV infections were 5.1%, 5.8%, 7.8%, and 3.4%; rates of Syphilis infection were 21.8%, 36.2%, 11.8%, and 13.8%; rates of inconsistent condom use were 57%, 52%, 58%, and 38%. Significant differences were found in various sociodemographic characteristics (e.g., age, migration history, education, income, places of employment) and risk behaviors (e.g., age at first sex, number of sex partners, involvement in commercial sex, and substance use) among samples recruited by different sampling methods. The results confirmed the challenges of obtaining representative MSM samples and underscored the importance of using multiple sampling methods to reach MSM from diverse backgrounds and in different social segments and to improve the representativeness of the MSM samples when the use of probability sampling approach is not feasible. PMID:21711162
Guo, Yan; Li, Xiaoming; Fang, Xiaoyi; Lin, Xiuyun; Song, Yan; Jiang, Shuling; Stanton, Bonita
2011-11-01
Sample representativeness remains one of the challenges in effective HIV/STD surveillance and prevention targeting men who have sex with men (MSM) worldwide. Although convenience samples are widely used in studies of MSM, previous studies suggested that these samples might not be representative of the broader MSM population. This issue becomes even more critical in many developing countries where needed resources for conducting probability sampling are limited. We examined variations in HIV and Syphilis infections and sociodemographic and behavioral factors among 307 young migrant MSM recruited using four different convenience sampling methods (peer outreach, informal social network, Internet, and venue-based) in Beijing, China in 2009. The participants completed a self-administered survey and provided blood specimens for HIV/STD testing. Among the four MSM samples using different recruitment methods, rates of HIV infections were 5.1%, 5.8%, 7.8%, and 3.4%; rates of Syphilis infection were 21.8%, 36.2%, 11.8%, and 13.8%; and rates of inconsistent condom use were 57%, 52%, 58%, and 38%. Significant differences were found in various sociodemographic characteristics (e.g., age, migration history, education, income, and places of employment) and risk behaviors (e.g., age at first sex, number of sex partners, involvement in commercial sex, and substance use) among samples recruited by different sampling methods. The results confirmed the challenges of obtaining representative MSM samples and underscored the importance of using multiple sampling methods to reach MSM from diverse backgrounds and in different social segments and to improve the representativeness of the MSM samples when the use of probability sampling approach is not feasible.
Raats, Dina; Offek, Maya; Minz, Dror; Halpern, Malka
2011-05-01
The impact of refrigeration on raw cow milk bacterial communities in three farm bulk tanks and three dairy plant silo tanks was studied using two methods: DGGE and cloning. Both methods demonstrated that bacterial taxonomic diversity decreased during refrigeration. Gammaproteobacteria, especially Pseudomonadales, dominated the milk after refrigeration. Farm samples and dairy plant samples differed in their microbial community composition, the former showing prevalence of Gram-positive bacteria affiliated with the classes Bacilli, Clostridia and Actinobacteria, the latter showing prevalence of Gram-negative species belonging to the Gammaproteobacteria class. Actinobacteria prevalence in the farm milk samples immediately after collection stood at about 25% of the clones. A previous study had found that psychrotolerant Actinobacteria identified in raw cow milk demonstrated both lipolytic and proteolytic enzymatic activity. Thus, we conclude that although Pseudomonadales play an important role in milk spoilage after long periods of cold incubation, Actinobacteria occurrence may play an important role when assessing the quality of milk arriving at the dairy plant from different farms. As new cooling technologies reduce the initial bacterial counts of milk to very low levels, more sensitive and efficient methods to evaluate the bacterial quality of raw milk are required. The present findings are an important step towards achieving this goal. Copyright © 2010 Elsevier Ltd. All rights reserved.
Baek, Hyun Jae; Shin, JaeWook; Jin, Gunwoo; Cho, Jaegeol
2017-10-24
Photoplethysmographic signals are useful for heart rate variability analysis in practical ambulatory applications. While reducing the sampling rate of signals is an important consideration for modern wearable devices that enable 24/7 continuous monitoring, there have not been many studies that have investigated how to compensate the low timing resolution of low-sampling-rate signals for accurate heart rate variability analysis. In this study, we utilized the parabola approximation method and measured it against the conventional cubic spline interpolation method for the time, frequency, and nonlinear domain variables of heart rate variability. For each parameter, the intra-class correlation, standard error of measurement, Bland-Altman 95% limits of agreement and root mean squared relative error were presented. Also, elapsed time taken to compute each interpolation algorithm was investigated. The results indicated that parabola approximation is a simple, fast, and accurate algorithm-based method for compensating the low timing resolution of pulse beat intervals. In addition, the method showed comparable performance with the conventional cubic spline interpolation method. Even though the absolute value of the heart rate variability variables calculated using a signal sampled at 20 Hz were not exactly matched with those calculated using a reference signal sampled at 250 Hz, the parabola approximation method remains a good interpolation method for assessing trends in HRV measurements for low-power wearable applications.
Xuli, Wu; Weiyi, He; Ji, Kunmei; Wenpu, Wan; Dongsheng, Hu; Hui, Wu; Xinpin, Luo; Zhigang, Liu
2013-03-01
The ingredient declaration on food labels assumes paramount importance in the protection of food-allergic consumers. China has not implemented Food allergen labeling. A gold immunochromatography assay (GICA) was developed using 2 monoclonal antibodies (mAb) against the milk allergen β-lactoglobulin in this study. The GICA was specific for pure milk samples with a sensitivity of 0.2 ng/mL. Milk protein traces extracted from 110 food products were detected by this method. The labels of 106 were confirmed by our GICA method: 57 food samples originally labeled as containing milk were positive for β-lactoglobulin and 49 food samples labeled as not containing milk were negative for β-lactoglobulin. However, 3 food samples falsely labeled as containing milk were found to contain no β-lactoglobulin whereas 1 food sample labeled as not containing milk actually contained β-lactoglobulin. First, these negatives could be because of the addition of a casein fraction. Second, some countries demand that food manufacturers label all ingredients derived from milk as "containing milk" even though the ingredients contain no detectable milk protein by any method. Our GICA method could thus provide a fast and simple method for semiquantitatation of β-lactoglobulin in foods. The present method provides a fast, simple, semiquantitative method for the determination of milk allergens in foods. © 2013 Institute of Food Technologists®
Illera, Juan-Carlos; Silván, Gema; Cáceres, Sara; Carbonell, Maria-Dolores; Gerique, Cati; Martínez-Fernández, Leticia; Munro, Coralie; Casares, Miguel
2014-01-01
Monitoring ovarian cycles through hormonal analysis is important in order to improve breeding management of captive elephants, and non-invasive collection techniques are particularly interesting for this purpose. However, there are some practical difficulties in collecting proper samples, and easier and more practical methods may be an advantage for some institutions and/or some animals. This study describes the development and validation of an enzymeimmunoassay (EIA) for progestins in salivary samples of African elephants, Loxodonta africana. Weekly urinary and salivary samples from five non-pregnant elephant cows aged 7-12 years were obtained for 28 weeks and analyzed using EIA. Both techniques correlated positively (r = 0.799; P < 0.001), and the cycle characteristics obtained were identical. The results clearly show that ovarian cycles can be monitored by measuring progestins from salivary samples in the African elephant. This is a simple and non-invasive method that may be a practical alternative to other sampling methods used in the species. © 2014 Wiley Periodicals, Inc.
Treating Sample Covariances for Use in Strongly Coupled Atmosphere-Ocean Data Assimilation
NASA Astrophysics Data System (ADS)
Smith, Polly J.; Lawless, Amos S.; Nichols, Nancy K.
2018-01-01
Strongly coupled data assimilation requires cross-domain forecast error covariances; information from ensembles can be used, but limited sampling means that ensemble derived error covariances are routinely rank deficient and/or ill-conditioned and marred by noise. Thus, they require modification before they can be incorporated into a standard assimilation framework. Here we compare methods for improving the rank and conditioning of multivariate sample error covariance matrices for coupled atmosphere-ocean data assimilation. The first method, reconditioning, alters the matrix eigenvalues directly; this preserves the correlation structures but does not remove sampling noise. We show that it is better to recondition the correlation matrix rather than the covariance matrix as this prevents small but dynamically important modes from being lost. The second method, model state-space localization via the Schur product, effectively removes sample noise but can dampen small cross-correlation signals. A combination that exploits the merits of each is found to offer an effective alternative.
An economic passive sampling method to detect particulate pollutants using magnetic measurements.
Cao, Liwan; Appel, Erwin; Hu, Shouyun; Ma, Mingming
2015-10-01
Identifying particulate matter (PM) emitted from industrial processes into the atmosphere is an important issue in environmental research. This paper presents a passive sampling method using simple artificial samplers that maintains the advantage of bio-monitoring, but overcomes some of its disadvantages. The samplers were tested in a heavily polluted area (Linfen, China) and compared to results from leaf samples. Spatial variations of magnetic susceptibility from artificial passive samplers and leaf samples show very similar patterns. Scanning electron microscopy suggests that the collected PM are mostly in the range of 2-25 μm; frequent occurrence of spherical shape indicates industrial combustion dominates PM emission. Magnetic properties around power plants show different features than other plants. This sampling method provides a suitable and economic tool for semi-quantifying temporal and spatial distribution of air quality; they can be installed in a regular grid and calibrate the weight of PM. Copyright © 2015 Elsevier Ltd. All rights reserved.
Preparation of samples for leaf architecture studies, a method for mounting cleared leaves1
Vasco, Alejandra; Thadeo, Marcela; Conover, Margaret; Daly, Douglas C.
2014-01-01
• Premise of the study: Several recent waves of interest in leaf architecture have shown an expanding range of approaches and applications across a number of disciplines. Despite this increased interest, examination of existing archives of cleared and mounted leaves shows that current methods for mounting, in particular, yield unsatisfactory results and deterioration of samples over relatively short periods. Although techniques for clearing and staining leaves are numerous, published techniques for mounting leaves are scarce. • Methods and Results: Here we present a complete protocol and recommendations for clearing, staining, and imaging leaves, and, most importantly, a method to permanently mount cleared leaves. • Conclusions: The mounting protocol is faster than other methods, inexpensive, and straightforward; moreover, it yields clear and permanent samples that can easily be imaged, scanned, and stored. Specimens mounted with this method preserve well, with leaves that were mounted more than 35 years ago showing no signs of bubbling or discoloration. PMID:25225627
McHale, Michael R.; McChesney, Dennis
2007-01-01
In 2003, a study was conducted to evaluate the accuracy and precision of 10 laboratories that analyze water-quality samples for phosphorus concentrations in the Catskill Mountain region of New York State. Many environmental studies in this region rely on data from these different laboratories for water-quality analyses, and the data may be used in watershed modeling and management decisions. Therefore, it is important to determine whether the data reported by these laboratories are of comparable accuracy and precision. Each laboratory was sent 12 samples for triplicate analysis for total phosphorus, total dissolved phosphorus, and soluble reactive phosphorus. Eight of these laboratories reported results that met comparability criteria for all samples; the remaining two laboratories met comparability criteria for only about half of the analyses. Neither the analytical method used nor the sample concentration ranges appeared to affect the comparability of results. The laboratories whose results were comparable gave consistently comparable results throughout the concentration range analyzed, and the differences among methods did not diminish comparability. All laboratories had high data precision as indicated by sample triplicate results. In addition, the laboratories consistently reported total phosphorus values greater than total dissolved phosphorus values, and total dissolved phosphorus values greater than soluble reactive phosphorus values, as would be expected. The results of this study emphasize the importance of regular laboratory participation in sample-exchange programs.
Duan, Wenjie
2015-01-01
Objective. Relationship, vitality, and conscientiousness are three fundamental virtues that have been recently identified as important individual differences to health, well being, and positive development. This cross-sectional study attempted to explore the relationship between the three constructs and post-traumatic growth (PTG) in three directions, including indirect trauma samples without post-traumatic stress disorder (PTSD), direct trauma samples without PTSD, and direct trauma samples with PTSD. Methods. A total of 340 community participants from Sichuan Province, Mainland China involved in the study, most of which experienced Wenchuan and Lushan Earthquake. Participants were required to complete the self-reported questionnaire packages at one time point for obtaining their scores on virtues (Chinese Virtues Questionnaire), PTSD (PTSD Checklist-Specific), and PTG (Post-traumatic Growth Inventory-Chinese). Results. Significant and positive correlations between the three virtues and PTG were identified (r = .39–.56; p < .01). Further regression analysis by stepwise method reveled that: in the indirect trauma samples, vitality explained 32% variance of PTG. In reference to the direct trauma sample without PTSD, both relationship and conscientiousness explained 32% variance of PTG, whereas in the direct trauma sample with PTSD, only conscientiousness accounted for 31% the variance in PTG. Conclusion.This cross-sectional investigation partly revealed the roles of different virtues in trauma context. Findings suggest important implications for strengths-based treatment. PMID:25870774
Determination of Protein Content by NIR Spectroscopy in Protein Powder Mix Products.
Ingle, Prashant D; Christian, Roney; Purohit, Piyush; Zarraga, Veronica; Handley, Erica; Freel, Keith; Abdo, Saleem
2016-01-01
Protein is a principal component in commonly used dietary supplements and health food products. The analysis of these products, within the consumer package form, is of critical importance for the purpose of ensuring quality and supporting label claims. A rapid test method was developed using near-infrared (NIR) spectroscopy as a compliment to current protein determination by the Dumas combustion method. The NIR method was found to be a rapid, low-cost, and green (no use of chemicals and reagents) complimentary technique. The protein powder samples analyzed in this study were in the range of 22-90% protein. The samples were prepared as mixtures of soy protein, whey protein, and silicon dioxide ingredients, which are common in commercially sold protein powder drink-mix products in the market. A NIR regression model was developed with 17 samples within the constituent range and was validated with 20 independent samples of known protein levels (85-88%). The results show that the NIR method is capable of predicting the protein content with a bias of ±2% and a maximum bias of 3% between NIR and the external Dumas method.
Detection of cocaine in cargo containers by high-volume vapor sampling: field test at Port of Miami
NASA Astrophysics Data System (ADS)
Neudorfl, Pavel; Hupe, Michael; Pilon, Pierre; Lawrence, Andre H.; Drolet, Gerry; Su, Chih-Wu; Rigdon, Stephen W.; Kunz, Terry D.; Ulwick, Syd; Hoglund, David E.; Wingo, Jeff J.; Demirgian, Jack C.; Shier, Patrick
1997-02-01
The use of marine containers is a well known smuggling method for large shipments of drugs. Such containers present an ideal method of smuggling as the examination method is time consuming, difficult and expensive for the importing community. At present, various methods are being studied for screening containers which would allow to rapidly distinguish between innocent and suspicious cargo. Air sampling is one such method. Air is withdrawn for the inside of containers and analyzed for telltale vapors uniquely associated with the drug. The attractive feature of the technique is that the containers could be sampled without destuffing and opening, since air could be conveniently withdrawn via ventilation ducts. In the present paper, the development of air sampling methodology for the detection of cocaine hydrochloride will be discussed, and the results from a recent field test will be presented. The results indicated that vapors of cocaine and its decomposition product, ecgonidine methyl ester, could serve as sensitive indicators of the presence of the drug in the containers.
Comparison of DNA extraction methods for meat analysis.
Yalçınkaya, Burhanettin; Yumbul, Eylem; Mozioğlu, Erkan; Akgoz, Muslum
2017-04-15
Preventing adulteration of meat and meat products with less desirable or objectionable meat species is important not only for economical, religious and health reasons, but also, it is important for fair trade practices, therefore, several methods for identification of meat and meat products have been developed. In the present study, ten different DNA extraction methods, including Tris-EDTA Method, a modified Cetyltrimethylammonium Bromide (CTAB) Method, Alkaline Method, Urea Method, Salt Method, Guanidinium Isothiocyanate (GuSCN) Method, Wizard Method, Qiagen Method, Zymogen Method and Genespin Method were examined to determine their relative effectiveness for extracting DNA from meat samples. The results show that the salt method is easy to perform, inexpensive and environmentally friendly. Additionally, it has the highest yield among all the isolation methods tested. We suggest this method as an alternative method for DNA isolation from meat and meat products. Copyright © 2016 Elsevier Ltd. All rights reserved.
Caldas, Sergiane Souza; Soares, Bruno Meira; Abreu, Fiamma; Castro, Ítalo Braga; Fillmann, Gilberto; Primel, Ednei Gilberto
2018-03-01
This paper reports the development of an analytical method employing vortex-assisted matrix solid-phase dispersion (MSPD) for the extraction of diuron, Irgarol 1051, TCMTB (2-thiocyanomethylthiobenzothiazole), DCOIT (4,5-dichloro-2-n-octyl-3-(2H)-isothiazolin-3-one), and dichlofluanid from sediment samples. Separation and determination were performed by liquid chromatography tandem-mass spectrometry. Important MSPD parameters, such as sample mass, mass of C18, and type and volume of extraction solvent, were investigated by response surface methodology. Quantitative recoveries were obtained with 2.0 g of sediment sample, 0.25 g of C18 as the solid support, and 10 mL of methanol as the extraction solvent. The MSPD method was suitable for the extraction and determination of antifouling biocides in sediment samples, with recoveries between 61 and 103% and a relative standard deviation lower than 19%. Limits of quantification between 0.5 and 5 ng g -1 were obtained. Vortex-assisted MPSD was shown to be fast and easy to use, with the advantages of low cost and reduced solvent consumption compared to the commonly employed techniques for the extraction of booster biocides from sediment samples. Finally, the developed method was applied to real samples. Results revealed that the developed extraction method is effective and simple, thus allowing the determination of biocides in sediment samples.
A comparison of moment-based methods of estimation for the log Pearson type 3 distribution
NASA Astrophysics Data System (ADS)
Koutrouvelis, I. A.; Canavos, G. C.
2000-06-01
The log Pearson type 3 distribution is a very important model in statistical hydrology, especially for modeling annual flood series. In this paper we compare the various methods based on moments for estimating quantiles of this distribution. Besides the methods of direct and mixed moments which were found most successful in previous studies and the well-known indirect method of moments, we develop generalized direct moments and generalized mixed moments methods and a new method of adaptive mixed moments. The last method chooses the orders of two moments for the original observations by utilizing information contained in the sample itself. The results of Monte Carlo experiments demonstrated the superiority of this method in estimating flood events of high return periods when a large sample is available and in estimating flood events of low return periods regardless of the sample size. In addition, a comparison of simulation and asymptotic results shows that the adaptive method may be used for the construction of meaningful confidence intervals for design events based on the asymptotic theory even with small samples. The simulation results also point to the specific members of the class of generalized moments estimates which maintain small values for bias and/or mean square error.
Yan, Ying; Han, Bingqing; Zeng, Jie; Zhou, Weiyan; Zhang, Tianjiao; Zhang, Jiangtao; Chen, Wenxiang; Zhang, Chuanbao
2017-08-28
Potassium is an important serum ion that is frequently assayed in clinical laboratories. Quality assurance requires reference methods; thus, the establishment of a candidate reference method for serum potassium measurements is important. An inductively coupled plasma mass spectrometry (ICP-MS) method was developed. Serum samples were gravimetrically spiked with an aluminum internal standard, digested with 69% ultrapure nitric acid, and diluted to the required concentration. The 39K/27Al ratios were measured by ICP-MS in hydrogen mode. The method was calibrated using 5% nitric acid matrix calibrators, and the calibration function was established using the bracketing method. The correlation coefficients between the measured 39K/27Al ratios and the analyte concentration ratios were >0.9999. The coefficients of variation were 0.40%, 0.68%, and 0.22% for the three serum samples, and the analytical recovery was 99.8%. The accuracy of the measurement was also verified by measuring certified reference materials, SRM909b and SRM956b. Comparison with the ion selective electrode routine method and international inter-laboratory comparisons gave satisfied results. The new ICP-MS method is specific, precise, simple, and low-cost, and it may be used as a candidate reference method for standardizing serum potassium measurements.
Winhusen, Theresa; Winstanley, Erin L; Somoza, Eugene; Brigham, Gregory
2012-01-01
Recruitment method can impact the sample composition of a clinical trial and, thus, the generalizability of the results, but the importance of recruitment method in substance use disorder trials has received little attention. The present paper sought to address this research gap by evaluating the association between recruitment method and sample characteristics and treatment outcomes in a substance use disorder trial. In a multi-site trial evaluating Seeking Safety (SS), relative to Women's Health Education (WHE), for women with co-occurring PTSD (either sub-threshold or full PTSD) and substance use disorders, one site assessed the method by which each participant was recruited. Data from this site (n=106), which recruited participants from newspaper advertising and clinic intakes, were analyzed. Participants recruited through advertising, relative to those from the clinic, had significantly higher levels of baseline drug use and higher rates of meeting DSM-IV-TR criteria for full PTSD. Results suggest that the effectiveness of SS in decreasing PTSD symptoms was greater for participants recruited through advertising relative to those recruited from the clinic. Conversely, the results revealed a significant treatment effect in the clinic-recruited participants, not seen in the advertising-recruited participants, with SS, relative to WHE, participants being more likely to report past week drug use during the follow-up phase. Recruitment method may impact sample composition and treatment effects. Replication of this finding would have important implications for substance use disorder efficacy trials which often utilize advertising to recruit participants. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.
Fernandes, Richard; Carey, Conn; Hynes, James; Papkovsky, Dmitri
2013-01-01
The importance of food safety has resulted in a demand for a more rapid, high-throughput method for total viable count (TVC). The industry standard for TVC determination (ISO 4833:2003) is widely used but presents users with some drawbacks. The method is materials- and labor-intensive, requiring multiple agar plates per sample. More importantly, the method is slow, with 72 h typically required for a definitive result. Luxcel Biosciences has developed the GreenLight Model 960, a microtiter plate-based assay providing a rapid high-throughput method of aerobic bacterial load assessment through analysis of microbial oxygen consumption. Results are generated in 1-12 h, depending on microbial load. The mix and measure procedure allows rapid detection of microbial oxygen consumption and equates oxygen consumption to microbial load (CFU/g), providing a simple, sensitive means of assessing the microbial contamination levels in foods (1). As bacteria in the test sample grow and respire, they deplete O2, which is detected as an increase in the GreenLight probe signal above the baseline level (2). The time required to reach this increase in signal can be used to calculate the CFU/g of the original sample, based on a predetermined calibration. The higher the initial microbial load, the earlier this threshold is reached (1).
Pérez-Zárate, Pamela; Aragón-Piña, Antonio; Soria-Guerra, Ruth Elena; González-Amaro, Ana María; Pérez-Urizar, José; Pérez-González, Luis Fernando; Martinez-Gutierrez, Fidel
2015-11-01
To determinate the significance of risk factors with the presence of biofilm on catheters of patients attended at tertiary hospital cares. A total of 126 patients were included, data collection by observing the handling of the CVC, clinical history and microbiological isolation methods of CVCs tips (Roll-plate, sonication and scanning electron microscopy) were evaluated. Certain factors, such as the lack of proper hand washing, the use of primary barriers and preparing medications in the same hospital service, showed an important relationship between biofilm formation in CVCs. The sonication method presented that most of the samples had isolation of multispecies 29 samples (64%); in contrast with the roll-plate method, just one sample (3%) was isolated. The importance of the strict aseptic techniques of insertion and of the handlings of CVC was highlighted, the failure of both techniques was related to the biofilm formation and was evidenced using the scanning electron microscopy. Since this tool is not available in most hospitals, we present the correlation of those evidences with other standard microbiological methods and risk factors, which are necessary for the sensible detection of the different steps of the biofilm formation on CVC and their correct interpretation with clinical evidences. Copyright © 2015 Elsevier Ltd. All rights reserved.
Preanalytical requirements of urinalysis
Delanghe, Joris; Speeckaert, Marijn
2014-01-01
Urine may be a waste product, but it contains an enormous amount of information. Well-standardized procedures for collection, transport, sample preparation and analysis should become the basis of an effective diagnostic strategy for urinalysis. As reproducibility of urinalysis has been greatly improved due to recent technological progress, preanalytical requirements of urinalysis have gained importance and have become stricter. Since the patients themselves often sample urine specimens, urinalysis is very susceptible to preanalytical issues. Various sampling methods and inappropriate specimen transport can cause important preanalytical errors. The use of preservatives may be helpful for particular analytes. Unfortunately, a universal preservative that allows a complete urinalysis does not (yet) exist. The preanalytical aspects are also of major importance for newer applications (e.g. metabolomics). The present review deals with the current preanalytical problems and requirements for the most common urinary analytes. PMID:24627718
MOHEBALI, Mehdi; ZAREI, Zabiholah; Khanaliha, Khadijeh; KIA, Eshrat Beigom; MOTAVALLI-HAGHI, Afsaneh; DAVOODI, Jaber; REZAEIAN, Tahereh; TARIGHI, Fathemeh; REZAEIAN, Mostafa
2017-01-01
Background: Majority of parasitic infections in rodents have zoonotic importance. This study aimed to determine the frequency and intensity of intestinal protozoa infections of rodents including Meriones persicus, Mus musculus and, Cricetulus migratorius. Methods: This survey was conducted in Meshkin Shahr district in northwestern Iran from Mar. to Dec. of 2014. Intestinal samples of 204 rodents including M. persicus (n=117), M. musculus (n=63) and C. migratorius (n=24) were parasitologically examined. Formalin-ether concentration method was done for all of rodents stool samples and observed with light microscope. All of suspected cases were stained with trichorome staining Method. Cultivation in dichromate potassium 2.5% was carried out for all of coccidian positive samples. Acid fast and aniline blue staining methods were used for detecting of coccidian oocysts and intestinal microsporidial spores, respectively. Results: About 121(59.3%) of the caught rodents were generally infected with intestinal protozoa. Entamoeba muris 14(6.9%), Trichomonas muris 55(27.0%), Chilomastix betencourtti 17 (8.3%), Giardia muris 19(9.3%), Eimeria spp. 46(22.5%), Isospora spp. 4(2%) and Cryptosporidium spp. 1(0.5%) were found from the collected rodents. Microsporidian spores were identified in 63 (31%) out of the 204 collected rodents using aniline blue staining method. Conclusion: Since some of the infections are zoonotic importance thus, control of rodents can be decreased new cases of the parasitic zoonoses in humans. PMID:28979348
Spindler, Patrice; Paretti, Nick V.
2007-01-01
The Arizona Department of Environmental Quality (ADEQ) and the U.S. Environmental Protection Agency (USEPA) Ecological Monitoring and Assessment Program (EMAP), use different field methods for collecting macroinvertebrate samples and habitat data for bioassessment purposes. Arizona’s Biocriteria index was developed using a riffle habitat sampling methodology, whereas the EMAP method employs a multi-habitat sampling protocol. There was a need to demonstrate comparability of these different bioassessment methodologies to allow use of the EMAP multi-habitat protocol for both statewide probabilistic assessments for integration of the EMAP data into the national (305b) assessment and for targeted in-state bioassessments for 303d determinations of standards violations and impaired aquatic life conditions. The purpose of this study was to evaluate whether the two methods yield similar bioassessment results, such that the data could be used interchangeably in water quality assessments. In this Regional EMAP grant funded project, a probabilistic survey of 30 sites in the Little Colorado River basin was conducted in the spring of 2007. Macroinvertebrate and habitat data were collected using both ADEQ and EMAP sampling methods, from adjacent reaches within these stream channels.
All analyses indicated that the two macroinvertebrate sampling methods were significantly correlated. ADEQ and EMAP samples were classified into the same scoring categories (meeting, inconclusive, violating the biocriteria standard) 82% of the time. When the ADEQ-IBI was applied to both the ADEQ and EMAP taxa lists, the resulting IBI scores were significantly correlated (r=0.91), even though only 4 of the 7 metrics in the IBI were significantly correlated. The IBI scores from both methods were significantly correlated to the percent of riffle habitat, even though the average percent riffle habitat was only 30% of the stream reach. Multivariate analyses found that the percent riffle was an important attribute for both datasets in classifying IBI scores into assessment categories.
Habitat measurements generated from EMAP and ADEQ methods were also significantly correlated; 13 of 16 habitat measures were significantly correlated (p<0.01). The visual-based percentage estimates of percent riffle and pool habitats, vegetative cover and percent canopy cover, and substrate measurements of percent fine substrate and embeddedness were all remarkably similar, given the different field methods used. A multivariate analysis identified substrate and flow conditions, as well as canopy cover as important combinations of habitat attributes affecting both IBI scores. These results indicate that similar habitat measures can be obtained using two different field sampling protocols. In addition, similar combinations of these habitat parameters were important to macroinvertebrate community condition in multivariate analyses of both ADEQ and EMAP datasets.
These results indicate the two sampling methods for macroinvertebrates and habitat data were very similar in terms of bioassessment results and stressors. While the bioassessment category was not identical for all sites, overall the assessments were significantly correlated, providing similar bioassessment results for the cold water streams used in this study. The findings of this study indicate that ADEQ can utilize either a riffle-based sampling methodology or a multi-habitat sampling approach in cold water streams as both yield similar results relative to the macroinvertebrate assemblage. These results will allow for use of either macroinvertebrate dataset to determine water quality standards compliance with the ADEQ Indexes of Biological Integrity, for which threshold values were just recently placed into the Arizona Surface Water Quality Standards. While this survey did not include warm water desert streams of Arizona, we would predict that EMAP and ADEQ sampling methodologies would provide similar bioassessment results and would not be significantly different, as we have found that the percent riffle habitat in cold and warm water perennial, wadeable streams is not significantly different. However, a comparison study of sampling methodologies in warm water streams should be conducted to confirm the predicted similarity of bioassessment results. ADEQ will continue to implement a monitoring strategy that includes probabilistic monitoring for a statewide ecological assessment of stream conditions. Conclusions from this study will guide decisions regarding the most appropriate sampling methods for future probabilistic monitoring sample plans.
Sample selection via angular distance in the space of the arguments of an artificial neural network
NASA Astrophysics Data System (ADS)
Fernández Jaramillo, J. M.; Mayerle, R.
2018-05-01
In the construction of an artificial neural network (ANN) a proper data splitting of the available samples plays a major role in the training process. This selection of subsets for training, testing and validation affects the generalization ability of the neural network. Also the number of samples has an impact in the time required for the design of the ANN and the training. This paper introduces an efficient and simple method for reducing the set of samples used for training a neural network. The method reduces the required time to calculate the network coefficients, while keeping the diversity and avoiding overtraining the ANN due the presence of similar samples. The proposed method is based on the calculation of the angle between two vectors, each one representing one input of the neural network. When the angle formed among samples is smaller than a defined threshold only one input is accepted for the training. The accepted inputs are scattered throughout the sample space. Tidal records are used to demonstrate the proposed method. The results of a cross-validation show that with few inputs the quality of the outputs is not accurate and depends on the selection of the first sample, but as the number of inputs increases the accuracy is improved and differences among the scenarios with a different starting sample have and important reduction. A comparison with the K-means clustering algorithm shows that for this application the proposed method with a smaller number of samples is producing a more accurate network.
Jantzi, Sarah C; Almirall, José R
2011-07-01
A method for the quantitative elemental analysis of surface soil samples using laser-induced breakdown spectroscopy (LIBS) was developed and applied to the analysis of bulk soil samples for discrimination between specimens. The use of a 266 nm laser for LIBS analysis is reported for the first time in forensic soil analysis. Optimization of the LIBS method is discussed, and the results compared favorably to a laser ablation inductively coupled plasma mass spectrometry (LA-ICP-MS) method previously developed. Precision for both methods was <10% for most elements. LIBS limits of detection were <33 ppm and bias <40% for most elements. In a proof of principle study, the LIBS method successfully discriminated samples from two different sites in Dade County, FL. Analysis of variance, Tukey's post hoc test and Student's t test resulted in 100% discrimination with no type I or type II errors. Principal components analysis (PCA) resulted in clear groupings of the two sites. A correct classification rate of 99.4% was obtained with linear discriminant analysis using leave-one-out validation. Similar results were obtained when the same samples were analyzed by LA-ICP-MS, showing that LIBS can provide similar information to LA-ICP-MS. In a forensic sampling/spatial heterogeneity study, the variation between sites, between sub-plots, between samples and within samples was examined on three similar Dade sites. The closer the sampling locations, the closer the grouping on a PCA plot and the higher the misclassification rate. These results underscore the importance of careful sampling for geographic site characterization.
Interpretation of Blood Microbiology Results - Function of the Clinical Microbiologist.
Kristóf, Katalin; Pongrácz, Júlia
2016-04-01
The proper use and interpretation of blood microbiology results may be one of the most challenging and one of the most important functions of clinical microbiology laboratories. Effective implementation of this function requires careful consideration of specimen collection and processing, pathogen detection techniques, and prompt and precise reporting of identification and susceptibility results. The responsibility of the treating physician is proper formulation of the analytical request and to provide the laboratory with complete and precise patient information, which are inevitable prerequisites of a proper testing and interpretation. The clinical microbiologist can offer advice concerning the differential diagnosis, sampling techniques and detection methods to facilitate diagnosis. Rapid detection methods are essential, since the sooner a pathogen is detected, the better chance the patient has of getting cured. Besides the gold-standard blood culture technique, microbiologic methods that decrease the time in obtaining a relevant result are more and more utilized today. In the case of certain pathogens, the pathogen can be identified directly from the blood culture bottle after propagation with serological or automated/semi-automated systems or molecular methods or with MALDI-TOF MS (matrix-assisted laser desorption-ionization time of flight mass spectrometry). Molecular biology methods are also suitable for the rapid detection and identification of pathogens from aseptically collected blood samples. Another important duty of the microbiology laboratory is to notify the treating physician immediately about all relevant information if a positive sample is detected. The clinical microbiologist may provide important guidance regarding the clinical significance of blood isolates, since one-third to one-half of blood culture isolates are contaminants or isolates of unknown clinical significance. To fully exploit the benefits of blood culture and other (non- culture based) diagnoses, the microbiologist and the clinician should interact directly.
Online air analysis of reduced sulfur compounds at a swine facility
USDA-ARS?s Scientific Manuscript database
Reduced sulfur compounds are emitted from waste management handling and can be important in odor production and atmospheric chemistry. Data on the emissions of these compounds have been obtained using off-line sampling and analysis methods, but on-line methods providing information on temporal chang...
USDA-ARS?s Scientific Manuscript database
Importance: Human milk is the subject of many nutrition studies but methods for representative sample collection are not established. Our recently improved, validated methods for analyzing micronutrients in human milk now enable systematic study of factors affecting their concentration. Objective...
USDA-ARS?s Scientific Manuscript database
Importance: Human milk is the subject of many nutrition studies but methods for representative sample collection are not established. Our recently improved, validated methods for analyzing micronutrients in human milk now enable systematic study of factors affecting their concentration. Objective:...
METHOD FOR MEASURING BASE/NEUTRAL AND CARBAMATE PESTICIDES IN PERSONAL DIETARY SAMPLES
Dietary uptake may be a significant pathway of exposure to contaminants. As such,dietary exposure assessments should be considered an important part of the total exposure assessment process. The objective of this work was to develop reliable methods that are applicable to a wide ...
METHOD FOR MEASURING BASE/NEUTRAL AND CARBAMATE PESTICIDES IN PERSONAL DIETARY SAMPLES
Dietary uptake may be a significant pathway of exposure to contaminants. As such, dietary exposure assessments should be considered an important part of the total exposure assessment process. The objective of this work was to develop reliable methods that are applicable to a wide...
Bai, Lu; Guo, Sen; Liu, Qingchao; Cui, Xueqin; Zhang, Xinxin; Zhang, Li; Yang, Xinwen; Hou, Manwei; Ho, Chi-Tang; Bai, Naisheng
2016-04-01
Polyphenols are important bioactive substances in apple. To explore the profiles of the nine representative polyphenols in this fruit, a high-performance liquid chromatography method has been established and validated. The validated method was successfully applied for the simultaneous characterization and quantification of these nine apple polyphenols in 11 apple extracts, which were obtained from six cultivars from Shaanxi Province, China. The results showed that only abscission of the Fuji apple sample was rich in the nine apple polyphenols, and the polyphenol contents of other samples varied. Although all the samples were collected in the same region, the contents of nine polyphenols were different. The proposed method could serve as a prerequisite for quality control of Malus products. Copyright © 2015. Published by Elsevier B.V.
Chen, Wei-Qiang; Obermayr, Philipp; Černigoj, Urh; Vidič, Jana; Panić-Janković, Tanta; Mitulović, Goran
2017-11-01
Classical proteomics approaches involve enzymatic hydrolysis of proteins (either separated by polyacrylamide gels or in solution) followed by peptide identification using LC-MS/MS analysis. This method requires normally more than 16 h to complete. In the case of clinical analysis, it is of the utmost importance to provide fast and reproducible analysis with minimal manual sample handling. Herein we report the method development for online protein digestion on immobilized monolithic enzymatic reactors (IMER) to accelerate protein digestion, reduce manual sample handling, and provide reproducibility to the digestion process in clinical laboratory. An integrated online digestion and separation method using monolithic immobilized enzymatic reactor was developed and applied to digestion and separation of in-vitro-fertilization media. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
The Vineyard Yeast Microbiome, a Mixed Model Microbial Map
Setati, Mathabatha Evodia; Jacobson, Daniel; Andong, Ursula-Claire; Bauer, Florian
2012-01-01
Vineyards harbour a wide variety of microorganisms that play a pivotal role in pre- and post-harvest grape quality and will contribute significantly to the final aromatic properties of wine. The aim of the current study was to investigate the spatial distribution of microbial communities within and between individual vineyard management units. For the first time in such a study, we applied the Theory of Sampling (TOS) to sample gapes from adjacent and well established commercial vineyards within the same terroir unit and from several sampling points within each individual vineyard. Cultivation-based and molecular data sets were generated to capture the spatial heterogeneity in microbial populations within and between vineyards and analysed with novel mixed-model networks, which combine sample correlations and microbial community distribution probabilities. The data demonstrate that farming systems have a significant impact on fungal diversity but more importantly that there is significant species heterogeneity between samples in the same vineyard. Cultivation-based methods confirmed that while the same oxidative yeast species dominated in all vineyards, the least treated vineyard displayed significantly higher species richness, including many yeasts with biocontrol potential. The cultivatable yeast population was not fully representative of the more complex populations seen with molecular methods, and only the molecular data allowed discrimination amongst farming practices with multivariate and network analysis methods. Importantly, yeast species distribution is subject to significant intra-vineyard spatial fluctuations and the frequently reported heterogeneity of tank samples of grapes harvested from single vineyards at the same stage of ripeness might therefore, at least in part, be due to the differing microbiota in different sections of the vineyard. PMID:23300721
Analysis of Endocrine Disrupting Pesticides by Capillary GC with Mass Spectrometric Detection
Matisová, Eva; Hrouzková, Svetlana
2012-01-01
Endocrine disrupting chemicals, among them many pesticides, alter the normal functioning of the endocrine system of both wildlife and humans at very low concentration levels. Therefore, the importance of method development for their analysis in food and the environment is increasing. This also covers contributions in the field of ultra-trace analysis of multicomponent mixtures of organic pollutants in complex matrices. With this fact conventional capillary gas chromatography (CGC) and fast CGC with mass spectrometric detection (MS) has acquired a real importance in the analysis of endocrine disrupting pesticide (EDP) residues. This paper provides an overview of GC methods, including sample preparation steps, for analysis of EDPs in a variety of matrices at ultra-trace concentration levels. Emphasis is put on separation method, mode of MS detection and ionization and obtained limits of detection and quantification. Analysis time is one of the most important aspects that should be considered in the choice of analytical methods for routine analysis. Therefore, the benefits of developed fast GC methods are important. PMID:23202677
NASA Technical Reports Server (NTRS)
Horai, K.-I.
1981-01-01
A theory of the measurement of the thermal diffusivity of a sample by the modified Angstrom method is developed for the case in which radiative heat loss from the end surface of the sample is not negligible, and applied to measurements performed on lunar samples. Formulas allowing sample thermal diffusivity to be determined from the amplitude decay and phase lag of a temperature wave traveling through the sample are derived for a flat disk sample for which only heat loss from the end surface is important, and a sample of finite diameter and length for which heat loss through the end and side surfaces must be considered. It is noted that in the case of a flat disk, measurements at a single angular frequency of the temperature wave are sufficient, while the sample of finite diameter and length requires measurements at two discrete angular frequencies. Comparison of the values of the thermal diffusivities of two lunar samples of dimensions approximately 1 x 1 x 2 cm derived by the present methods and by the Angstrom theory for a finite bar reveals them to differ by not more than 5%, and indicates that more refined data are required as the measurement theory becomes more complicated.
Alp, Alpaslan; Us, Dürdal; Hasçelik, Gülşen
2004-01-01
Rapid quantitative molecular methods are very important for the diagnosis of human immunodeficiency virus (HIV) infections, assessment of prognosis and follow up. The purpose of this study was to compare and evaluate the performances of conventional manual extraction method and automated MagNA Pure system, for the nucleic acid isolation step which is the first and most important step in molecular diagnosis of HIV infections. Plasma samples of 35 patients in which anti-HIV antibodies were found as positive by microparticule enzyme immunoassay and confirmed by immunoblotting method, were included in the study. The nucleic acids obtained simultaneously by manual isolation kit (Cobas Amplicor, HIV-1 Monitor Test, version 1.5, Roche Diagnostics) and automated system (MagNA Pure LC Total Nucleic Acid Isolation Kit, Roche Diagnostics), were amplified and detected in Cobas Amplicor (Roche Diagnostics) instrument. Twenty three of 35 samples (65.7%) were found to be positive, and 9 (25.7%) were negative by both of the methods. The agreement between the methods were detected as 91.4%, for qualitative results. Viral RNA copies detected by manual and MagNA Pure isolation methods were found between 76.0-7.590.000 (mean: 487.143) and 113.0-20.300.0000 (mean: 2.174.097) copies/ml, respectively. When both of the overall and individual results were evaluated, the number of RNA copies obtained with automatized system, were found higher than the manual method (p<0.05). Three samples which had low numbers of nucleic acids (113, 773, 857, respectively) with MagNA Pure, yielded negative results with manual method. In conclusion, the automatized MagNA Pure system was found to be a reliable, rapid and practical method for the isolation of HIV-RNA.
Sample size for post-marketing safety studies based on historical controls.
Wu, Yu-te; Makuch, Robert W
2010-08-01
As part of a drug's entire life cycle, post-marketing studies are an important part in the identification of rare, serious adverse events. Recently, the US Food and Drug Administration (FDA) has begun to implement new post-marketing safety mandates as a consequence of increased emphasis on safety. The purpose of this research is to provide exact sample size formula for the proposed hybrid design, based on a two-group cohort study with incorporation of historical external data. Exact sample size formula based on the Poisson distribution is developed, because the detection of rare events is our outcome of interest. Performance of exact method is compared to its approximate large-sample theory counterpart. The proposed hybrid design requires a smaller sample size compared to the standard, two-group prospective study design. In addition, the exact method reduces the number of subjects required in the treatment group by up to 30% compared to the approximate method for the study scenarios examined. The proposed hybrid design satisfies the advantages and rationale of the two-group design with smaller sample sizes generally required. 2010 John Wiley & Sons, Ltd.
Raessler, Michael; Wissuwa, Bianka; Breul, Alexander; Unger, Wolfgang; Grimm, Torsten
2008-09-10
The exact and reliable determination of carbohydrates in plant samples of different origin is of great importance with respect to plant physiology. Additionally, the identification and quantification of carbohydrates are necessary for the evaluation of the impact of these compounds on the biogeochemistry of carbon. To attain this goal, it is necessary to analyze a great number of samples with both high sensitivity and selectivity within a limited time frame. This paper presents a rugged and easy method that allows the isocratic chromatographic determination of 12 carbohydrates and sugar alcohols from one sample within 30 min. The method was successfully applied to a variety of plant materials with particular emphasis on perennial ryegrass samples of the species Lolium perenne. The method was easily extended to the analysis of the polysaccharide inulin after its acidic hydrolysis into the corresponding monomers without the need for substantial change of chromatographic conditions or even the use of enzymes. It therefore offers a fundamental advantage for the analysis of the complex mixture of nonstructural carbohydrates often found in plant samples.
Exploring RNA structure and dynamics through enhanced sampling simulations.
Mlýnský, Vojtěch; Bussi, Giovanni
2018-04-01
RNA function is intimately related to its structural dynamics. Molecular dynamics simulations are useful for exploring biomolecular flexibility but are severely limited by the accessible timescale. Enhanced sampling methods allow this timescale to be effectively extended in order to probe biologically relevant conformational changes and chemical reactions. Here, we review the role of enhanced sampling techniques in the study of RNA systems. We discuss the challenges and promises associated with the application of these methods to force-field validation, exploration of conformational landscapes and ion/ligand-RNA interactions, as well as catalytic pathways. Important technical aspects of these methods, such as the choice of the biased collective variables and the analysis of multi-replica simulations, are examined in detail. Finally, a perspective on the role of these methods in the characterization of RNA dynamics is provided. Copyright © 2018 Elsevier Ltd. All rights reserved.
The Direct Lighting Computation in Global Illumination Methods
NASA Astrophysics Data System (ADS)
Wang, Changyaw Allen
1994-01-01
Creating realistic images is a computationally expensive process, but it is very important for applications such as interior design, product design, education, virtual reality, and movie special effects. To generate realistic images, state-of-art rendering techniques are employed to simulate global illumination, which accounts for the interreflection of light among objects. In this document, we formalize the global illumination problem into a eight -dimensional integral and discuss various methods that can accelerate the process of approximating this integral. We focus on the direct lighting computation, which accounts for the light reaching the viewer from the emitting sources after exactly one reflection, Monte Carlo sampling methods, and light source simplification. Results include a new sample generation method, a framework for the prediction of the total number of samples used in a solution, and a generalized Monte Carlo approach for computing the direct lighting from an environment which for the first time makes ray tracing feasible for highly complex environments.
Sample normalization methods in quantitative metabolomics.
Wu, Yiman; Li, Liang
2016-01-22
To reveal metabolomic changes caused by a biological event in quantitative metabolomics, it is critical to use an analytical tool that can perform accurate and precise quantification to examine the true concentration differences of individual metabolites found in different samples. A number of steps are involved in metabolomic analysis including pre-analytical work (e.g., sample collection and storage), analytical work (e.g., sample analysis) and data analysis (e.g., feature extraction and quantification). Each one of them can influence the quantitative results significantly and thus should be performed with great care. Among them, the total sample amount or concentration of metabolites can be significantly different from one sample to another. Thus, it is critical to reduce or eliminate the effect of total sample amount variation on quantification of individual metabolites. In this review, we describe the importance of sample normalization in the analytical workflow with a focus on mass spectrometry (MS)-based platforms, discuss a number of methods recently reported in the literature and comment on their applicability in real world metabolomics applications. Sample normalization has been sometimes ignored in metabolomics, partially due to the lack of a convenient means of performing sample normalization. We show that several methods are now available and sample normalization should be performed in quantitative metabolomics where the analyzed samples have significant variations in total sample amounts. Copyright © 2015 Elsevier B.V. All rights reserved.
ERIC Educational Resources Information Center
Collishaw, Stephan; Pickles, Andrew; Messer, Julie; Rutter, Michael; Shearer, Christina; Maughan, Barbara
2007-01-01
Objective: Child abuse is an important risk for adult psychiatric morbidity. However, not all maltreated children experience mental health problems as adults. The aims of the present study were to address the extent of resilience to adult psychopathology in a representative community sample, and to explore predictors of a good prognosis. Methods:…
ERIC Educational Resources Information Center
Kocman, Andreas; Fischer, Linda; Weber, Germain
2018-01-01
Background: Obtaining employment is among the most important ambitions of people with intellectual disability. Progress towards comprehensive inclusive employment is hampered by numerous barriers. Limited research is available on these barriers and strategies to overcome them. Method: A mixed method approach in a sample of 30 HR-managers was used…
Karpasitou, Katerina; Drago, Francesca; Crespiatico, Loretta; Paccapelo, Cinzia; Truglio, Francesca; Frison, Sara; Scalamogna, Mario; Poli, Francesca
2008-03-01
Traditionally, blood group typing has been performed with serologic techniques, the classical method being the hemagglutination test. Serotyping, however, may present important limitations such as scarce availability of rare antisera, typing of recently transfused patients, and those with a positive direct antiglobulin test. Consequently, serologic tests are being complemented with molecular methods. The aim of this study was to develop a low-cost, high-throughput method for large-scale genotyping of red blood cells (RBCs). Single-nucleotide polymorphisms associated with some clinically important blood group antigens, as well as with certain rare blood antigens, were evaluated: Jk(a)/Jk(b), Fy(a)/Fy(b), S/s, K/k, Kp(a)/Kp(b), Js(a)/Js(b), Co(a)/Co(b), and Lu(a)/Lu(b). Polymerase chain reaction (PCR)-amplified targets were detected by direct hybridization to microspheres coupled to allele-specific oligonucleotides. Cutoff values for each genotype were established with phenotyped and/or genotyped samples. The method was validated with a blind panel of 92 blood donor samples. The results were fully concordant with those provided by hemagglutination assays and/or sequence-specific primer (SSP)-PCR. The method was subsequently evaluated with approximately 800 blood donor and patient samples. This study presents a flexible, quick, and economical method for complete genotyping of large donor cohorts for RBC alleles.
Bee (Hymenoptera: Apoidea) Diversity and Sampling Methodology in a Midwestern USA Deciduous Forest.
McCravy, Kenneth W; Ruholl, Jared D
2017-08-04
Forests provide potentially important bee habitat, but little research has been done on forest bee diversity and the relative effectiveness of bee sampling methods in this environment. Bee diversity and sampling methodology were studied in an Illinois, USA upland oak-hickory forest using elevated and ground-level pan traps, malaise traps, and vane traps. 854 bees and 55 bee species were collected. Elevated pan traps collected the greatest number of bees (473), but ground-level pan traps collected greater species diversity (based on Simpson's diversity index) than did elevated pan traps. Elevated and ground-level pan traps collected the greatest bee species richness, with 43 and 39 species, respectively. An estimated sample size increase of over 18-fold would be required to approach minimum asymptotic richness using ground-level pan traps. Among pan trap colors/elevations, elevated yellow pan traps collected the greatest number of bees (266) but the lowest diversity. Malaise traps were relatively ineffective, collecting only 17 bees. Vane traps collected relatively low species richness (14 species), and Chao1 and abundance coverage estimators suggested that minimum asymptotic species richness was approached for that method. Bee species composition differed significantly between elevated pan traps, ground-level pan traps, and vane traps. Indicator species were significantly associated with each of these trap types, as well as with particular pan trap colors/elevations. These results indicate that Midwestern deciduous forests provide important bee habitat, and that the performance of common bee sampling methods varies substantially in this environment.
Static versus dynamic sampling for data mining
DOE Office of Scientific and Technical Information (OSTI.GOV)
John, G.H.; Langley, P.
1996-12-31
As data warehouses grow to the point where one hundred gigabytes is considered small, the computational efficiency of data-mining algorithms on large databases becomes increasingly important. Using a sample from the database can speed up the datamining process, but this is only acceptable if it does not reduce the quality of the mined knowledge. To this end, we introduce the {open_quotes}Probably Close Enough{close_quotes} criterion to describe the desired properties of a sample. Sampling usually refers to the use of static statistical tests to decide whether a sample is sufficiently similar to the large database, in the absence of any knowledgemore » of the tools the data miner intends to use. We discuss dynamic sampling methods, which take into account the mining tool being used and can thus give better samples. We describe dynamic schemes that observe a mining tool`s performance on training samples of increasing size and use these results to determine when a sample is sufficiently large. We evaluate these sampling methods on data from the UCI repository and conclude that dynamic sampling is preferable.« less
Bittner, Tonya D; Hajek, Ann E; Liebhold, Andrew M; Thistle, Harold
2017-09-01
The goal of this study was to develop effective and practical field sampling methods for quantification of aerial deposition of airborne conidia of Entomophaga maimaiga over space and time. This important fungal pathogen is a major cause of larval death in invasive gypsy moth ( Lymantria dispar ) populations in the United States. Airborne conidia of this pathogen are relatively large (similar in size to pollen), with unusual characteristics, and require specialized methods for collection and quantification. Initially, dry sampling (settling of spores from the air onto a dry surface) was used to confirm the detectability of E. maimaiga at field sites with L. dispar deaths caused by E. maimaiga , using quantitative PCR (qPCR) methods. We then measured the signal degradation of conidial DNA on dry surfaces under field conditions, ultimately rejecting dry sampling as a reliable method due to rapid DNA degradation. We modified a chamber-style trap commonly used in palynology to capture settling spores in buffer. We tested this wet-trapping method in a large-scale (137-km) spore-trapping survey across gypsy moth outbreak regions in Pennsylvania undergoing epizootics, in the summer of 2016. Using 4-day collection periods during the period of late instar and pupal development, we detected variable amounts of target DNA settling from the air. The amounts declined over the season and with distance from the nearest defoliated area, indicating airborne spore dispersal from outbreak areas. IMPORTANCE We report on a method for trapping and quantifying airborne spores of Entomophaga maimaiga , an important fungal pathogen affecting gypsy moth ( Lymantria dispar ) populations. This method can be used to track dispersal of E. maimaiga from epizootic areas and ultimately to provide critical understanding of the spatial dynamics of gypsy moth-pathogen interactions. Copyright © 2017 American Society for Microbiology.
Jabłońska-Czapla, Magdalena
2015-01-01
Chemical speciation is a very important subject in the environmental protection, toxicology, and chemical analytics due to the fact that toxicity, availability, and reactivity of trace elements depend on the chemical forms in which these elements occur. Research on low analyte levels, particularly in complex matrix samples, requires more and more advanced and sophisticated analytical methods and techniques. The latest trends in this field concern the so-called hyphenated techniques. Arsenic, antimony, chromium, and (underestimated) thallium attract the closest attention of toxicologists and analysts. The properties of those elements depend on the oxidation state in which they occur. The aim of the following paper is to answer the question why the speciation analytics is so important. The paper also provides numerous examples of the hyphenated technique usage (e.g., the LC-ICP-MS application in the speciation analysis of chromium, antimony, arsenic, or thallium in water and bottom sediment samples). An important issue addressed is the preparation of environmental samples for speciation analysis. PMID:25873962
Bounds on the minimum number of recombination events in a sample history.
Myers, Simon R; Griffiths, Robert C
2003-01-01
Recombination is an important evolutionary factor in many organisms, including humans, and understanding its effects is an important task facing geneticists. Detecting past recombination events is thus important; this article introduces statistics that give a lower bound on the number of recombination events in the history of a sample, on the basis of the patterns of variation in the sample DNA. Such lower bounds are appropriate, since many recombination events in the history are typically undetectable, so the true number of historical recombinations is unobtainable. The statistics can be calculated quickly by computer and improve upon the earlier bound of Hudson and Kaplan 1985. A method is developed to combine bounds on local regions in the data to produce more powerful improved bounds. The method is flexible to different models of recombination occurrence. The approach gives recombination event bounds between all pairs of sites, to help identify regions with more detectable recombinations, and these bounds can be viewed graphically. Under coalescent simulations, there is a substantial improvement over the earlier method (of up to a factor of 2) in the expected number of recombination events detected by one of the new minima, across a wide range of parameter values. The method is applied to data from a region within the lipoprotein lipase gene and the amount of detected recombination is substantially increased. Further, there is strong clustering of detected recombination events in an area near the center of the region. A program implementing these statistics, which was used for this article, is available from http://www.stats.ox.ac.uk/mathgen/programs.html. PMID:12586723
Linkage-specific sialic acid derivatization for MALDI-TOF-MS profiling of IgG glycopeptides.
de Haan, Noortje; Reiding, Karli R; Haberger, Markus; Reusch, Dietmar; Falck, David; Wuhrer, Manfred
2015-08-18
Glycosylation is a common co- and post-translational protein modification, having a large influence on protein properties like conformation and solubility. Furthermore, glycosylation is an important determinant of efficacy and clearance of biopharmaceuticals such as immunoglobulin G (IgG). Matrix-assisted laser desorption/ionization (MALDI)-time-of-flight (TOF)-mass spectrometry (MS) shows potential for the site-specific glycosylation analysis of IgG at the glycopeptide level. With this approach, however, important information about glycopeptide sialylation is not duly covered because of in-source and metastable decay of the sialylated species. Here, we present a highly repeatable sialic acid derivatization method to allow subclass-specific MALDI-TOF-MS analysis of tryptic IgG glycopeptides. The method, employing dimethylamidation with the carboxylic acid activator 1-ethyl-3-(3-dimethylamino)propyl)carbodiimide (EDC) and the catalyst 1-hydroxybenzotriazole (HOBt), results in different masses for the functionally divergent α2,3- and α2,6-linked sialic acids. Respective lactonization and dimethylamidation leads to their direct discrimination in MS and importantly, both glycan and peptide moieties reacted in a controlled manner. In addition, stabilization allowed the acquisition of fragmentation spectra informative with respect to glycosylation and peptide sequence. This was in contrast to fragmentation spectra of underivatized samples, which were dominated by sialic acid loss. The method allowed the facile discrimination and relative quantitation of IgG Fc sialylation in therapeutic IgG samples. The method has considerable potential for future site- and sialic acid linkage-specific glycosylation profiling of therapeutic antibodies, as well as for subclass-specific biomarker discovery in clinical IgG samples derived from plasma.
Quantification of underivatised amino acids on dry blood spot, plasma, and urine by HPLC-ESI-MS/MS.
Giordano, Giuseppe; Di Gangi, Iole Maria; Gucciardi, Antonina; Naturale, Mauro
2012-01-01
Enzyme deficiencies in amino acid (AA) metabolism affecting the levels of amino acids and their derivatives in physiological fluids may serve as diagnostically significant biomarkers for one or a group of metabolic disorders. Therefore, it is important to monitor a wide range of free amino acids simultaneously and to quantify them. This is time consuming if we use the classical methods and more than ever now that many laboratories have introduced Newborn Screening Programs for the semiquantitative analysis, detection, and quantification of some amino acids needed to be performed in a short time to reduce the rate of false positives.We have modified the stable isotope dilution HPLC-electrospray ionization (ESI)-MS/MS method previously described by Qu et al. (Anal Chem 74: 2034-2040, 2002) for a more rapid, robust, sensitive, and specific detection and quantification of underivatised amino acids. The modified method reduces the time of analysis to 10 min with very good reproducibility of retention times and a better separation of the metabolites and their isomers.The omission of the derivatization step allowed us to achieve some important advantages: fast and simple sample preparation and exclusion of artefacts and interferences. The use of this technique is highly sensitive, specific, and allows monitoring of 40 underivatized amino acids, including the key isomers and quantification of some of them, in order to cover many diagnostically important intermediates of metabolic pathways.We propose this HPLC-ESI-MS/MS method for underivatized amino acids as a support for the Newborn Screening as secondary test using the same dried blood spots for a more accurate and specific examination in case of suspected metabolic diseases. In this way, we avoid plasma collection from the patient as it normally occurs, reducing anxiety for the parents and further costs for analysis.The same method was validated and applied also to plasma and urine samples with good reproducibility, accuracy, and precision. The fast run time, feasibility of high sample throughput, and small amount of sample required make this method very suitable for routine analysis in the clinical setting.
de Irala, Jokin; Lopez del Burgo, Cristina; Lopez de Fez, Carmen M; Arredondo, Jorge; Mikolajczyk, Rafael T; Stanford, Joseph B
2007-06-27
Informed consent in family planning includes knowledge of mechanism of action. Some methods of family planning occasionally work after fertilization. Knowing about postfertilization effects may be important to some women before choosing a certain family planning method. The objective of this survey is to explore women's attitudes towards postfertilization effects of family planning methods, and beliefs and characteristics possibly associated with those attitudes. Cross-sectional survey in a sample of 755 potentially fertile women, aged 18-49, from Primary Care Health Centres in Pamplona, Spain. Participants were given a 30-item, self-administered, anonymous questionnaire about family planning methods and medical and surgical abortion. Logistic regression was used to identify variables associated with women's attitudes towards postfertilization effects. The response rate was 80%. The majority of women were married, held an academic degree and had no children. Forty percent of women would not consider using a method that may work after fertilization but before implantation and 57% would not consider using one that may work after implantation. While 35.3% of the sample would stop using a method if they learned that it sometimes works after fertilization, this percentage increased to 56.3% when referring to a method that sometimes works after implantation. Women who believe that human life begins at fertilization and those who consider it is important to distinguish between natural and induced embryo loss were less likely to consider the use of a method with postfertilization effects. Information about potential postfertilization effects of family planning methods may influence women's acceptance and choice of a particular family planning method. Additional studies in other populations are necessary to evaluate whether these beliefs are important to those populations.
SU-G-IeP1-13: Sub-Nyquist Dynamic MRI Via Prior Rank, Intensity and Sparsity Model (PRISM)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jiang, B; Gao, H
Purpose: Accelerated dynamic MRI is important for MRI guided radiotherapy. Inspired by compressive sensing (CS), sub-Nyquist dynamic MRI has been an active research area, i.e., sparse sampling in k-t space for accelerated dynamic MRI. This work is to investigate sub-Nyquist dynamic MRI via a previously developed CS model, namely Prior Rank, Intensity and Sparsity Model (PRISM). Methods: The proposed method utilizes PRISM with rank minimization and incoherent sampling patterns for sub-Nyquist reconstruction. In PRISM, the low-rank background image, which is automatically calculated by rank minimization, is excluded from the L1 minimization step of the CS reconstruction to further sparsify themore » residual image, thus allowing for higher acceleration rates. Furthermore, the sampling pattern in k-t space is made more incoherent by sampling a different set of k-space points at different temporal frames. Results: Reconstruction results from L1-sparsity method and PRISM method with 30% undersampled data and 15% undersampled data are compared to demonstrate the power of PRISM for dynamic MRI. Conclusion: A sub- Nyquist MRI reconstruction method based on PRISM is developed with improved image quality from the L1-sparsity method.« less
Inference from Samples of DNA Sequences Using a Two-Locus Model
Griffiths, Robert C.
2011-01-01
Abstract Performing inference on contemporary samples of DNA sequence data is an important and challenging task. Computationally intensive methods such as importance sampling (IS) are attractive because they make full use of the available data, but in the presence of recombination the large state space of genealogies can be prohibitive. In this article, we make progress by developing an efficient IS proposal distribution for a two-locus model of sequence data. We show that the proposal developed here leads to much greater efficiency, outperforming existing IS methods that could be adapted to this model. Among several possible applications, the algorithm can be used to find maximum likelihood estimates for mutation and crossover rates, and to perform ancestral inference. We illustrate the method on previously reported sequence data covering two loci either side of the well-studied TAP2 recombination hotspot. The two loci are themselves largely non-recombining, so we obtain a gene tree at each locus and are able to infer in detail the effect of the hotspot on their joint ancestry. We summarize this joint ancestry by introducing the gene graph, a summary of the well-known ancestral recombination graph. PMID:21210733
Raulf, M; Buters, J; Chapman, M; Cecchi, L; de Blay, F; Doekes, G; Eduard, W; Heederik, D; Jeebhay, M F; Kespohl, S; Krop, E; Moscato, G; Pala, G; Quirce, S; Sander, I; Schlünssen, V; Sigsgaard, T; Walusiak-Skorupa, J; Wiszniewska, M; Wouters, I M; Annesi-Maesano, I
2014-10-01
Exposure to high molecular weight sensitizers of biological origin is an important risk factor for the development of asthma and rhinitis. Most of the causal allergens have been defined based on their reactivity with IgE antibodies, and in many cases, the molecular structure and function of the allergens have been established. Significant information on allergen levels that cause sensitization and allergic symptoms for several major environmental and occupational allergens has been reported. Monitoring of high molecular weight allergens and allergen carrier particles is an important part of the management of allergic respiratory diseases and requires standardized allergen assessment methods for occupational and environmental (indoor and outdoor) allergen exposure. The aim of this EAACI task force was to review the essential points for monitoring environmental and occupational allergen exposure including sampling strategies and methods, processing of dust samples, allergen analysis, and quantification. The paper includes a summary of different methods for sampling and allergen quantification, as well as their pros and cons for various exposure settings. Recommendations are being made for different exposure scenarios. © 2014 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
Krause, Jane; Van Lieshout, Jan; Klomp, Rien; Huntink, Elke; Aakhus, Eivind; Flottorp, Signe; Jaeger, Cornelia; Steinhaeuser, Jost; Godycki-Cwirko, Maciek; Kowalczyk, Anna; Agarwal, Shona; Wensing, Michel; Baker, Richard
2014-08-12
The tailoring of implementation interventions includes the identification of the determinants of, or barriers to, healthcare practice. Different methods for identifying determinants have been used in implementation projects, but which methods are most appropriate to use is unknown. The study was undertaken in five European countries, recommendations for a different chronic condition being addressed in each country: Germany (polypharmacy in multimorbid patients); the Netherlands (cardiovascular risk management); Norway (depression in the elderly); Poland (chronic obstructive pulmonary disease--COPD); and the United Kingdom (UK) (obesity). Using samples of professionals and patients in each country, three methods were compared directly: brainstorming amongst health professionals, interviews of health professionals, and interviews of patients. The additional value of discussion structured through reference to a checklist of determinants in addition to brainstorming, and determinants identified by open questions in a questionnaire survey, were investigated separately. The questionnaire, which included closed questions derived from a checklist of determinants, was administered to samples of health professionals in each country. Determinants were classified according to whether it was likely that they would inform the design of an implementation intervention (defined as plausibly important determinants). A total of 601 determinants judged to be plausibly important were identified. An additional 609 determinants were judged to be unlikely to inform an implementation intervention, and were classified as not plausibly important. Brainstorming identified 194 of the plausibly important determinants, health professional interviews 152, patient interviews 63, and open questions 48. Structured group discussion identified 144 plausibly important determinants in addition to those already identified by brainstorming. Systematic methods can lead to the identification of large numbers of determinants. Tailoring will usually include a process to decide, from all the determinants that are identified, those to be addressed by implementation interventions. There is no best buy of methods to identify determinants, and a combination should be used, depending on the topic and setting. Brainstorming is a simple, low cost method that could be relevant to many tailored implementation projects.
Kaus, Joseph W; Harder, Edward; Lin, Teng; Abel, Robert; McCammon, J Andrew; Wang, Lingle
2015-06-09
Recent advances in improved force fields and sampling methods have made it possible for the accurate calculation of protein–ligand binding free energies. Alchemical free energy perturbation (FEP) using an explicit solvent model is one of the most rigorous methods to calculate relative binding free energies. However, for cases where there are high energy barriers separating the relevant conformations that are important for ligand binding, the calculated free energy may depend on the initial conformation used in the simulation due to the lack of complete sampling of all the important regions in phase space. This is particularly true for ligands with multiple possible binding modes separated by high energy barriers, making it difficult to sample all relevant binding modes even with modern enhanced sampling methods. In this paper, we apply a previously developed method that provides a corrected binding free energy for ligands with multiple binding modes by combining the free energy results from multiple alchemical FEP calculations starting from all enumerated poses, and the results are compared with Glide docking and MM-GBSA calculations. From these calculations, the dominant ligand binding mode can also be predicted. We apply this method to a series of ligands that bind to c-Jun N-terminal kinase-1 (JNK1) and obtain improved free energy results. The dominant ligand binding modes predicted by this method agree with the available crystallography, while both Glide docking and MM-GBSA calculations incorrectly predict the binding modes for some ligands. The method also helps separate the force field error from the ligand sampling error, such that deviations in the predicted binding free energy from the experimental values likely indicate possible inaccuracies in the force field. An error in the force field for a subset of the ligands studied was identified using this method, and improved free energy results were obtained by correcting the partial charges assigned to the ligands. This improved the root-mean-square error (RMSE) for the predicted binding free energy from 1.9 kcal/mol with the original partial charges to 1.3 kcal/mol with the corrected partial charges.
2016-01-01
Recent advances in improved force fields and sampling methods have made it possible for the accurate calculation of protein–ligand binding free energies. Alchemical free energy perturbation (FEP) using an explicit solvent model is one of the most rigorous methods to calculate relative binding free energies. However, for cases where there are high energy barriers separating the relevant conformations that are important for ligand binding, the calculated free energy may depend on the initial conformation used in the simulation due to the lack of complete sampling of all the important regions in phase space. This is particularly true for ligands with multiple possible binding modes separated by high energy barriers, making it difficult to sample all relevant binding modes even with modern enhanced sampling methods. In this paper, we apply a previously developed method that provides a corrected binding free energy for ligands with multiple binding modes by combining the free energy results from multiple alchemical FEP calculations starting from all enumerated poses, and the results are compared with Glide docking and MM-GBSA calculations. From these calculations, the dominant ligand binding mode can also be predicted. We apply this method to a series of ligands that bind to c-Jun N-terminal kinase-1 (JNK1) and obtain improved free energy results. The dominant ligand binding modes predicted by this method agree with the available crystallography, while both Glide docking and MM-GBSA calculations incorrectly predict the binding modes for some ligands. The method also helps separate the force field error from the ligand sampling error, such that deviations in the predicted binding free energy from the experimental values likely indicate possible inaccuracies in the force field. An error in the force field for a subset of the ligands studied was identified using this method, and improved free energy results were obtained by correcting the partial charges assigned to the ligands. This improved the root-mean-square error (RMSE) for the predicted binding free energy from 1.9 kcal/mol with the original partial charges to 1.3 kcal/mol with the corrected partial charges. PMID:26085821
Shutdown Dose Rate Analysis Using the Multi-Step CADIS Method
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ibrahim, Ahmad M.; Peplow, Douglas E.; Peterson, Joshua L.
2015-01-01
The Multi-Step Consistent Adjoint Driven Importance Sampling (MS-CADIS) hybrid Monte Carlo (MC)/deterministic radiation transport method was proposed to speed up the shutdown dose rate (SDDR) neutron MC calculation using an importance function that represents the neutron importance to the final SDDR. This work applied the MS-CADIS method to the ITER SDDR benchmark problem. The MS-CADIS method was also used to calculate the SDDR uncertainty resulting from uncertainties in the MC neutron calculation and to determine the degree of undersampling in SDDR calculations because of the limited ability of the MC method to tally detailed spatial and energy distributions. The analysismore » that used the ITER benchmark problem compared the efficiency of the MS-CADIS method to the traditional approach of using global MC variance reduction techniques for speeding up SDDR neutron MC calculation. Compared to the standard Forward-Weighted-CADIS (FW-CADIS) method, the MS-CADIS method increased the efficiency of the SDDR neutron MC calculation by 69%. The MS-CADIS method also increased the fraction of nonzero scoring mesh tally elements in the space-energy regions of high importance to the final SDDR.« less
Xing, Han-Zhu; Wang, Xia; Chen, Xiang-Feng; Wang, Ming-Lin; Zhao, Ru-Song
2015-05-01
A method combining accelerated solvent extraction with dispersive liquid-liquid microextraction was developed for the first time as a sample pretreatment for the rapid analysis of phenols (including phenol, m-cresol, 2,4-dichlorophenol, and 2,4,6-trichlorophenol) in soil samples. In the accelerated solvent extraction procedure, water was used as an extraction solvent, and phenols were extracted from soil samples into water. The dispersive liquid-liquid microextraction technique was then performed on the obtained aqueous solution. Important accelerated solvent extraction and dispersive liquid-liquid microextraction parameters were investigated and optimized. Under optimized conditions, the new method provided wide linearity (6.1-3080 ng/g), low limits of detection (0.06-1.83 ng/g), and excellent reproducibility (<10%) for phenols. Four real soil samples were analyzed by the proposed method to assess its applicability. Experimental results showed that the soil samples were free of our target compounds, and average recoveries were in the range of 87.9-110%. These findings indicate that accelerated solvent extraction with dispersive liquid-liquid microextraction as a sample pretreatment procedure coupled with gas chromatography and mass spectrometry is an excellent method for the rapid analysis of trace levels of phenols in environmental soil samples. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
NASA Astrophysics Data System (ADS)
Pries, V. V.; Proskuriakov, N. E.
2018-04-01
To control the assembly quality of multi-element mass-produced products on automatic rotor lines, control methods with operational feedback are required. However, due to possible failures in the operation of the devices and systems of automatic rotor line, there is always a real probability of getting defective (incomplete) products into the output process stream. Therefore, a continuous sampling control of the products completeness, based on the use of statistical methods, remains an important element in managing the quality of assembly of multi-element mass products on automatic rotor lines. The feature of continuous sampling control of the multi-element products completeness in the assembly process is its breaking sort, which excludes the possibility of returning component parts after sampling control to the process stream and leads to a decrease in the actual productivity of the assembly equipment. Therefore, the use of statistical procedures for continuous sampling control of the multi-element products completeness when assembled on automatic rotor lines requires the use of such sampling plans that ensure a minimum size of control samples. Comparison of the values of the limit of the average output defect level for the continuous sampling plan (CSP) and for the automated continuous sampling plan (ACSP) shows the possibility of providing lower limit values for the average output defects level using the ACSP-1. Also, the average sample size when using the ACSP-1 plan is less than when using the CSP-1 plan. Thus, the application of statistical methods in the assembly quality management of multi-element products on automatic rotor lines, involving the use of proposed plans and methods for continuous selective control, will allow to automating sampling control procedures and the required level of quality of assembled products while minimizing sample size.
Sampling and sample processing in pesticide residue analysis.
Lehotay, Steven J; Cook, Jo Marie
2015-05-13
Proper sampling and sample processing in pesticide residue analysis of food and soil have always been essential to obtain accurate results, but the subject is becoming a greater concern as approximately 100 mg test portions are being analyzed with automated high-throughput analytical methods by agrochemical industry and contract laboratories. As global food trade and the importance of monitoring increase, the food industry and regulatory laboratories are also considering miniaturized high-throughput methods. In conjunction with a summary of the symposium "Residues in Food and Feed - Going from Macro to Micro: The Future of Sample Processing in Residue Analytical Methods" held at the 13th IUPAC International Congress of Pesticide Chemistry, this is an opportune time to review sampling theory and sample processing for pesticide residue analysis. If collected samples and test portions do not adequately represent the actual lot from which they came and provide meaningful results, then all costs, time, and efforts involved in implementing programs using sophisticated analytical instruments and techniques are wasted and can actually yield misleading results. This paper is designed to briefly review the often-neglected but crucial topic of sample collection and processing and put the issue into perspective for the future of pesticide residue analysis. It also emphasizes that analysts should demonstrate the validity of their sample processing approaches for the analytes/matrices of interest and encourages further studies on sampling and sample mass reduction to produce a test portion.
Torres, Daiane Placido; Martins-Teixeira, Maristela Braga; Cadore, Solange; Queiroz, Helena Müller
2015-01-01
A method for the determination of total mercury in fresh fish and shrimp samples by solid sampling thermal decomposition/amalgamation atomic absorption spectrometry (TDA AAS) has been validated following international foodstuff protocols in order to fulfill the Brazilian National Residue Control Plan. The experimental parameters have been previously studied and optimized according to specific legislation on validation and inorganic contaminants in foodstuff. Linearity, sensitivity, specificity, detection and quantification limits, precision (repeatability and within-laboratory reproducibility), robustness as well as accuracy of the method have been evaluated. Linearity of response was satisfactory for the two range concentrations available on the TDA AAS equipment, between approximately 25.0 and 200.0 μg kg(-1) (square regression) and 250.0 and 2000.0 μg kg(-1) (linear regression) of mercury. The residues for both ranges were homoscedastic and independent, with normal distribution. Correlation coefficients obtained for these ranges were higher than 0.995. Limits of quantification (LOQ) and of detection of the method (LDM), based on signal standard deviation (SD) for a low-in-mercury sample, were 3.0 and 1.0 μg kg(-1), respectively. Repeatability of the method was better than 4%. Within-laboratory reproducibility achieved a relative SD better than 6%. Robustness of the current method was evaluated and pointed sample mass as a significant factor. Accuracy (assessed as the analyte recovery) was calculated on basis of the repeatability, and ranged from 89% to 99%. The obtained results showed the suitability of the present method for direct mercury measurement in fresh fish and shrimp samples and the importance of monitoring the analysis conditions for food control purposes. Additionally, the competence of this method was recognized by accreditation under the standard ISO/IEC 17025.
Honest Importance Sampling with Multiple Markov Chains
Tan, Aixin; Doss, Hani; Hobert, James P.
2017-01-01
Importance sampling is a classical Monte Carlo technique in which a random sample from one probability density, π1, is used to estimate an expectation with respect to another, π. The importance sampling estimator is strongly consistent and, as long as two simple moment conditions are satisfied, it obeys a central limit theorem (CLT). Moreover, there is a simple consistent estimator for the asymptotic variance in the CLT, which makes for routine computation of standard errors. Importance sampling can also be used in the Markov chain Monte Carlo (MCMC) context. Indeed, if the random sample from π1 is replaced by a Harris ergodic Markov chain with invariant density π1, then the resulting estimator remains strongly consistent. There is a price to be paid however, as the computation of standard errors becomes more complicated. First, the two simple moment conditions that guarantee a CLT in the iid case are not enough in the MCMC context. Second, even when a CLT does hold, the asymptotic variance has a complex form and is difficult to estimate consistently. In this paper, we explain how to use regenerative simulation to overcome these problems. Actually, we consider a more general set up, where we assume that Markov chain samples from several probability densities, π1, …, πk, are available. We construct multiple-chain importance sampling estimators for which we obtain a CLT based on regeneration. We show that if the Markov chains converge to their respective target distributions at a geometric rate, then under moment conditions similar to those required in the iid case, the MCMC-based importance sampling estimator obeys a CLT. Furthermore, because the CLT is based on a regenerative process, there is a simple consistent estimator of the asymptotic variance. We illustrate the method with two applications in Bayesian sensitivity analysis. The first concerns one-way random effects models under different priors. The second involves Bayesian variable selection in linear regression, and for this application, importance sampling based on multiple chains enables an empirical Bayes approach to variable selection. PMID:28701855
Honest Importance Sampling with Multiple Markov Chains.
Tan, Aixin; Doss, Hani; Hobert, James P
2015-01-01
Importance sampling is a classical Monte Carlo technique in which a random sample from one probability density, π 1 , is used to estimate an expectation with respect to another, π . The importance sampling estimator is strongly consistent and, as long as two simple moment conditions are satisfied, it obeys a central limit theorem (CLT). Moreover, there is a simple consistent estimator for the asymptotic variance in the CLT, which makes for routine computation of standard errors. Importance sampling can also be used in the Markov chain Monte Carlo (MCMC) context. Indeed, if the random sample from π 1 is replaced by a Harris ergodic Markov chain with invariant density π 1 , then the resulting estimator remains strongly consistent. There is a price to be paid however, as the computation of standard errors becomes more complicated. First, the two simple moment conditions that guarantee a CLT in the iid case are not enough in the MCMC context. Second, even when a CLT does hold, the asymptotic variance has a complex form and is difficult to estimate consistently. In this paper, we explain how to use regenerative simulation to overcome these problems. Actually, we consider a more general set up, where we assume that Markov chain samples from several probability densities, π 1 , …, π k , are available. We construct multiple-chain importance sampling estimators for which we obtain a CLT based on regeneration. We show that if the Markov chains converge to their respective target distributions at a geometric rate, then under moment conditions similar to those required in the iid case, the MCMC-based importance sampling estimator obeys a CLT. Furthermore, because the CLT is based on a regenerative process, there is a simple consistent estimator of the asymptotic variance. We illustrate the method with two applications in Bayesian sensitivity analysis. The first concerns one-way random effects models under different priors. The second involves Bayesian variable selection in linear regression, and for this application, importance sampling based on multiple chains enables an empirical Bayes approach to variable selection.
Beknazarova, Meruyert; Millsteed, Shelby; Robertson, Gemma; Whiley, Harriet; Ross, Kirstin
2017-06-09
Strongyloides stercoralis is a gastrointestinal parasitic nematode with a life cycle that includes free-living and parasitic forms. For both clinical (diagnostic) and environmental evaluation, it is important that we can detect Strongyloides spp. in both human and non-human fecal samples. Real-time PCR is the most feasible method for detecting the parasite in both clinical and environmental samples that have been preserved. However, one of the biggest challenges with PCR detection is DNA degradation during the postage time from rural and remote areas to the laboratory. This study included a laboratory assessment and field validation of DESS (dimethyl sulfoxide, disodium EDTA, and saturated NaCl) preservation of Strongyloides spp. DNA in fecal samples. The laboratory study investigated the capacity of 1:1 and 1:3 sample to DESS ratios to preserve Strongyloides ratti in spike canine feces. It was found that both ratios of DESS significantly prevented DNA degradation compared to the untreated sample. This method was then validated by applying it to the field-collected canine feces and detecting Strongyloides DNA using PCR. A total of 37 canine feces samples were collected and preserved in the 1:3 ratio (sample: DESS) and of these, 17 were positive for Strongyloides spp. The study shows that both 1:1 and 1:3 sample to DESS ratios were able to preserve the Strongyloides spp. DNA in canine feces samples stored at room temperature for up to 56 days. This DESS preservation method presents the most applicable and feasible method for the Strongyloides DNA preservation in field-collected feces.
Kraemer, D; Chen, G
2014-02-01
Accurate measurements of thermal conductivity are of great importance for materials research and development. Steady-state methods determine thermal conductivity directly from the proportionality between heat flow and an applied temperature difference (Fourier Law). Although theoretically simple, in practice, achieving high accuracies with steady-state methods is challenging and requires rather complex experimental setups due to temperature sensor uncertainties and parasitic heat loss. We developed a simple differential steady-state method in which the sample is mounted between an electric heater and a temperature-controlled heat sink. Our method calibrates for parasitic heat losses from the electric heater during the measurement by maintaining a constant heater temperature close to the environmental temperature while varying the heat sink temperature. This enables a large signal-to-noise ratio which permits accurate measurements of samples with small thermal conductance values without an additional heater calibration measurement or sophisticated heater guards to eliminate parasitic heater losses. Additionally, the differential nature of the method largely eliminates the uncertainties of the temperature sensors, permitting measurements with small temperature differences, which is advantageous for samples with high thermal conductance values and/or with strongly temperature-dependent thermal conductivities. In order to accelerate measurements of more than one sample, the proposed method allows for measuring several samples consecutively at each temperature measurement point without adding significant error. We demonstrate the method by performing thermal conductivity measurements on commercial bulk thermoelectric Bi2Te3 samples in the temperature range of 30-150 °C with an error below 3%.
Performing skin microbiome research: A method to the madness
Kong, Heidi H.; Andersson, Björn; Clavel, Thomas; Common, John E.; Jackson, Scott A.; Olson, Nathan D.; Segre, Julia A.; Traidl-Hoffmann, Claudia
2017-01-01
Growing interest in microbial contributions to human health and disease has increasingly led investigators to examine the microbiome in both healthy skin and cutaneous disorders, including acne, psoriasis and atopic dermatitis. The need for common language, effective study design, and validated methods are critical for high-quality, standardized research. Features, unique to skin, pose particular challenges when conducting microbiome research. This review discusses microbiome research standards and highlights important factors to consider, including clinical study design, skin sampling, sample processing, DNA sequencing, control inclusion, and data analysis. PMID:28063650
Wolkerstorfer, Silviya V; Rosner, Sabine; Hietz, Peter
2012-10-01
The vulnerability of the xylem to cavitation is an important trait in plant drought resistance and has been quantified by several methods. We present a modified method for the simultaneous measurement of cavitations, recorded as ultrasound acoustic emissions (UAEs), and the water potential, measured with a thermocouple psychrometer, in small samples of conifer wood. Analyzing the amplitude of the individual signals showed that a first phase, during which the mean amplitude increased, was followed by a second phase with distinctly lower signal amplitudes. We provide a method to separate the two groups of signals and show that for many samples plausible vulnerability curves require rejecting late low-energy UAEs. These very likely do not result from cavitations. This method was used to analyze the differences between juvenile wood, and early and late mature wood in Picea abies (L.) Karst. Juvenile earlywood was more resistant to cavitation than mature earlywood or latewood, which we relate to the tracheid anatomy of the samples. Copyright © Physiologia Plantarum 2012.
Data for factor analysis of hydro-geochemical characteristics of groundwater resources in Iranshahr.
Biglari, Hamed; Saeidi, Mehdi; Karimyan, Kamaleddin; Narooie, Mohammad Reza; Sharafi, Hooshmand
2018-08-01
Detection of Hydrogeological and Hydro-geochemical changes affecting the quality of aquifer water is very important. The aim of this study was to determine the factor analysis of the hydro-geochemical characteristics of Iranshahr underground water resources during the warm and cool seasons. In this study, 248 samples (two-time repetitions) of ground water resources were provided at first by cluster-random sampling method during 2017 in the villages of Iranshahr city. After transferring the samples to the laboratory, concentrations of 13 important chemical parameters in those samples were determined according to o water and wastewater standard methods. The results of this study indicated that 45.45% and 55.55% of the correlation between parameters has had a significant decrease and increase, respectively with the transition from warm seasons to cold seasons. According to the factor analysis method, three factors of land hydro-geochemical processes, supplying resources by surface water and sewage as well as human activities have been identified as influential on the chemical composition of these resources.The highest growth rate of 0.37 was observed between phosphate and nitrate ions while the lowest trend of - 0.33 was seen between fluoride ion and calcium as well as chloride ions. Also, a significant increase in the correlation between magnesium ion and nitrate ion from warm seasons to cold seasons indicates the high seasonal impact of the relation between these two parameters.
Maraschin, Marcelo; Somensi-Zeggio, Amélia; Oliveira, Simone K; Kuhnen, Shirley; Tomazzoli, Maíra M; Raguzzoni, Josiane C; Zeri, Ana C M; Carreira, Rafael; Correia, Sara; Costa, Christopher; Rocha, Miguel
2016-01-22
The chemical composition of propolis is affected by environmental factors and harvest season, making it difficult to standardize its extracts for medicinal usage. By detecting a typical chemical profile associated with propolis from a specific production region or season, certain types of propolis may be used to obtain a specific pharmacological activity. In this study, propolis from three agroecological regions (plain, plateau, and highlands) from southern Brazil, collected over the four seasons of 2010, were investigated through a novel NMR-based metabolomics data analysis workflow. Chemometrics and machine learning algorithms (PLS-DA and RF), including methods to estimate variable importance in classification, were used in this study. The machine learning and feature selection methods permitted construction of models for propolis sample classification with high accuracy (>75%, reaching ∼90% in the best case), better discriminating samples regarding their collection seasons comparatively to the harvest regions. PLS-DA and RF allowed the identification of biomarkers for sample discrimination, expanding the set of discriminating features and adding relevant information for the identification of the class-determining metabolites. The NMR-based metabolomics analytical platform, coupled to bioinformatic tools, allowed characterization and classification of Brazilian propolis samples regarding the metabolite signature of important compounds, i.e., chemical fingerprint, harvest seasons, and production regions.
de Andrade, Rafael Barreto; Barlow, Jos; Louzada, Julio; Vaz-de-Mello, Fernando Zagury; Souza, Mateus; Silveira, Juliana M.; Cochrane, Mark A.
2011-01-01
Understanding how biodiversity responds to environmental changes is essential to provide the evidence-base that underpins conservation initiatives. The present study provides a standardized comparison between unbaited flight intercept traps (FIT) and baited pitfall traps (BPT) for sampling dung beetles. We examine the effectiveness of the two to assess fire disturbance effects and how trap performance is affected by seasonality. The study was carried out in a transitional forest between Cerrado (Brazilian Savanna) and Amazon Forest. Dung beetles were collected during one wet and one dry sampling season. The two methods sampled different portions of the local beetle assemblage. Both FIT and BPT were sensitive to fire disturbance during the wet season, but only BPT detected community differences during the dry season. Both traps showed similar correlation with environmental factors. Our results indicate that seasonality had a stronger effect than trap type, with BPT more effective and robust under low population numbers, and FIT more sensitive to fine scale heterogeneity patterns. This study shows the strengths and weaknesses of two commonly used methodologies for sampling dung beetles in tropical forests, as well as highlighting the importance of seasonality in shaping the results obtained by both sampling strategies. PMID:22028831
[Wound microbial sampling methods in surgical practice, imprint techniques].
Chovanec, Z; Veverková, L; Votava, M; Svoboda, J; Peštál, A; Doležel, J; Jedlička, V; Veselý, M; Wechsler, J; Čapov, I
2012-12-01
The wound is a damage of tissue. The process of healing is influenced by many systemic and local factors. The most crucial and the most discussed local factor of wound healing is infection. Surgical site infection in the wound is caused by micro-organisms. This information is known for many years, however the conditions leading to an infection occurrence have not been sufficiently described yet. Correct sampling technique, correct storage, transportation, evaluation, and valid interpretation of these data are very important in clinical practice. There are many methods for microbiological sampling, but the best one has not been yet identified and validated. We aim to discuss the problem with the focus on the imprint technique.
Probabilistic methods for rotordynamics analysis
NASA Technical Reports Server (NTRS)
Wu, Y.-T.; Torng, T. Y.; Millwater, H. R.; Fossum, A. F.; Rheinfurth, M. H.
1991-01-01
This paper summarizes the development of the methods and a computer program to compute the probability of instability of dynamic systems that can be represented by a system of second-order ordinary linear differential equations. Two instability criteria based upon the eigenvalues or Routh-Hurwitz test functions are investigated. Computational methods based on a fast probability integration concept and an efficient adaptive importance sampling method are proposed to perform efficient probabilistic analysis. A numerical example is provided to demonstrate the methods.
Estimating TCP Packet Loss Ratio from Sampled ACK Packets
NASA Astrophysics Data System (ADS)
Yamasaki, Yasuhiro; Shimonishi, Hideyuki; Murase, Tutomu
The advent of various quality-sensitive applications has greatly changed the requirements for IP network management and made the monitoring of individual traffic flows more important. Since the processing costs of per-flow quality monitoring are high, especially in high-speed backbone links, packet sampling techniques have been attracting considerable attention. Existing sampling techniques, such as those used in Sampled NetFlow and sFlow, however, focus on the monitoring of traffic volume, and there has been little discussion of the monitoring of such quality indexes as packet loss ratio. In this paper we propose a method for estimating, from sampled packets, packet loss ratios in individual TCP sessions. It detects packet loss events by monitoring duplicate ACK events raised by each TCP receiver. Because sampling reveals only a portion of the actual packet loss, the actual packet loss ratio is estimated statistically. Simulation results show that the proposed method can estimate the TCP packet loss ratio accurately from a 10% sampling of packets.
Vargas, S L; Ponce, C; Bustamante, R; Calderón, E; Nevez, G; De Armas, Y; Matos, O; Miller, R F; Gallo, M J
2017-10-01
To understand the epidemiological significance of Pneumocystis detection in a lung tissue sample of non-immunosuppressed individuals, we examined sampling procedures, laboratory methodology, and patient characteristics of autopsy series reported in the literature. Number of tissue specimens, DNA-extraction procedures, age and underlying diagnosis highly influence yield and are critical to understand yield differences of Pneumocystis among reports of pulmonary colonization in immunocompetent individuals.
A Stepwise Test Characteristic Curve Method to Detect Item Parameter Drift
ERIC Educational Resources Information Center
Guo, Rui; Zheng, Yi; Chang, Hua-Hua
2015-01-01
An important assumption of item response theory is item parameter invariance. Sometimes, however, item parameters are not invariant across different test administrations due to factors other than sampling error; this phenomenon is termed item parameter drift. Several methods have been developed to detect drifted items. However, most of the…
USDA-ARS?s Scientific Manuscript database
Hybridization is currently considered to be a frequent and important force in plant evolution; NGS methods offer new possibilities for clade resolution and ambitious sampling of gene genealogies, yet difficulty remains in detecting deep reticulation events using currently available methods. We exami...
Improving Sampling and Response Rates in Children's Health Research through Participatory Methods
ERIC Educational Resources Information Center
Claudio, Luz; Stingone, Jeanette A.
2008-01-01
Background: Children's health is an important indicator of community health because children are especially vulnerable to disease. The school setting is ideal for assessing these vulnerabilities and prevalence of disease, yet the methods that produce high participation among students and their families are not usually described or evaluated. This…
Protocol Improvements for Low Concentration DNA-Based Bioaerosol Sampling and Analysis
Ng, Chun Kiat; Miller, Dana; Cao, Bin
2015-01-01
Introduction As bioaerosol research attracts increasing attention, there is a need for additional efforts that focus on method development to deal with different environmental samples. Bioaerosol environmental samples typically have very low biomass concentrations in the air, which often leaves researchers with limited options in choosing the downstream analysis steps, especially when culture-independent methods are intended. Objectives This study investigates the impacts of three important factors that can influence the performance of culture-independent DNA-based analysis in dealing with bioaerosol environmental samples engaged in this study. The factors are: 1) enhanced high temperature sonication during DNA extraction; 2) effect of sampling duration on DNA recoverability; and 3) an alternative method for concentrating composite samples. In this study, DNA extracted from samples was analysed using the Qubit fluorometer (for direct total DNA measurement) and quantitative polymerase chain reaction (qPCR). Results and Findings The findings suggest that additional lysis from high temperature sonication is crucial: DNA yields from both high and low biomass samples increased up to 600% when the protocol included 30-min sonication at 65°C. Long air sampling duration on a filter media was shown to have a negative impact on DNA recoverability with up to 98% of DNA lost over a 20-h sampling period. Pooling DNA from separate samples during extraction was proven to be feasible with margins of error below 30%. PMID:26619279
García-Blanco, Ana; Peña-Bautista, Carmen; Oger, Camille; Vigor, Claire; Galano, Jean-Marie; Durand, Thierry; Martín-Ibáñez, Nuria; Baquero, Miguel; Vento, Máximo; Cháfer-Pericás, Consuelo
2018-07-01
Lipid peroxidation plays an important role in Alzheimer Disease, so corresponding metabolites found in urine samples could be potential biomarkers. The aim of this work is to develop a reliable ultra-performance liquid chromatography-tandem mass spectrometry analytical method to determine a new set of lipid peroxidation compounds in urine samples. Excellent sensitivity was achieved with limits of detection between 0.08 and 17 nmol L -1 , which renders this method suitable to monitor analytes concentrations in real samples. The method's precision was satisfactory with coefficients of variation around 5-17% (intra-day) and 8-19% (inter-day). The accuracy of the method was assessed by analysis of spiked urine samples obtaining recoveries between 70% and 120% for most of the analytes. The utility of the described method was tested by analyzing urine samples from patients early diagnosed with mild cognitive impairment or mild dementia Alzheimer Disease following the clinical standard criteria. As preliminary results, some analytes (17(RS)-10-epi-SC-Δ 15 -11-dihomo-IsoF, PGE 2 ) and total parameters (Neuroprostanes, Isoprostanes, Isofurans) show differences between the control and the clinical groups. So, these analytes could be potential early Alzheimer Disease biomarkers assessing the patients' pro-oxidant condition. Copyright © 2018 Elsevier B.V. All rights reserved.
Do, Van-Khoai; Yamamoto, Masahiko; Taguchi, Shigeo; Takamura, Yuzuru; Surugaya, Naoki; Kuno, Takehiko
2018-06-01
A sensitive analytical method for determination of total cesium (Cs) in highly active liquid waste (HALW) by using modified liquid electrode plasma optical emission spectrometry (LEP-OES) is developed in this study. The instrument is modified to measure radioactive samples in a glove box. The effects of important factors, including pulsed voltage sequence and nitric acid concentration, on the emission of Cs are investigated. The limit of detection (LOD) and limit of quantification (LOQ) are 0.005 mg/L and 0.02 mg/L, respectively. The achieved LOD is one order lower than that of recently developed spectroscopic methods using liquid discharge plasma. The developed method is validated by subjecting a simulated HALW sample to inductively coupled plasma mass spectrometry (ICP-MS). The recoveries obtained from a spike-and-recovery test are 96-102%, implying good accuracy. The method is successfully applied to the quantification of Cs in a real HALW sample at the Tokai reprocessing plant in Japan. Apart from dilution and filtration of the HALW sample, no other pre-treatment process is required. The results agree well with the values obtained using gamma spectrometry. The developed method offers a reliable technique for rapid analysis of total Cs in HALW samples. Copyright © 2018 Elsevier B.V. All rights reserved.
An evaluation of long-term preservation methods for brown bear (Ursus arctos) faecal DNA samples
Murphy, M.A.; Waits, L.P.; Kendall, K.C.; Wasser, S.K.; Higbee, J.A.; Bogden, R.
2002-01-01
Relatively few large-scale faecal DNA studies have been initiated due to difficulties in amplifying low quality and quantity DNA template. To improve brown bear faecal DNA PCR amplification success rates and to determine post collection sample longevity, five preservation methods were evaluated: 90% ethanol, DETs buffer, silica-dried, oven-dried stored at room temperature, and oven-dried stored at -20??C. Preservation effectiveness was evaluated for 50 faecal samples by PCR amplification of a mitochondrial DNA (mtDNA) locus (???146 bp) and a nuclear DNA (nDNA) locus (???200 bp) at time points of one week, one month, three months and six months. Preservation method and storage time significantly impacted mtDNA and nDNA amplification success rates. For mtDNA, all preservation methods had ??? 75% success at one week, but storage time had a significant impact on the effectiveness of the silica preservation method. Ethanol preserved samples had the highest success rates for both mtDNA (86.5%) and nDNA (84%). Nuclear DNA amplification success rates ranged from 26-88%, and storage time had a significant impact on all methods but ethanol. Preservation method and storage time should be important considerations for researchers planning projects utilizing faecal DNA. We recommend preservation of faecal samples in 90% ethanol when feasible, although when collecting in remote field conditions or for both DNA and hormone assays a dry collection method may be advantageous.
W. Henry McNab
2010-01-01
Site index is the most widely used method for site quality assessment in hardwood forests of the eastern United States. Its application in most oak (Quercus sp. L.) dominated stands is often problematic, however, because available sample trees usually do not meet important underlying assumptions of the method. A prototype method for predicting site index from tree...
Multilevel Mixture Kalman Filter
NASA Astrophysics Data System (ADS)
Guo, Dong; Wang, Xiaodong; Chen, Rong
2004-12-01
The mixture Kalman filter is a general sequential Monte Carlo technique for conditional linear dynamic systems. It generates samples of some indicator variables recursively based on sequential importance sampling (SIS) and integrates out the linear and Gaussian state variables conditioned on these indicators. Due to the marginalization process, the complexity of the mixture Kalman filter is quite high if the dimension of the indicator sampling space is high. In this paper, we address this difficulty by developing a new Monte Carlo sampling scheme, namely, the multilevel mixture Kalman filter. The basic idea is to make use of the multilevel or hierarchical structure of the space from which the indicator variables take values. That is, we draw samples in a multilevel fashion, beginning with sampling from the highest-level sampling space and then draw samples from the associate subspace of the newly drawn samples in a lower-level sampling space, until reaching the desired sampling space. Such a multilevel sampling scheme can be used in conjunction with the delayed estimation method, such as the delayed-sample method, resulting in delayed multilevel mixture Kalman filter. Examples in wireless communication, specifically the coherent and noncoherent 16-QAM over flat-fading channels, are provided to demonstrate the performance of the proposed multilevel mixture Kalman filter.
Comparison of fecal egg counting methods in four livestock species.
Paras, Kelsey L; George, Melissa M; Vidyashankar, Anand N; Kaplan, Ray M
2018-06-15
Gastrointestinal nematode parasites are important pathogens of all domesticated livestock species. Fecal egg counts (FEC) are routinely used for evaluating anthelmintic efficacy and for making targeted anthelmintic treatment decisions. Numerous FEC techniques exist and vary in precision and accuracy. These performance characteristics are especially important when performing fecal egg count reduction tests (FECRT). The objective of this study was to compare the accuracy and precision of three commonly used FEC methods and determine if differences existed among livestock species. In this study, we evaluated the modified-Wisconsin, 3-chamber (high-sensitivity) McMaster, and Mini-FLOTAC methods in cattle, sheep, horses, and llamas in three phases. In the first phase, we performed an egg-spiking study to assess the egg recovery rate and accuracy of the different FEC methods. In the second phase, we examined clinical samples from four different livestock species and completed multiple replicate FEC using each method. In the last phase, we assessed the cheesecloth straining step as a potential source of egg loss. In the egg-spiking study, the Mini-FLOTAC recovered 70.9% of the eggs, which was significantly higher than either the McMaster (P = 0.002) or Wisconsin (P = 0.002). In the clinical samples from ruminants, Mini-FLOTAC consistently yielded the highest EPG, revealing a significantly higher level of egg recovery (P < 0.0001). For horses and llamas, both McMaster and Mini-FLOTAC yielded significantly higher EPG than Wisconsin (P < 0.0001, P < 0.0001, P < 0.001, and P = 0.024). Mini-FLOTAC was the most accurate method and was the most precise test for both species of ruminants. The Wisconsin method was the most precise for horses and McMaster was more precise for llama samples. We compared the Wisconsin and Mini-FLOTAC methods using a modified technique where both methods were performed using either the Mini-FLOTAC sieve or cheesecloth. The differences in the estimated mean EPG on log scale between the Wisconsin and mini-FLOTAC methods when cheesecloth was used (P < 0.0001) and when cheesecloth was excluded (P < 0.0001) were significant, providing strong evidence that the straining step is an important source of error. The high accuracy and precision demonstrated in this study for the Mini-FLOTAC, suggest that this method can be recommended for routine use in all host species. The benefits of Mini-FLOTAC will be especially relevant when high accuracy is important, such as when performing FECRT. Copyright © 2018 Elsevier B.V. All rights reserved.
Laurel J. Haavik; Tom W. Coleman; Mary Louise Flint; Robert C. Venette; Steven J. Seybold
2012-01-01
In recent decades, invasive phloem and wood borers have become important pests in North America. To aid tree sampling and survey efforts for the newly introduced goldspotted oak borer, Agrilus auroguttatus Schaeffer (Coleoptera: Buprestidae), we examined spatial patterns of exit holes on the boles (trunks) of 58 coast live oak, Quercus...
ERIC Educational Resources Information Center
Zakszeski, Brittany N.; Hojnoski, Robin L.; Wood, Brenna K.
2017-01-01
Classroom engagement is important to young children's academic and social development. Accurate methods of capturing this behavior are needed to inform and evaluate intervention efforts. This study compared the accuracy of interval durations (i.e., 5 s, 10 s, 15 s, 20 s, 30 s, and 60 s) of momentary time sampling (MTS) in approximating the…
ERIC Educational Resources Information Center
Southwood, Frenette; Russell, Ann F.
2004-01-01
The spontaneous language sample forms an important part of the language evaluation protocol (M. Dunn, J. Flax, M. Sliwinski, & D. Aram, 1996; J. L. Evans & H. K. Craig, 1992; L. E. Evans & J. Miller, 1999) because of the limitations of standardized language tests and their unavailability in certain languages, such as Afrikaans. This study examined…
Analysis of injuries in taekwondo athletes.
Ji, MinJoon
2016-01-01
[Purpose] The present study aims to provide fundamental information on injuries in taekwondo by investigating the categories of injuries that occur in taekwondo and determining the locations of these injuries. [Subjects and Methods] The data of 512 taekwondo athletes were collected. The sampling method was convenience sampling along with non-probability sampling extraction methods. Questionnaire forms were used to obtain the data. [Results] The foot, knee, ankle, thigh, and head were most frequently injured while practicing taekwondo, and contusions, strains, and sprains were the main injuries diagnosed. [Conclusion] It is desirable to decrease the possibility of injuries to the lower extremities for extending participation in taekwondo. Other than the lower extremities, injuries of other specific body parts including the head or neck could be important factors limiting the duration of participation. Therefore, it is necessary to cope with these problems before practicing taekwondo.
Mashile, Geaneth Pertunia; Nomngongo, Philiswa N
2017-03-04
Cyanotoxins are toxic and are found in eutrophic, municipal, and residential water supplies. For this reason, their occurrence in drinking water systems has become a global concern. Therefore, monitoring, control, risk assessment, and prevention of these contaminants in the environmental bodies are important subjects associated with public health. Thus, rapid, sensitive, selective, simple, and accurate analytical methods for the identification and determination of cyanotoxins are required. In this paper, the sampling methodologies and applications of solid phase-based sample preparation methods for the determination of cyanotoxins in environmental matrices are reviewed. The sample preparation techniques mainly include solid phase micro-extraction (SPME), solid phase extraction (SPE), and solid phase adsorption toxin tracking technology (SPATT). In addition, advantages and disadvantages and future prospects of these methods have been discussed.
A Hybrid Monte Carlo importance sampling of rare events in Turbulence and in Turbulent Models
NASA Astrophysics Data System (ADS)
Margazoglou, Georgios; Biferale, Luca; Grauer, Rainer; Jansen, Karl; Mesterhazy, David; Rosenow, Tillmann; Tripiccione, Raffaele
2017-11-01
Extreme and rare events is a challenging topic in the field of turbulence. Trying to investigate those instances through the use of traditional numerical tools turns to be a notorious task, as they fail to systematically sample the fluctuations around them. On the other hand, we propose that an importance sampling Monte Carlo method can selectively highlight extreme events in remote areas of the phase space and induce their occurrence. We present a brand new computational approach, based on the path integral formulation of stochastic dynamics, and employ an accelerated Hybrid Monte Carlo (HMC) algorithm for this purpose. Through the paradigm of stochastic one-dimensional Burgers' equation, subjected to a random noise that is white-in-time and power-law correlated in Fourier space, we will prove our concept and benchmark our results with standard CFD methods. Furthermore, we will present our first results of constrained sampling around saddle-point instanton configurations (optimal fluctuations). The research leading to these results has received funding from the EU Horizon 2020 research and innovation programme under Grant Agreement No. 642069, and from the EU Seventh Framework Programme (FP7/2007-2013) under ERC Grant Agreement No. 339032.
PlasFlow: predicting plasmid sequences in metagenomic data using genome signatures
Lipinski, Leszek; Dziembowski, Andrzej
2018-01-01
Abstract Plasmids are mobile genetics elements that play an important role in the environmental adaptation of microorganisms. Although plasmids are usually analyzed in cultured microorganisms, there is a need for methods that allow for the analysis of pools of plasmids (plasmidomes) in environmental samples. To that end, several molecular biology and bioinformatics methods have been developed; however, they are limited to environments with low diversity and cannot recover large plasmids. Here, we present PlasFlow, a novel tool based on genomic signatures that employs a neural network approach for identification of bacterial plasmid sequences in environmental samples. PlasFlow can recover plasmid sequences from assembled metagenomes without any prior knowledge of the taxonomical or functional composition of samples with an accuracy up to 96%. It can also recover sequences of both circular and linear plasmids and can perform initial taxonomical classification of sequences. Compared to other currently available tools, PlasFlow demonstrated significantly better performance on test datasets. Analysis of two samples from heavy metal-contaminated microbial mats revealed that plasmids may constitute an important fraction of their metagenomes and carry genes involved in heavy-metal homeostasis, proving the pivotal role of plasmids in microorganism adaptation to environmental conditions. PMID:29346586
Ait Kaci Azzou, Sadoune; Larribe, Fabrice; Froda, Sorana
2015-01-01
The effective population size over time (demographic history) can be retraced from a sample of contemporary DNA sequences. In this paper, we propose a novel methodology based on importance sampling (IS) for exploring such demographic histories. Our starting point is the generalized skyline plot with the main difference being that our procedure, skywis plot, uses a large number of genealogies. The information provided by these genealogies is combined according to the IS weights. Thus, we compute a weighted average of the effective population sizes on specific time intervals (epochs), where the genealogies that agree more with the data are given more weight. We illustrate by a simulation study that the skywis plot correctly reconstructs the recent demographic history under the scenarios most commonly considered in the literature. In particular, our method can capture a change point in the effective population size, and its overall performance is comparable with the one of the bayesian skyline plot. We also introduce the case of serially sampled sequences and illustrate that it is possible to improve the performance of the skywis plot in the case of an exponential expansion of the effective population size. PMID:26300910
Accuracy or precision: Implications of sample design and methodology on abundance estimation
Kowalewski, Lucas K.; Chizinski, Christopher J.; Powell, Larkin A.; Pope, Kevin L.; Pegg, Mark A.
2015-01-01
Sampling by spatially replicated counts (point-count) is an increasingly popular method of estimating population size of organisms. Challenges exist when sampling by point-count method, and it is often impractical to sample entire area of interest and impossible to detect every individual present. Ecologists encounter logistical limitations that force them to sample either few large-sample units or many small sample-units, introducing biases to sample counts. We generated a computer environment and simulated sampling scenarios to test the role of number of samples, sample unit area, number of organisms, and distribution of organisms in the estimation of population sizes using N-mixture models. Many sample units of small area provided estimates that were consistently closer to true abundance than sample scenarios with few sample units of large area. However, sample scenarios with few sample units of large area provided more precise abundance estimates than abundance estimates derived from sample scenarios with many sample units of small area. It is important to consider accuracy and precision of abundance estimates during the sample design process with study goals and objectives fully recognized, although and with consequence, consideration of accuracy and precision of abundance estimates is often an afterthought that occurs during the data analysis process.
Modeling variability and scale integration of LAI measurements
Kris Nackaerts; Pol Coppin
2000-01-01
Rapid and reliable estimation of leaf area at various scales is important for research on chance detection of leaf area index (LAI) as an indicator of ecosystem condition. It is of utmost importance to know to what extent boundary and illumination conditions, data aggregation method, and sampling scheme influence the relative accuracy of stand-level LAI measurements....
DiMichele, Daniel L; Spradley, M Katherine
2012-09-10
Reliable methods for sex estimation during the development of a biological profile are important to the forensic community in instances when the common skeletal elements used to assess sex are absent or damaged. Sex estimation from the calcaneus has potentially significant importance for the forensic community. Specifically, measurements of the calcaneus provide an additional reliable method for sex estimation via discriminant function analysis based on a North American forensic population. Research on a modern American sample was chosen in order to develop up-to-date population specific discriminant functions for sex estimation. The current study addresses this matter, building upon previous research and introduces a new measurement, posterior circumference that promises to advance the accuracy of use of this single, highly resistant bone in future instances of sex determination from partial skeletal remains. Data were collected from The William Bass Skeletal Collection, housed at The University of Tennessee. Sample size includes 320 adult individuals born between the years 1900 and 1985. The sample was comprised of 136 females and 184 males. Skeletons used for measurements were confined to those with fused diaphyses showing no signs of pathology or damage that may have altered measurements, and that also had accompanying records that included information on ancestry, age, and sex. Measurements collected and analyzed include maximum length, load-arm length, load-arm width, and posterior circumference. The sample was used to compute a discriminant function, based on all four variables, and was performed in SAS 9.1.3. The discriminant function obtained an overall cross-validated classification rate of 86.69%. Females were classified correctly in 88.64% of the cases and males were correctly classified in 84.75% of the cases. Due to the increasing heterogeneity of current populations further discussion on this topic will include the importance that the re-evaluation of past studies has on modern forensic populations. Due to secular and micro evolutionary changes among populations, the near future must include additional methods being updated, and new methods being examined, both which should cover a wide population spectrum. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.
APPLICATION OF THE ELECTROMAGNETIC BOREHOLE FLOWMETER
Spatial variability of saturated zone hydraulic properties has important implications with regard to sampling wells for water quality parameters, use of conventional methods to estimate transmissivity, and remedial system design. Characterization of subsurface heterogeneity requ...
Zhang, Ning; Chen, Haitao; Sun, Baoguo; Mao, Xueying; Zhang, Yuyu; Zhou, Ying
2016-01-01
To compare the volatile compounds of Chinese black truffle and white truffle from Yunnan province, this study presents the application of a direct solvent extraction/solvent-assisted flavor evaporation (DSE-SAFE) coupled with a comprehensive two-dimensional gas chromatography (GC × GC) high resolution time-of-flight mass spectrometry (HR-TOF/MS) and an electronic nose. Both of the analytical methods could distinguish the aroma profile of the two samples. In terms of the overall profile of truffle samples in this research, more kinds of acids were detected via the method of DSE-SAFE. Besides, compounds identified in black truffle (BT), but not in white truffle (WT), or vice versa, and those detected in both samples at different levels were considered to play an important role in differentiating the two samples. According to the analysis of electronic nose, the two samples could be separated, as well. PMID:27058524
NASA Astrophysics Data System (ADS)
Zhang, Linna; Sun, Meixiu; Wang, Zhennan; Li, Hongxiao; Li, Yingxin; Li, Gang; Lin, Ling
2017-09-01
The inspection and identification of whole blood are crucially significant for import-export ports and inspection and quarantine departments. In our previous research, we proved Near-Infrared diffuse transmitted spectroscopy method was potential for noninvasively identifying three blood species, including macaque, human and mouse, with samples measured in the cuvettes. However, in open sampling cases, inspectors may be endangered by virulence factors in blood samples. In this paper, we explored the noncontact measurement for classification, with blood samples measured in the vacuum blood vessels. Spatially resolved near-infrared spectroscopy was used to improve the prediction accuracy. Results showed that the prediction accuracy of the model built with nine detection points was more than 90% in identification between all five species, including chicken, goat, macaque, pig and rat, far better than the performance of the model built with single-point spectra. The results fully supported the idea that spatially resolved near-infrared spectroscopy method can improve the prediction ability, and demonstrated the feasibility of this method for noncontact blood species identification in practical applications.
Zarei, Mohammad; Ravanshad, Mehrdad; Bagban, Ashraf; Fallahi, Shahab
2016-07-01
The human immunodeficiency virus (HIV-1) is the etiologic agent of AIDS. The disease can be transmitted via blood in the window period prior to the development of antibodies to the disease. Thus, an appropriate method for the detection of HIV-1 during this window period is very important. This descriptive study proposes a sensitive, efficient, inexpensive, and easy method to detect HIV-1. In this study 25 serum samples of patients under treatment and also 10 positive and 10 negative control samples were studied. Twenty-five blood samples were obtained from HIV-1-infected individuals who were receiving treatment at the acquired immune deficiency syndrome (AIDS) research center of Imam Khomeini hospital in Tehran. The identification of HIV-1-positive samples was done by using reverse transcription to produce copy deoxyribonucleic acid (cDNA) and then optimizing the nested polymerase chain reaction (PCR) method. Two pairs of primers were then designed specifically for the protease gene fragment of the nested real time-PCR (RT-PCR) samples. Electrophoresis was used to examine the PCR products. The results were analyzed using statistical tests, including Fisher's exact test, and SPSS17 software. The 325 bp band of the protease gene was observed in all the positive control samples and in none of the negative control samples. The proposed method correctly identified HIV-1 in 23 of the 25 samples. These results suggest that, in comparison with viral cultures, antibody detection by enzyme linked immunosorbent assay (ELISAs), and conventional PCR methods, the proposed method has high sensitivity and specificity for the detection of HIV-1.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Romero, Vicente; Bonney, Matthew; Schroeder, Benjamin
When very few samples of a random quantity are available from a source distribution of unknown shape, it is usually not possible to accurately infer the exact distribution from which the data samples come. Under-estimation of important quantities such as response variance and failure probabilities can result. For many engineering purposes, including design and risk analysis, we attempt to avoid under-estimation with a strategy to conservatively estimate (bound) these types of quantities -- without being overly conservative -- when only a few samples of a random quantity are available from model predictions or replicate experiments. This report examines a classmore » of related sparse-data uncertainty representation and inference approaches that are relatively simple, inexpensive, and effective. Tradeoffs between the methods' conservatism, reliability, and risk versus number of data samples (cost) are quantified with multi-attribute metrics use d to assess method performance for conservative estimation of two representative quantities: central 95% of response; and 10 -4 probability of exceeding a response threshold in a tail of the distribution. Each method's performance is characterized with 10,000 random trials on a large number of diverse and challenging distributions. The best method and number of samples to use in a given circumstance depends on the uncertainty quantity to be estimated, the PDF character, and the desired reliability of bounding the true value. On the basis of this large data base and study, a strategy is proposed for selecting the method and number of samples for attaining reasonable credibility levels in bounding these types of quantities when sparse samples of random variables or functions are available from experiments or simulations.« less
Lewis, Celine; Clotworthy, Margaret; Hilton, Shona; Magee, Caroline; Robertson, Mark J; Stubbins, Lesley J; Corfield, Julie
2013-01-01
Objective A mixed methods study exploring the UK general public's willingness to donate human biosamples (HBSs) for biomedical research. Setting Cross-sectional focus groups followed by an online survey. Participants Twelve focus groups (81 participants) selectively sampled to reflect a range of demographic groups; 1110 survey responders recruited through a stratified sampling method with quotas set on sex, age, geographical location, socioeconomic group and ethnicity. Main outcome measures (1) Identify participants’ willingness to donate HBSs for biomedical research, (2) explore acceptability towards donating different types of HBSs in various settings and (3) explore preferences regarding use and access to HBSs. Results 87% of survey participants thought donation of HBSs was important and 75% wanted to be asked to donate in general. Responders who self-reported having some or good knowledge of the medical research process were significantly more likely to want to donate (p<0.001). Reasons why focus group participants saw donation as important included: it was a good way of reciprocating for the medical treatment received; it was an important way of developing drugs and treatments; residual tissue would otherwise go to waste and they or their family members might benefit. The most controversial types of HBSs to donate included: brain post mortem (29% would donate), eyes post mortem (35%), embryos (44%), spare eggs (48%) and sperm (58%). Regarding the use of samples, there were concerns over animal research (34%), research conducted outside the UK (35%), and research conducted by pharmaceutical companies (56%), although education and discussion were found to alleviate such concerns. Conclusions There is a high level of public support and willingness to donate HBSs for biomedical research. Underlying concerns exist regarding the use of certain types of HBSs and conditions under which they are used. Improved education and more controlled forms of consent for sensitive samples may mitigate such concerns. PMID:23929915
Kosek, Margaret N.; Schwab, Kellogg J.
2017-01-01
Empiric quantification of environmental fecal contamination is an important step toward understanding the impact that water, sanitation, and hygiene interventions have on reducing enteric infections. There is a need to standardize the methods used for surface sampling in field studies that examine fecal contamination in low-income settings. The dry cloth method presented in this manuscript improves upon the more commonly used swabbing technique that has been shown in the literature to have a low sampling efficiency. The recovery efficiency of a dry electrostatic cloth sampling method was evaluated using Escherichia coli and then applied to household surfaces in Iquitos, Peru, where there is high fecal contamination and enteric infection. Side-by-side measurements were taken from various floor locations within a household at the same time over a three-month period to compare for consistency of quantification of E. coli bacteria. The dry cloth sampling method in the laboratory setting showed 105% (95% Confidence Interval: 98%, 113%) E. coli recovery efficiency off of the cloths. The field application demonstrated strong agreement of side-by-side results (Pearson correlation coefficient for dirt surfaces was 0.83 (p < 0.0001) and 0.91 (p < 0.0001) for cement surfaces) and moderate agreement for results between entrance and kitchen samples (Pearson (0.53, p < 0.0001) and weighted Kappa statistic (0.54, p < 0.0001)). Our findings suggest that this method can be utilized in households with high bacterial loads using either continuous (quantitative) or categorical (semi-quantitative) data. The standardization of this low-cost, dry electrostatic cloth sampling method can be used to measure differences between households in intervention and non-intervention arms of randomized trials. PMID:28829392
Rodrigues, Silas Pessini; Ventura, José Aires; Zingali, R B; Fernandes, P M B
2009-01-01
A variety of sample preparation protocols for plant proteomic analysis using two-dimensional gel electrophoresis (2-DE) have been reported. However, they usually have to be adapted and further optimised for the analysis of plant species not previously studied. This work aimed to evaluate different sample preparation protocols for analysing Carica papaya L. leaf proteins through 2-DE. Four sample preparation methods were tested: (1) phenol extraction and methanol-ammonium acetate precipitation; (2) no precipitation fractionation; and the traditional trichloroacetic acid-acetone precipitation either (3) with or (4) without protein fractionation. The samples were analysed for their compatibility with SDS-PAGE (1-DE) and 2-DE. Fifteen selected protein spots were trypsinised and analysed by matrix-assisted laser desorption/ionisation time-of-flight tandem mass spectrometry (MALDI-TOF-MS/MS), followed by a protein search using the NCBInr database to accurately identify all proteins. Methods number 3 and 4 resulted in large quantities of protein with good 1-DE separation and were chosen for 2-DE analysis. However, only the TCA method without fractionation (no. 4) proved to be useful. Spot number and resolution advances were achieved, which included having an additional solubilisation step in the conventional TCA method. Moreover, most of the theoretical and experimental protein molecular weight and pI data had similar values, suggesting good focusing and, most importantly, limited protein degradation. The described sample preparation method allows the proteomic analysis of papaya leaves by 2-DE and mass spectrometry (MALDI-TOF-MS/MS). The methods presented can be a starting point for the optimisation of sample preparation protocols for other plant species.
Exum, Natalie G; Kosek, Margaret N; Davis, Meghan F; Schwab, Kellogg J
2017-08-22
Empiric quantification of environmental fecal contamination is an important step toward understanding the impact that water, sanitation, and hygiene interventions have on reducing enteric infections. There is a need to standardize the methods used for surface sampling in field studies that examine fecal contamination in low-income settings. The dry cloth method presented in this manuscript improves upon the more commonly used swabbing technique that has been shown in the literature to have a low sampling efficiency. The recovery efficiency of a dry electrostatic cloth sampling method was evaluated using Escherichia coli and then applied to household surfaces in Iquitos, Peru, where there is high fecal contamination and enteric infection. Side-by-side measurements were taken from various floor locations within a household at the same time over a three-month period to compare for consistency of quantification of E. coli bacteria. The dry cloth sampling method in the laboratory setting showed 105% (95% Confidence Interval: 98%, 113%) E. coli recovery efficiency off of the cloths. The field application demonstrated strong agreement of side-by-side results (Pearson correlation coefficient for dirt surfaces was 0.83 ( p < 0.0001) and 0.91 ( p < 0.0001) for cement surfaces) and moderate agreement for results between entrance and kitchen samples (Pearson (0.53, p < 0.0001) and weighted Kappa statistic (0.54, p < 0.0001)). Our findings suggest that this method can be utilized in households with high bacterial loads using either continuous (quantitative) or categorical (semi-quantitative) data. The standardization of this low-cost, dry electrostatic cloth sampling method can be used to measure differences between households in intervention and non-intervention arms of randomized trials.
Armand, Belal; Solhjoo, Kavous; Shabani-Kordshooli, Manoochehr; Davami, Mohammad Hasan; Sadeghi, Mehdi
2016-01-01
Aim: Toxoplasma gondii has a clinical and veterinary importance as it is known to cause congenital disease and abortion both in humans and livestock. Since the contaminated lamb is one of the sources of human infection, this study was performed to determine the prevalence of T. gondii in sheep in south of Iran. Materials and Methods: Sera and tissue samples (diaphragm and heart) were collected from 370 sheep from slaughterhouse of Jahrom. The samples were taken from both sexes and from 6 to 60 months age. Specific immunoglobulin G antibodies to T. gondii were examined with enzyme-linked immunosorbent assay, and B1 gene nested-polymerase chain reaction detection was done to survey the tissue samples. Results: The total prevalence of Toxoplasma infection among sheep was found to be 35.94% and 34.32% based on serological and molecular method, respectively. According to serologic and molecular findings, the females were more positive than males for Toxoplasma; maximum frequency of positive samples was observed in 24-36 months and the positive samples had been collected more in spring than in summer, but no statistical correlation was observed between prevalence rate and the age and sex of animals or season of sampling. Conclusion: T. gondii is widely distributed in sheep in Jahrom with a rate comparable with other parts of Iran and the world. It suggested a widespread exposure of sheep in this region to T. gondii. Thus, consumption of undercooked or raw meat presents the transmission risk of the parasite and this might be considered as an important public health problem, mainly for high-risk groups such as the pregnant and the immunodeficient. PMID:27651673
Jaccard distance based weighted sparse representation for coarse-to-fine plant species recognition.
Zhang, Shanwen; Wu, Xiaowei; You, Zhuhong
2017-01-01
Leaf based plant species recognition plays an important role in ecological protection, however its application to large and modern leaf databases has been a long-standing obstacle due to the computational cost and feasibility. Recognizing such limitations, we propose a Jaccard distance based sparse representation (JDSR) method which adopts a two-stage, coarse to fine strategy for plant species recognition. In the first stage, we use the Jaccard distance between the test sample and each training sample to coarsely determine the candidate classes of the test sample. The second stage includes a Jaccard distance based weighted sparse representation based classification(WSRC), which aims to approximately represent the test sample in the training space, and classify it by the approximation residuals. Since the training model of our JDSR method involves much fewer but more informative representatives, this method is expected to overcome the limitation of high computational and memory costs in traditional sparse representation based classification. Comparative experimental results on a public leaf image database demonstrate that the proposed method outperforms other existing feature extraction and SRC based plant recognition methods in terms of both accuracy and computational speed.
Terra, Luciana A; Filgueiras, Paulo R; Tose, Lílian V; Romão, Wanderson; de Souza, Douglas D; de Castro, Eustáquio V R; de Oliveira, Mirela S L; Dias, Júlio C M; Poppi, Ronei J
2014-10-07
Negative-ion mode electrospray ionization, ESI(-), with Fourier transform ion cyclotron resonance mass spectrometry (FT-ICR MS) was coupled to a Partial Least Squares (PLS) regression and variable selection methods to estimate the total acid number (TAN) of Brazilian crude oil samples. Generally, ESI(-)-FT-ICR mass spectra present a power of resolution of ca. 500,000 and a mass accuracy less than 1 ppm, producing a data matrix containing over 5700 variables per sample. These variables correspond to heteroatom-containing species detected as deprotonated molecules, [M - H](-) ions, which are identified primarily as naphthenic acids, phenols and carbazole analog species. The TAN values for all samples ranged from 0.06 to 3.61 mg of KOH g(-1). To facilitate the spectral interpretation, three methods of variable selection were studied: variable importance in the projection (VIP), interval partial least squares (iPLS) and elimination of uninformative variables (UVE). The UVE method seems to be more appropriate for selecting important variables, reducing the dimension of the variables to 183 and producing a root mean square error of prediction of 0.32 mg of KOH g(-1). By reducing the size of the data, it was possible to relate the selected variables with their corresponding molecular formulas, thus identifying the main chemical species responsible for the TAN values.
Mohebali, Mehdi; Zarei, Zabiholah; Khanaliha, Khadijeh; Kia, Eshrat Beigom; Motavalli-Haghi, Afsaneh; Davoodi, Jaber; Rezaeian, Tahereh; Tarighi, Fathemeh; Rezaeian, Mostafa
2017-01-01
Majority of parasitic infections in rodents have zoonotic importance. This study aimed to determine the frequency and intensity of intestinal protozoa infections of rodents including Meriones persicus, Mus musculus and, C ricetulus migratorius . This survey was conducted in Meshkin Shahr district in northwestern Iran from Mar. to Dec. of 2014. Intestinal samples of 204 rodents including M. persicus (n=117), M. musculus (n=63) and C. migratorius (n=24) were parasitologically examined. Formalin-ether concentration method was done for all of rodents stool samples and observed with light microscope. All of suspected cases were stained with trichorome staining Method. Cultivation in dichromate potassium 2.5% was carried out for all of coccidian positive samples. Acid fast and aniline blue staining methods were used for detecting of coccidian oocysts and intestinal microsporidial spores, respectively. About 121(59.3%) of the caught rodents were generally infected with intestinal protozoa. Entamoeba muris 14(6.9%), Trichomonas muris 55(27.0%), Chilomastix betencourtti 17 (8.3%), Giardia muris 19(9.3%), Eimeria spp. 46(22.5%) , Isospora spp. 4(2%) and Cryptosporidium spp. 1(0.5%) were found from the collected rodents. Microsporidian spores were identified in 63 (31%) out of the 204 collected rodents using aniline blue staining method. Since some of the infections are zoonotic importance thus, control of rodents can be decreased new cases of the parasitic zoonoses in humans.
Lien, Stina K; Kvitvang, Hans Fredrik Nyvold; Bruheim, Per
2012-07-20
GC-MS analysis of silylated metabolites is a sensitive method that covers important metabolite groups such as sugars, amino acids and non-amino organic acids, and it has become one of the most important analytical methods for exploring the metabolome. Absolute quantitative GC-MS analysis of silylated metabolites poses a challenge as different metabolites have different derivatization kinetics and as their silyl-derivates have varying stability. This report describes the development of a targeted GC-MS/MS method for quantification of metabolites. Internal standards for each individual metabolite were obtained by derivatization of a mixture of standards with deuterated N-methyl-N-trimethylsilyltrifluoroacetamide (d9-MSTFA), and spiking this solution into MSTFA derivatized samples prior to GC-MS/MS analysis. The derivatization and spiking protocol needed optimization to ensure that the behaviour of labelled compound responses in the spiked sample correctly reflected the behaviour of unlabelled compound responses. Using labelled and unlabelled MSTFA in this way enabled normalization of metabolite responses by the response of their deuterated counterpart (i.e. individual correction). Such individual correction of metabolite responses reproducibly resulted in significantly higher precision than traditional data correction strategies when tested on samples both with and without serum and urine matrices. The developed method is thus a valuable contribution to the field of absolute quantitative metabolomics. Copyright © 2012 Elsevier B.V. All rights reserved.
Mass spectrometric detection of ricin and its activity in food and clinical samples.
Kalb, Suzanne R; Barr, John R
2009-03-15
Ricin is a potent toxin capable of inhibiting protein synthesis and causing death or respiratory failure. Because of its high availability and lethality, ricin is considered a likely agent for bioterrorism. Rapidly determining contamination of food product with ricin and human exposure to ricin is therefore an important public health goal. In this work, we report the development of a method that detects ricin and its activity in food or clinical samples. This method involves immunocapture of the toxin, an examination of the activity of the ricin protein upon a DNA substrate that mimics the toxin's natural RNA target, and analysis of tryptic fragments of the toxin itself. It is the combination of these three techniques, all performed on the same sample, which allows for a sensitive and selective analysis of ricin isolated from a food or clinical sample. This measurement includes a measure of the toxin's activity. The utility of this method was demonstrated on ricin spiked into food and clinical samples consisting of milk, apple juice, serum, and saliva.
Magnetic Stirrer Method for the Detection of Trichinella Larvae in Muscle Samples.
Mayer-Scholl, Anne; Pozio, Edoardo; Gayda, Jennifer; Thaben, Nora; Bahn, Peter; Nöckler, Karsten
2017-03-03
Trichinellosis is a debilitating disease in humans and is caused by the consumption of raw or undercooked meat of animals infected with the nematode larvae of the genus Trichinella. The most important sources of human infections worldwide are game meat and pork or pork products. In many countries, the prevention of human trichinellosis is based on the identification of infected animals by means of the artificial digestion of muscle samples from susceptible animal carcasses. There are several methods based on the digestion of meat but the magnetic stirrer method is considered the gold standard. This method allows the detection of Trichinella larvae by microscopy after the enzymatic digestion of muscle samples and subsequent filtration and sedimentation steps. Although this method does not require special and expensive equipment, internal controls cannot be used. Therefore, stringent quality management should be applied throughout the test. The aim of the present work is to provide detailed handling instructions and critical control points of the method to analysts, based on the experience of the European Union Reference Laboratory for Parasites and the National Reference Laboratory of Germany for Trichinella.
Profiling pleural effusion cells by a diffraction imaging method
NASA Astrophysics Data System (ADS)
Al-Qaysi, Safaa; Hong, Heng; Wen, Yuhua; Lu, Jun Q.; Feng, Yuanming; Hu, Xin-Hua
2018-02-01
Assay of cells in pleural effusion (PE) is an important means of disease diagnosis. Conventional cytology of effusion samples, however, has low sensitivity and depends heavily on the expertise of cytopathologists. We applied a polarization diffraction imaging flow cytometry method on effusion cells to investigate their features. Diffraction imaging of the PE cell samples has been performed on 6000 to 12000 cells for each effusion cell sample of three patients. After prescreening to remove images by cellular debris and aggregated non-cellular particles, the image textures were extracted with a gray level co-occurrence matrix (GLCM) algorithm. The distribution of the imaged cells in the GLCM parameters space was analyzed by a Gaussian Mixture Model (GMM) to determine the number of clusters among the effusion cells. These results yield insight on textural features of diffraction images and related cellular morphology in effusion samples and can be used toward the development of a label-free method for effusion cells assay.
Use of volatile organic components in scat to identify canid species
Burnham, E.; Bender, L.C.; Eiceman, G.A.; Pierce, K.M.; Prasad, S.
2008-01-01
Identification of wildlife species from indirect evidence can be an important part of wildlife management, and conventional +methods can be expensive or have high error rates. We used chemical characterization of the volatile organic constituents (VOCs) in scat as a method to identify 5 species of North American canids from multiple individuals. We sampled vapors of scats in the headspace over a sample using solid-phase microextraction and determined VOC content using gas chromatography with a flame ionization detector. We used linear discriminant analysis to develop models for differentiating species with bootstrapping to estimate accuracy. Our method correcdy classified 82.4% (bootstrapped 95% CI = 68.8-93.8%) of scat samples. Red fox (Vulpes vulpes) scat was most frequendy misclassified (25.0% of scats misclassified); red fox was also the most common destination for misclassified samples. Our findings are the first reported identification of animal species using VOCs in vapor emissions from scat and suggest that identification of wildlife species may be plausible through chemical characterization of vapor emissions of scat.
Analysis of the neurotoxin anisatin in star anise by LC-MS/MS.
Mathon, Caroline; Bongard, Benjamin; Duret, Monique; Ortelli, Didier; Christen, Philippe; Bieri, Stefan
2013-01-01
The aim of this work was to develop an analytical method capable of determining the presence of anisatin in star anise. This neurotoxin may induce severe side effects such as epileptic convulsions. It is therefore of prime importance to have rapid and accurate analytical methods able to detect and quantify anisatin in samples that are purportedly edible star anise. The sample preparation combined an automated accelerated solvent extraction with a solid-supported liquid-liquid purification step on EXtrelut®. Samples were analysed on a porous graphitic carbon HPLC column and quantified by tandem mass spectrometry operating in the negative ionisation mode. The quantification range of anisatin was between 0.2 and 8 mg kg⁻¹. The applicability of this validated method was demonstrated by the analysis of several Illicium species and star anise samples purchased on the Swiss market. High levels of anisatin were measured in Illicium lanceolatum, I. majus and I. anisatum, which may cause health concerns if they are misidentified or mixed with edible Illicium verum.
NASA Astrophysics Data System (ADS)
Barnhart, E. P.; Ruppert, L. F.; Orem, W. H.; McIntosh, J. C.; Cunningham, A. B.; Fields, M. W.; Hiebert, R.; Hyatt, R.
2016-12-01
There is an increasing threat that deep aquifers, an important drinking water resource, may be contaminated by the extraction and transport of fossil fuels. This threat increases the need for improved groundwater monitoring and the ability to predict the extent to which microbial activity may remediate such contamination. The characterization of subsurface microbial communities could provide an ideal biomonitoring tool for the assessment of subsurface contamination due to prokaryotes environmental ubiquity, rapidity of response to environmental perturbation and the important role they play in hydrocarbon degradation and bioremediation. New DNA sequencing technologies provide the opportunity to cost-effectively identify the vast subsurface microbial ecosystem, but use of this new technology is restricted due to issues with sampling. Prior subsurface microbiology studies have relied on core samples that are expensive to obtain hard to collect aseptically and/or ground water samples that do not reflect in situ microbial densities or activities. The development of down-well incubation of sterile sediment with a Diffusive Microbial Sampler (DMS) has emerged as an alternative method to sample subsurface microbial communities that minimizes cost and contamination issues associated with traditional methods. We have designed a Subsurface Environment Sampler with a DMS module that could enable the anaerobic transport of the in situ microbial community from the field for laboratory bioremediation studies. This sampler could provide an inexpensive and standard method for subsurface microbial sampling which would make this tool useful for Federal, State, private and local agencies interested in monitoring contamination or the effectiveness of bioremediation activities in subsurface aquifers.
Chaban, Bonnie; Chu, Shirley; Hendrick, Steven; Waldner, Cheryl; Hill, Janet E.
2012-01-01
The detection and subspeciation of Campylobacter fetus subsp. venerealis (CFV) from veterinary samples is important for both clinical and economic reasons. Campylobacter fetus subsp. venerealis is the causative agent of bovine genital campylobacteriosis, a venereal disease that can lead to serious reproductive problems in cattle, and strict international regulations require animals and animal products to be CFV-free for trade. This study evaluated methods reported in the literature for CFV detection and reports the translation of an extensively tested CFV-specific polymerase chain reaction (PCR) primer set; including the VenSF/VenSR primers and a real-time, quantitative PCR (qPCR) platform using SYBR Green chemistry. Three methods of preputial sample preparation for direct qPCR were evaluated and a heat lysis DNA extraction method was shown to allow for CFV detection at the level of approximately one cell equivalent per reaction (or 1.0 × 103 CFU/mL) from prepuce. The optimized sample preparation and qPCR protocols were then used to evaluate 3 western Canadian bull cohorts, which included 377 bulls, for CFV. The qPCR assay detected 11 positive bulls for the CFV-specific parA gene target. DNA sequence data confirmed the identity of the amplified product and revealed that positive samples were comprised of 2 sequence types; one identical to previously reported CFV parA gene sequences and one with a 9% sequence divergence. These results add valuable information towards our understanding of an important CFV subspeciation target and offer a significantly improved format for an internationally recognized PCR test. PMID:23277694
Karakavuk, Muhammet; Aykur, Mehmet; Şahar, Esra Atalay; Karakuş, Mehmet; Aldemir, Duygu; Döndüren, Ömer; Özdemir, Hüseyin Gökhan; Can, Hüseyin; Gürüz, Adnan Yüksel; Dağcı, Hande; Döşkaya, Mert
2017-12-01
Acanthamoeba is a free-living amoeba which can be isolated from environment and among others well known as an opportunist protozoan parasite causing infections in humans and animals. Eyes are extremely important for the wild birds and losing sight ability due to Acanthamoeba can be dangerous. The studies on Acanthamoeba infection in wild birds is very few in world and Turkey therefore we aimed to screen deceased wild birds found in İzmir and Manisa provinces located in western Turkey using PCR and non-nutrition agar (NNA) plate method. Cornea samples were obtained from 18 deceased wild birds. During the external examination, signs of keratitis were observed in two Eurasian sparrowhawks (Accipiter nisus). All of the corneal samples were analyzed by two PCR methods and NNA plate. According to results, the Acanthamoeba positivity in corneal samples was 16.6% and 5.5% by PCR and plate method, respectively. According to sequencing data, two of isolates belonged to genotype T5 and one was genotype T4. In conclusion, Acanthamoeba infection was detected in wild bird cornea samples with/without keratitis for the first time in the world. The result of this study also show that Acanthamoeba can be a cause of keratitis in wild birds of Turkey and thus these predator birds can be a target of other wild animals due to loss of sight ability. In terms of public health, these results show the importance of wild birds as a source of Acanthamoeba infection in nature. Copyright © 2017 Elsevier Inc. All rights reserved.
Determination of tributyltin in whole water matrices under the European Water Framework Directive.
Richter, Janine; Fettig, Ina; Philipp, Rosemarie; Jakubowski, Norbert; Panne, Ulrich; Fisicaro, Paola; Alasonati, Enrica
2016-08-12
Monitoring of water quality is important to control water pollution. Contamination of the aquatic system has a large effect on human health and the environment. Under the European Water Framework Directive (WFD) 2000/60/EC and the related directive on environmental quality standards (EQS) in the field of water policy 2008/105/EC, the need for sensitive reference methods was highlighted. Since tributyltin (TBT) is one of the WFD listed priority substances a method was developed which is capable to qualify and quantify the pollutant at the required low WFD EQS of 0.2ngL(-1) in whole water bodies, i.e. in non-filtered water samples with dissolved organic carbon and suspended particulate matter. Therefore special attention was paid on the interaction of TBT with the suspended particulate matter and humic substances to obtain a complete representation of the pollution in surface waters. Different water samples were investigated varying the content of organic dissolved and suspended matter. Quantification was performed using species-specific isotope dilution (SSID) and gas chromatography with inductively coupled plasma mass spectrometry (GC-ICP-MS). Different sample treatment strategies were evaluated and compared. The process of internal standard addition was investigated and optimized, hence the equilibrium between internal standards and matrix is of primary importance to perform accurate SSID. Samples spiked at EQS level were analyzed with a recovery between 95 and 105 %. Additionally real surface water samples were investigated and the TBT concentration for the whole water body was determined and compared with conventional routine analysis method. Copyright © 2016 Elsevier B.V. All rights reserved.
Karacan, C Özgen; Olea, Ricardo A
2018-03-01
Chemical properties of coal largely determine coal handling, processing, beneficiation methods, and design of coal-fired power plants. Furthermore, these properties impact coal strength, coal blending during mining, as well as coal's gas content, which is important for mining safety. In order for these processes and quantitative predictions to be successful, safer, and economically feasible, it is important to determine and map chemical properties of coals accurately in order to infer these properties prior to mining. Ultimate analysis quantifies principal chemical elements in coal. These elements are C, H, N, S, O, and, depending on the basis, ash, and/or moisture. The basis for the data is determined by the condition of the sample at the time of analysis, with an "as-received" basis being the closest to sampling conditions and thus to the in-situ conditions of the coal. The parts determined or calculated as the result of ultimate analyses are compositions, reported in weight percent, and pose the challenges of statistical analyses of compositional data. The treatment of parts using proper compositional methods may be even more important in mapping them, as most mapping methods carry uncertainty due to partial sampling as well. In this work, we map the ultimate analyses parts of the Springfield coal from an Indiana section of the Illinois basin, USA, using sequential Gaussian simulation of isometric log-ratio transformed compositions. We compare the results with those of direct simulations of compositional parts. We also compare the implications of these approaches in calculating other properties using correlations to identify the differences and consequences. Although the study here is for coal, the methods described in the paper are applicable to any situation involving compositional data and its mapping.
Pooley, Hannah B.; de Silva, Kumudika; Purdie, Auriol C.; Begg, Douglas J.; Whittington, Richard J.
2016-01-01
ABSTRACT Determining the viability of bacteria is a key outcome of in vitro cellular infection assays. Currently, this is done by culture, which is problematic for fastidious slow-growing bacteria such as Mycobacterium avium subsp. paratuberculosis, where it can take up to 4 months to confirm growth. This study aimed to identify an assay that can rapidly quantify the number of viable M. avium subsp. paratuberculosis cells in a cellular sample. Three commercially available bacterial viability assays along with a modified liquid culture method coupled with high-throughput quantitative PCR growth detection were assessed. Criteria for assessment included the ability of each assay to differentiate live and dead M. avium subsp. paratuberculosis organisms and their accuracy at low bacterial concentrations. Using the culture-based method, M. avium subsp. paratuberculosis growth was reliably detected and quantified within 2 weeks. There was a strong linear association between the 2-week growth rate and the initial inoculum concentration. The number of viable M. avium subsp. paratuberculosis cells in an unknown sample was quantified based on the growth rate, by using growth standards. In contrast, none of the commercially available viability assays were suitable for use with samples from in vitro cellular infection assays. IMPORTANCE Rapid quantification of the viability of Mycobacterium avium subsp. paratuberculosis in samples from in vitro cellular infection assays is important, as it allows these assays to be carried out on a large scale. In vitro cellular infection assays can function as a preliminary screening tool, for vaccine development or antimicrobial screening, and also to extend findings derived from experimental animal trials. Currently, by using culture, it takes up to 4 months to obtain quantifiable results regarding M. avium subsp. paratuberculosis viability after an in vitro infection assay; however, with the quantitative PCR and liquid culture method developed, reliable results can be obtained at 2 weeks. This method will be important for vaccine and antimicrobial screening work, as it will allow a greater number of candidates to be screened in the same amount of time, which will increase the likelihood that a favorable candidate will be found to be subjected to further testing. PMID:27371585
Relative importance index (RII) in ranking of procrastination factors among university students
NASA Astrophysics Data System (ADS)
Aziz, Nazrina; Zain, Zakiyah; Mafuzi, Raja Muhammad Zahid Raja; Mustapa, Aini Mastura; Najib, Nur Hasibah Mohd; Lah, Nik Fatihah Nik
2016-08-01
Procrastination is the action of delaying or postponing something such as making a decision or starting or completing some tasks or activities. According to previous studies, students who have a strong tendency to procrastinate get low scores in their tests, resulting in poorer academic performance compared to those who do not procrastinate. This study aims to identify the procrastination factors in completing assignments among three groups of undergraduate students. The relative importance of procrastination factors was quantified by the relative importance index (RII) method prior to ranking. A multistage sampling technique was used in selecting the sample. The findings revealed that `too many works in one time' is one of the top three factors contributing to procrastination in all groups.
Characterizing lentic freshwater fish assemblages using multiple sampling methods
Fischer, Jesse R.; Quist, Michael C.
2014-01-01
Characterizing fish assemblages in lentic ecosystems is difficult, and multiple sampling methods are almost always necessary to gain reliable estimates of indices such as species richness. However, most research focused on lentic fish sampling methodology has targeted recreationally important species, and little to no information is available regarding the influence of multiple methods and timing (i.e., temporal variation) on characterizing entire fish assemblages. Therefore, six lakes and impoundments (48–1,557 ha surface area) were sampled seasonally with seven gear types to evaluate the combined influence of sampling methods and timing on the number of species and individuals sampled. Probabilities of detection for species indicated strong selectivities and seasonal trends that provide guidance on optimal seasons to use gears when targeting multiple species. The evaluation of species richness and number of individuals sampled using multiple gear combinations demonstrated that appreciable benefits over relatively few gears (e.g., to four) used in optimal seasons were not present. Specifically, over 90 % of the species encountered with all gear types and season combinations (N = 19) from six lakes and reservoirs were sampled with nighttime boat electrofishing in the fall and benthic trawling, modified-fyke, and mini-fyke netting during the summer. Our results indicated that the characterization of lentic fish assemblages was highly influenced by the selection of sampling gears and seasons, but did not appear to be influenced by waterbody type (i.e., natural lake, impoundment). The standardization of data collected with multiple methods and seasons to account for bias is imperative to monitoring of lentic ecosystems and will provide researchers with increased reliability in their interpretations and decisions made using information on lentic fish assemblages.
Williams, Michael S; Cao, Yong; Ebel, Eric D
2013-07-15
Levels of pathogenic organisms in food and water have steadily declined in many parts of the world. A consequence of this reduction is that the proportion of samples that test positive for the most contaminated product-pathogen pairings has fallen to less than 0.1. While this is unequivocally beneficial to public health, datasets with very few enumerated samples present an analytical challenge because a large proportion of the observations are censored values. One application of particular interest to risk assessors is the fitting of a statistical distribution function to datasets collected at some point in the farm-to-table continuum. The fitted distribution forms an important component of an exposure assessment. A number of studies have compared different fitting methods and proposed lower limits on the proportion of samples where the organisms of interest are identified and enumerated, with the recommended lower limit of enumerated samples being 0.2. This recommendation may not be applicable to food safety risk assessments for a number of reasons, which include the development of new Bayesian fitting methods, the use of highly sensitive screening tests, and the generally larger sample sizes found in surveys of food commodities. This study evaluates the performance of a Markov chain Monte Carlo fitting method when used in conjunction with a screening test and enumeration of positive samples by the Most Probable Number technique. The results suggest that levels of contamination for common product-pathogen pairs, such as Salmonella on poultry carcasses, can be reliably estimated with the proposed fitting method and samples sizes in excess of 500 observations. The results do, however, demonstrate that simple guidelines for this application, such as the proportion of positive samples, cannot be provided. Published by Elsevier B.V.
Debey-Pascher, Svenja; Hofmann, Andrea; Kreusch, Fatima; Schuler, Gerold; Schuler-Thurner, Beatrice; Schultze, Joachim L.; Staratschek-Jox, Andrea
2011-01-01
Microarray-based transcriptome analysis of peripheral blood as surrogate tissue has become an important approach in clinical implementations. However, application of gene expression profiling in routine clinical settings requires careful consideration of the influence of sample handling and RNA isolation methods on gene expression profile outcome. We evaluated the effect of different sample preservation strategies (eg, cryopreservation of peripheral blood mononuclear cells or freezing of PAXgene-stabilized whole blood samples) on gene expression profiles. Expression profiles obtained from cryopreserved peripheral blood mononuclear cells differed substantially from those of their nonfrozen counterpart samples. Furthermore, expression profiles in cryopreserved peripheral blood mononuclear cell samples were found to undergo significant alterations with increasing storage period, whereas long-term freezing of PAXgene RNA stabilized whole blood samples did not significantly affect stability of gene expression profiles. This report describes important technical aspects contributing toward the establishment of robust and reliable guidance for gene expression studies using peripheral blood and provides a promising strategy for reliable implementation in routine handling for diagnostic purposes. PMID:21704280
Jafari, Mostafa; Ebrahimzadeh, Homeira; Banitaba, Mohamma Hossein
2015-11-01
In this work a rapid and simple method for creatinine determination in urine and plasma samples based on aqueous derivatization of creatinine and complete vaporization of sample (as low as 10 µL), followed by ion mobility spectrometry analysis has been proposed. The effect of four important parameters (extraction temperature, total volume of solution, desorption temperature and extraction time) on ion mobility signal has been studied. Under the optimized conditions, the quantitative response of ion mobility spectrometry for creatinine was linear in the range of 0-500 mg L(-1) with a detection limit of 0.6 mg L(-1) in urine and 0-250 mg L(-1) with a detection limit of 2.6 mg L(-1) in plasma sample. The limit of quantitation of creatinine was 2.1 mg L(-1) and 8.7 mg L(-1) in urine and plasma samples, respectively. The relative standard deviation of the method was found to be 13%. The method was successfully applied to the analysis of creatinine in biological samples, showing recoveries from 92% to 104% in urine and 101-110% in plasma samples. Copyright © 2015 Elsevier B.V. All rights reserved.
Salo, Hanna; Berisha, Anna-Kaisa; Mäkinen, Joni
2016-03-01
This is the first study seasonally applying Sphagnum papillosum moss bags and vertical snow samples for monitoring atmospheric pollution. Moss bags, exposed in January, were collected together with snow samples by early March 2012 near the Harjavalta Industrial Park in southwest Finland. Magnetic, chemical, scanning electron microscopy-energy dispersive X-ray spectroscopy (SEM-EDX), K-means clustering, and Tomlinson pollution load index (PLI) data showed parallel spatial trends of pollution dispersal for both materials. Results strengthen previous findings that concentrate and slag handling activities were important (dust) emission sources while the impact from Cu-Ni smelter's pipe remained secondary at closer distances. Statistically significant correlations existed between the variables of snow and moss bags. As a summary, both methods work well for sampling and are efficient pollutant accumulators. Moss bags can be used also in winter conditions and they provide more homogeneous and better controlled sampling method than snow samples. Copyright © 2015. Published by Elsevier B.V.
Forensic Comparison of Soil Samples Using Nondestructive Elemental Analysis.
Uitdehaag, Stefan; Wiarda, Wim; Donders, Timme; Kuiper, Irene
2017-07-01
Soil can play an important role in forensic cases in linking suspects or objects to a crime scene by comparing samples from the crime scene with samples derived from items. This study uses an adapted ED-XRF analysis (sieving instead of grinding to prevent destruction of microfossils) to produce elemental composition data of 20 elements. Different data processing techniques and statistical distances were evaluated using data from 50 samples and the log-LR cost (C llr ). The best performing combination, Canberra distance, relative data, and square root values, is used to construct a discriminative model. Examples of the spatial resolution of the method in crime scenes are shown for three locations, and sampling strategy is discussed. Twelve test cases were analyzed, and results showed that the method is applicable. The study shows how the combination of an analysis technique, a database, and a discriminative model can be used to compare multiple soil samples quickly. © 2016 American Academy of Forensic Sciences.
Marques, Sara S.; Magalhães, Luís M.; Tóth, Ildikó V.; Segundo, Marcela A.
2014-01-01
Total antioxidant capacity assays are recognized as instrumental to establish antioxidant status of biological samples, however the varying experimental conditions result in conclusions that may not be transposable to other settings. After selection of the complexing agent, reagent addition order, buffer type and concentration, copper reducing assays were adapted to a high-throughput scheme and validated using model biological antioxidant compounds of ascorbic acid, Trolox (a soluble analogue of vitamin E), uric acid and glutathione. A critical comparison was made based on real samples including NIST-909c human serum certified sample, and five study samples. The validated method provided linear range up to 100 µM Trolox, (limit of detection 2.3 µM; limit of quantification 7.7 µM) with recovery results above 85% and precision <5%. The validated developed method with an increased sensitivity is a sound choice for assessment of TAC in serum samples. PMID:24968275
Use of three-point taper systems in timber cruising
James W. Flewelling; Richard L. Ernst; Lawrence M. Raynes
2000-01-01
Tree volumes and profiles are often estimated as functions of total height and DBH. Alternative estimators include form-class methods, importance sampling, the centroid method, and multi-point profile (taper) estimation systems; all of these require some measurement or estimate of upper stem diameters. The multi-point profile system discussed here allows for upper stem...
ERIC Educational Resources Information Center
Cheng, Anran; Tyne, Rebecca; Kwok, Yu Ting; Rees, Louis; Craig, Lorraine; Lapinee, Chaipat; D'Arcy, Mitch; Weiss, Dominik J.; Salau¨n, Pascal
2016-01-01
Testing water samples for arsenic contamination has become an important water quality issue worldwide. Arsenic usually occurs in very small concentrations, and a sensitive analytical method is needed. We present here a 1-day laboratory module developed to introduce Earth Sciences and/or Chemistry student undergraduates to key aspects of this…
van den Bosch, Frank; Gottwald, Timothy R.; Alonso Chavez, Vasthi
2017-01-01
The spread of pathogens into new environments poses a considerable threat to human, animal, and plant health, and by extension, human and animal wellbeing, ecosystem function, and agricultural productivity, worldwide. Early detection through effective surveillance is a key strategy to reduce the risk of their establishment. Whilst it is well established that statistical and economic considerations are of vital importance when planning surveillance efforts, it is also important to consider epidemiological characteristics of the pathogen in question—including heterogeneities within the epidemiological system itself. One of the most pronounced realisations of this heterogeneity is seen in the case of vector-borne pathogens, which spread between ‘hosts’ and ‘vectors’—with each group possessing distinct epidemiological characteristics. As a result, an important question when planning surveillance for emerging vector-borne pathogens is where to place sampling resources in order to detect the pathogen as early as possible. We answer this question by developing a statistical function which describes the probability distributions of the prevalences of infection at first detection in both hosts and vectors. We also show how this method can be adapted in order to maximise the probability of early detection of an emerging pathogen within imposed sample size and/or cost constraints, and demonstrate its application using two simple models of vector-borne citrus pathogens. Under the assumption of a linear cost function, we find that sampling costs are generally minimised when either hosts or vectors, but not both, are sampled. PMID:28846676
Wang, Yong; Fujii, Takeshi
2011-01-01
It is important in molecular biological analyses to evaluate contamination of co-extracted humic acids in DNA/RNA extracted from soil. We compared the sensitivity of various methods for measurement of humic acids, and influences of DNA/RNA and proteins on the measurement. Considering the results, we give suggestions as to choice of methods for measurement of humic acids in molecular biological analyses.
Shubhakar, Archana; Kalla, Rahul; Nimmo, Elaine R.; Fernandes, Daryl L.; Satsangi, Jack; Spencer, Daniel I. R.
2015-01-01
Introduction Serum N-glycans have been identified as putative biomarkers for numerous diseases. The impact of different serum sample tubes and processing methods on N-glycan analysis has received relatively little attention. This study aimed to determine the effect of different sample tubes and processing methods on the whole serum N-glycan profile in both health and disease. A secondary objective was to describe a robot automated N-glycan release, labeling and cleanup process for use in a biomarker discovery system. Methods 25 patients with active and quiescent inflammatory bowel disease and controls had three different serum sample tubes taken at the same draw. Two different processing methods were used for three types of tube (with and without gel-separation medium). Samples were randomised and processed in a blinded fashion. Whole serum N-glycan release, 2-aminobenzamide labeling and cleanup was automated using a Hamilton Microlab STARlet Liquid Handling robot. Samples were analysed using a hydrophilic interaction liquid chromatography/ethylene bridged hybrid(BEH) column on an ultra-high performance liquid chromatography instrument. Data were analysed quantitatively by pairwise correlation and hierarchical clustering using the area under each chromatogram peak. Qualitatively, a blinded assessor attempted to match chromatograms to each individual. Results There was small intra-individual variation in serum N-glycan profiles from samples collected using different sample processing methods. Intra-individual correlation coefficients were between 0.99 and 1. Unsupervised hierarchical clustering and principal coordinate analyses accurately matched samples from the same individual. Qualitative analysis demonstrated good chromatogram overlay and a blinded assessor was able to accurately match individuals based on chromatogram profile, regardless of disease status. Conclusions The three different serum sample tubes processed using the described methods cause minimal inter-individual variation in serum whole N-glycan profile when processed using an automated workstream. This has important implications for N-glycan biomarker discovery studies using different serum processing standard operating procedures. PMID:25831126
NASA Astrophysics Data System (ADS)
Elshall, A. S.; Ye, M.; Niu, G. Y.; Barron-Gafford, G.
2016-12-01
Bayesian multimodel inference is increasingly being used in hydrology. Estimating Bayesian model evidence (BME) is of central importance in many Bayesian multimodel analysis such as Bayesian model averaging and model selection. BME is the overall probability of the model in reproducing the data, accounting for the trade-off between the goodness-of-fit and the model complexity. Yet estimating BME is challenging, especially for high dimensional problems with complex sampling space. Estimating BME using the Monte Carlo numerical methods is preferred, as the methods yield higher accuracy than semi-analytical solutions (e.g. Laplace approximations, BIC, KIC, etc.). However, numerical methods are prone the numerical demons arising from underflow of round off errors. Although few studies alluded to this issue, to our knowledge this is the first study that illustrates these numerical demons. We show that the precision arithmetic can become a threshold on likelihood values and Metropolis acceptance ratio, which results in trimming parameter regions (when likelihood function is less than the smallest floating point number that a computer can represent) and corrupting of the empirical measures of the random states of the MCMC sampler (when using log-likelihood function). We consider two of the most powerful numerical estimators of BME that are the path sampling method of thermodynamic integration (TI) and the importance sampling method of steppingstone sampling (SS). We also consider the two most widely used numerical estimators, which are the prior sampling arithmetic mean (AS) and posterior sampling harmonic mean (HM). We investigate the vulnerability of these four estimators to the numerical demons. Interesting, the most biased estimator, namely the HM, turned out to be the least vulnerable. While it is generally assumed that AM is a bias-free estimator that will always approximate the true BME by investing in computational effort, we show that arithmetic underflow can hamper AM resulting in severe underestimation of BME. TI turned out to be the most vulnerable, resulting in BME overestimation. Finally, we show how SS can be largely invariant to rounding errors, yielding the most accurate and computational efficient results. These research results are useful for MC simulations to estimate Bayesian model evidence.
Distance-limited perpendicular distance sampling for coarse woody debris: theory and field results
Mark J. Ducey; Micheal S. Williams; Jeffrey H. Gove; Steven Roberge; Robert S. Kenning
2013-01-01
Coarse woody debris (CWD) has been identified as an important component in many forest ecosystem processes. Perpendicular distance sampling (PDS) is one of the several efficient new methods that have been proposed for CWD inventory. One drawback of PDS is that the maximum search distance can be very large, especially if CWD diameters are large or the volume factor...
Wood Specific Gravity Variation with Height and Its Implications for Biomass Estimation
Michael C. Wiemann; G. Bruce Williamson
2014-01-01
Wood specific gravity (SG) is widely employed by ecologists as a key variable in estimates of biomass. When it is important to have nondestructive methods for sampling wood for SG measurements, cores are extracted with an increment borer. While boring is a relatively difficult task even at breast height sampling, it is impossible at ground level and arduous at heights...
Su, Xiaoquan; Xu, Jian; Ning, Kang
2012-10-01
It has long been intriguing scientists to effectively compare different microbial communities (also referred as 'metagenomic samples' here) in a large scale: given a set of unknown samples, find similar metagenomic samples from a large repository and examine how similar these samples are. With the current metagenomic samples accumulated, it is possible to build a database of metagenomic samples of interests. Any metagenomic samples could then be searched against this database to find the most similar metagenomic sample(s). However, on one hand, current databases with a large number of metagenomic samples mostly serve as data repositories that offer few functionalities for analysis; and on the other hand, methods to measure the similarity of metagenomic data work well only for small set of samples by pairwise comparison. It is not yet clear, how to efficiently search for metagenomic samples against a large metagenomic database. In this study, we have proposed a novel method, Meta-Storms, that could systematically and efficiently organize and search metagenomic data. It includes the following components: (i) creating a database of metagenomic samples based on their taxonomical annotations, (ii) efficient indexing of samples in the database based on a hierarchical taxonomy indexing strategy, (iii) searching for a metagenomic sample against the database by a fast scoring function based on quantitative phylogeny and (iv) managing database by index export, index import, data insertion, data deletion and database merging. We have collected more than 1300 metagenomic data from the public domain and in-house facilities, and tested the Meta-Storms method on these datasets. Our experimental results show that Meta-Storms is capable of database creation and effective searching for a large number of metagenomic samples, and it could achieve similar accuracies compared with the current popular significance testing-based methods. Meta-Storms method would serve as a suitable database management and search system to quickly identify similar metagenomic samples from a large pool of samples. ningkang@qibebt.ac.cn Supplementary data are available at Bioinformatics online.
Numerical analysis of laser-driven reservoir dynamics for shockless loading
DOE Office of Scientific and Technical Information (OSTI.GOV)
Li Mu; Zhang Hongping; Sun Chengwei
2011-05-01
Laser-driven plasma loader for shockless compression provides a new approach to study the rapid compression response of materials not attainable in conventional shock experiments. In this method, the strain rate is varied from {approx}10{sup 6}/s to {approx}10{sup 8}/s, significantly higher than other shockless compression methods. Thus, this loading process is attractive in the research of solid material dynamics and astrophysics. The objective of the current study is to demonstrate the dynamic properties of the jet from the rear surface of the reservoir, and how important parameters such as peak load, rise time, shockless compression depth, and stagnating melt depth inmore » the sample vary with laser intensity, laser pulse length, reservoir thickness, vacuum gap size, and even the sample material. Numerical simulations based on the space-time conservation element and solution element method, together with the bulk ablation model, were used. The dynamics of the reservoir depend on the laser intensity, pulse length, equation of state, as well as the molecular structure of the reservoir. The critical pressure condition at which the reservoir will unload, similar to a gas or weak plasma, is 40-80 GPa before expansion. The momentum distribution bulges downward near the front of the plasma jet, which is an important characteristic that determines shockless compression. The total energy density is the most important parameter, and has great influence on the jet characteristics, and consequently on the shockless compression characteristics. If the reservoir is of a single material irradiated at a given laser condition, the relation of peak load and shockless compression depth is in conflict, and the highest loads correspond to the smallest thickness of sample. The temperature of jet front runs up several electron volts after impacting on the sample, and the heat transfer between the stagnating plasma and the sample is sufficiently significant to induce the melting of the sample surface. However, this diffusion heat wave propagates much more slowly than the stress wave, and has minimal effect on the shockless compression progress at a deeper position.« less
Inglis, Stephen; Melko, Roger G
2013-01-01
We implement a Wang-Landau sampling technique in quantum Monte Carlo (QMC) simulations for the purpose of calculating the Rényi entanglement entropies and associated mutual information. The algorithm converges an estimate for an analog to the density of states for stochastic series expansion QMC, allowing a direct calculation of Rényi entropies without explicit thermodynamic integration. We benchmark results for the mutual information on two-dimensional (2D) isotropic and anisotropic Heisenberg models, a 2D transverse field Ising model, and a three-dimensional Heisenberg model, confirming a critical scaling of the mutual information in cases with a finite-temperature transition. We discuss the benefits and limitations of broad sampling techniques compared to standard importance sampling methods.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mathews, Irimpan I.; Allison, Kim; Robbins, Thomas
The crystal structure of the trans-acyltransferase (AT) from the disorazole polyketide synthase (PKS) was determined at room temperature to a resolution of 2.5 Å using a new method for sample delivery directly into an X-ray free-electron laser. A novel sample extractor efficiently delivered limited quantities of microcrystals directly from the native crystallization solution into the X-ray beam at room temperature. The AT structure revealed important catalytic features of this core PKS enzyme, including the occurrence of conformational changes around the active site. The implications of these conformational changes on polyketide synthase reaction dynamics are discussed.
Mathews, Irimpan I.; Allison, Kim; Robbins, Thomas; ...
2017-08-23
The crystal structure of the trans-acyltransferase (AT) from the disorazole polyketide synthase (PKS) was determined at room temperature to a resolution of 2.5 Å using a new method for sample delivery directly into an X-ray free-electron laser. A novel sample extractor efficiently delivered limited quantities of microcrystals directly from the native crystallization solution into the X-ray beam at room temperature. The AT structure revealed important catalytic features of this core PKS enzyme, including the occurrence of conformational changes around the active site. The implications of these conformational changes on polyketide synthase reaction dynamics are discussed.
A computational method for estimating the PCR duplication rate in DNA and RNA-seq experiments.
Bansal, Vikas
2017-03-14
PCR amplification is an important step in the preparation of DNA sequencing libraries prior to high-throughput sequencing. PCR amplification introduces redundant reads in the sequence data and estimating the PCR duplication rate is important to assess the frequency of such reads. Existing computational methods do not distinguish PCR duplicates from "natural" read duplicates that represent independent DNA fragments and therefore, over-estimate the PCR duplication rate for DNA-seq and RNA-seq experiments. In this paper, we present a computational method to estimate the average PCR duplication rate of high-throughput sequence datasets that accounts for natural read duplicates by leveraging heterozygous variants in an individual genome. Analysis of simulated data and exome sequence data from the 1000 Genomes project demonstrated that our method can accurately estimate the PCR duplication rate on paired-end as well as single-end read datasets which contain a high proportion of natural read duplicates. Further, analysis of exome datasets prepared using the Nextera library preparation method indicated that 45-50% of read duplicates correspond to natural read duplicates likely due to fragmentation bias. Finally, analysis of RNA-seq datasets from individuals in the 1000 Genomes project demonstrated that 70-95% of read duplicates observed in such datasets correspond to natural duplicates sampled from genes with high expression and identified outlier samples with a 2-fold greater PCR duplication rate than other samples. The method described here is a useful tool for estimating the PCR duplication rate of high-throughput sequence datasets and for assessing the fraction of read duplicates that correspond to natural read duplicates. An implementation of the method is available at https://github.com/vibansal/PCRduplicates .
Liquid Biopsy and its Potential for Management of Hepatocellular Carcinoma.
Zhou, Jian; Huang, Ao; Yang, Xin-Rong
2016-06-01
We summarized the recent findings of liquid biopsy in cancer field and discussed its potential utility in hepatocellular carcinoma. Literature published in MEDLINE, EMBASE, and Science Direct electronic databases was searched and reviewed. Liquid biopsy specially referred to the detection of nucleic acids (circulating cell-free DNA, cfDNA) and circulating tumor cells (CTCs) in the blood of cancer patients. Compared to conventional single-site sampling or biopsy method, liquid biopsy had the advantages such as non-invasiveness, dynamic monitoring, and the most important of all, overcoming the limit of spatial and temporal heterogeneity. The genomic information of cancer could be profiled by genotyping cfDNA/CTC and subsequently applied to make molecular classification, targeted therapy guidance, and unveil drug resistance mechanisms. The serial sampling feature of liquid biopsy made it possible to monitor treatment response in a real-time manner and predict tumor metastasis/recurrence in advance. Liquid biopsy is a non-invasive, dynamic, and informative sampling method with important clinical translational significance in cancer research and practice. Much work needs to be done before it is used in the management of HCC.
Minimal Polynomial Method for Estimating Parameters of Signals Received by an Antenna Array
NASA Astrophysics Data System (ADS)
Ermolaev, V. T.; Flaksman, A. G.; Elokhin, A. V.; Kuptsov, V. V.
2018-01-01
The effectiveness of the projection minimal polynomial method for solving the problem of determining the number of sources of signals acting on an antenna array (AA) with an arbitrary configuration and their angular directions has been studied. The method proposes estimating the degree of the minimal polynomial of the correlation matrix (CM) of the input process in the AA on the basis of a statistically validated root-mean-square criterion. Special attention is paid to the case of the ultrashort sample of the input process when the number of samples is considerably smaller than the number of AA elements, which is important for multielement AAs. It is shown that the proposed method is more effective in this case than methods based on the AIC (Akaike's Information Criterion) or minimum description length (MDL) criterion.
The interaction between a solid body and viscous fluid by marker-and-cell method
NASA Technical Reports Server (NTRS)
Cheng, R. Y. K.
1976-01-01
A computational method for solving nonlinear problems relating to impact and penetration of a rigid body into a fluid type medium is presented. The numerical techniques, based on the Marker-and-Cell method, gives the pressure and velocity of the flow field. An important feature in this method is that the force and displacement of the rigid body interacting with the fluid during the impact and sinking phases are evaluated from the boundary stresses imposed by the fluid on the rigid body. A sample problem of low velocity penetration of a rigid block into still water is solved by this method. The computed time histories of the acceleration, pressure, and displacement of the block show food agreement with experimental measurements. A sample problem of high velocity impact of a rigid block into soft clay is also presented.
Determination and prioritizing of addiction prevention factors in delfan city, iran.
Mirzaei, Davod; Zamani, Bibi Eshrat; Mousavi, Sayyed Hojat
2011-01-01
In recent decades, drug abuse has been one of the most important problems of human societies and has been imposing enormous charges to them. Exposing addicts to infectious diseases, social and economic harmful impacts, expensive and reversibility of treatment methods have caused that drug abuse prevention programs be more inexpensive and more effective than treatment. One of the most important methods of drug abuse prevention is identification and prioritization of them according to scientific methods. The purpose of this study was to investigate addiction prevention methods among adolescents and teenagers from the viewpoints of addicts, their parents, authorities and prioritizing the prevention methods based on analytical hierarchy process (AHP) model in Delfan city, Iran. Statistical samples included 17 authorities, 42 addicts, and 23 parents that have been selected through purposive sampling. Data collection instruments involved structured and semi-structured interviews. Data were analyzed based on quantitative and qualitative methods, encoding and categorization. In this study, AHP model was used for prioritizing the prevention methods. This model is one of the most efficient and comprehensive designed techniques for multi-criteria decision making; it formulates the possibility of natural complex problems as hierarchy. The results indicated that the most important methods of drug abuse prevention were using media, case studies, planning for leisure times, educating social skills, integrating drug prevention methods in religious customs and respect to teenagers. Among these factors, the media and respect to adolescents with weights 0.3321 and 0.2389 had the highest preferences for the prevention of drug addiction, respectively. Planning for leisure time with weight of 0.1349 had the lowest importance than media and teenager respectful factor and higher priority than religion customs, dating and learning lessons factors. On the contrary, integrating in religion customs, using case studies with weights 0.1145, 0.1114 and 0.0680 had the lowest preferences, respectively, and can be considered in later settings. The interviewees mentioned the most important addiction prevention methods in respect to teenagers, religious customs, media, dating skills, learning lessons from examples and attention to the leisure times among which the media has been the most efficient method. Because, publicity of the media as a national media is available to the public and it is not dedicated for a special group or class of people and everyone can use it regardless of his literacy and knowledge level.
Application of micro-Fourier transform infrared spectroscopy to the examination of paint samples
NASA Astrophysics Data System (ADS)
Zięba-Palus, J.
1999-11-01
The examination and identification of automobile paints is an important problem in road accidents investigations. Since the real sample available is very small, only sensitive microtechniques can be applied. The methods of optical microscopy and micro-Fourier transform infrared spectroscopy (MK-FTIR) supported by scanning electron microscopy together with X-ray microanalysis (SEM-EDX) allow one to carry out the examination of each paint layer without any separation procedure. In this paper an attempt is made to discriminate between different automobile paints of the same colour by the use of these methods for criminalistic investigations.
Driven-dissipative quantum Monte Carlo method for open quantum systems
NASA Astrophysics Data System (ADS)
Nagy, Alexandra; Savona, Vincenzo
2018-05-01
We develop a real-time full configuration-interaction quantum Monte Carlo approach to model driven-dissipative open quantum systems with Markovian system-bath coupling. The method enables stochastic sampling of the Liouville-von Neumann time evolution of the density matrix thanks to a massively parallel algorithm, thus providing estimates of observables on the nonequilibrium steady state. We present the underlying theory and introduce an initiator technique and importance sampling to reduce the statistical error. Finally, we demonstrate the efficiency of our approach by applying it to the driven-dissipative two-dimensional X Y Z spin-1/2 model on a lattice.
Ait Kaci Azzou, S; Larribe, F; Froda, S
2016-10-01
In Ait Kaci Azzou et al. (2015) we introduced an Importance Sampling (IS) approach for estimating the demographic history of a sample of DNA sequences, the skywis plot. More precisely, we proposed a new nonparametric estimate of a population size that changes over time. We showed on simulated data that the skywis plot can work well in typical situations where the effective population size does not undergo very steep changes. In this paper, we introduce an iterative procedure which extends the previous method and gives good estimates under such rapid variations. In the iterative calibrated skywis plot we approximate the effective population size by a piecewise constant function, whose values are re-estimated at each step. These piecewise constant functions are used to generate the waiting times of non homogeneous Poisson processes related to a coalescent process with mutation under a variable population size model. Moreover, the present IS procedure is based on a modified version of the Stephens and Donnelly (2000) proposal distribution. Finally, we apply the iterative calibrated skywis plot method to a simulated data set from a rapidly expanding exponential model, and we show that the method based on this new IS strategy correctly reconstructs the demographic history. Copyright © 2016. Published by Elsevier Inc.
Metzger, E; Viollier, E; Simonucci, C; Prévot, F; Langlet, D; Jézéquel, D
2013-10-01
Constrained DET (Diffusive Equilibration in Thin films) probes equipped with 75 sampling layers of agarose gel (DGT Research(©)) were used to sample bottom and pore waters in marine sediment with a 2 mm vertical resolution. After retrieval, each piece of hydrogel, corresponding to 25 μL, was introduced into 1 mL of colorimetric reagent (CR) solution consisting of formic acid and bromophenol blue. After the elution/reaction time, absorbance of the latter mixture was read at 590 nm and compared to a calibration curve obtained with the same protocol applied to mini DET probes soaked in sodium hydrogen carbonate standard solutions. This method allows rapid alkalinity determinations for the small volumes of anoxic pore water entrapped into the gel. The method was assessed on organic-rich coastal marine sediments from Thau lagoon (France). Alkalinity values in the overlying waters were in agreement with data obtained by classical sampling techniques. Pore water data showed a progressive increase of alkalinity in the sediment from 2 to 10 mmol kg(-1), corresponding to anaerobic respiration in organic-rich sediments. Moreover, replicates of high-resolution DET profiles showed important lateral heterogeneity at a decimeter scale. This underlines the importance of high-resolution spatial methods for alkalinity profiling in coastal marine systems. Copyright © 2013 Elsevier Ltd. All rights reserved.
Ducat, Giseli; Felsner, Maria L; da Costa Neto, Pedro R; Quináia, Sueli P
2015-06-15
Recently the use of brown sugar has increased due to its nutritional characteristics, thus requiring a more rigid quality control. The development of a method for water content analysis in soft brown sugar is carried out for the first time by TG/DTA with application of different statistical tests. The results of the optimization study suggest that heating rates of 5°C min(-1) and an alumina sample holder improve the efficiency of the drying process. The validation study showed that thermo gravimetry presents good accuracy and precision for water content analysis in soft brown sugar samples. This technique offers advantages over other analytical methods as it does not use toxic and costly reagents or solvents, it does not need any sample preparation, and it allows the identification of the temperature at which water is completely eliminated in relation to other volatile degradation products. This is an important advantage over the official method (loss on drying). Copyright © 2015 Elsevier Ltd. All rights reserved.
Zhao, Yang; Kao, Chun-Pin; Wu, Kun-Chang; Liao, Chi-Ren; Ho, Yu-Ling; Chang, Yuan-Shiun
2014-11-10
This paper describes the development of an HPLC-UV-MS method for quantitative determination of andrographolide and dehydroandrographolide in Andrographis Herba and establishment of its chromatographic fingerprint. The method was validated for linearity, limit of detection and quantification, inter- and intra-day precisions, repeatability, stability and recovery. All the validation results of quantitative determination and fingerprinting methods were satisfactory. The developed method was then applied to assay the contents of andrographolide and dehydroandrographolide and to acquire the fingerprints of all the collected Andrographis Herba samples. Furthermore, similarity analysis and principal component analysis were used to reveal the similarities and differences between the samples on the basis of the characteristic peaks. More importantly, the DPPH free radical-scavenging and ferric reducing capacities of the Andrographis Herba samples were assayed. By bivariate correlation analysis, we found that six compounds are positively correlated to DPPH free radical scavenging and ferric reducing capacities, and four compounds are negatively correlated to DPPH free radical scavenging and ferric reducing capacities.
Aerosol Production from Charbroiled and Wet-Fried Meats
NASA Astrophysics Data System (ADS)
Niedziela, R. F.; Blanc, L. E.
2012-12-01
Previous work in our laboratory focused on the chemical and optical characterization of aerosols produced during the dry-frying of different meat samples. This method yielded a complex ensemble of particles composed of water and long-chain fatty acids with the latter dominated by oleic, stearic, and palmitic acids. The present study examines how wet-frying and charbroiling cooking methods affect the physical and chemical properties of their derived aerosols. Samples of ground beef, salmon, chicken, and pork were subject to both cooking methods in the laboratory, with their respective aerosols swept into a laminar flow cell where they were optically analyzed in the mid-infrared and collected through a gas chromatography probe for chemical characterization. This presentation will compare and contrast the nature of the aerosols generated in each cooking method, particularly those produced during charbroiling which exposes the samples, and their drippings, to significantly higher temperatures. Characterization of such cooking-related aerosols is important because of the potential impact of these particles on air quality, particularly in urban areas.
Hasanpour, Foroozan; Hadadzadeh, Hassan; Taei, Masoumeh; Nekouei, Mohsen; Mozafari, Elmira
2016-05-01
Analytical performance of conventional spectrophotometer was developed by coupling of effective dispersive liquid-liquid micro-extraction method with spectrophotometric determination for ultra-trace determination of cobalt. The method was based on the formation of Co(II)-alpha-benzoin oxime complex and its extraction using a dispersive liquid-liquid micro-extraction technique. During the present work, several important variables such as pH, ligand concentration, amount and type of dispersive, and extracting solvent were optimized. It was found that the crucial factor for the Co(II)-alpha benzoin oxime complex formation is the pH of the alkaline alcoholic medium. Under the optimized condition, the calibration graph was linear in the ranges of 1.0-110 μg L(-1) with the detection limit (S/N = 3) of 0.5 μg L(-1). The preconcentration operation of 25 mL of sample gave enhancement factor of 75. The proposed method was applied for determination of Co(II) in soil samples.
Li, Zhi; Chen, Weidong; Lian, Feiyu; Ge, Hongyi; Guan, Aihong
2017-12-01
Quantitative analysis of component mixtures is an important application of terahertz time-domain spectroscopy (THz-TDS) and has attracted broad interest in recent research. Although the accuracy of quantitative analysis using THz-TDS is affected by a host of factors, wavelength selection from the sample's THz absorption spectrum is the most crucial component. The raw spectrum consists of signals from the sample and scattering and other random disturbances that can critically influence the quantitative accuracy. For precise quantitative analysis using THz-TDS, the signal from the sample needs to be retained while the scattering and other noise sources are eliminated. In this paper, a novel wavelength selection method based on differential evolution (DE) is investigated. By performing quantitative experiments on a series of binary amino acid mixtures using THz-TDS, we demonstrate the efficacy of the DE-based wavelength selection method, which yields an error rate below 5%.
Kumar, Vineet
2011-12-01
The grain size statistics, commonly derived from the grain map of a material sample, are important microstructure characteristics that greatly influence its properties. The grain map for nanomaterials is usually obtained manually by visual inspection of the transmission electron microscope (TEM) micrographs because automated methods do not perform satisfactorily. While the visual inspection method provides reliable results, it is a labor intensive process and is often prone to human errors. In this article, an automated grain mapping method is developed using TEM diffraction patterns. The presented method uses wide angle convergent beam diffraction in the TEM. The automated technique was applied on a platinum thin film sample to obtain the grain map and subsequently derive grain size statistics from it. The grain size statistics obtained with the automated method were found in good agreement with the visual inspection method.
NASA Astrophysics Data System (ADS)
Voss, Sebastian; Zimmermann, Beate; Zimmermann, Alexander
2016-04-01
In the last three decades, an increasing number of studies analyzed spatial patterns in throughfall to investigate the consequences of rainfall redistribution for biogeochemical and hydrological processes in forests. In the majority of cases, variograms were used to characterize the spatial properties of the throughfall data. The estimation of the variogram from sample data requires an appropriate sampling scheme: most importantly, a large sample and an appropriate layout of sampling locations that often has to serve both variogram estimation and geostatistical prediction. While some recommendations on these aspects exist, they focus on Gaussian data and high ratios of the variogram range to the extent of the study area. However, many hydrological data, and throughfall data in particular, do not follow a Gaussian distribution. In this study, we examined the effect of extent, sample size, sampling design, and calculation methods on variogram estimation of throughfall data. For our investigation, we first generated non-Gaussian random fields based on throughfall data with heavy outliers. Subsequently, we sampled the fields with three extents (plots with edge lengths of 25 m, 50 m, and 100 m), four common sampling designs (two grid-based layouts, transect and random sampling), and five sample sizes (50, 100, 150, 200, 400). We then estimated the variogram parameters by method-of-moments and residual maximum likelihood. Our key findings are threefold. First, the choice of the extent has a substantial influence on the estimation of the variogram. A comparatively small ratio of the extent to the correlation length is beneficial for variogram estimation. Second, a combination of a minimum sample size of 150, a design that ensures the sampling of small distances and variogram estimation by residual maximum likelihood offers a good compromise between accuracy and efficiency. Third, studies relying on method-of-moments based variogram estimation may have to employ at least 200 sampling points for reliable variogram estimates. These suggested sample sizes exceed the numbers recommended by studies dealing with Gaussian data by up to 100 %. Given that most previous throughfall studies relied on method-of-moments variogram estimation and sample sizes << 200, our current knowledge about throughfall spatial variability stands on shaky ground.
NASA Astrophysics Data System (ADS)
Voss, Sebastian; Zimmermann, Beate; Zimmermann, Alexander
2016-09-01
In the last decades, an increasing number of studies analyzed spatial patterns in throughfall by means of variograms. The estimation of the variogram from sample data requires an appropriate sampling scheme: most importantly, a large sample and a layout of sampling locations that often has to serve both variogram estimation and geostatistical prediction. While some recommendations on these aspects exist, they focus on Gaussian data and high ratios of the variogram range to the extent of the study area. However, many hydrological data, and throughfall data in particular, do not follow a Gaussian distribution. In this study, we examined the effect of extent, sample size, sampling design, and calculation method on variogram estimation of throughfall data. For our investigation, we first generated non-Gaussian random fields based on throughfall data with large outliers. Subsequently, we sampled the fields with three extents (plots with edge lengths of 25 m, 50 m, and 100 m), four common sampling designs (two grid-based layouts, transect and random sampling) and five sample sizes (50, 100, 150, 200, 400). We then estimated the variogram parameters by method-of-moments (non-robust and robust estimators) and residual maximum likelihood. Our key findings are threefold. First, the choice of the extent has a substantial influence on the estimation of the variogram. A comparatively small ratio of the extent to the correlation length is beneficial for variogram estimation. Second, a combination of a minimum sample size of 150, a design that ensures the sampling of small distances and variogram estimation by residual maximum likelihood offers a good compromise between accuracy and efficiency. Third, studies relying on method-of-moments based variogram estimation may have to employ at least 200 sampling points for reliable variogram estimates. These suggested sample sizes exceed the number recommended by studies dealing with Gaussian data by up to 100 %. Given that most previous throughfall studies relied on method-of-moments variogram estimation and sample sizes ≪200, currently available data are prone to large uncertainties.
de Vries, W; Wieggers, H J J; Brus, D J
2010-08-05
Element fluxes through forest ecosystems are generally based on measurements of concentrations in soil solution at regular time intervals at plot locations sampled in a regular grid. Here we present spatially averaged annual element leaching fluxes in three Dutch forest monitoring plots using a new sampling strategy in which both sampling locations and sampling times are selected by probability sampling. Locations were selected by stratified random sampling with compact geographical blocks of equal surface area as strata. In each sampling round, six composite soil solution samples were collected, consisting of five aliquots, one per stratum. The plot-mean concentration was estimated by linear regression, so that the bias due to one or more strata being not represented in the composite samples is eliminated. The sampling times were selected in such a way that the cumulative precipitation surplus of the time interval between two consecutive sampling times was constant, using an estimated precipitation surplus averaged over the past 30 years. The spatially averaged annual leaching flux was estimated by using the modeled daily water flux as an ancillary variable. An important advantage of the new method is that the uncertainty in the estimated annual leaching fluxes due to spatial and temporal variation and resulting sampling errors can be quantified. Results of this new method were compared with the reference approach in which daily leaching fluxes were calculated by multiplying daily interpolated element concentrations with daily water fluxes and then aggregated to a year. Results show that the annual fluxes calculated with the reference method for the period 2003-2005, including all plots, elements and depths, lies only in 53% of the cases within the range of the average +/-2 times the standard error of the new method. Despite the differences in results, both methods indicate comparable N retention and strong Al mobilization in all plots, with Al leaching being nearly equal to the leaching of SO(4) and NO(3) with fluxes expressed in mol(c) ha(-1) yr(-1). This illustrates that Al release, which is the clearest signal of soil acidification, is mainly due to the external input of SO(4) and NO(3).
Detector-unit-dependent calibration for polychromatic projections of rock core CT.
Li, Mengfei; Zhao, Yunsong; Zhang, Peng
2017-01-01
Computed tomography (CT) plays an important role in digital rock analysis, which is a new prospective technique for oil and gas industry. But the artifacts in CT images will influence the accuracy of the digital rock model. In this study, we proposed and demonstrated a novel method to restore detector-unit-dependent functions for polychromatic projection calibration by scanning some simple shaped reference samples. As long as the attenuation coefficients of the reference samples are similar to the scanned object, the size or position is not needed to be exactly known. Both simulated and real data were used to verify the proposed method. The results showed that the new method reduced both beam hardening artifacts and ring artifacts effectively. Moreover, the method appeared to be quite robust.
NASA Astrophysics Data System (ADS)
Gorewoda, Tadeusz; Mzyk, Zofia; Anyszkiewicz, Jacek; Charasińska, Jadwiga
2015-04-01
The purpose of this study was to develop an accurate method for the determination of bromine in polymer materials using X-ray fluorescence spectrometry when the thickness of the sample is less than the bromine critical thickness (tc) value. This is particularly important for analyzing compliance with the Restriction of Hazardous Substances Directive. Mathematically and experimentally estimated tc values in polyethylene and cellulose matrixes were up to several millimeters. Four methods were developed to obtain an accurate result. These methods include the addition of an element with a high mass absorption coefficient, the measurement of the total bromine contained in a defined volume of the sample, the exploitation of tube-Rayleigh line intensities and using the Br-Lβ line.
Biofouling development on plasma treated samples versus layers coated samples
NASA Astrophysics Data System (ADS)
Hnatiuc, B.; Exnar, P.; Sabau, A.; Spatenka, P.; Dumitrache, C. L.; Hnatiuc, M.; Ghita, S.
2016-12-01
Biofouling is the most important cause of naval corrosion. In order to reduce the Biofouling development on naval materials as steel or resin, different new methods have been tested. These methods could help to follow the new IMO environment reglementations and they could replace few classic operations before the painting of the small ships. The replacement of these operations means a reduction in maintenance costs. Their action must influence especially the first two steps of the Biofouling development, called Microfouling, that demand about 24 hours. This work presents the comparative results of the Biofouling development on two different classic naval materials, steel and resin, for three treated samples, immersed in sea water. Non-thermal plasma, produced by GlidArc technology, is applied to the first sample, called GD. The plasma treatment was set to 10 minutes. The last two samples, called AE9 and AE10 are covered by hydrophobic layers, prepared from a special organic-inorganic sol synthesized by sol-gel method. Theoretically, because of the hydrophobic properties, the Biofouling formation must be delayed for AE9 and AE10. The Biofouling development on each treated sample was compared with a witness non-treated sample. The microbiological analyses have been done for 24 hours by epifluorescence microscopy, available for one single layer.
NASA Astrophysics Data System (ADS)
Zhang, Linna; Li, Gang; Sun, Meixiu; Li, Hongxiao; Wang, Zhennan; Li, Yingxin; Lin, Ling
2017-11-01
Identifying whole bloods to be either human or nonhuman is an important responsibility for import-export ports and inspection and quarantine departments. Analytical methods and DNA testing methods are usually destructive. Previous studies demonstrated that visible diffuse reflectance spectroscopy method can realize noncontact human and nonhuman blood discrimination. An appropriate method for calibration set selection was very important for a robust quantitative model. In this paper, Random Selection (RS) method and Kennard-Stone (KS) method was applied in selecting samples for calibration set. Moreover, proper stoichiometry method can be greatly beneficial for improving the performance of classification model or quantification model. Partial Least Square Discrimination Analysis (PLSDA) method was commonly used in identification of blood species with spectroscopy methods. Least Square Support Vector Machine (LSSVM) was proved to be perfect for discrimination analysis. In this research, PLSDA method and LSSVM method was used for human blood discrimination. Compared with the results of PLSDA method, this method could enhance the performance of identified models. The overall results convinced that LSSVM method was more feasible for identifying human and animal blood species, and sufficiently demonstrated LSSVM method was a reliable and robust method for human blood identification, and can be more effective and accurate.
Erchinger, Friedemann; Engjom, Trond; Gudbrandsen, Oddrun Anita; Tjora, Erling; Gilja, Odd H; Dimcevski, Georg
2016-01-01
We have recently evaluated a short endoscopic secretin test for exocrine pancreatic function. Bicarbonate concentration in duodenal juice is an important parameter in this test. Measurement of bicarbonate by back titration as the gold standard method is time consuming, expensive and technically difficult, thus a simplified method is warranted. We aimed to evaluate an automated spectrophotometric method in samples spanning the effective range of bicarbonate concentrations in duodenal juice. We also evaluated if freezing of samples before analyses would affect its results. Patients routinely examined with short endoscopic secretin test suspected to have decreased pancreatic function of various reasons were included. Bicarbonate in duodenal juice was quantified by back titration and automatic spectrophotometry. Both fresh and thawed samples were analysed spectrophotometrically. 177 samples from 71 patients were analysed. Correlation coefficient of all measurements was r = 0.98 (p < 0.001). Correlation coefficient of fresh versus frozen samples conducted with automatic spectrophotometry (n = 25): r = 0.96 (p < 0.001) CONCLUSIONS: The measurement of bicarbonate in fresh and thawed samples by automatic spectrophotometrical analysis correlates excellent with the back titration gold standard. This is a major simplification of direct pancreas function testing, and allows a wider distribution of bicarbonate testing in duodenal juice. Extreme values for Bicarbonate concentration achieved by the autoanalyser method have to be interpreted with caution. Copyright © 2016 IAP and EPC. Published by Elsevier India Pvt Ltd. All rights reserved.
Chhillar, Sumit; Acharya, Raghunath; Sodaye, Suparna; Pujari, Pradeep K
2014-11-18
We report simple particle induced gamma-ray emission (PIGE) methods using a 4 MeV proton beam for simultaneous and nondestructive determination of the isotopic composition of boron ((10)B/(11)B atom ratio) and total boron concentrations in various solid samples with natural isotopic composition and enriched with (10)B. It involves measurement of prompt gamma-rays at 429, 718, and 2125 keV from (10)B(p,αγ)(7)Be, (10)B(p, p'γ)(10)B, and (11)B(p, p'γ)(11)B reactions, respectively. The isotopic composition of boron in natural and enriched samples was determined by comparing peak area ratios corresponding to (10)B and (11)B of samples to natural boric acid standard. An in situ current normalized PIGE method, using F or Al, was standardized for total B concentration determination. The methods were validated by analyzing stoichiometric boron compounds and applied to samples such as boron carbide, boric acid, carborane, and borosilicate glass. Isotopic compositions of boron in the range of 0.247-2.0 corresponding to (10)B in the range of 19.8-67.0 atom % and total B concentrations in the range of 5-78 wt % were determined. It has been demonstrated that PIGE offers a simple and alternate method for total boron as well as isotopic composition determination in boron based solid samples, including neutron absorbers that are important in nuclear technology.
Comparison of electrical conductivity calculation methods for natural waters
McCleskey, R. Blaine; Nordstrom, D. Kirk; Ryan, Joseph N.
2012-01-01
The capability of eleven methods to calculate the electrical conductivity of a wide range of natural waters from their chemical composition was investigated. A brief summary of each method is presented including equations to calculate the conductivities of individual ions, the ions incorporated, and the method's limitations. The ability of each method to reliably predict the conductivity depends on the ions included, effective accounting of ion pairing, and the accuracy of the equation used to estimate the ionic conductivities. The performances of the methods were evaluated by calculating the conductivity of 33 environmentally important electrolyte solutions, 41 U.S. Geological Survey standard reference water samples, and 1593 natural water samples. The natural waters tested include acid mine waters, geothermal waters, seawater, dilute mountain waters, and river water impacted by municipal waste water. The three most recent conductivity methods predict the conductivity of natural waters better than other methods. Two of the recent methods can be used to reliably calculate the conductivity for samples with pH values greater than about 3 and temperatures between 0 and 40°C. One method is applicable to a variety of natural water types with a range of pH from 1 to 10, temperature from 0 to 95°C, and ionic strength up to 1 m.
Santiago, Paula; Jiménez-Belenguer, Ana; García-Hernández, Jorge; Estellés, Rosa Montes; Hernández Pérez, Manuel; Castillo López, M Angeles; Ferrús, María Antonia; Moreno, Yolanda
2018-01-01
Salmonella spp. is one of the most important causal agents of food-borne illness in developed countries and its presence in irrigation water poses a risk to public health. Its detection in environmental samples is not easy when culture methods are used, and molecular techniques such as PCR or ribosomal rRNA probe hybridization (Fluorescent in situ Hybridization, FISH) are outstanding alternatives. The aim of this work was to determine the environmental risk due to the presence of Salmonella spp. in wastewater by culture, PCR and FISH. A new specific rDNA probe for Salmonella was designed and its efficiency was compared with the rest of methods Serotype and antibiotic resistance of isolated strains were determined. Forty-five wastewater samples (collected from two secondary wastewater treatment plants) were analysed. Salmonella strains were isolated in 24 wastewater samples (53%), two of them after disinfection treatment. Twenty-three Salmonella strains exhibited resistance to one or more antimicrobial agent. Analysis of wastewater samples yielded PCR positive results for Salmonella in 28 out of the 45 wastewater samples (62%). FISH analysis allowed for the detection of Salmonella in 27 (60%) samples. By using molecular methods, Salmonella was detected in four samples after disinfection treatment. These results show the prevalence of Salmonella in reclaimed wastewater even after U.V. disinfection, what is a matter of public health concern, the high rates of resistance to antibiotics and the adequacy of molecular methods for its rapid detection. FISH method, with SA23 probe developed and assayed in this work provides a tool for detecting Salmonella in water within few hours, with a high rate of effectiveness. Copyright © 2017 Elsevier GmbH. All rights reserved.
Multimodal manifold-regularized transfer learning for MCI conversion prediction.
Cheng, Bo; Liu, Mingxia; Suk, Heung-Il; Shen, Dinggang; Zhang, Daoqiang
2015-12-01
As the early stage of Alzheimer's disease (AD), mild cognitive impairment (MCI) has high chance to convert to AD. Effective prediction of such conversion from MCI to AD is of great importance for early diagnosis of AD and also for evaluating AD risk pre-symptomatically. Unlike most previous methods that used only the samples from a target domain to train a classifier, in this paper, we propose a novel multimodal manifold-regularized transfer learning (M2TL) method that jointly utilizes samples from another domain (e.g., AD vs. normal controls (NC)) as well as unlabeled samples to boost the performance of the MCI conversion prediction. Specifically, the proposed M2TL method includes two key components. The first one is a kernel-based maximum mean discrepancy criterion, which helps eliminate the potential negative effect induced by the distributional difference between the auxiliary domain (i.e., AD and NC) and the target domain (i.e., MCI converters (MCI-C) and MCI non-converters (MCI-NC)). The second one is a semi-supervised multimodal manifold-regularized least squares classification method, where the target-domain samples, the auxiliary-domain samples, and the unlabeled samples can be jointly used for training our classifier. Furthermore, with the integration of a group sparsity constraint into our objective function, the proposed M2TL has a capability of selecting the informative samples to build a robust classifier. Experimental results on the Alzheimer's Disease Neuroimaging Initiative (ADNI) database validate the effectiveness of the proposed method by significantly improving the classification accuracy of 80.1 % for MCI conversion prediction, and also outperforming the state-of-the-art methods.
Estimation of sample size and testing power (part 5).
Hu, Liang-ping; Bao, Xiao-lei; Guan, Xue; Zhou, Shi-guo
2012-02-01
Estimation of sample size and testing power is an important component of research design. This article introduced methods for sample size and testing power estimation of difference test for quantitative and qualitative data with the single-group design, the paired design or the crossover design. To be specific, this article introduced formulas for sample size and testing power estimation of difference test for quantitative and qualitative data with the above three designs, the realization based on the formulas and the POWER procedure of SAS software and elaborated it with examples, which will benefit researchers for implementing the repetition principle.
Gaussian process surrogates for failure detection: A Bayesian experimental design approach
NASA Astrophysics Data System (ADS)
Wang, Hongqiao; Lin, Guang; Li, Jinglai
2016-05-01
An important task of uncertainty quantification is to identify the probability of undesired events, in particular, system failures, caused by various sources of uncertainties. In this work we consider the construction of Gaussian process surrogates for failure detection and failure probability estimation. In particular, we consider the situation that the underlying computer models are extremely expensive, and in this setting, determining the sampling points in the state space is of essential importance. We formulate the problem as an optimal experimental design for Bayesian inferences of the limit state (i.e., the failure boundary) and propose an efficient numerical scheme to solve the resulting optimization problem. In particular, the proposed limit-state inference method is capable of determining multiple sampling points at a time, and thus it is well suited for problems where multiple computer simulations can be performed in parallel. The accuracy and performance of the proposed method is demonstrated by both academic and practical examples.
Importance-sampling computation of statistical properties of coupled oscillators
NASA Astrophysics Data System (ADS)
Gupta, Shamik; Leitão, Jorge C.; Altmann, Eduardo G.
2017-07-01
We introduce and implement an importance-sampling Monte Carlo algorithm to study systems of globally coupled oscillators. Our computational method efficiently obtains estimates of the tails of the distribution of various measures of dynamical trajectories corresponding to states occurring with (exponentially) small probabilities. We demonstrate the general validity of our results by applying the method to two contrasting cases: the driven-dissipative Kuramoto model, a paradigm in the study of spontaneous synchronization; and the conservative Hamiltonian mean-field model, a prototypical system of long-range interactions. We present results for the distribution of the finite-time Lyapunov exponent and a time-averaged order parameter. Among other features, our results show most notably that the distributions exhibit a vanishing standard deviation but a skewness that is increasing in magnitude with the number of oscillators, implying that nontrivial asymmetries and states yielding rare or atypical values of the observables persist even for a large number of oscillators.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Baer, Donald R.; Shutthanandan, Vaithiyalingam
Nano-sized objects are increasingly important as biomaterials and their surfaces play critical roles in determining their beneficial or deleterious behaviors in biological systems. Important characteristics of nanomaterials that impact their application in many areas are described with a strong focus on the importance of particle surfaces and surface characterization. Understanding aspects of the inherent nature of nano-objects and the important role that surfaces play in these applications is a universal need for any research or product development using such materials in biological applications. The role of surface analysis methods in collecting critical information about the nature of particle surfaces andmore » physicochemical properties of nano-objects is described along with the importance of including sample history and analysis results in a record of provenance information regarding specific batches of nano-objects.« less
Dotsinsky, Ivan
2005-01-01
Background Public access defibrillators (PADs) are now available for more efficient and rapid treatment of out-of-hospital sudden cardiac arrest. PADs are used normally by untrained people on the streets and in sports centers, airports, and other public areas. Therefore, automated detection of ventricular fibrillation, or its exclusion, is of high importance. A special case exists at railway stations, where electric power-line frequency interference is significant. Many countries, especially in Europe, use 16.7 Hz AC power, which introduces high level frequency-varying interference that may compromise fibrillation detection. Method Moving signal averaging is often used for 50/60 Hz interference suppression if its effect on the ECG spectrum has little importance (no morphological analysis is performed). This approach may be also applied to the railway situation, if the interference frequency is continuously detected so as to synchronize the analog-to-digital conversion (ADC) for introducing variable inter-sample intervals. A better solution consists of rated ADC, software frequency measuring, internal irregular re-sampling according to the interference frequency, and a moving average over a constant sample number, followed by regular back re-sampling. Results The proposed method leads to a total railway interference cancellation, together with suppression of inherent noise, while the peak amplitudes of some sharp complexes are reduced. This reduction has negligible effect on accurate fibrillation detection. Conclusion The method is developed in the MATLAB environment and represents a useful tool for real time railway interference suppression. PMID:16309558
NASA Astrophysics Data System (ADS)
Zhang, Hua; Yang, Hui; Li, Hongxing; Huang, Guangnan; Ding, Zheyi
2018-04-01
The attenuation of random noise is important for improving the signal to noise ratio (SNR). However, the precondition for most conventional denoising methods is that the noisy data must be sampled on a uniform grid, making the conventional methods unsuitable for non-uniformly sampled data. In this paper, a denoising method capable of regularizing the noisy data from a non-uniform grid to a specified uniform grid is proposed. Firstly, the denoising method is performed for every time slice extracted from the 3D noisy data along the source and receiver directions, then the 2D non-equispaced fast Fourier transform (NFFT) is introduced in the conventional fast discrete curvelet transform (FDCT). The non-equispaced fast discrete curvelet transform (NFDCT) can be achieved based on the regularized inversion of an operator that links the uniformly sampled curvelet coefficients to the non-uniformly sampled noisy data. The uniform curvelet coefficients can be calculated by using the inversion algorithm of the spectral projected-gradient for ℓ1-norm problems. Then local threshold factors are chosen for the uniform curvelet coefficients for each decomposition scale, and effective curvelet coefficients are obtained respectively for each scale. Finally, the conventional inverse FDCT is applied to the effective curvelet coefficients. This completes the proposed 3D denoising method using the non-equispaced curvelet transform in the source-receiver domain. The examples for synthetic data and real data reveal the effectiveness of the proposed approach in applications to noise attenuation for non-uniformly sampled data compared with the conventional FDCT method and wavelet transformation.
Mauté, Carole; Nibourel, Olivier; Réa, Delphine; Coiteux, Valérie; Grardel, Nathalie; Preudhomme, Claude; Cayuela, Jean-Michel
2014-09-01
Until recently, diagnostic laboratories that wanted to report on the international scale had limited options: they had to align their BCR-ABL1 quantification methods through a sample exchange with a reference laboratory to derive a conversion factor. However, commercial methods calibrated on the World Health Organization genetic reference panel are now available. We report results from a study designed to assess the comparability of the two alignment strategies. Sixty follow-up samples from chronic myeloid leukemia patients were included. Two commercial methods calibrated on the genetic reference panel were compared to two conversion factor methods routinely used at Saint-Louis Hospital, Paris, and at Lille University Hospital. Results were matched against concordance criteria (i.e., obtaining at least two of the three following landmarks: 50, 75 and 90% of the patient samples within a 2-fold, 3-fold and 5-fold range, respectively). Out of the 60 samples, more than 32 were available for comparison. Compared to the conversion factor method, the two commercial methods were within a 2-fold, 3-fold and 5-fold range for 53 and 59%, 89 and 88%, 100 and 97%, respectively of the samples analyzed at Saint-Louis. At Lille, results were 45 and 85%, 76 and 97%, 100 and 100%, respectively. Agreements between methods were observed in the four comparisons performed. Our data show that the two commercial methods selected are concordant with the conversion factor methods. This study brings the proof of principle that alignment on the international scale using the genetic reference panel is compatible with the patient sample exchange procedure. We believe that these results are particularly important for diagnostic laboratories wishing to adopt commercial methods. Copyright © 2014 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.
Trends in tungsten coil atomic spectrometry
NASA Astrophysics Data System (ADS)
Donati, George L.
Renewed interest in electrothermal atomic spectrometric methods based on tungsten coil atomizers is a consequence of a world wide increasing demand for fast, inexpensive, sensitive, and portable analytical methods for trace analysis. In this work, tungsten coil atomic absorption spectrometry (WCAAS) and tungsten coil atomic emission spectrometry (WCAES) are used to determine several different metals and even a non-metal at low levels in different samples. Improvements in instrumentation and new strategies to reduce matrix effects and background signals are presented. Investigation of the main factors affecting both WCAAS and WCAES analytical signals points to the importance of a reducing, high temperature gas phase in the processes leading to atomic cloud generation. Some more refractory elements such as V and Ti were determined for the first time by double tungsten coil atomic emission spectrometry (DWCAES). The higher temperatures provided by two atomizers in DWCAES also allowed the detection of Ag, Cu and Sn emission signals for the first time. Simultaneous determination of several elements by WCAES in relatively complex sample matrices was possible after a simple acid extraction. The results show the potential of this method as an alternative to more traditional, expensive methods for fast, more effective analyses and applications in the field. The development of a new metallic atomization cell is also presented. Lower limits of detection in both WCAAS and WCAES determinations were obtained due to factors such as better control of background signal, smaller, more isothermal system, with atomic cloud concentration at the optical path for a longer period of time. Tungsten coil-based methods are especially well suited to applications requiring low sample volume, low cost, sensitivity and portability. Both WCAAS and WCAES have great commercial potential in fields as diverse as archeology and industrial quality control. They are simple, inexpensive, effective methods for trace metal determinations in several different samples, representing an important asset in today's analytical chemistry.
Do placebo based validation standards mimic real batch products behaviour? Case studies.
Bouabidi, A; Talbi, M; Bouklouze, A; El Karbane, M; Bourichi, H; El Guezzar, M; Ziemons, E; Hubert, Ph; Rozet, E
2011-06-01
Analytical methods validation is a mandatory step to evaluate the ability of developed methods to provide accurate results for their routine application. Validation usually involves validation standards or quality control samples that are prepared in placebo or reconstituted matrix made of a mixture of all the ingredients composing the drug product except the active substance or the analyte under investigation. However, one of the main concerns that can be made with this approach is that it may lack an important source of variability that come from the manufacturing process. The question that remains at the end of the validation step is about the transferability of the quantitative performance from validation standards to real authentic drug product samples. In this work, this topic is investigated through three case studies. Three analytical methods were validated using the commonly spiked placebo validation standards at several concentration levels as well as using samples coming from authentic batch samples (tablets and syrups). The results showed that, depending on the type of response function used as calibration curve, there were various degrees of differences in the results accuracy obtained with the two types of samples. Nonetheless the use of spiked placebo validation standards was showed to mimic relatively well the quantitative behaviour of the analytical methods with authentic batch samples. Adding these authentic batch samples into the validation design may help the analyst to select and confirm the most fit for purpose calibration curve and thus increase the accuracy and reliability of the results generated by the method in routine application. Copyright © 2011 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Sabatini, Francesca; Lluveras-Tenorio, Anna; Degano, Ilaria; Kuckova, Stepanka; Krizova, Iva; Colombini, Maria Perla
2016-11-01
This study deals with the identification of anthraquinoid molecular markers in standard dyes, reference lakes, and paint model systems using a micro-invasive and nondestructive technique such as matrix-assisted laser desorption/ionization time-of-flight-mass spectrometry (MALDI-ToF-MS). Red anthraquinoid lakes, such as madder lake, carmine lake, and Indian lac, have been the most widely used for painting purposes since ancient times. From an analytical point of view, identifying lakes in paint samples is challenging and developing methods that maximize the information achievable minimizing the amount of sample needed is of paramount importance. The employed method was tested on less than 0.5 mg of reference samples and required a minimal sample preparation, entailing a hydrofluoric acid extraction. The method is fast and versatile because of the possibility to re-analyze the same sample (once it has been spotted on the steel plate), testing both positive and negative modes in a few minutes. The MALDI mass spectra collected in the two analysis modes were studied and compared with LDI and simulated mass spectra in order to highlight the peculiar behavior of the anthraquinones in the MALDI process. Both ionization modes were assessed for each species. The effect of the different paint binders on dye identification was also evaluated through the analyses of paint model systems. In the end, the method was successful in detecting madder lake in archeological samples from Greek wall paintings and on an Italian funerary clay vessel, demonstrating its capabilities to identify dyes in small amount of highly degraded samples.
Derogis, Priscilla Bento Matos; Sanches, Livia Rentas; de Aranda, Valdir Fernandes; Colombini, Marjorie Paris; Mangueira, Cristóvão Luis Pitangueira; Katz, Marcelo; Faulhaber, Adriana Caschera Leme; Mendes, Claudio Ernesto Albers; Ferreira, Carlos Eduardo Dos Santos; França, Carolina Nunes; Guerra, João Carlos de Campos
2017-01-01
Rivaroxaban is an oral direct factor Xa inhibitor, therapeutically indicated in the treatment of thromboembolic diseases. As other new oral anticoagulants, routine monitoring of rivaroxaban is not necessary, but important in some clinical circumstances. In our study a high-performance liquid chromatography-tandem mass spectrometry (HPLC-MS/MS) method was validated to measure rivaroxaban plasmatic concentration. Our method used a simple sample preparation, protein precipitation, and a fast chromatographic run. It was developed a precise and accurate method, with a linear range from 2 to 500 ng/mL, and a lower limit of quantification of 4 pg on column. The new method was compared to a reference method (anti-factor Xa activity) and both presented a good correlation (r = 0.98, p < 0.001). In addition, we validated hemolytic, icteric or lipemic plasma samples for rivaroxaban measurement by HPLC-MS/MS without interferences. The chromogenic and HPLC-MS/MS methods were highly correlated and should be used as clinical tools for drug monitoring. The method was applied successfully in a group of 49 real-life patients, which allowed an accurate determination of rivaroxaban in peak and trough levels.
Mohebifar, Rafat; Hasani, Hana; Barikani, Ameneh; Rafiei, Sima
2016-08-01
Providing high service quality is one of the main functions of health systems. Measuring service quality is the basic prerequisite for improving quality. The aim of this study was to evaluate the quality of service in teaching hospitals using importance-performance analysis matrix. A descriptive-analytic study was conducted through a cross-sectional method in six academic hospitals of Qazvin, Iran, in 2012. A total of 360 patients contributed to the study. The sampling technique was stratified random sampling. Required data were collected based on a standard questionnaire (SERVQUAL). Data analysis was done through SPSS version 18 statistical software and importance-performance analysis matrix. The results showed a significant gap between importance and performance in all five dimensions of service quality (p < 0.05). In reviewing the gap, "reliability" (2.36) and "assurance" (2.24) dimensions had the highest quality gap and "responsiveness" had the lowest gap (1.97). Also, according to findings, reliability and assurance were in Quadrant (I), empathy was in Quadrant (II), and tangibles and responsiveness were in Quadrant (IV) of the importance-performance matrix. The negative gap in all dimensions of quality shows that quality improvement is necessary in all dimensions. Using quality and diagnosis measurement instruments such as importance-performance analysis will help hospital managers with planning of service quality improvement and achieving long-term goals.
NASA Technical Reports Server (NTRS)
Hiroi, T.; Sasaki, S.; Noble, S. K.; Pieters, C. M.
2011-01-01
As the most abundance meteorites in our collections, ordinary chondrites potentially have very important implications on the origin and formation of our Solar System. In order to map the distribution of ordinary chondrite-like asteroids through remote sensing, the space weathering effects of ordinary chondrite parent bodies must be addressed through experiments and modeling. Of particular importance is the impact on distinguishing different types (H/L/LL) of ordinary chondrites. In addition, samples of asteroid Itokawa returned by the Hayabusa spacecraft may re veal the mechanism of space weathering on an LLchondrite parent body. Results of space weathering simulations on ordinary chondrites and implications for Itokawa samples are presented here.
Reinharz, Vladimir; Ponty, Yann; Waldispühl, Jérôme
2013-07-01
The design of RNA sequences folding into predefined secondary structures is a milestone for many synthetic biology and gene therapy studies. Most of the current software uses similar local search strategies (i.e. a random seed is progressively adapted to acquire the desired folding properties) and more importantly do not allow the user to control explicitly the nucleotide distribution such as the GC-content in their sequences. However, the latter is an important criterion for large-scale applications as it could presumably be used to design sequences with better transcription rates and/or structural plasticity. In this article, we introduce IncaRNAtion, a novel algorithm to design RNA sequences folding into target secondary structures with a predefined nucleotide distribution. IncaRNAtion uses a global sampling approach and weighted sampling techniques. We show that our approach is fast (i.e. running time comparable or better than local search methods), seedless (we remove the bias of the seed in local search heuristics) and successfully generates high-quality sequences (i.e. thermodynamically stable) for any GC-content. To complete this study, we develop a hybrid method combining our global sampling approach with local search strategies. Remarkably, our glocal methodology overcomes both local and global approaches for sampling sequences with a specific GC-content and target structure. IncaRNAtion is available at csb.cs.mcgill.ca/incarnation/. Supplementary data are available at Bioinformatics online.
Demkowska-Kutrzepa, Marta; Borecka, Anna; Meisner, Michał; Tomczuk, Krzysztof; Roczeń-Karczmarz, Monika; Kłapeć, Teresa; Abbass, Zahrai; Cholewa, Alicja
2017-01-01
Companion animals are an important aspect in human life. However, they may also be considered a source of pathogens. An example of zoonotic parasitoses is toxocarosis or cutaneous larva migrans (CLM). The aim of the study was to detect zoonotic nematodes of dogs living in different areas and the intensity of contamination in parasite polluted environments that are hazardous to human health. The fecal samples were examined using standard flotation and decantation methods as well as McMaster’s quantitative technique. The soil samples in urban and rural areas were examined using a modified flotation method as described by Quinn et al. Statistical analyses were performed by IBM SPSS Statistics Version 23. The overall prevalence of parasites in dogs was 38%, 17.02% and 56.60% from urban and rural areas, respectively. The percentage values of nematodes important for human health (Toxocara canis, Ancylostomatidae, Trichuris vulpis) remained at the same level (16%). The infected dogs were dominated by a single parasite species, the main was T. canis (28.95%). In total, 54.30% of the soil samples were contaminated with parasite eggs. The contamination of urban and rural sandpits was 40% and 60%, respectively. The molecular examinations of soil samples using LAMP (loop-mediated isothermal amplification) confirmed the presence of nematode eggs of the species T. canis in all samples previously classified as positive PMID:28862690
Studzińska, Maria Bernadeta; Demkowska-Kutrzepa, Marta; Borecka, Anna; Meisner, Michał; Tomczuk, Krzysztof; Roczeń-Karczmarz, Monika; Kłapeć, Teresa; Abbass, Zahrai; Cholewa, Alicja
2017-09-01
Companion animals are an important aspect in human life. However, they may also be considered a source of pathogens. An example of zoonotic parasitoses is toxocarosis or cutaneous larva migrans (CLM). The aim of the study was to detect zoonotic nematodes of dogs living in different areas and the intensity of contamination in parasite polluted environments that are hazardous to human health. The fecal samples were examined using standard flotation and decantation methods as well as McMaster's quantitative technique. The soil samples in urban and rural areas were examined using a modified flotation method as described by Quinn et al. Statistical analyses were performed by IBM SPSS Statistics Version 23. The overall prevalence of parasites in dogs was 38%, 17.02% and 56.60% from urban and rural areas, respectively. The percentage values of nematodes important for human health ( Toxocara canis , Ancylostomatidae, Trichuris vulpis ) remained at the same level (16%). The infected dogs were dominated by a single parasite species, the main was T. canis (28.95%). In total, 54.30% of the soil samples were contaminated with parasite eggs. The contamination of urban and rural sandpits was 40% and 60%, respectively. The molecular examinations of soil samples using LAMP (loop-mediated isothermal amplification) confirmed the presence of nematode eggs of the species T. canis in all samples previously classified as positive.
Ort, Christoph; Lawrence, Michael G; Rieckermann, Jörg; Joss, Adriano
2010-08-15
The analysis of 87 peer-reviewed journal articles reveals that sampling for pharmaceuticals and personal care products (PPCPs) and illicit drugs in sewers and sewage treatment plant influents is mostly carried out according to existing tradition or standard laboratory protocols. Less than 5% of all studies explicitly consider internationally acknowledged guidelines or methods for the experimental design of monitoring campaigns. In the absence of a proper analysis of the system under investigation, the importance of short-term pollutant variations was typically not addressed. Therefore, due to relatively long sampling intervals, potentially inadequate sampling modes, or insufficient documentation, it remains unclear for the majority of reviewed studies whether observed variations can be attributed to "real" variations or if they simply reflect sampling artifacts. Based on results from previous and current work, the present paper demonstrates that sampling errors can lead to overinterpretation of measured data and ultimately, wrong conclusions. Depending on catchment size, sewer type, sampling setup, substance of interest, and accuracy of analytical method, avoidable sampling artifacts can range from "not significant" to "100% or more" for different compounds even within the same study. However, in most situations sampling errors can be reduced greatly, and sampling biases can be eliminated completely, by choosing an appropriate sampling mode and frequency. This is crucial, because proper sampling will help to maximize the value of measured data for the experimental assessment of the fate of PPCPs as well as for the formulation and validation of mathematical models. The trend from reporting presence or absence of a compound in "clean" water samples toward the quantification of PPCPs in raw wastewater requires not only sophisticated analytical methods but also adapted sampling methods. With increasing accuracy of chemical analyses, inappropriate sampling increasingly represents the major source of inaccuracy. A condensed step-by-step Sampling Guide is proposed as a starting point for future studies.
Johnston, Lisa G; McLaughlin, Katherine R; Rhilani, Houssine El; Latifi, Amina; Toufik, Abdalla; Bennani, Aziza; Alami, Kamal; Elomari, Boutaina; Handcock, Mark S
2015-01-01
Background Respondent-driven sampling is used worldwide to estimate the population prevalence of characteristics such as HIV/AIDS and associated risk factors in hard-to-reach populations. Estimating the total size of these populations is of great interest to national and international organizations, however reliable measures of population size often do not exist. Methods Successive Sampling-Population Size Estimation (SS-PSE) along with network size imputation allows population size estimates to be made without relying on separate studies or additional data (as in network scale-up, multiplier and capture-recapture methods), which may be biased. Results Ten population size estimates were calculated for people who inject drugs, female sex workers, men who have sex with other men, and migrants from sub-Sahara Africa in six different cities in Morocco. SS-PSE estimates fell within or very close to the likely values provided by experts and the estimates from previous studies using other methods. Conclusions SS-PSE is an effective method for estimating the size of hard-to-reach populations that leverages important information within respondent-driven sampling studies. The addition of a network size imputation method helps to smooth network sizes allowing for more accurate results. However, caution should be used particularly when there is reason to believe that clustered subgroups may exist within the population of interest or when the sample size is small in relation to the population. PMID:26258908
Reducing acquisition times in multidimensional NMR with a time-optimized Fourier encoding algorithm
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, Zhiyong; Department of Electronic Science, Fujian Provincial Key Laboratory of Plasma and Magnetic Resonance, Xiamen University, Xiamen, Fujian 361005; Smith, Pieter E. S.
Speeding up the acquisition of multidimensional nuclear magnetic resonance (NMR) spectra is an important topic in contemporary NMR, with central roles in high-throughput investigations and analyses of marginally stable samples. A variety of fast NMR techniques have been developed, including methods based on non-uniform sampling and Hadamard encoding, that overcome the long sampling times inherent to schemes based on fast-Fourier-transform (FFT) methods. Here, we explore the potential of an alternative fast acquisition method that leverages a priori knowledge, to tailor polychromatic pulses and customized time delays for an efficient Fourier encoding of the indirect domain of an NMR experiment. Bymore » porting the encoding of the indirect-domain to the excitation process, this strategy avoids potential artifacts associated with non-uniform sampling schemes and uses a minimum number of scans equal to the number of resonances present in the indirect dimension. An added convenience is afforded by the fact that a usual 2D FFT can be used to process the generated data. Acquisitions of 2D heteronuclear correlation NMR spectra on quinine and on the anti-inflammatory drug isobutyl propionic phenolic acid illustrate the new method's performance. This method can be readily automated to deal with complex samples such as those occurring in metabolomics, in in-cell as well as in in vivo NMR applications, where speed and temporal stability are often primary concerns.« less
High-Resolution Detection of Identity by Descent in Unrelated Individuals
Browning, Sharon R.; Browning, Brian L.
2010-01-01
Detection of recent identity by descent (IBD) in population samples is important for population-based linkage mapping and for highly accurate genotype imputation and haplotype-phase inference. We present a method for detection of recent IBD in population samples. Our method accounts for linkage disequilibrium between SNPs to enable full use of high-density SNP data. We find that our method can detect segments of a length of 2 cM with moderate power and negligible false discovery rate in Illumina 550K data in Northwestern Europeans. We compare our method with GERMLINE and PLINK, and we show that our method has a level of resolution that is significantly better than these existing methods, thus extending the usefulness of recent IBD in analysis of high-density SNP data. We survey four genomic regions in a sample of UK individuals of European descent and find that on average, at a given location, our method detects IBD in 2.7 per 10,000 pairs of individuals in Illumina 550K data. We also present methodology and results for detection of homozygosity by descent (HBD) and survey the whole genome in a sample of 1373 UK individuals of European descent. We detect HBD in 4.7 individuals per 10,000 on average at a given location. Our methodology is implemented in the freely available BEAGLE software package. PMID:20303063
Fiuzy, Mohammad; Haddadnia, Javad; Mollania, Nasrin; Hashemian, Maryam; Hassanpour, Kazem
2012-01-01
Background Accurate Diagnosis of Breast Cancer is of prime importance. Fine Needle Aspiration test or "FNA”, which has been used for several years in Europe, is a simple, inexpensive, noninvasive and accurate technique for detecting breast cancer. Expending the suitable features of the Fine Needle Aspiration results is the most important diagnostic problem in early stages of breast cancer. In this study, we introduced a new algorithm that can detect breast cancer based on combining artificial intelligent system and Fine Needle Aspiration (FNA). Methods We studied the Features of Wisconsin Data Base Cancer which contained about 569 FNA test samples (212 patient samples (malignant) and 357 healthy samples (benign)). In this research, we combined Artificial Intelligence Approaches, such as Evolutionary Algorithm (EA) with Genetic Algorithm (GA), and also used Exact Classifier Systems (here by Fuzzy C-Means (FCM)) to separate malignant from benign samples. Furthermore, we examined artificial Neural Networks (NN) to identify the model and structure. This research proposed a new algorithm for an accurate diagnosis of breast cancer. Results According to Wisconsin Data Base Cancer (WDBC) data base, 62.75% of samples were benign, and 37.25% were malignant. After applying the proposed algorithm, we achieved high detection accuracy of about "96.579%” on 205 patients who were diagnosed as having breast cancer. It was found that the method had 93% sensitivity, 73% specialty, 65% positive predictive value, and 95% negative predictive value, respectively. If done by experts, Fine Needle Aspiration (FNA) can be a reliable replacement for open biopsy in palpable breast masses. Evaluation of FNA samples during aspiration can decrease insufficient samples. FNA can be the first line of diagnosis in women with breast masses, at least in deprived regions, and may increase health standards and clinical supervision of patients. Conclusion Such a smart, economical, non-invasive, rapid and accurate system can be introduced as a useful diagnostic system for comprehensive treatment of breast cancer. Another advantage of this method is the possibility of diagnosing breast abnormalities. If done by experts, FNA can be a reliable replacement for open biopsy in palpable breast masses. Evaluation of FNA samples during aspiration can decrease insufficient samples. PMID:25352966
Mansion-de Vries, Elisabeth Maria; Knorr, Nicole; Paduch, Jan-Hendrik; Zinke, Claudia; Hoedemaker, Martina; Krömker, Volker
2014-03-01
Clinical mastitis is one of the most common and expensive diseases of dairy cattle. To make an informed treatment decision, it is important to know the causative pathogen. However, no detection of bacterial growth can be made in approximately 30% of all clinical cases of mastitis. Before selecting the treatment regimen, it is important to know whether the mastitis-causing pathogen (MCP) is Gram-positive or Gram-negative. The aim of this field study was to investigate whether using two 3M Petrifilm™ products on-farm (which conveys a higher degree of sample freshness but also bears a higher risk for contamination than working in a lab) as 24-h rapid diagnostic of clinical mastitis achieved results that were comparable to the conventional microbiological diagnostic method. AerobicCount (AC)-Petrifilm™ and ColiformCount (CC)-Petrifilm™ were used to identify the total bacterial counts and Gram-negative bacteria in samples from clinical mastitis cases, respectively. Missing growth on both plates was classified as no bacterial detection. Growth only on the AC-Petrifilm™ was assessed as Gram-positive, and growth on both Petrifilm™ plates was assessed as Gram-negative bacterial growth. Additionally, milk samples were analysed by conventional microbiological diagnostic method on aesculin blood agar as a reference method. Overall, 616 samples from clinical mastitis cases were analysed. Using the reference method, Gram-positive and Gram-negative bacteria, mixed bacterial growth, contaminated samples and yeast were determined in 32.6%, 20.0%, 2.5%, 14.1% and 1.1% of the samples, respectively. In 29.7% of the samples, microbiological growth could not be identified. Using the Petrifilm™ concept, bacterial growth was detected in 59% of the culture-negative samples. The sensitivity of the Petrifilm™ for Gram-positive and Gram-negative MCP was 85.2% and 89.9%, respectively. The specificity was 75.4% for Gram-positive and 88.4% for Gram-negative MCP. For the culture-negative samples, sensitivity was 41.0% and specificity was 91.0%. The results indicate that the Petrifilm™ concept is suitable for therapeutic decision-making at the farm level or in veterinary practice. As this concept does not allow any statement about the genus or species of microorganisms, relevant MCP should be assessed periodically at the herd level with conventional microbiological diagnostics. Copyright © 2014 Elsevier B.V. All rights reserved.
2010-01-01
Background An important challenge in conducting social research of specific relevance to harm reduction programs is locating hidden populations of consumers of substances like cannabis who typically report few adverse or unwanted consequences of their use. Much of the deviant, pathologized perception of drug users is historically derived from, and empirically supported, by a research emphasis on gaining ready access to users in drug treatment or in prison populations with higher incidence of problems of dependence and misuse. Because they are less visible, responsible recreational users of illicit drugs have been more difficult to study. Methods This article investigates Respondent Driven Sampling (RDS) as a method of recruiting experienced marijuana users representative of users in the general population. Based on sampling conducted in a multi-city study (Halifax, Montreal, Toronto, and Vancouver), and compared to samples gathered using other research methods, we assess the strengths and weaknesses of RDS recruitment as a means of gaining access to illicit substance users who experience few harmful consequences of their use. Demographic characteristics of the sample in Toronto are compared with those of users in a recent household survey and a pilot study of Toronto where the latter utilized nonrandom self-selection of respondents. Results A modified approach to RDS was necessary to attain the target sample size in all four cities (i.e., 40 'users' from each site). The final sample in Toronto was largely similar, however, to marijuana users in a random household survey that was carried out in the same city. Whereas well-educated, married, whites and females in the survey were all somewhat overrepresented, the two samples, overall, were more alike than different with respect to economic status and employment. Furthermore, comparison with a self-selected sample suggests that (even modified) RDS recruitment is a cost-effective way of gathering respondents who are more representative of users in the general population than nonrandom methods of recruitment ordinarily produce. Conclusions Research on marijuana use, and other forms of drug use hidden in the general population of adults, is important for informing and extending harm reduction beyond its current emphasis on 'at-risk' populations. Expanding harm reduction in a normalizing context, through innovative research on users often overlooked, further challenges assumptions about reducing harm through prohibition of drug use and urges consideration of alternative policies such as decriminalization and legal regulation. PMID:20618944
Damage evolution analysis of coal samples under cyclic loading based on single-link cluster method
NASA Astrophysics Data System (ADS)
Zhang, Zhibo; Wang, Enyuan; Li, Nan; Li, Xuelong; Wang, Xiaoran; Li, Zhonghui
2018-05-01
In this paper, the acoustic emission (AE) response of coal samples under cyclic loading is measured. The results show that there is good positive relation between AE parameters and stress. The AE signal of coal samples under cyclic loading exhibits an obvious Kaiser Effect. The single-link cluster (SLC) method is applied to analyze the spatial evolution characteristics of AE events and the damage evolution process of coal samples. It is found that a subset scale of the SLC structure becomes smaller and smaller when the number of cyclic loading increases, and there is a negative linear relationship between the subset scale and the degree of damage. The spatial correlation length ξ of an SLC structure is calculated. The results show that ξ fluctuates around a certain value from the second cyclic loading process to the fifth cyclic loading process, but spatial correlation length ξ clearly increases in the sixth loading process. Based on the criterion of microcrack density, the coal sample failure process is the transformation from small-scale damage to large-scale damage, which is the reason for changes in the spatial correlation length. Through a systematic analysis, the SLC method is an effective method to research the damage evolution process of coal samples under cyclic loading, and will provide important reference values for studying coal bursts.
Microfluidic method for measuring viscosity using images from smartphone
NASA Astrophysics Data System (ADS)
Kim, Sooyeong; Kim, Kyung Chun; Yeom, Eunseop
2018-05-01
The viscosity of a fluid is the most important characteristic in fluid rheology. Many microfluidic devices have been proposed for easily measuring the fluid viscosity of small samples. A hybrid system consisting of a smartphone and microfluidic device can offer a mobile laboratory for performing a wide range of detection and analysis functions related to healthcare. In this study, a new mobile sensing method based on a microfluidic device was proposed for fluid viscosity measurements. By separately delivering sample and reference fluids into the two inlets of a Y-shaped microfluidic device, an interfacial line is induced at downstream of the device. Because the interfacial width (W) between the sample and reference fluid flows was determined by their pressure ratio, the viscosity (μ) of the sample could be estimated by measuring the interfacial width. To distinguish the interfacial width of a sample, optical images of the flows at downstream of the Y-shaped microfluidic device were acquired using a smartphone. To check the measurement accuracy of the proposed method, the viscosities of glycerol mixtures were compared with those measured by a conventional viscometer. The proposed technique was applied to monitor the variations in blood and oil samples depending on storage or rancidity. We expect that this mobile sensing method based on a microfluidic device could be utilized as a viscometer with significant advantages in terms of mobility, ease-of-operation, and data management.
Standard methods for sampling freshwater fishes: Opportunities for international collaboration
Bonar, Scott A.; Mercado-Silva, Norman; Hubert, Wayne A.; Beard, Douglas; Dave, Göran; Kubečka, Jan; Graeb, Brian D. S.; Lester, Nigel P.; Porath, Mark T.; Winfield, Ian J.
2017-01-01
With publication of Standard Methods for Sampling North American Freshwater Fishes in 2009, the American Fisheries Society (AFS) recommended standard procedures for North America. To explore interest in standardizing at intercontinental scales, a symposium attended by international specialists in freshwater fish sampling was convened at the 145th Annual AFS Meeting in Portland, Oregon, in August 2015. Participants represented all continents except Australia and Antarctica and were employed by state and federal agencies, universities, nongovernmental organizations, and consulting businesses. Currently, standardization is practiced mostly in North America and Europe. Participants described how standardization has been important for management of long-term data sets, promoting fundamental scientific understanding, and assessing efficacy of large spatial scale management strategies. Academics indicated that standardization has been useful in fisheries education because time previously used to teach how sampling methods are developed is now more devoted to diagnosis and treatment of problem fish communities. Researchers reported that standardization allowed increased sample size for method validation and calibration. Group consensus was to retain continental standards where they currently exist but to further explore international and intercontinental standardization, specifically identifying where synergies and bridges exist, and identify means to collaborate with scientists where standardization is limited but interest and need occur.
Suominen, Tina; Uutela, Päivi; Ketola, Raimo A.; Bergquist, Jonas; Hillered, Lars; Finel, Moshe; Zhang, Hongbo; Laakso, Aki; Kostiainen, Risto
2013-01-01
An UPLC-MS/MS method was developed for the determination of serotonin (5-HT), dopamine (DA), their phase I metabolites 5-HIAA, DOPAC and HVA, and their sulfate and glucuronide conjugates in human brain microdialysis samples obtained from two patients with acute brain injuries, ventricular cerebrospinal fluid (CSF) samples obtained from four patients with obstructive hydrocephalus, and a lumbar CSF sample pooled mainly from patients undergoing spinal anesthesia in preparation for orthopedic surgery. The method was validated by determining the limits of detection and quantification, linearity, repeatability and specificity. The direct method enabled the analysis of the intact phase II metabolites of 5-HT and DA, without hydrolysis of the conjugates. The method also enabled the analysis of the regioisomers of the conjugates, and several intact glucuronide and sulfate conjugates were identified and quantified for the first time in the human brain microdialysis and CSF samples. We were able to show the presence of 5-HIAA sulfate, and that dopamine-3-O-sulfate predominates over dopamine-4-O-sulfate in the human brain. The quantitative results suggest that sulfonation is a more important phase II metabolism pathway than glucuronidation in the human brain. PMID:23826355
Analyzing Kernel Matrices for the Identification of Differentially Expressed Genes
Xia, Xiao-Lei; Xing, Huanlai; Liu, Xueqin
2013-01-01
One of the most important applications of microarray data is the class prediction of biological samples. For this purpose, statistical tests have often been applied to identify the differentially expressed genes (DEGs), followed by the employment of the state-of-the-art learning machines including the Support Vector Machines (SVM) in particular. The SVM is a typical sample-based classifier whose performance comes down to how discriminant samples are. However, DEGs identified by statistical tests are not guaranteed to result in a training dataset composed of discriminant samples. To tackle this problem, a novel gene ranking method namely the Kernel Matrix Gene Selection (KMGS) is proposed. The rationale of the method, which roots in the fundamental ideas of the SVM algorithm, is described. The notion of ''the separability of a sample'' which is estimated by performing -like statistics on each column of the kernel matrix, is first introduced. The separability of a classification problem is then measured, from which the significance of a specific gene is deduced. Also described is a method of Kernel Matrix Sequential Forward Selection (KMSFS) which shares the KMGS method's essential ideas but proceeds in a greedy manner. On three public microarray datasets, our proposed algorithms achieved noticeably competitive performance in terms of the B.632+ error rate. PMID:24349110
Wang, Jia; Liu, Feng; Mo, Yuxiang; Wang, Zhaoying; Zhang, Sichun; Zhang, Xinrong
2017-11-01
Mass spectrometry imaging (MSI) has important applications in material research, biology, and medicine. The MSI method based on UV laser desorption/ionization (UVLDI) can obtain images of intact samples, but has a high level of molecular fragmentation. In this work, we report a new MSI instrument that uses a VUV laser (125.3 nm) as a desorption/ionization source to exploit its advantages of high single photon energy and small focus size. The new instrument was tested by the mass spectra of Nile red and FGB (Fibrinogen beta chain) samples and mass spectrometric images of a fly brain section. For the tested samples, the VUVDI method offers lower levels of molecular fragmentations and higher sensitivities than those of the UVLDI method and second ion mass spectrometry imaging method using a Bi 3 + beam. The ablation crater produced by the focused VUV laser on a quartz plate has an area of 10 μm 2 . The VUV laser is prepared based on the four-wave mixing method using three collimated laser beams and a heated Hg cell.
NASA Astrophysics Data System (ADS)
Yulia, M.; Suhandy, D.
2017-05-01
Indonesian palm civet coffee or kopi luwak (Indonesian words for coffee and palm civet) is well known as the world’s priciest and rarest coffee. To protect the authenticity of luwak coffee and protect consumer from luwak coffee adulteration, it is very important to develop a simple and inexpensive method to discriminate between civet and non-civet coffee. The discrimination between civet and non-civet coffee in ground roasted (powder) samples is very challenging since it is very difficult to distinguish between the two by using conventional method. In this research, the use of UV-Visible spectra combined with two chemometric methods, SIMCA and PLS-DA, was evaluated to discriminate civet and non-civet ground coffee samples. The spectral data of civet and non-civet coffee were acquired using UV-Vis spectrometer (Genesys™ 10S UV-Vis, Thermo Scientific, USA). The result shows that using both supervised discrimination methods: SIMCA and PLS-DA, all samples were correctly classified into their corresponding classes with 100% rate for accuracy, sensitivity and specificity, respectively.
NASA Astrophysics Data System (ADS)
Wang, Jia; Liu, Feng; Mo, Yuxiang; Wang, Zhaoying; Zhang, Sichun; Zhang, Xinrong
2017-11-01
Mass spectrometry imaging (MSI) has important applications in material research, biology, and medicine. The MSI method based on UV laser desorption/ionization (UVLDI) can obtain images of intact samples, but has a high level of molecular fragmentation. In this work, we report a new MSI instrument that uses a VUV laser (125.3 nm) as a desorption/ionization source to exploit its advantages of high single photon energy and small focus size. The new instrument was tested by the mass spectra of Nile red and FGB (Fibrinogen beta chain) samples and mass spectrometric images of a fly brain section. For the tested samples, the VUVDI method offers lower levels of molecular fragmentations and higher sensitivities than those of the UVLDI method and second ion mass spectrometry imaging method using a Bi3+ beam. The ablation crater produced by the focused VUV laser on a quartz plate has an area of 10 μm2. The VUV laser is prepared based on the four-wave mixing method using three collimated laser beams and a heated Hg cell.
A Hierarchical Poisson Log-Normal Model for Network Inference from RNA Sequencing Data
Gallopin, Mélina; Rau, Andrea; Jaffrézic, Florence
2013-01-01
Gene network inference from transcriptomic data is an important methodological challenge and a key aspect of systems biology. Although several methods have been proposed to infer networks from microarray data, there is a need for inference methods able to model RNA-seq data, which are count-based and highly variable. In this work we propose a hierarchical Poisson log-normal model with a Lasso penalty to infer gene networks from RNA-seq data; this model has the advantage of directly modelling discrete data and accounting for inter-sample variance larger than the sample mean. Using real microRNA-seq data from breast cancer tumors and simulations, we compare this method to a regularized Gaussian graphical model on log-transformed data, and a Poisson log-linear graphical model with a Lasso penalty on power-transformed data. For data simulated with large inter-sample dispersion, the proposed model performs better than the other methods in terms of sensitivity, specificity and area under the ROC curve. These results show the necessity of methods specifically designed for gene network inference from RNA-seq data. PMID:24147011
Nakano, Yosuke; Konya, Yutaka; Taniguchi, Moyu; Fukusaki, Eiichiro
2017-01-01
d-Amino acids have recently attracted much attention in various research fields including medical, clinical and food industry due to their important biological functions that differ from l-amino acid. Most chiral amino acid separation techniques require complicated derivatization procedures in order to achieve the desirable chromatographic behavior and detectability. Thus, the aim of this research is to develop a highly sensitive analytical method for the enantioseparation of chiral amino acids without any derivatization process using liquid chromatography-tandem mass spectrometry (LC-MS/MS). By optimizing MS/MS parameters, we established a quantification method that allowed the simultaneous analysis of 18 d-amino acids with high sensitivity and reproducibility. Additionally, we applied the method to food sample (vinegar) for the validation, and successfully quantified trace levels of d-amino acids in samples. These results demonstrated the applicability and feasibility of the LC-MS/MS method as a novel, effective tool for d-amino acid measurement in various biological samples. Copyright © 2016 The Society for Biotechnology, Japan. Published by Elsevier B.V. All rights reserved.
Extraction and Determination of Cyproheptadine in Human Urine by DLLME-HPLC Method.
Maham, Mehdi; Kiarostami, Vahid; Waqif-Husain, Syed; Abroomand-Azar, Parviz; Tehrani, Mohammad Saber; Khoeini Sharifabadi, Malihe; Afrouzi, Hossein; Shapouri, Mahmoudreza; Karami-Osboo, Rouhollah
2013-01-01
Novel dispersive liquid-liquid microextraction (DLLME), coupled with high performance liquid chromatography with photodiode array detection (HPLC-DAD) has been applied for the extraction and determination of cyproheptadine (CPH), an antihistamine, in human urine samples. In this method, 0.6 mL of acetonitrile (disperser solvent) containing 30 μL of carbon tetrachloride (extraction solvent) was rapidly injected by a syringe into 5 mL urine sample. After centrifugation, the sedimented phase containing enriched analyte was dissolved in acetonitrile and an aliquot of this solution injected into the HPLC system for analysis. Development of DLLME procedure includes optimization of some important parameters such as kind and volume of extraction and disperser solvent, pH and salt addition. The proposed method has good linearity in the range of 0.02-4.5 μg mL(-1) and low detection limit (13.1 ng mL(-1)). The repeatability of the method, expressed as relative standard deviation was 4.9% (n = 3). This method has also been applied to the analysis of real urine samples with satisfactory relative recoveries in the range of 91.6-101.0%.
Annealed Importance Sampling Reversible Jump MCMC algorithms
DOE Office of Scientific and Technical Information (OSTI.GOV)
Karagiannis, Georgios; Andrieu, Christophe
2013-03-20
It will soon be 20 years since reversible jump Markov chain Monte Carlo (RJ-MCMC) algorithms have been proposed. They have significantly extended the scope of Markov chain Monte Carlo simulation methods, offering the promise to be able to routinely tackle transdimensional sampling problems, as encountered in Bayesian model selection problems for example, in a principled and flexible fashion. Their practical efficient implementation, however, still remains a challenge. A particular difficulty encountered in practice is in the choice of the dimension matching variables (both their nature and their distribution) and the reversible transformations which allow one to define the one-to-one mappingsmore » underpinning the design of these algorithms. Indeed, even seemingly sensible choices can lead to algorithms with very poor performance. The focus of this paper is the development and performance evaluation of a method, annealed importance sampling RJ-MCMC (aisRJ), which addresses this problem by mitigating the sensitivity of RJ-MCMC algorithms to the aforementioned poor design. As we shall see the algorithm can be understood as being an “exact approximation” of an idealized MCMC algorithm that would sample from the model probabilities directly in a model selection set-up. Such an idealized algorithm may have good theoretical convergence properties, but typically cannot be implemented, and our algorithms can approximate the performance of such idealized algorithms to an arbitrary degree while not introducing any bias for any degree of approximation. Our approach combines the dimension matching ideas of RJ-MCMC with annealed importance sampling and its Markov chain Monte Carlo implementation. We illustrate the performance of the algorithm with numerical simulations which indicate that, although the approach may at first appear computationally involved, it is in fact competitive.« less
Salem, Shady; Chang, Sam S; Clark, Peter E; Davis, Rodney; Herrell, S Duke; Kordan, Yakup; Wills, Marcia L; Shappell, Scott B; Baumgartner, Roxelyn; Phillips, Sharon; Smith, Joseph A; Cookson, Michael S; Barocas, Daniel A
2010-10-01
Whole mount processing is more resource intensive than routine systematic sampling of radical retropubic prostatectomy specimens. We compared whole mount and systematic sampling for detecting pathological outcomes, and compared the prognostic value of pathological findings across pathological methods. We included men (608 whole mount and 525 systematic sampling samples) with no prior treatment who underwent radical retropubic prostatectomy at Vanderbilt University Medical Center between January 2000 and June 2008. We used univariate and multivariate analysis to compare the pathological outcome detection rate between pathological methods. Kaplan-Meier curves and the log rank test were used to compare the prognostic value of pathological findings across pathological methods. There were no significant differences between the whole mount and the systematic sampling groups in detecting extraprostatic extension (25% vs 30%), positive surgical margins (31% vs 31%), pathological Gleason score less than 7 (49% vs 43%), 7 (39% vs 43%) or greater than 7 (12% vs 13%), seminal vesicle invasion (8% vs 10%) or lymph node involvement (3% vs 5%). Tumor volume was higher in the systematic sampling group and whole mount detected more multiple surgical margins (each p <0.01). There were no significant differences in the likelihood of biochemical recurrence between the pathological methods when patients were stratified by pathological outcome. Except for estimated tumor volume and multiple margins whole mount and systematic sampling yield similar pathological information. Each method stratifies patients into comparable risk groups for biochemical recurrence. Thus, while whole mount is more resource intensive, it does not appear to result in improved detection of clinically important pathological outcomes or prognostication. Copyright © 2010 American Urological Association Education and Research, Inc. Published by Elsevier Inc. All rights reserved.
An effective and low cost carbon based clean-up method for PCDD/Fs and PCBs analysis in food.
Kedikoglou, Kleopatra; Costopoulou, Danae; Vassiliadou, Irene; Bakeas, Evangelos; Leondiadis, Leondios
2018-09-01
Sample preparation is of critical importance in dioxin analysis of food and feed samples. It is a complex procedure that includes lipid extraction followed by the application of chromatographic separation techniques, aiming in removing undesirable interferences from the matrix. The separation of polychlorinated dibenzo-p-dioxins (PCDDs) and polychlorinated dibenzofurans (PCDFs) from polychlorinated biphenyls (PCBs) is achieved by carbon-based materials which should have high fat capacity in order to be suitable for lipid-containing matrices. Automated methods are available but due to high cost and use of high amounts of solvents, manual methods are also applied. An active carbon material (Carbosphere) with high fat capacity that has been used in the past for manual methods is no longer commercially available. The present study assesses an alternative active carbon material, FU 4652, that can be used for the separation of PCDD/Fs and non-ortho PCBs. Mono-ortho and 6 non-dioxin-like PCBs are also analyzed. The method was validated according to the analytical criteria set in EU regulations 589/2014 and 709/2014. Control samples analyzed for the evaluation of the above material were olive oil reference samples spiked with PCDD/Fs and dioxin-like PCBs at two concentration levels. The new method was tested successfully on food samples of interlaboratory trials organized in previous years. Farmed fish samples collected within national surveillance programs for the years 2016-2017 were analyzed with the method developed. The results obtained indicate that the FU 4652 carbon sorbent has high fat capacity and is capable of separating congeners with good recoveries. Copyright © 2018 Elsevier Ltd. All rights reserved.
Commutability of food microbiology proficiency testing samples.
Abdelmassih, M; Polet, M; Goffaux, M-J; Planchon, V; Dierick, K; Mahillon, J
2014-03-01
Food microbiology proficiency testing (PT) is a useful tool to assess the analytical performances among laboratories. PT items should be close to routine samples to accurately evaluate the acceptability of the methods. However, most PT providers distribute exclusively artificial samples such as reference materials or irradiated foods. This raises the issue of the suitability of these samples because the equivalence-or 'commutability'-between results obtained on artificial vs. authentic food samples has not been demonstrated. In the clinical field, the use of noncommutable PT samples has led to erroneous evaluation of the performances when different analytical methods were used. This study aimed to provide a first assessment of the commutability of samples distributed in food microbiology PT. REQUASUD and IPH organized 13 food microbiology PTs including 10-28 participants. Three types of PT items were used: genuine food samples, sterile food samples and reference materials. The commutability of the artificial samples (reference material or sterile samples) was assessed by plotting the distribution of the results on natural and artificial PT samples. This comparison highlighted matrix-correlated issues when nonfood matrices, such as reference materials, were used. Artificially inoculated food samples, on the other hand, raised only isolated commutability issues. In the organization of a PT-scheme, authentic or artificially inoculated food samples are necessary to accurately evaluate the analytical performances. Reference materials, used as PT items because of their convenience, may present commutability issues leading to inaccurate penalizing conclusions for methods that would have provided accurate results on food samples. For the first time, the commutability of food microbiology PT samples was investigated. The nature of the samples provided by the organizer turned out to be an important factor because matrix effects can impact on the analytical results. © 2013 The Society for Applied Microbiology.
U'ren, Jana M; Dalling, James W; Gallery, Rachel E; Maddison, David R; Davis, E Christine; Gibson, Cara M; Arnold, A Elizabeth
2009-04-01
Fungi associated with seeds of tropical trees pervasively affect seed survival and germination, and thus are an important, but understudied, component of forest ecology. Here, we examine the diversity and evolutionary origins of fungi isolated from seeds of an important pioneer tree (Cecropia insignis, Cecropiaceae) following burial in soil for five months in a tropical moist forest in Panama. Our approach, which relied on molecular sequence data because most isolates did not sporulate in culture, provides an opportunity to evaluate several methods currently used to analyse environmental samples of fungi. First, intra- and interspecific divergence were estimated for the nu-rITS and 5.8S gene for four genera of Ascomycota that are commonly recovered from seeds. Using these values we estimated species boundaries for 527 isolates, showing that seed-associated fungi are highly diverse, horizontally transmitted, and genotypically congruent with some foliar endophytes from the same site. We then examined methods for inferring the taxonomic placement and phylogenetic relationships of these fungi, evaluating the effects of manual versus automated alignment, model selection, and inference methods, as well as the quality of BLAST-based identification using GenBank. We found that common methods such as neighbor-joining and Bayesian inference differ in their sensitivity to alignment methods; analyses of particular fungal genera differ in their sensitivity to alignments; and numerous and sometimes intricate disparities exist between BLAST-based versus phylogeny-based identification methods. Lastly, we used our most robust methods to infer phylogenetic relationships of seed-associated fungi in four focal genera, and reconstructed ancestral states to generate preliminary hypotheses regarding the evolutionary origins of this guild. Our results illustrate the dynamic evolutionary relationships among endophytic fungi, pathogens, and seed-associated fungi, and the apparent evolutionary distinctiveness of saprotrophs. Our study also elucidates the diversity, taxonomy, and ecology of an important group of plant-associated fungi and highlights some of the advantages and challenges inherent in the use of ITS data for environmental sampling of fungi.
Williams, M S; Ebel, E D; Cao, Y
2013-01-01
The fitting of statistical distributions to microbial sampling data is a common application in quantitative microbiology and risk assessment applications. An underlying assumption of most fitting techniques is that data are collected with simple random sampling, which is often times not the case. This study develops a weighted maximum likelihood estimation framework that is appropriate for microbiological samples that are collected with unequal probabilities of selection. A weighted maximum likelihood estimation framework is proposed for microbiological samples that are collected with unequal probabilities of selection. Two examples, based on the collection of food samples during processing, are provided to demonstrate the method and highlight the magnitude of biases in the maximum likelihood estimator when data are inappropriately treated as a simple random sample. Failure to properly weight samples to account for how data are collected can introduce substantial biases into inferences drawn from the data. The proposed methodology will reduce or eliminate an important source of bias in inferences drawn from the analysis of microbial data. This will also make comparisons between studies and the combination of results from different studies more reliable, which is important for risk assessment applications. © 2012 No claim to US Government works.
Greenspoon, S A; Sykes, K L V; Ban, J D; Pollard, A; Baisden, M; Farr, M; Graham, N; Collins, B L; Green, M M; Christenson, C C
2006-12-20
Human genome, pharmaceutical and research laboratories have long enjoyed the application of robotics to performing repetitive laboratory tasks. However, the utilization of robotics in forensic laboratories for processing casework samples is relatively new and poses particular challenges. Since the quantity and quality (a mixture versus a single source sample, the level of degradation, the presence of PCR inhibitors) of the DNA contained within a casework sample is unknown, particular attention must be paid to procedural susceptibility to contamination, as well as DNA yield, especially as it pertains to samples with little biological material. The Virginia Department of Forensic Science (VDFS) has successfully automated forensic casework DNA extraction utilizing the DNA IQ(trade mark) System in conjunction with the Biomek 2000 Automation Workstation. Human DNA quantitation is also performed in a near complete automated fashion utilizing the AluQuant Human DNA Quantitation System and the Biomek 2000 Automation Workstation. Recently, the PCR setup for casework samples has been automated, employing the Biomek 2000 Automation Workstation and Normalization Wizard, Genetic Identity version, which utilizes the quantitation data, imported into the software, to create a customized automated method for DNA dilution, unique to that plate of DNA samples. The PCR Setup software method, used in conjunction with the Normalization Wizard method and written for the Biomek 2000, functions to mix the diluted DNA samples, transfer the PCR master mix, and transfer the diluted DNA samples to PCR amplification tubes. Once the process is complete, the DNA extracts, still on the deck of the robot in PCR amplification strip tubes, are transferred to pre-labeled 1.5 mL tubes for long-term storage using an automated method. The automation of these steps in the process of forensic DNA casework analysis has been accomplished by performing extensive optimization, validation and testing of the software methods.
Detection of antileishmanial antibodies in blood sampled from blood bank donors in Istanbul.
Ates, Sezen Canim; Bagirova, Malahat; Allahverdiyev, Adil M; Baydar, Serap Yesilkir; Koc, Rabia Cakir; Elcicek, Serhat; Abamor, Emrah Sefik; Oztel, Olga Nehir
2012-06-01
According to the WHO, only 5-20% of the total cases of leishmaniasis are symptomatic leishmaniasis; the other cases are identified as asymptomatic leishmaniasis. In recent studies, it has been demonstrated that donor blood plays an important role in the epidemiology of asymptomatic leishmaniasis. However, the number of the studies on this subject is still insufficient. Additionally, donor blood samples obtained from Istanbul, which is the biggest metropolitan area in Turkey, have not been investigated with regard to Leishmania. Moreover, there is no information about the sensitivity of noninvasive serological methods that are used in the detection of leishmaniasis donor blood samples. Accordingly, this study aimed to investigate the presence of antileishmanial antibodies in blood samples obtained from blood bank donors in Istanbul, by using different serologic methods, and to determine the most sensitive detection method. Blood samples were taken from 188 healthy blood bank donors to the Capa Turkish Red Crescent Blood Bank (Istanbul, Turkey), and the presence of antileishmanial antibodies was measured by indirect immunofluorescent antibody test (IFAT), ELISA, immunochromatographic dipstick rapid test, and western blot (WB). Antileishmanial antibodies were determined in 12 out of 188 samples by IFAT (6.4%), and six out of these 12 donors were found to be positive at diagnostic titer 1:128 (3.2%). One hundred and eighty eight samples were investigated by ELISA and one (0.5%) of them gave a positive result. None of 188 samples provided a positive result by immunochromatographic test. WB applied to the 12 seroreactive donors showed that three out of 12 donors were positive. In this study, the presence of antileishmanial antibodies in blood samples of blood bank donors from Istanbul has been demonstrated by using feasible and low-cost serological methods. Additionally, in comparison with other simple and low-cost detection methods, WB was used for confirmation. IFAT has a higher sensitivity and therefore may be preferred as a prescreening method in endemic or nonendemic areas.
Reply to “Ranking filter methods for concentrating pathogens in lake water”
Bushon, Rebecca N.; Francy, Donna S.; Gallardo, Vicente J.; Lindquist, H.D. Alan; Villegas, Eric N.; Ware, Michael W.
2013-01-01
Accurately comparing filtration methods is indeed difficult. Our method (1) and the method described by Borchardt et al. for determining recoveries are both acceptable approaches; however, each is designed to achieve a different research goal. Our study was designed to compare recoveries of multiple microorganisms in surface-water samples. Because, in practice, water-matrix effects come into play throughout filtration, concentration, and detection processes, we felt it important to incorporate those effects into the recovery results.
NASA Astrophysics Data System (ADS)
Hoch, Jeffrey C.
2017-10-01
Non-Fourier methods of spectrum analysis are gaining traction in NMR spectroscopy, driven by their utility for processing nonuniformly sampled data. These methods afford new opportunities for optimizing experiment time, resolution, and sensitivity of multidimensional NMR experiments, but they also pose significant challenges not encountered with the discrete Fourier transform. A brief history of non-Fourier methods in NMR serves to place different approaches in context. Non-Fourier methods reflect broader trends in the growing importance of computation in NMR, and offer insights for future software development.
Jansen, Wiebke; Woudstra, Svenja; Müller, Anja; Grabowski, Nils; Schoo, Gundela; Gerulat, Bettina; Klein, Günter; Kehrenberg, Corinna
2018-01-01
Though imports of products of animal origin into the European Union (EU) have to comply with legal requirements and quality standards of the community, food consignment rejections at external EU borders have been increasing in recent years. This study explored microbiological metrics according to national target and critical values valid for samples at consumer level of 498 fresh poultry meat and 136 fresh pork filets from consignments subjected to physical checks during clearing at the border inspection post Hamburg harbour between January 2014 and December 2015 with ISO standard methods. Quantitative results indicated that critical thresholds for aerobic counts, Enterobacteriaceae, and E. coli were never surpassed. Merely for staphylococci, one poultry sample (0.2%) and 10 pork samples (9.3%) exceeded the critical limit (3.7 log cfu/g). However, qualitative analyses revealed that, Staphylococcus aureus was present in 16% and 10% of all poultry and pork samples, respectively, though no methicillin-resistant Staphylococcus aureus could be confirmed. Moreover, E. coli was present in 50% and 67% of all pork and poultry samples, respectively, and thereof 33 isolates were confirmed as extended-spectrum β-lactamase-producing E. coli. Only 1.2% of the poultry samples were unacceptable due to the presence of Salmonella spp., whereas they were not detected in any pork sample. Campylobacter spp. were not detected in any sample. Though imported pork and poultry meat complies mostly with national market requirements, it might pose a potential risk to public health, especially for a direct or indirect foodborne transmission of imported, uncommon strains of zoonotic bacteria.
Jansen, Wiebke; Woudstra, Svenja; Müller, Anja; Grabowski, Nils; Schoo, Gundela; Gerulat, Bettina
2018-01-01
Though imports of products of animal origin into the European Union (EU) have to comply with legal requirements and quality standards of the community, food consignment rejections at external EU borders have been increasing in recent years. This study explored microbiological metrics according to national target and critical values valid for samples at consumer level of 498 fresh poultry meat and 136 fresh pork filets from consignments subjected to physical checks during clearing at the border inspection post Hamburg harbour between January 2014 and December 2015 with ISO standard methods. Quantitative results indicated that critical thresholds for aerobic counts, Enterobacteriaceae, and E. coli were never surpassed. Merely for staphylococci, one poultry sample (0.2%) and 10 pork samples (9.3%) exceeded the critical limit (3.7 log cfu/g). However, qualitative analyses revealed that, Staphylococcus aureus was present in 16% and 10% of all poultry and pork samples, respectively, though no methicillin-resistant Staphylococcus aureus could be confirmed. Moreover, E. coli was present in 50% and 67% of all pork and poultry samples, respectively, and thereof 33 isolates were confirmed as extended-spectrum β-lactamase-producing E. coli. Only 1.2% of the poultry samples were unacceptable due to the presence of Salmonella spp., whereas they were not detected in any pork sample. Campylobacter spp. were not detected in any sample. Though imported pork and poultry meat complies mostly with national market requirements, it might pose a potential risk to public health, especially for a direct or indirect foodborne transmission of imported, uncommon strains of zoonotic bacteria. PMID:29425222
Purvis, Linda B; Villegas, Pedro; Perozo, Francisco
2006-12-01
Infectious bursal disease virus (IBDV) is an important poultry pathogen and is distributed world wide that can cause immune suppression and lesions of the bursa of Fabricius. The main component of the virus, VP2, is not only responsible for the bird's immune response, but is important for the molecular identification of this virus as well. The nucleic acid of the virus must be adequately preserved to be analyzed by reverse-transcriptase PCR (RT-PCR) and sequenced for the molecular characterization of the field strain. Phenol inactivation has been the standard for IBDV tissue collection and international shipment; however, there have been some reports of interference with molecular detection capabilities when using phenol. Phenol is also a hazardous chemical and must be handled and shipped carefully. The ability to use the Flinders Technology Associates filter paper (FTA card) for inactivation of several avian pathogens has been proven previously, however no work has been published on its use in IBDV nucleic acid detection. Bursas from experimentally infected birds was imprinted on FTA cards, and then placed in phenol. Samples were evaluated and compared based on molecular detection capabilities between the two inactivation methods. The nucleic acid of the virus was detected in 85% of the FTA card inactivated samples compared to 71% in the phenol inactivated samples. Sequence analysis was performed on samples inactivated by both methods and no differences were found. When comparing the RNA stability at different temperatures, euthanized IBDV infected birds were held at two different temperatures before sampling. No differences were detected for FTA sampling; however, for tissues in phenol the nucleic acid was only detectable up to 2 h post-mortem in the tissues held at 4 degrees C prior to sampling. These findings indicate that the FTA card is an efficient and reliable alternative collection method for molecular detection and characterization of IBDV.
Howe, Alan; Musgrove, Darren; Breuer, Dietmar; Gusbeth, Krista; Moritz, Andreas; Demange, Martine; Oury, Véronique; Rousset, Davy; Dorotte, Michel
2011-08-01
Historically, workplace exposure to the volatile inorganic acids hydrochloric acid (HCl) and nitric acid (HNO(3)) has been determined mostly by collection on silica gel sorbent tubes and analysis of the corresponding anions by ion chromatography (IC). However, HCl and HNO(3) can be present in workplace air in the form of mist as well as vapor, so it is important to sample the inhalable fraction of airborne particles. As sorbent tubes exhibit a low sampling efficiency for inhalable particles, a more suitable method was required. This is the first of two articles on "Evaluation of Sampling Methods for Measuring Exposure to Volatile Inorganic Acids in Workplace Air" and describes collaborative sampling exercises carried out to evaluate an alternative method for sampling HCl and HNO(3) using sodium carbonate-impregnated filters. The second article describes sampling capacity and breakthrough tests. The method was found to perform well and a quartz fiber filter impregnated with 500 μL of 1 M Na(2)CO(3) (10% (m/v) Na(2)CO(3)) was found to have sufficient sampling capacity for use in workplace air measurement. A pre-filter is required to remove particulate chlorides and nitrates that when present would otherwise result in a positive interference. A GSP sampler fitted with a plastic cone, a closed face cassette, or a plastic IOM sampler were all found to be suitable for mounting the pre-filter and sampling filter(s), but care has to be taken with the IOM sampler to ensure that the sampler is tightly closed to avoid leaks. HCl and HNO(3) can react with co-sampled particulate matter on the pre-filter, e.g., zinc oxide, leading to low results, and stronger acids can react with particulate chlorides and nitrates removed by the pre-filter to liberate HCl and HNO(3), which are subsequently collected on the sampling filter, leading to high results. However, although there is this potential for both positive and negative interferences in the measurement, these are unavoidable. The method studied has now been published in ISO 21438-2:2009.
Jha, Virendra K.; Wydoski, Duane S.
2003-01-01
A method for the isolation of 20 parent organophosphate pesticides and 5 organophosphate pesticide degradates from natural-water samples is described. Compounds are extracted from water samples with methylene chloride using a continuous liquid-liquid extractor for 6 hours. The solvent is evaporated using heat and a flow of nitrogen to a volume of 1 milliliter and solvent exchanged to ethyl acetate. Extracted compounds are determined by capillary-column gas chromatography with flame photometric detection. Single-operator derived method detection limits in three water-matrix samples ranged from 0.003 to 0.009 microgram per liter. Method performance was validated by spiking all compounds in three different matrices at three different concentrations. Eight replicates were analyzed at each concentration in each matrix. Mean recoveries of most method compounds spiked in surface-water samples ranged from 54 to 137 percent and those in ground-water samples ranged from 40 to 109 percent for all pesticides. Recoveries in reagent-water samples ranged from 42 to 104 percent for all pesticides. The only exception was O-ethyl-O-methyl-S-propylphosphorothioate, which had variable recovery in all three matrices ranging from 27 to 79 percent. As a result, the detected concentration of O-ethyl-O-methyl-S-propylphosphorothioate in samples is reported in this method with an estimated remark code. Based on the performance issue, two more compounds, disulfoton and ethion monoxon, also will be reported in this method with an estimated remark code. Estimated-value compounds, which are ?E-coded? in the data base, do not meet the performance criteria for unqualified quantification, but are retained in the method because the compounds are important owing to high use or potential environmental effects and because analytical performance has been consistent and reproducible.
Field and laboratory arsenic speciation methods and their application to natural-water analysis
Bednar, A.J.; Garbarino, J.R.; Burkhardt, M.R.; Ranville, J.F.; Wildeman, T.R.
2004-01-01
The toxic and carcinogenic properties of inorganic and organic arsenic species make their determination in natural water vitally important. Determination of individual inorganic and organic arsenic species is critical because the toxicology, mobility, and adsorptivity vary substantially. Several methods for the speciation of arsenic in groundwater, surface-water, and acid mine drainage sample matrices using field and laboratory techniques are presented. The methods provide quantitative determination of arsenite [As(III)], arsenate [As(V)], monomethylarsonate (MMA), dimethylarsinate (DMA), and roxarsone in 2-8min at detection limits of less than 1??g arsenic per liter (??g AsL-1). All the methods use anion exchange chromatography to separate the arsenic species and inductively coupled plasma-mass spectrometry as an arsenic-specific detector. Different methods were needed because some sample matrices did not have all arsenic species present or were incompatible with particular high-performance liquid chromatography (HPLC) mobile phases. The bias and variability of the methods were evaluated using total arsenic, As(III), As(V), DMA, and MMA results from more than 100 surface-water, groundwater, and acid mine drainage samples, and reference materials. Concentrations in test samples were as much as 13,000??g AsL-1 for As(III) and 3700??g AsL-1 for As(V). Methylated arsenic species were less than 100??g AsL-1 and were found only in certain surface-water samples, and roxarsone was not detected in any of the water samples tested. The distribution of inorganic arsenic species in the test samples ranged from 0% to 90% As(III). Laboratory-speciation method variability for As(III), As(V), MMA, and DMA in reagent water at 0.5??g AsL-1 was 8-13% (n=7). Field-speciation method variability for As(III) and As(V) at 1??g AsL-1 in reagent water was 3-4% (n=3). ?? 2003 Elsevier Ltd. All rights reserved.
Khoza, B. S.; Chimuka, L.; Mukwevho, E.; Steenkamp, P. A.; Madala, N. E.
2014-01-01
Metabolite extraction methods have been shown to be a critical consideration for pharmacometabolomics studies and, as such, optimization and development of new extraction methods are crucial. In the current study, an organic solvent-free method, namely, pressurised hot water extraction (PHWE), was used to extract pharmacologically important metabolites from dried Moringa oleifera leaves. Here, the temperature of the extraction solvent (pure water) was altered while keeping other factors constant using a homemade PHWE system. Samples extracted at different temperatures (50, 100, and 150°C) were assayed for antioxidant activities and the effect of the temperature on the extraction process was evaluated. The samples were further analysed by mass spectrometry to elucidate their metabolite compositions. Principal component analysis (PCA) evaluation of the UPLC-MS data showed distinctive differential metabolite patterns. Here, temperature changes during PHWE were shown to affect the levels of metabolites with known pharmacological activities, such as chlorogenic acids and flavonoids. Our overall findings suggest that, if not well optimised, the extraction temperature could compromise the “pharmacological potency” of the extracts. The use of MS in combination with PCA was furthermore shown to be an excellent approach to evaluate the quality and content of pharmacologically important extracts. PMID:25371697
Development of a Terbium-Sensitized Fluorescence Method for Analysis of Silibinin.
Ershadi, Saba; Jouyban, Abolghasem; Molavi, Ommoleila; Shayanfar, Ali
2017-05-01
Silibinin is a natural flavonoid with potent anticancer properties, as shown in both in vitro and in vivo experiments. Various methods have been used for silibinin analysis. Terbium-sensitized fluorescence methods have been widely used for the determination of drugs in pharmaceutical preparations and biological samples in recent years. The present work is aimed at providing a simple analytical method for the quantitative determination of silibinin in aqueous solutions based on the formation of a fluorescent complex with terbium ion. Terbium concentration, pH, and volume of buffer, the important effective parameters for the determination of silibinin by the proposed method, were optimized using response surface methodology. The fluorescence intensity of silibinin was measured at 545 nm using λex = 334 nm. The developed method was applied for the determination of silibinin in plasma samples after protein precipitation with acetone. Under optimum conditions, the method provided a linear range between 0.10 and 0.50 mg/L, with a coefficient of determination (R2) of 0.997. The LOD and LOQ were 0.034 and 0.112 mg/L, respectively. These results indicate that the developed method is a simple, low-cost, and suitable analytical method for the quantification of silibinin in aqueous solution and plasma samples.
Social Influences on Fertility: A Comparative Mixed Methods Study in Eastern and Western Germany
ERIC Educational Resources Information Center
Bernardi, Laura; Keim, Sylvia; von der Lippe, Holger
2007-01-01
This article uses a mixed methods design to investigate the effects of social influence on family formation in a sample of eastern and western German young adults at an early stage of their family formation. Theoretical propositions on the importance of informal interaction for fertility and family behavior are still rarely supported by systematic…
Shortleaf Pine Seedling Inventory Methods On Broadcast-Seeded Areas in the Missouri Ozarks
Kenneth W. Seidel; Nelson F. Rogers
1966-01-01
The success of broadcast-seeding of shortleaf pine (Pinus echinata Mill.) after one or several years can be determined with specified precision by a systematic sampling procedure. Seedling results often are expressed as the total number of seedlings per acre, but good distribution is equally important. The total stocking and the stocked milacre methods described here...
Management of thyroid cytological material, pre-analytical procedures and bio-banking.
Bode-Lesniewska, Beata; Cochand-Priollet, Beatrix; Straccia, Patrizia; Fadda, Guido; Bongiovanni, Massimo
2018-06-09
Thyroid nodules are common and increasingly detected due to recent advances in imaging techniques. However, clinically relevant thyroid cancer is rare and the mortality from aggressive thyroid cancer remains constant. FNAC (Fine Needle Aspiration Cytology) is a standard method for diagnosing thyroid malignancy and the discrimination of malignant nodules from goiter. As the examined nodules on thyroid FNAC are often small incidental findings, it is important to maintain a low rate of undetermined diagnoses requiring further clinical work up or surgery. The most important factors determining the accuracy of the cytological diagnosis and suitability for biobanking of thyroid FNACs are the quality of the sample and availability of adequate tissue for auxiliary studies. This article analyses technical aspects (pre-analytics) of performing thyroid FNACs, including image guidance and rapid on slide evaluation (ROSE), sample collection methods (conventional slides, liquid based methods (LBC), cell blocks) and storage (bio-banking). The spectrum of the special studies (immunocytochemistry on direct slides or LBC, immunohistochemistry on cell blocks and molecular methods) required for improving the precision of the cytological diagnosis of the thyroid nodules is discussed. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.
Wee, Eugene J.H.; Wang, Yuling; Tsao, Simon Chang-Hao; Trau, Matt
2016-01-01
Sensitive and accurate identification of specific DNA mutations can influence clinical decisions. However accurate diagnosis from limiting samples such as circulating tumour DNA (ctDNA) is challenging. Current approaches based on fluorescence such as quantitative PCR (qPCR) and more recently, droplet digital PCR (ddPCR) have limitations in multiplex detection, sensitivity and the need for expensive specialized equipment. Herein we describe an assay capitalizing on the multiplexing and sensitivity benefits of surface-enhanced Raman spectroscopy (SERS) with the simplicity of standard PCR to address the limitations of current approaches. This proof-of-concept method could reproducibly detect as few as 0.1% (10 copies, CV < 9%) of target sequences thus demonstrating the high sensitivity of the method. The method was then applied to specifically detect three important melanoma mutations in multiplex. Finally, the PCR/SERS assay was used to genotype cell lines and ctDNA from serum samples where results subsequently validated with ddPCR. With ddPCR-like sensitivity and accuracy yet at the convenience of standard PCR, we believe this multiplex PCR/SERS method could find wide applications in both diagnostics and research. PMID:27446486