Sample records for sampling method based

  1. Molecular cancer classification using a meta-sample-based regularized robust coding method.

    PubMed

    Wang, Shu-Lin; Sun, Liuchao; Fang, Jianwen

    2014-01-01

    Previous studies have demonstrated that machine learning based molecular cancer classification using gene expression profiling (GEP) data is promising for the clinic diagnosis and treatment of cancer. Novel classification methods with high efficiency and prediction accuracy are still needed to deal with high dimensionality and small sample size of typical GEP data. Recently the sparse representation (SR) method has been successfully applied to the cancer classification. Nevertheless, its efficiency needs to be improved when analyzing large-scale GEP data. In this paper we present the meta-sample-based regularized robust coding classification (MRRCC), a novel effective cancer classification technique that combines the idea of meta-sample-based cluster method with regularized robust coding (RRC) method. It assumes that the coding residual and the coding coefficient are respectively independent and identically distributed. Similar to meta-sample-based SR classification (MSRC), MRRCC extracts a set of meta-samples from the training samples, and then encodes a testing sample as the sparse linear combination of these meta-samples. The representation fidelity is measured by the l2-norm or l1-norm of the coding residual. Extensive experiments on publicly available GEP datasets demonstrate that the proposed method is more efficient while its prediction accuracy is equivalent to existing MSRC-based methods and better than other state-of-the-art dimension reduction based methods.

  2. A study of active learning methods for named entity recognition in clinical text.

    PubMed

    Chen, Yukun; Lasko, Thomas A; Mei, Qiaozhu; Denny, Joshua C; Xu, Hua

    2015-12-01

    Named entity recognition (NER), a sequential labeling task, is one of the fundamental tasks for building clinical natural language processing (NLP) systems. Machine learning (ML) based approaches can achieve good performance, but they often require large amounts of annotated samples, which are expensive to build due to the requirement of domain experts in annotation. Active learning (AL), a sample selection approach integrated with supervised ML, aims to minimize the annotation cost while maximizing the performance of ML-based models. In this study, our goal was to develop and evaluate both existing and new AL methods for a clinical NER task to identify concepts of medical problems, treatments, and lab tests from the clinical notes. Using the annotated NER corpus from the 2010 i2b2/VA NLP challenge that contained 349 clinical documents with 20,423 unique sentences, we simulated AL experiments using a number of existing and novel algorithms in three different categories including uncertainty-based, diversity-based, and baseline sampling strategies. They were compared with the passive learning that uses random sampling. Learning curves that plot performance of the NER model against the estimated annotation cost (based on number of sentences or words in the training set) were generated to evaluate different active learning and the passive learning methods and the area under the learning curve (ALC) score was computed. Based on the learning curves of F-measure vs. number of sentences, uncertainty sampling algorithms outperformed all other methods in ALC. Most diversity-based methods also performed better than random sampling in ALC. To achieve an F-measure of 0.80, the best method based on uncertainty sampling could save 66% annotations in sentences, as compared to random sampling. For the learning curves of F-measure vs. number of words, uncertainty sampling methods again outperformed all other methods in ALC. To achieve 0.80 in F-measure, in comparison to random sampling, the best uncertainty based method saved 42% annotations in words. But the best diversity based method reduced only 7% annotation effort. In the simulated setting, AL methods, particularly uncertainty-sampling based approaches, seemed to significantly save annotation cost for the clinical NER task. The actual benefit of active learning in clinical NER should be further evaluated in a real-time setting. Copyright © 2015 Elsevier Inc. All rights reserved.

  3. Nonprobability and probability-based sampling strategies in sexual science.

    PubMed

    Catania, Joseph A; Dolcini, M Margaret; Orellana, Roberto; Narayanan, Vasudah

    2015-01-01

    With few exceptions, much of sexual science builds upon data from opportunistic nonprobability samples of limited generalizability. Although probability-based studies are considered the gold standard in terms of generalizability, they are costly to apply to many of the hard-to-reach populations of interest to sexologists. The present article discusses recent conclusions by sampling experts that have relevance to sexual science that advocates for nonprobability methods. In this regard, we provide an overview of Internet sampling as a useful, cost-efficient, nonprobability sampling method of value to sex researchers conducting modeling work or clinical trials. We also argue that probability-based sampling methods may be more readily applied in sex research with hard-to-reach populations than is typically thought. In this context, we provide three case studies that utilize qualitative and quantitative techniques directed at reducing limitations in applying probability-based sampling to hard-to-reach populations: indigenous Peruvians, African American youth, and urban men who have sex with men (MSM). Recommendations are made with regard to presampling studies, adaptive and disproportionate sampling methods, and strategies that may be utilized in evaluating nonprobability and probability-based sampling methods.

  4. Methodology Series Module 5: Sampling Strategies.

    PubMed

    Setia, Maninder Singh

    2016-01-01

    Once the research question and the research design have been finalised, it is important to select the appropriate sample for the study. The method by which the researcher selects the sample is the ' Sampling Method'. There are essentially two types of sampling methods: 1) probability sampling - based on chance events (such as random numbers, flipping a coin etc.); and 2) non-probability sampling - based on researcher's choice, population that accessible & available. Some of the non-probability sampling methods are: purposive sampling, convenience sampling, or quota sampling. Random sampling method (such as simple random sample or stratified random sample) is a form of probability sampling. It is important to understand the different sampling methods used in clinical studies and mention this method clearly in the manuscript. The researcher should not misrepresent the sampling method in the manuscript (such as using the term ' random sample' when the researcher has used convenience sample). The sampling method will depend on the research question. For instance, the researcher may want to understand an issue in greater detail for one particular population rather than worry about the ' generalizability' of these results. In such a scenario, the researcher may want to use ' purposive sampling' for the study.

  5. Evaluation of a QuECHERS-like extraction approach for the determination of PBDEs in mussels by immuno-assay-based screening methods

    USDA-ARS?s Scientific Manuscript database

    A sample preparation method was evaluated for the determination of polybrominated diphenyl ethers (PBDEs) in mussel samples, by using colorimetric and electrochemical immunoassay-based screening methods. A simple sample preparation in conjunction with a rapid screening method possesses the desired c...

  6. Comparison of a New Cobinamide-Based Method to a Standard Laboratory Method for Measuring Cyanide in Human Blood

    PubMed Central

    Swezey, Robert; Shinn, Walter; Green, Carol; Drover, David R.; Hammer, Gregory B.; Schulman, Scott R.; Zajicek, Anne; Jett, David A.; Boss, Gerry R.

    2013-01-01

    Most hospital laboratories do not measure blood cyanide concentrations, and samples must be sent to reference laboratories. A simple method is needed for measuring cyanide in hospitals. The authors previously developed a method to quantify cyanide based on the high binding affinity of the vitamin B12 analog, cobinamide, for cyanide and a major spectral change observed for cyanide-bound cobinamide. This method is now validated in human blood, and the findings include a mean inter-assay accuracy of 99.1%, precision of 8.75% and a lower limit of quantification of 3.27 µM cyanide. The method was applied to blood samples from children treated with sodium nitroprusside and it yielded measurable results in 88 of 172 samples (51%), whereas the reference laboratory yielded results in only 19 samples (11%). In all 19 samples, the cobinamide-based method also yielded measurable results. The two methods showed reasonable agreement when analyzed by linear regression, but not when analyzed by a standard error of the estimate or paired t-test. Differences in results between the two methods may be because samples were assayed at different times on different sample types. The cobinamide-based method is applicable to human blood, and can be used in hospital laboratories and emergency rooms. PMID:23653045

  7. Methodology Series Module 5: Sampling Strategies

    PubMed Central

    Setia, Maninder Singh

    2016-01-01

    Once the research question and the research design have been finalised, it is important to select the appropriate sample for the study. The method by which the researcher selects the sample is the ‘ Sampling Method’. There are essentially two types of sampling methods: 1) probability sampling – based on chance events (such as random numbers, flipping a coin etc.); and 2) non-probability sampling – based on researcher's choice, population that accessible & available. Some of the non-probability sampling methods are: purposive sampling, convenience sampling, or quota sampling. Random sampling method (such as simple random sample or stratified random sample) is a form of probability sampling. It is important to understand the different sampling methods used in clinical studies and mention this method clearly in the manuscript. The researcher should not misrepresent the sampling method in the manuscript (such as using the term ‘ random sample’ when the researcher has used convenience sample). The sampling method will depend on the research question. For instance, the researcher may want to understand an issue in greater detail for one particular population rather than worry about the ‘ generalizability’ of these results. In such a scenario, the researcher may want to use ‘ purposive sampling’ for the study. PMID:27688438

  8. GalaxyGPCRloop: Template-Based and Ab Initio Structure Sampling of the Extracellular Loops of G-Protein-Coupled Receptors.

    PubMed

    Won, Jonghun; Lee, Gyu Rie; Park, Hahnbeom; Seok, Chaok

    2018-06-07

    The second extracellular loops (ECL2s) of G-protein-coupled receptors (GPCRs) are often involved in GPCR functions, and their structures have important implications in drug discovery. However, structure prediction of ECL2 is difficult because of its long length and the structural diversity among different GPCRs. In this study, a new ECL2 conformational sampling method involving both template-based and ab initio sampling was developed. Inspired by the observation of similar ECL2 structures of closely related GPCRs, a template-based sampling method employing loop structure templates selected from the structure database was developed. A new metric for evaluating similarity of the target loop to templates was introduced for template selection. An ab initio loop sampling method was also developed to treat cases without highly similar templates. The ab initio method is based on the previously developed fragment assembly and loop closure method. A new sampling component that takes advantage of secondary structure prediction was added. In addition, a conserved disulfide bridge restraining ECL2 conformation was predicted and analytically incorporated into sampling, reducing the effective dimension of the conformational search space. The sampling method was combined with an existing energy function for comparison with previously reported loop structure prediction methods, and the benchmark test demonstrated outstanding performance.

  9. Towards robust and repeatable sampling methods in eDNA based studies.

    PubMed

    Dickie, Ian A; Boyer, Stephane; Buckley, Hannah; Duncan, Richard P; Gardner, Paul; Hogg, Ian D; Holdaway, Robert J; Lear, Gavin; Makiola, Andreas; Morales, Sergio E; Powell, Jeff R; Weaver, Louise

    2018-05-26

    DNA based techniques are increasingly used for measuring the biodiversity (species presence, identity, abundance and community composition) of terrestrial and aquatic ecosystems. While there are numerous reviews of molecular methods and bioinformatic steps, there has been little consideration of the methods used to collect samples upon which these later steps are based. This represents a critical knowledge gap, as methodologically sound field sampling is the foundation for subsequent analyses. We reviewed field sampling methods used for metabarcoding studies of both terrestrial and freshwater ecosystem biodiversity over a nearly three-year period (n = 75). We found that 95% (n = 71) of these studies used subjective sampling methods, inappropriate field methods, and/or failed to provide critical methodological information. It would be possible for researchers to replicate only 5% of the metabarcoding studies in our sample, a poorer level of reproducibility than for ecological studies in general. Our findings suggest greater attention to field sampling methods and reporting is necessary in eDNA-based studies of biodiversity to ensure robust outcomes and future reproducibility. Methods must be fully and accurately reported, and protocols developed that minimise subjectivity. Standardisation of sampling protocols would be one way to help to improve reproducibility, and have additional benefits in allowing compilation and comparison of data from across studies. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.

  10. [Examination of analytical method for triphenyltin (TPT) and tributyltin (TBT) to revise the official methods based on "Act on the Control of Household Products Containing Harmful Substances"].

    PubMed

    Kawakami, Tsuyoshi; Isama, Kazuo; Nakashima, Harunobu; Yoshida, Jin; Ooshima, Tomoko; Ohno, Hiroyuki; Uemura, Hitoshi; Shioda, Hiroko; Kikuchi, Yoko; Matsuoka, Atsuko; Nishimura, Tetsuji

    2012-01-01

    The use of triphenyltin (TPT) and tributyltin (TBT) in some household products is banned by "Act on the Control of Household Products Containing Harmful Substances" in Japan. To revise the official analytical method, the method for detecting these organotin compounds was examined in six laboratories using a textile product, water-based adhesive, oil-based paint, which contained known amounts of TPT and TBT (0.1, 1.0, 10 μg/g). TPT and TBT were measured by GC-MS after ethyl-derivation with sodium tetraethylborate. The TBT recoveries in the samples were 70-120%. The TPT recoveries in the water-based adhesive samples were 80-110%, while its concentrations in the textile product and oil-based paint samples decreased because of dephenylation during storage. However, the precision of the method examined was satisfactory because most coefficients of variation for TPT and TBT in the samples were less than 10%. Furthermore, the revised method was able to detect concentrations lower than the officially regulated value. However, the sample matrix and the condition of analytical instrument might affect the estimated TPT and TBT concentrations. Therefore, the revised method may not be suitable for quantitative tests; rather, it can be employed to judge the acceptable levels of these organotin compounds by comparing the values of control sample containing regulated amounts of TPT and TBT with those for an unknown sample, with deuterated TPT and TBT as surrogate substances. It is desirable that TPT in textile and oil-based paint samples are analyzed immediately after the samples obtained because of the decomposition of TPT.

  11. Screen Space Ambient Occlusion Based Multiple Importance Sampling for Real-Time Rendering

    NASA Astrophysics Data System (ADS)

    Zerari, Abd El Mouméne; Babahenini, Mohamed Chaouki

    2018-03-01

    We propose a new approximation technique for accelerating the Global Illumination algorithm for real-time rendering. The proposed approach is based on the Screen-Space Ambient Occlusion (SSAO) method, which approximates the global illumination for large, fully dynamic scenes at interactive frame rates. Current algorithms that are based on the SSAO method suffer from difficulties due to the large number of samples that are required. In this paper, we propose an improvement to the SSAO technique by integrating it with a Multiple Importance Sampling technique that combines a stratified sampling method with an importance sampling method, with the objective of reducing the number of samples. Experimental evaluation demonstrates that our technique can produce high-quality images in real time and is significantly faster than traditional techniques.

  12. Study on Big Database Construction and its Application of Sample Data Collected in CHINA'S First National Geographic Conditions Census Based on Remote Sensing Images

    NASA Astrophysics Data System (ADS)

    Cheng, T.; Zhou, X.; Jia, Y.; Yang, G.; Bai, J.

    2018-04-01

    In the project of China's First National Geographic Conditions Census, millions of sample data have been collected all over the country for interpreting land cover based on remote sensing images, the quantity of data files reaches more than 12,000,000 and has grown in the following project of National Geographic Conditions Monitoring. By now, using database such as Oracle for storing the big data is the most effective method. However, applicable method is more significant for sample data's management and application. This paper studies a database construction method which is based on relational database with distributed file system. The vector data and file data are saved in different physical location. The key issues and solution method are discussed. Based on this, it studies the application method of sample data and analyzes some kinds of using cases, which could lay the foundation for sample data's application. Particularly, sample data locating in Shaanxi province are selected for verifying the method. At the same time, it takes 10 first-level classes which defined in the land cover classification system for example, and analyzes the spatial distribution and density characteristics of all kinds of sample data. The results verify that the method of database construction which is based on relational database with distributed file system is very useful and applicative for sample data's searching, analyzing and promoted application. Furthermore, sample data collected in the project of China's First National Geographic Conditions Census could be useful in the earth observation and land cover's quality assessment.

  13. Comparison of the Immunomagnetic Separation/Adenosine Triphosphate Rapid Method and the Modified mTEC Membrane-Filtration Method for Enumeration of Escherichia coli

    USGS Publications Warehouse

    Brady, Amie M.G.; Bushon, Rebecca N.; Bertke, Erin E.

    2009-01-01

    Water quality at beaches is monitored for fecal indicator bacteria by traditional, culture-based methods that can take 18 to 24 hours to obtain results. A rapid detection method that provides estimated concentrations of fecal indicator bacteria within 1 hour from the start of sample processing would allow beach managers to post advisories or close the beach when the conditions are actually considered unsafe instead of a day later, when conditions may have changed. A rapid method that couples immunomagnetic separation with adenosine triphosphate detection (IMS/ATP rapid method) was evaluated through monitoring of Escherichia coli (E. coli) at three Lake Erie beaches in Ohio (Edgewater and Villa Angela in Cleveland and Huntington in Bay Village). Beach water samples were collected between 4 and 5 days per week during the recreational seasons (May through September) of 2006 and 2007. Composite samples were created in the lab from two point samples collected at each beach and were shown to be comparable substitutes for analysis of two individual samples. E. coli concentrations in composite samples, as determined by the culture-based method, ranged from 4 to 24,000 colony-forming units per 100 milliliters during this study across all beaches. Turbidity also was measured for each sample and ranged from 0.8 to 260 neophelometric turbidity ratio units. Environmental variables were noted at the time of sampling, including number of birds at the beach and wave height. Rainfall amounts were measured at National Weather Service stations at local airports. Turbidity, rainfall, and wave height were significantly related to the culture-based method results each year and for both years combined at each beach. The number of birds at the beach was significantly related to the culture-based method results only at Edgewater during 2006 and during both years combined. Results of the IMS/ATP method were compared to results of the culture-based method for samples by year for each beach. The IMS/ATP method underwent several changes and refinements during the first year, including changes in reagents and antibodies and alterations to the method protocol. Because of the changes in the method, results from the two years of study could not be combined. Kendall's tau correlation coefficients for relations between the IMS/ATP and culture-based methods were significant except for samples collected during 2006 at Edgewater and for samples collected during 2007 at Villa Angela. Further, relations were stronger for samples collected in 2006 than for those collected in 2007, except at Edgewater where the reverse was observed. The 2007 dataset was examined to identify possible reasons for the observed difference in significance of relations by year. By dividing the 2007 data set into groups as a function of sampling date, relations (Kendall's tau) between methods were observed to be stronger for samples collected earlier in the season than for those collected later in the season. At Edgewater and Villa Angela, there were more birds at the beach at time of sampling later in the season compared to earlier in the season. (The number of birds was not examined at Huntington.) Also, more wet days (when rainfall during the 24 hours prior to sampling was greater than 0.05 inch) were sampled later in the season compared to earlier in the season. Differences in the dominant fecal source may explain the change in the relations between the culture-based and IMS/ATP methods.

  14. A two-dimensional matrix image based feature extraction method for classification of sEMG: A comparative analysis based on SVM, KNN and RBF-NN.

    PubMed

    Wen, Tingxi; Zhang, Zhongnan; Qiu, Ming; Zeng, Ming; Luo, Weizhen

    2017-01-01

    The computer mouse is an important human-computer interaction device. But patients with physical finger disability are unable to operate this device. Surface EMG (sEMG) can be monitored by electrodes on the skin surface and is a reflection of the neuromuscular activities. Therefore, we can control limbs auxiliary equipment by utilizing sEMG classification in order to help the physically disabled patients to operate the mouse. To develop a new a method to extract sEMG generated by finger motion and apply novel features to classify sEMG. A window-based data acquisition method was presented to extract signal samples from sEMG electordes. Afterwards, a two-dimensional matrix image based feature extraction method, which differs from the classical methods based on time domain or frequency domain, was employed to transform signal samples to feature maps used for classification. In the experiments, sEMG data samples produced by the index and middle fingers at the click of a mouse button were separately acquired. Then, characteristics of the samples were analyzed to generate a feature map for each sample. Finally, the machine learning classification algorithms (SVM, KNN, RBF-NN) were employed to classify these feature maps on a GPU. The study demonstrated that all classifiers can identify and classify sEMG samples effectively. In particular, the accuracy of the SVM classifier reached up to 100%. The signal separation method is a convenient, efficient and quick method, which can effectively extract the sEMG samples produced by fingers. In addition, unlike the classical methods, the new method enables to extract features by enlarging sample signals' energy appropriately. The classical machine learning classifiers all performed well by using these features.

  15. Appearance-based representative samples refining method for palmprint recognition

    NASA Astrophysics Data System (ADS)

    Wen, Jiajun; Chen, Yan

    2012-07-01

    The sparse representation can deal with the lack of sample problem due to utilizing of all the training samples. However, the discrimination ability will degrade when more training samples are used for representation. We propose a novel appearance-based palmprint recognition method. We aim to find a compromise between the discrimination ability and the lack of sample problem so as to obtain a proper representation scheme. Under the assumption that the test sample can be well represented by a linear combination of a certain number of training samples, we first select the representative training samples according to the contributions of the samples. Then we further refine the training samples by an iteration procedure, excluding the training sample with the least contribution to the test sample for each time. Experiments on PolyU multispectral palmprint database and two-dimensional and three-dimensional palmprint database show that the proposed method outperforms the conventional appearance-based palmprint recognition methods. Moreover, we also explore and find out the principle of the usage for the key parameters in the proposed algorithm, which facilitates to obtain high-recognition accuracy.

  16. A unified method to process biosolids samples for the recovery of bacterial, viral, and helminths pathogens.

    PubMed

    Alum, Absar; Rock, Channah; Abbaszadegan, Morteza

    2014-01-01

    For land application, biosolids are classified as Class A or Class B based on the levels of bacterial, viral, and helminths pathogens in residual biosolids. The current EPA methods for the detection of these groups of pathogens in biosolids include discrete steps. Therefore, a separate sample is processed independently to quantify the number of each group of the pathogens in biosolids. The aim of the study was to develop a unified method for simultaneous processing of a single biosolids sample to recover bacterial, viral, and helminths pathogens. At the first stage for developing a simultaneous method, nine eluents were compared for their efficiency to recover viruses from a 100 gm spiked biosolids sample. In the second stage, the three top performing eluents were thoroughly evaluated for the recovery of bacteria, viruses, and helminthes. For all three groups of pathogens, the glycine-based eluent provided higher recovery than the beef extract-based eluent. Additional experiments were performed to optimize performance of glycine-based eluent under various procedural factors such as, solids to eluent ratio, stir time, and centrifugation conditions. Last, the new method was directly compared with the EPA methods for the recovery of the three groups of pathogens spiked in duplicate samples of biosolids collected from different sources. For viruses, the new method yielded up to 10% higher recoveries than the EPA method. For bacteria and helminths, recoveries were 74% and 83% by the new method compared to 34% and 68% by the EPA method, respectively. The unified sample processing method significantly reduces the time required for processing biosolids samples for different groups of pathogens; it is less impacted by the intrinsic variability of samples, while providing higher yields (P = 0.05) and greater consistency than the current EPA methods.

  17. The special case of the 2 × 2 table: asymptotic unconditional McNemar test can be used to estimate sample size even for analysis based on GEE.

    PubMed

    Borkhoff, Cornelia M; Johnston, Patrick R; Stephens, Derek; Atenafu, Eshetu

    2015-07-01

    Aligning the method used to estimate sample size with the planned analytic method ensures the sample size needed to achieve the planned power. When using generalized estimating equations (GEE) to analyze a paired binary primary outcome with no covariates, many use an exact McNemar test to calculate sample size. We reviewed the approaches to sample size estimation for paired binary data and compared the sample size estimates on the same numerical examples. We used the hypothesized sample proportions for the 2 × 2 table to calculate the correlation between the marginal proportions to estimate sample size based on GEE. We solved the inside proportions based on the correlation and the marginal proportions to estimate sample size based on exact McNemar, asymptotic unconditional McNemar, and asymptotic conditional McNemar. The asymptotic unconditional McNemar test is a good approximation of GEE method by Pan. The exact McNemar is too conservative and yields unnecessarily large sample size estimates than all other methods. In the special case of a 2 × 2 table, even when a GEE approach to binary logistic regression is the planned analytic method, the asymptotic unconditional McNemar test can be used to estimate sample size. We do not recommend using an exact McNemar test. Copyright © 2015 Elsevier Inc. All rights reserved.

  18. Multirate sampled-data yaw-damper and modal suppression system design

    NASA Technical Reports Server (NTRS)

    Berg, Martin C.; Mason, Gregory S.

    1990-01-01

    A multirate control law synthesized algorithm based on an infinite-time quadratic cost function, was developed along with a method for analyzing the robustness of multirate systems. A generalized multirate sampled-data control law structure (GMCLS) was introduced. A new infinite-time-based parameter optimization multirate sampled-data control law synthesis method and solution algorithm were developed. A singular-value-based method for determining gain and phase margins for multirate systems was also developed. The finite-time-based parameter optimization multirate sampled-data control law synthesis algorithm originally intended to be applied to the aircraft problem was instead demonstrated by application to a simpler problem involving the control of the tip position of a two-link robot arm. The GMCLS, the infinite-time-based parameter optimization multirate control law synthesis method and solution algorithm, and the singular-value based method for determining gain and phase margins were all demonstrated by application to the aircraft control problem originally proposed for this project.

  19. Prevalence of Mixed-Methods Sampling Designs in Social Science Research

    ERIC Educational Resources Information Center

    Collins, Kathleen M. T.

    2006-01-01

    The purpose of this mixed-methods study was to document the prevalence of sampling designs utilised in mixed-methods research and to examine the interpretive consistency between interpretations made in mixed-methods studies and the sampling design used. Classification of studies was based on a two-dimensional mixed-methods sampling model. This…

  20. Molecular-based rapid inventories of sympatric diversity: a comparison of DNA barcode clustering methods applied to geography-based vs clade-based sampling of amphibians.

    PubMed

    Paz, Andrea; Crawford, Andrew J

    2012-11-01

    Molecular markers offer a universal source of data for quantifying biodiversity. DNA barcoding uses a standardized genetic marker and a curated reference database to identify known species and to reveal cryptic diversity within wellsampled clades. Rapid biological inventories, e.g. rapid assessment programs (RAPs), unlike most barcoding campaigns, are focused on particular geographic localities rather than on clades. Because of the potentially sparse phylogenetic sampling, the addition of DNA barcoding to RAPs may present a greater challenge for the identification of named species or for revealing cryptic diversity. In this article we evaluate the use of DNA barcoding for quantifying lineage diversity within a single sampling site as compared to clade-based sampling, and present examples from amphibians. We compared algorithms for identifying DNA barcode clusters (e.g. species, cryptic species or Evolutionary Significant Units) using previously published DNA barcode data obtained from geography-based sampling at a site in Central Panama, and from clade-based sampling in Madagascar. We found that clustering algorithms based on genetic distance performed similarly on sympatric as well as clade-based barcode data, while a promising coalescent-based method performed poorly on sympatric data. The various clustering algorithms were also compared in terms of speed and software implementation. Although each method has its shortcomings in certain contexts, we recommend the use of the ABGD method, which not only performs fairly well under either sampling method, but does so in a few seconds and with a user-friendly Web interface.

  1. Progress in multirate digital control system design

    NASA Technical Reports Server (NTRS)

    Berg, Martin C.; Mason, Gregory S.

    1991-01-01

    A new methodology for multirate sampled-data control design based on a new generalized control law structure, two new parameter-optimization-based control law synthesis methods, and a new singular-value-based robustness analysis method are described. The control law structure can represent multirate sampled-data control laws of arbitrary structure and dynamic order, with arbitrarily prescribed sampling rates for all sensors and update rates for all processor states and actuators. The two control law synthesis methods employ numerical optimization to determine values for the control law parameters. The robustness analysis method is based on the multivariable Nyquist criterion applied to the loop transfer function for the sampling period equal to the period of repetition of the system's complete sampling/update schedule. The complete methodology is demonstrated by application to the design of a combination yaw damper and modal suppression system for a commercial aircraft.

  2. Approximation of the exponential integral (well function) using sampling methods

    NASA Astrophysics Data System (ADS)

    Baalousha, Husam Musa

    2015-04-01

    Exponential integral (also known as well function) is often used in hydrogeology to solve Theis and Hantush equations. Many methods have been developed to approximate the exponential integral. Most of these methods are based on numerical approximations and are valid for a certain range of the argument value. This paper presents a new approach to approximate the exponential integral. The new approach is based on sampling methods. Three different sampling methods; Latin Hypercube Sampling (LHS), Orthogonal Array (OA), and Orthogonal Array-based Latin Hypercube (OA-LH) have been used to approximate the function. Different argument values, covering a wide range, have been used. The results of sampling methods were compared with results obtained by Mathematica software, which was used as a benchmark. All three sampling methods converge to the result obtained by Mathematica, at different rates. It was found that the orthogonal array (OA) method has the fastest convergence rate compared with LHS and OA-LH. The root mean square error RMSE of OA was in the order of 1E-08. This method can be used with any argument value, and can be used to solve other integrals in hydrogeology such as the leaky aquifer integral.

  3. Evaluation of two outlier-detection-based methods for detecting tissue-selective genes from microarray data.

    PubMed

    Kadota, Koji; Konishi, Tomokazu; Shimizu, Kentaro

    2007-05-01

    Large-scale expression profiling using DNA microarrays enables identification of tissue-selective genes for which expression is considerably higher and/or lower in some tissues than in others. Among numerous possible methods, only two outlier-detection-based methods (an AIC-based method and Sprent's non-parametric method) can treat equally various types of selective patterns, but they produce substantially different results. We investigated the performance of these two methods for different parameter settings and for a reduced number of samples. We focused on their ability to detect selective expression patterns robustly. We applied them to public microarray data collected from 36 normal human tissue samples and analyzed the effects of both changing the parameter settings and reducing the number of samples. The AIC-based method was more robust in both cases. The findings confirm that the use of the AIC-based method in the recently proposed ROKU method for detecting tissue-selective expression patterns is correct and that Sprent's method is not suitable for ROKU.

  4. Comparison of soil sampling and analytical methods for asbestos at the Sumas Mountain Asbestos Site-Working towards a toolbox for better assessment.

    PubMed

    Wroble, Julie; Frederick, Timothy; Frame, Alicia; Vallero, Daniel

    2017-01-01

    Established soil sampling methods for asbestos are inadequate to support risk assessment and risk-based decision making at Superfund sites due to difficulties in detecting asbestos at low concentrations and difficulty in extrapolating soil concentrations to air concentrations. Environmental Protection Agency (EPA)'s Office of Land and Emergency Management (OLEM) currently recommends the rigorous process of Activity Based Sampling (ABS) to characterize site exposures. The purpose of this study was to compare three soil analytical methods and two soil sampling methods to determine whether one method, or combination of methods, would yield more reliable soil asbestos data than other methods. Samples were collected using both traditional discrete ("grab") samples and incremental sampling methodology (ISM). Analyses were conducted using polarized light microscopy (PLM), transmission electron microscopy (TEM) methods or a combination of these two methods. Data show that the fluidized bed asbestos segregator (FBAS) followed by TEM analysis could detect asbestos at locations that were not detected using other analytical methods; however, this method exhibited high relative standard deviations, indicating the results may be more variable than other soil asbestos methods. The comparison of samples collected using ISM versus discrete techniques for asbestos resulted in no clear conclusions regarding preferred sampling method. However, analytical results for metals clearly showed that measured concentrations in ISM samples were less variable than discrete samples.

  5. Comparison of soil sampling and analytical methods for asbestos at the Sumas Mountain Asbestos Site—Working towards a toolbox for better assessment

    PubMed Central

    2017-01-01

    Established soil sampling methods for asbestos are inadequate to support risk assessment and risk-based decision making at Superfund sites due to difficulties in detecting asbestos at low concentrations and difficulty in extrapolating soil concentrations to air concentrations. Environmental Protection Agency (EPA)’s Office of Land and Emergency Management (OLEM) currently recommends the rigorous process of Activity Based Sampling (ABS) to characterize site exposures. The purpose of this study was to compare three soil analytical methods and two soil sampling methods to determine whether one method, or combination of methods, would yield more reliable soil asbestos data than other methods. Samples were collected using both traditional discrete (“grab”) samples and incremental sampling methodology (ISM). Analyses were conducted using polarized light microscopy (PLM), transmission electron microscopy (TEM) methods or a combination of these two methods. Data show that the fluidized bed asbestos segregator (FBAS) followed by TEM analysis could detect asbestos at locations that were not detected using other analytical methods; however, this method exhibited high relative standard deviations, indicating the results may be more variable than other soil asbestos methods. The comparison of samples collected using ISM versus discrete techniques for asbestos resulted in no clear conclusions regarding preferred sampling method. However, analytical results for metals clearly showed that measured concentrations in ISM samples were less variable than discrete samples. PMID:28759607

  6. A two-stage cluster sampling method using gridded population data, a GIS, and Google Earth(TM) imagery in a population-based mortality survey in Iraq.

    PubMed

    Galway, Lp; Bell, Nathaniel; Sae, Al Shatari; Hagopian, Amy; Burnham, Gilbert; Flaxman, Abraham; Weiss, Wiliam M; Rajaratnam, Julie; Takaro, Tim K

    2012-04-27

    Mortality estimates can measure and monitor the impacts of conflict on a population, guide humanitarian efforts, and help to better understand the public health impacts of conflict. Vital statistics registration and surveillance systems are rarely functional in conflict settings, posing a challenge of estimating mortality using retrospective population-based surveys. We present a two-stage cluster sampling method for application in population-based mortality surveys. The sampling method utilizes gridded population data and a geographic information system (GIS) to select clusters in the first sampling stage and Google Earth TM imagery and sampling grids to select households in the second sampling stage. The sampling method is implemented in a household mortality study in Iraq in 2011. Factors affecting feasibility and methodological quality are described. Sampling is a challenge in retrospective population-based mortality studies and alternatives that improve on the conventional approaches are needed. The sampling strategy presented here was designed to generate a representative sample of the Iraqi population while reducing the potential for bias and considering the context specific challenges of the study setting. This sampling strategy, or variations on it, are adaptable and should be considered and tested in other conflict settings.

  7. A two-stage cluster sampling method using gridded population data, a GIS, and Google EarthTM imagery in a population-based mortality survey in Iraq

    PubMed Central

    2012-01-01

    Background Mortality estimates can measure and monitor the impacts of conflict on a population, guide humanitarian efforts, and help to better understand the public health impacts of conflict. Vital statistics registration and surveillance systems are rarely functional in conflict settings, posing a challenge of estimating mortality using retrospective population-based surveys. Results We present a two-stage cluster sampling method for application in population-based mortality surveys. The sampling method utilizes gridded population data and a geographic information system (GIS) to select clusters in the first sampling stage and Google Earth TM imagery and sampling grids to select households in the second sampling stage. The sampling method is implemented in a household mortality study in Iraq in 2011. Factors affecting feasibility and methodological quality are described. Conclusion Sampling is a challenge in retrospective population-based mortality studies and alternatives that improve on the conventional approaches are needed. The sampling strategy presented here was designed to generate a representative sample of the Iraqi population while reducing the potential for bias and considering the context specific challenges of the study setting. This sampling strategy, or variations on it, are adaptable and should be considered and tested in other conflict settings. PMID:22540266

  8. Confidence intervals for the population mean tailored to small sample sizes, with applications to survey sampling.

    PubMed

    Rosenblum, Michael A; Laan, Mark J van der

    2009-01-07

    The validity of standard confidence intervals constructed in survey sampling is based on the central limit theorem. For small sample sizes, the central limit theorem may give a poor approximation, resulting in confidence intervals that are misleading. We discuss this issue and propose methods for constructing confidence intervals for the population mean tailored to small sample sizes. We present a simple approach for constructing confidence intervals for the population mean based on tail bounds for the sample mean that are correct for all sample sizes. Bernstein's inequality provides one such tail bound. The resulting confidence intervals have guaranteed coverage probability under much weaker assumptions than are required for standard methods. A drawback of this approach, as we show, is that these confidence intervals are often quite wide. In response to this, we present a method for constructing much narrower confidence intervals, which are better suited for practical applications, and that are still more robust than confidence intervals based on standard methods, when dealing with small sample sizes. We show how to extend our approaches to much more general estimation problems than estimating the sample mean. We describe how these methods can be used to obtain more reliable confidence intervals in survey sampling. As a concrete example, we construct confidence intervals using our methods for the number of violent deaths between March 2003 and July 2006 in Iraq, based on data from the study "Mortality after the 2003 invasion of Iraq: A cross sectional cluster sample survey," by Burnham et al. (2006).

  9. Quantifying Uncertainties from Presence Data Sampling Methods for Species Distribution Modeling: Focused on Vegetation.

    NASA Astrophysics Data System (ADS)

    Sung, S.; Kim, H. G.; Lee, D. K.; Park, J. H.; Mo, Y.; Kil, S.; Park, C.

    2016-12-01

    The impact of climate change has been observed throughout the globe. The ecosystem experiences rapid changes such as vegetation shift, species extinction. In these context, Species Distribution Model (SDM) is one of the popular method to project impact of climate change on the ecosystem. SDM basically based on the niche of certain species with means to run SDM present point data is essential to find biological niche of species. To run SDM for plants, there are certain considerations on the characteristics of vegetation. Normally, to make vegetation data in large area, remote sensing techniques are used. In other words, the exact point of presence data has high uncertainties as we select presence data set from polygons and raster dataset. Thus, sampling methods for modeling vegetation presence data should be carefully selected. In this study, we used three different sampling methods for selection of presence data of vegetation: Random sampling, Stratified sampling and Site index based sampling. We used one of the R package BIOMOD2 to access uncertainty from modeling. At the same time, we included BioCLIM variables and other environmental variables as input data. As a result of this study, despite of differences among the 10 SDMs, the sampling methods showed differences in ROC values, random sampling methods showed the lowest ROC value while site index based sampling methods showed the highest ROC value. As a result of this study the uncertainties from presence data sampling methods and SDM can be quantified.

  10. Ultrasensitive Hybridization-Based ELISA Method for the Determination of Phosphorodiamidate Morpholino Oligonucleotides in Biological samples.

    PubMed

    Burki, Umar; Straub, Volker

    2017-01-01

    Determining the concentration of oligonucleotide in biological samples such as tissue lysate and serum is essential for determining the biodistribution and pharmacokinetic profile, respectively. ELISA-based assays have shown far greater sensitivities compared to other methods such as HPLC and LC/MS. Here, we describe a novel ultrasensitive hybridization-based ELISA method for quantitating morpholino oligonucleotides in mouse tissue lysate and serum samples. The assay has a linear detection range of 5-250 pM (R2 > 0.99).

  11. Integrating conventional and inverse representation for face recognition.

    PubMed

    Xu, Yong; Li, Xuelong; Yang, Jian; Lai, Zhihui; Zhang, David

    2014-10-01

    Representation-based classification methods are all constructed on the basis of the conventional representation, which first expresses the test sample as a linear combination of the training samples and then exploits the deviation between the test sample and the expression result of every class to perform classification. However, this deviation does not always well reflect the difference between the test sample and each class. With this paper, we propose a novel representation-based classification method for face recognition. This method integrates conventional and the inverse representation-based classification for better recognizing the face. It first produces conventional representation of the test sample, i.e., uses a linear combination of the training samples to represent the test sample. Then it obtains the inverse representation, i.e., provides an approximation representation of each training sample of a subject by exploiting the test sample and training samples of the other subjects. Finally, the proposed method exploits the conventional and inverse representation to generate two kinds of scores of the test sample with respect to each class and combines them to recognize the face. The paper shows the theoretical foundation and rationale of the proposed method. Moreover, this paper for the first time shows that a basic nature of the human face, i.e., the symmetry of the face can be exploited to generate new training and test samples. As these new samples really reflect some possible appearance of the face, the use of them will enable us to obtain higher accuracy. The experiments show that the proposed conventional and inverse representation-based linear regression classification (CIRLRC), an improvement to linear regression classification (LRC), can obtain very high accuracy and greatly outperforms the naive LRC and other state-of-the-art conventional representation based face recognition methods. The accuracy of CIRLRC can be 10% greater than that of LRC.

  12. Design-based and model-based inference in surveys of freshwater mollusks

    USGS Publications Warehouse

    Dorazio, R.M.

    1999-01-01

    Well-known concepts in statistical inference and sampling theory are used to develop recommendations for planning and analyzing the results of quantitative surveys of freshwater mollusks. Two methods of inference commonly used in survey sampling (design-based and model-based) are described and illustrated using examples relevant in surveys of freshwater mollusks. The particular objectives of a survey and the type of information observed in each unit of sampling can be used to help select the sampling design and the method of inference. For example, the mean density of a sparsely distributed population of mollusks can be estimated with higher precision by using model-based inference or by using design-based inference with adaptive cluster sampling than by using design-based inference with conventional sampling. More experience with quantitative surveys of natural assemblages of freshwater mollusks is needed to determine the actual benefits of different sampling designs and inferential procedures.

  13. Correcting for Sample Contamination in Genotype Calling of DNA Sequence Data

    PubMed Central

    Flickinger, Matthew; Jun, Goo; Abecasis, Gonçalo R.; Boehnke, Michael; Kang, Hyun Min

    2015-01-01

    DNA sample contamination is a frequent problem in DNA sequencing studies and can result in genotyping errors and reduced power for association testing. We recently described methods to identify within-species DNA sample contamination based on sequencing read data, showed that our methods can reliably detect and estimate contamination levels as low as 1%, and suggested strategies to identify and remove contaminated samples from sequencing studies. Here we propose methods to model contamination during genotype calling as an alternative to removal of contaminated samples from further analyses. We compare our contamination-adjusted calls to calls that ignore contamination and to calls based on uncontaminated data. We demonstrate that, for moderate contamination levels (5%–20%), contamination-adjusted calls eliminate 48%–77% of the genotyping errors. For lower levels of contamination, our contamination correction methods produce genotypes nearly as accurate as those based on uncontaminated data. Our contamination correction methods are useful generally, but are particularly helpful for sample contamination levels from 2% to 20%. PMID:26235984

  14. Monte Carlo approaches to sampling forested tracts with lines or points

    Treesearch

    Harry T. Valentine; Jeffrey H. Gove; Timothy G. Gregoire

    2001-01-01

    Several line- and point-based sampling methods can be employed to estimate the aggregate dimensions of trees standing on a forested tract or pieces of coarse woody debris lying on the forest floor. Line methods include line intersect sampling, horizontal line sampling, and transect relascope sampling; point methods include variable- and fixed-radius plot sampling, and...

  15. Evaluation of different approaches for identifying optimal sites to predict mean hillslope soil moisture content

    NASA Astrophysics Data System (ADS)

    Liao, Kaihua; Zhou, Zhiwen; Lai, Xiaoming; Zhu, Qing; Feng, Huihui

    2017-04-01

    The identification of representative soil moisture sampling sites is important for the validation of remotely sensed mean soil moisture in a certain area and ground-based soil moisture measurements in catchment or hillslope hydrological studies. Numerous approaches have been developed to identify optimal sites for predicting mean soil moisture. Each method has certain advantages and disadvantages, but they have rarely been evaluated and compared. In our study, surface (0-20 cm) soil moisture data from January 2013 to March 2016 (a total of 43 sampling days) were collected at 77 sampling sites on a mixed land-use (tea and bamboo) hillslope in the hilly area of Taihu Lake Basin, China. A total of 10 methods (temporal stability (TS) analyses based on 2 indices, K-means clustering based on 6 kinds of inputs and 2 random sampling strategies) were evaluated for determining optimal sampling sites for mean soil moisture estimation. They were TS analyses based on the smallest index of temporal stability (ITS, a combination of the mean relative difference and standard deviation of relative difference (SDRD)) and based on the smallest SDRD, K-means clustering based on soil properties and terrain indices (EFs), repeated soil moisture measurements (Theta), EFs plus one-time soil moisture data (EFsTheta), and the principal components derived from EFs (EFs-PCA), Theta (Theta-PCA), and EFsTheta (EFsTheta-PCA), and global and stratified random sampling strategies. Results showed that the TS based on the smallest ITS was better (RMSE = 0.023 m3 m-3) than that based on the smallest SDRD (RMSE = 0.034 m3 m-3). The K-means clustering based on EFsTheta (-PCA) was better (RMSE <0.020 m3 m-3) than these based on EFs (-PCA) and Theta (-PCA). The sampling design stratified by the land use was more efficient than the global random method. Forty and 60 sampling sites are needed for stratified sampling and global sampling respectively to make their performances comparable to the best K-means method (EFsTheta-PCA). Overall, TS required only one site, but its accuracy was limited. The best K-means method required <8 sites and yielded high accuracy, but extra soil and terrain information is necessary when using this method. The stratified sampling strategy can only be used if no pre-knowledge about soil moisture variation is available. This information will help in selecting the optimal methods for estimation the area mean soil moisture.

  16. Acupuncture injection for field amplified sample stacking and glass microchip-based capillary gel electrophoresis.

    PubMed

    Ha, Ji Won; Hahn, Jong Hoon

    2017-02-01

    Acupuncture sample injection is a simple method to deliver well-defined nanoliter-scale sample plugs in PDMS microfluidic channels. This acupuncture injection method in microchip CE has several advantages, including minimization of sample consumption, the capability of serial injections of different sample solutions into the same microchannel, and the capability of injecting sample plugs into any desired position of a microchannel. Herein, we demonstrate that the simple and cost-effective acupuncture sample injection method can be used for PDMS microchip-based field amplified sample stacking in the most simplified straight channel by applying a single potential. We achieved the increase in electropherogram signals for the case of sample stacking. Furthermore, we present that microchip CGE of ΦX174 DNA-HaeⅢ digest can be performed with the acupuncture injection method on a glass microchip while minimizing sample loss and voltage control hardware. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  17. MStern Blotting-High Throughput Polyvinylidene Fluoride (PVDF) Membrane-Based Proteomic Sample Preparation for 96-Well Plates.

    PubMed

    Berger, Sebastian T; Ahmed, Saima; Muntel, Jan; Cuevas Polo, Nerea; Bachur, Richard; Kentsis, Alex; Steen, Judith; Steen, Hanno

    2015-10-01

    We describe a 96-well plate compatible membrane-based proteomic sample processing method, which enables the complete processing of 96 samples (or multiples thereof) within a single workday. This method uses a large-pore hydrophobic PVDF membrane that efficiently adsorbs proteins, resulting in fast liquid transfer through the membrane and significantly reduced sample processing times. Low liquid transfer speeds have prevented the useful 96-well plate implementation of FASP as a widely used membrane-based proteomic sample processing method. We validated our approach on whole-cell lysate and urine and cerebrospinal fluid as clinically relevant body fluids. Without compromising peptide and protein identification, our method uses a vacuum manifold and circumvents the need for digest desalting, making our processing method compatible with standard liquid handling robots. In summary, our new method maintains the strengths of FASP and simultaneously overcomes one of the major limitations of FASP without compromising protein identification and quantification. © 2015 by The American Society for Biochemistry and Molecular Biology, Inc.

  18. MStern Blotting–High Throughput Polyvinylidene Fluoride (PVDF) Membrane-Based Proteomic Sample Preparation for 96-Well Plates*

    PubMed Central

    Berger, Sebastian T.; Ahmed, Saima; Muntel, Jan; Cuevas Polo, Nerea; Bachur, Richard; Kentsis, Alex; Steen, Judith; Steen, Hanno

    2015-01-01

    We describe a 96-well plate compatible membrane-based proteomic sample processing method, which enables the complete processing of 96 samples (or multiples thereof) within a single workday. This method uses a large-pore hydrophobic PVDF membrane that efficiently adsorbs proteins, resulting in fast liquid transfer through the membrane and significantly reduced sample processing times. Low liquid transfer speeds have prevented the useful 96-well plate implementation of FASP as a widely used membrane-based proteomic sample processing method. We validated our approach on whole-cell lysate and urine and cerebrospinal fluid as clinically relevant body fluids. Without compromising peptide and protein identification, our method uses a vacuum manifold and circumvents the need for digest desalting, making our processing method compatible with standard liquid handling robots. In summary, our new method maintains the strengths of FASP and simultaneously overcomes one of the major limitations of FASP without compromising protein identification and quantification. PMID:26223766

  19. A comparison of respondent-driven and venue-based sampling of female sex workers in Liuzhou, China

    PubMed Central

    Weir, Sharon S; Merli, M Giovanna; Li, Jing; Gandhi, Anisha D; Neely, William W; Edwards, Jessie K; Suchindran, Chirayath M; Henderson, Gail E; Chen, Xiang-Sheng

    2012-01-01

    Objectives To compare two methods for sampling female sex workers (FSWs) for bio-behavioural surveillance. We compared the populations of sex workers recruited by the venue-based Priorities for Local AIDS Control Efforts (PLACE) method and a concurrently implemented network-based sampling method, respondent-driven sampling (RDS), in Liuzhou, China. Methods For the PLACE protocol, all female workers at a stratified random sample of venues identified as places where people meet new sexual partners were interviewed and tested for syphilis. Female workers who reported sex work in the past 4 weeks were categorised as FSWs. RDS used peer recruitment and chain referral to obtain a sample of FSWs. Data were collected between October 2009 and January 2010. We compared the socio-demographic characteristics and the percentage with a positive syphilis test of FSWs recruited by PLACE and RDS. Results The prevalence of a positive syphilis test was 24% among FSWs recruited by PLACE and 8.5% among those recruited by RDS and tested (prevalence ratio 3.3; 95% CI 1.5 to 7.2). Socio-demographic characteristics (age, residence and monthly income) also varied by sampling method. PLACE recruited fewer FSWs than RDS (161 vs 583), was more labour-intensive and had difficulty gaining access to some venues. RDS was more likely to recruit from areas near the RDS office and from large low prevalence entertainment venues. Conclusions Surveillance protocols using different sampling methods can obtain different estimates of prevalence and population characteristics. Venue-based and network-based methods each have strengths and limitations reflecting differences in design and assumptions. We recommend that more research be conducted on measuring bias in bio-behavioural surveillance. PMID:23172350

  20. A Monte-Carlo method which is not based on Markov chain algorithm, used to study electrostatic screening of ion potential

    NASA Astrophysics Data System (ADS)

    Šantić, Branko; Gracin, Davor

    2017-12-01

    A new simple Monte Carlo method is introduced for the study of electrostatic screening by surrounding ions. The proposed method is not based on the generally used Markov chain method for sample generation. Each sample is pristine and there is no correlation with other samples. As the main novelty, the pairs of ions are gradually added to a sample provided that the energy of each ion is within the boundaries determined by the temperature and the size of ions. The proposed method provides reliable results, as demonstrated by the screening of ion in plasma and in water.

  1. Sampling Based Influence Maximization on Linear Threshold Model

    NASA Astrophysics Data System (ADS)

    Jia, Su; Chen, Ling

    2018-04-01

    A sampling based influence maximization on linear threshold (LT) model method is presented. The method samples the routes in the possible worlds in the social networks, and uses Chernoff bound to estimate the number of samples so that the error can be constrained within a given bound. Then the active possibilities of the routes in the possible worlds are calculated, and are used to compute the influence spread of each node in the network. Our experimental results show that our method can effectively select appropriate seed nodes set that spreads larger influence than other similar methods.

  2. Sampling enhancement for the quantum mechanical potential based molecular dynamics simulations: a general algorithm and its extension for free energy calculation on rugged energy surface.

    PubMed

    Li, Hongzhi; Yang, Wei

    2007-03-21

    An approach is developed in the replica exchange framework to enhance conformational sampling for the quantum mechanical (QM) potential based molecular dynamics simulations. Importantly, with our enhanced sampling treatment, a decent convergence for electronic structure self-consistent-field calculation is robustly guaranteed, which is made possible in our replica exchange design by avoiding direct structure exchanges between the QM-related replicas and the activated (scaled by low scaling parameters or treated with high "effective temperatures") molecular mechanical (MM) replicas. Although the present approach represents one of the early efforts in the enhanced sampling developments specifically for quantum mechanical potentials, the QM-based simulations treated with the present technique can possess the similar sampling efficiency to the MM based simulations treated with the Hamiltonian replica exchange method (HREM). In the present paper, by combining this sampling method with one of our recent developments (the dual-topology alchemical HREM approach), we also introduce a method for the sampling enhanced QM-based free energy calculations.

  3. A Ricin Forensic Profiling Approach Based on a Complex Set of Biomarkers

    DOE PAGES

    Fredriksson, Sten-Ake; Wunschel, David S.; Lindstrom, Susanne Wiklund; ...

    2018-03-28

    A forensic method for the retrospective determination of preparation methods used for illicit ricin toxin production was developed. The method was based on a complex set of biomarkers, including carbohydrates, fatty acids, seed storage proteins, in combination with data on ricin and Ricinus communis agglutinin. The analyses were performed on samples prepared from four castor bean plant (R. communis) cultivars by four different sample preparation methods (PM1 – PM4) ranging from simple disintegration of the castor beans to multi-step preparation methods including different protein precipitation methods. Comprehensive analytical data was collected by use of a range of analytical methods andmore » robust orthogonal partial least squares-discriminant analysis- models (OPLS-DA) were constructed based on the calibration set. By the use of a decision tree and two OPLS-DA models, the sample preparation methods of test set samples were determined. The model statistics of the two models were good and a 100% rate of correct predictions of the test set was achieved.« less

  4. A Ricin Forensic Profiling Approach Based on a Complex Set of Biomarkers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fredriksson, Sten-Ake; Wunschel, David S.; Lindstrom, Susanne Wiklund

    A forensic method for the retrospective determination of preparation methods used for illicit ricin toxin production was developed. The method was based on a complex set of biomarkers, including carbohydrates, fatty acids, seed storage proteins, in combination with data on ricin and Ricinus communis agglutinin. The analyses were performed on samples prepared from four castor bean plant (R. communis) cultivars by four different sample preparation methods (PM1 – PM4) ranging from simple disintegration of the castor beans to multi-step preparation methods including different protein precipitation methods. Comprehensive analytical data was collected by use of a range of analytical methods andmore » robust orthogonal partial least squares-discriminant analysis- models (OPLS-DA) were constructed based on the calibration set. By the use of a decision tree and two OPLS-DA models, the sample preparation methods of test set samples were determined. The model statistics of the two models were good and a 100% rate of correct predictions of the test set was achieved.« less

  5. Methods of developing core collections based on the predicted genotypic value of rice ( Oryza sativa L.).

    PubMed

    Li, C T; Shi, C H; Wu, J G; Xu, H M; Zhang, H Z; Ren, Y L

    2004-04-01

    The selection of an appropriate sampling strategy and a clustering method is important in the construction of core collections based on predicted genotypic values in order to retain the greatest degree of genetic diversity of the initial collection. In this study, methods of developing rice core collections were evaluated based on the predicted genotypic values for 992 rice varieties with 13 quantitative traits. The genotypic values of the traits were predicted by the adjusted unbiased prediction (AUP) method. Based on the predicted genotypic values, Mahalanobis distances were calculated and employed to measure the genetic similarities among the rice varieties. Six hierarchical clustering methods, including the single linkage, median linkage, centroid, unweighted pair-group average, weighted pair-group average and flexible-beta methods, were combined with random, preferred and deviation sampling to develop 18 core collections of rice germplasm. The results show that the deviation sampling strategy in combination with the unweighted pair-group average method of hierarchical clustering retains the greatest degree of genetic diversities of the initial collection. The core collections sampled using predicted genotypic values had more genetic diversity than those based on phenotypic values.

  6. COST-EFFECTIVE SAMPLING FOR SPATIALLY DISTRIBUTED PHENOMENA

    EPA Science Inventory

    Various measures of sampling plan cost and loss are developed and analyzed as they relate to a variety of multidisciplinary sampling techniques. The sampling choices examined include methods from design-based sampling, model-based sampling, and geostatistics. Graphs and tables ar...

  7. Evaluation of Two Outlier-Detection-Based Methods for Detecting Tissue-Selective Genes from Microarray Data

    PubMed Central

    Kadota, Koji; Konishi, Tomokazu; Shimizu, Kentaro

    2007-01-01

    Large-scale expression profiling using DNA microarrays enables identification of tissue-selective genes for which expression is considerably higher and/or lower in some tissues than in others. Among numerous possible methods, only two outlier-detection-based methods (an AIC-based method and Sprent’s non-parametric method) can treat equally various types of selective patterns, but they produce substantially different results. We investigated the performance of these two methods for different parameter settings and for a reduced number of samples. We focused on their ability to detect selective expression patterns robustly. We applied them to public microarray data collected from 36 normal human tissue samples and analyzed the effects of both changing the parameter settings and reducing the number of samples. The AIC-based method was more robust in both cases. The findings confirm that the use of the AIC-based method in the recently proposed ROKU method for detecting tissue-selective expression patterns is correct and that Sprent’s method is not suitable for ROKU. PMID:19936074

  8. Improved lossless intra coding for H.264/MPEG-4 AVC.

    PubMed

    Lee, Yung-Lyul; Han, Ki-Hun; Sullivan, Gary J

    2006-09-01

    A new lossless intra coding method based on sample-by-sample differential pulse code modulation (DPCM) is presented as an enhancement of the H.264/MPEG-4 AVC standard. The H.264/AVC design includes a multidirectional spatial prediction method to reduce spatial redundancy by using neighboring samples as a prediction for the samples in a block of data to be encoded. In the new lossless intra coding method, the spatial prediction is performed based on samplewise DPCM instead of in the block-based manner used in the current H.264/AVC standard, while the block structure is retained for the residual difference entropy coding process. We show that the new method, based on samplewise DPCM, does not have a major complexity penalty, despite its apparent pipeline dependencies. Experiments show that the new lossless intra coding method reduces the bit rate by approximately 12% in comparison with the lossless intra coding method previously included in the H.264/AVC standard. As a result, the new method is currently being adopted into the H.264/AVC standard in a new enhancement project.

  9. The Analysis of Organizational Diagnosis on Based Six Box Model in Universities

    ERIC Educational Resources Information Center

    Hamid, Rahimi; Siadat, Sayyed Ali; Reza, Hoveida; Arash, Shahin; Ali, Nasrabadi Hasan; Azizollah, Arbabisarjou

    2011-01-01

    Purpose: The analysis of organizational diagnosis on based six box model at universities. Research method: Research method was descriptive-survey. Statistical population consisted of 1544 faculty members of universities which through random strafed sampling method 218 persons were chosen as the sample. Research Instrument were organizational…

  10. From picture to porosity of river bed material using Structure-from-Motion with Multi-View-Stereo

    NASA Astrophysics Data System (ADS)

    Seitz, Lydia; Haas, Christian; Noack, Markus; Wieprecht, Silke

    2018-04-01

    Common methods for in-situ determination of porosity of river bed material are time- and effort-consuming. Although mathematical predictors can be used for estimation, they do not adequately represent porosities. The objective of this study was to assess a new approach for the determination of porosity of frozen sediment samples. The method is based on volume determination by applying Structure-from-Motion with Multi View Stereo (SfM-MVS) to estimate a 3D volumetric model based on overlapping imagery. The method was applied on artificial sediment mixtures as well as field samples. In addition, the commonly used water replacement method was applied to determine porosities in comparison with the SfM-MVS method. We examined a range of porosities from 0.16 to 0.46 that are representative of the wide range of porosities found in rivers. SfM-MVS performed well in determining volumes of the sediment samples. A very good correlation (r = 0.998, p < 0.0001) was observed between the SfM-MVS and the water replacement method. Results further show that the water replacement method underestimated total sample volumes. A comparison with several mathematical predictors showed that for non-uniform samples the calculated porosity based on the standard deviation performed better than porosities based on the median grain size. None of the predictors were effective at estimating the porosity of the field samples.

  11. A STANDARDIZED ASSESSMENT METHOD (SAM) FOR RIVERINE MACROINVERTEBRATES

    EPA Science Inventory

    A macroinvertebrate sampling method for large rivers based on desirable characteristics of existing nonwadeable methods was developed and tested. Six sites each were sampled on the Great Miami and Kentucky Rivers, reflecting a human disturbance gradient. Samples were collected ...

  12. Detecting and Estimating Contamination of Human DNA Samples in Sequencing and Array-Based Genotype Data

    PubMed Central

    Jun, Goo; Flickinger, Matthew; Hetrick, Kurt N.; Romm, Jane M.; Doheny, Kimberly F.; Abecasis, Gonçalo R.; Boehnke, Michael; Kang, Hyun Min

    2012-01-01

    DNA sample contamination is a serious problem in DNA sequencing studies and may result in systematic genotype misclassification and false positive associations. Although methods exist to detect and filter out cross-species contamination, few methods to detect within-species sample contamination are available. In this paper, we describe methods to identify within-species DNA sample contamination based on (1) a combination of sequencing reads and array-based genotype data, (2) sequence reads alone, and (3) array-based genotype data alone. Analysis of sequencing reads allows contamination detection after sequence data is generated but prior to variant calling; analysis of array-based genotype data allows contamination detection prior to generation of costly sequence data. Through a combination of analysis of in silico and experimentally contaminated samples, we show that our methods can reliably detect and estimate levels of contamination as low as 1%. We evaluate the impact of DNA contamination on genotype accuracy and propose effective strategies to screen for and prevent DNA contamination in sequencing studies. PMID:23103226

  13. Jaccard distance based weighted sparse representation for coarse-to-fine plant species recognition.

    PubMed

    Zhang, Shanwen; Wu, Xiaowei; You, Zhuhong

    2017-01-01

    Leaf based plant species recognition plays an important role in ecological protection, however its application to large and modern leaf databases has been a long-standing obstacle due to the computational cost and feasibility. Recognizing such limitations, we propose a Jaccard distance based sparse representation (JDSR) method which adopts a two-stage, coarse to fine strategy for plant species recognition. In the first stage, we use the Jaccard distance between the test sample and each training sample to coarsely determine the candidate classes of the test sample. The second stage includes a Jaccard distance based weighted sparse representation based classification(WSRC), which aims to approximately represent the test sample in the training space, and classify it by the approximation residuals. Since the training model of our JDSR method involves much fewer but more informative representatives, this method is expected to overcome the limitation of high computational and memory costs in traditional sparse representation based classification. Comparative experimental results on a public leaf image database demonstrate that the proposed method outperforms other existing feature extraction and SRC based plant recognition methods in terms of both accuracy and computational speed.

  14. Comparison of Standard Culture-Based Method to Culture-Independent Method for Evaluation of Hygiene Effects on the Hand Microbiome

    PubMed Central

    Leff, J.; Henley, J.; Tittl, J.; De Nardo, E.; Butler, M.; Griggs, R.; Fierer, N.

    2017-01-01

    ABSTRACT Hands play a critical role in the transmission of microbiota on one’s own body, between individuals, and on environmental surfaces. Effectively measuring the composition of the hand microbiome is important to hand hygiene science, which has implications for human health. Hand hygiene products are evaluated using standard culture-based methods, but standard test methods for culture-independent microbiome characterization are lacking. We sampled the hands of 50 participants using swab-based and glove-based methods prior to and following four hand hygiene treatments (using a nonantimicrobial hand wash, alcohol-based hand sanitizer [ABHS], a 70% ethanol solution, or tap water). We compared results among culture plate counts, 16S rRNA gene sequencing of DNA extracted directly from hands, and sequencing of DNA extracted from culture plates. Glove-based sampling yielded higher numbers of unique operational taxonomic units (OTUs) but had less diversity in bacterial community composition than swab-based sampling. We detected treatment-induced changes in diversity only by using swab-based samples (P < 0.001); we were unable to detect changes with glove-based samples. Bacterial cell counts significantly decreased with use of the ABHS (P < 0.05) and ethanol control (P < 0.05). Skin hydration at baseline correlated with bacterial abundances, bacterial community composition, pH, and redness across subjects. The importance of the method choice was substantial. These findings are important to ensure improvement of hand hygiene industry methods and for future hand microbiome studies. On the basis of our results and previously published studies, we propose recommendations for best practices in hand microbiome research. PMID:28351915

  15. A Rapid Identification Method for Calamine Using Near-Infrared Spectroscopy Based on Multi-Reference Correlation Coefficient Method and Back Propagation Artificial Neural Network.

    PubMed

    Sun, Yangbo; Chen, Long; Huang, Bisheng; Chen, Keli

    2017-07-01

    As a mineral, the traditional Chinese medicine calamine has a similar shape to many other minerals. Investigations of commercially available calamine samples have shown that there are many fake and inferior calamine goods sold on the market. The conventional identification method for calamine is complicated, therefore as a result of the large scale of calamine samples, a rapid identification method is needed. To establish a qualitative model using near-infrared (NIR) spectroscopy for rapid identification of various calamine samples, large quantities of calamine samples including crude products, counterfeits and processed products were collected and correctly identified using the physicochemical and powder X-ray diffraction method. The NIR spectroscopy method was used to analyze these samples by combining the multi-reference correlation coefficient (MRCC) method and the error back propagation artificial neural network algorithm (BP-ANN), so as to realize the qualitative identification of calamine samples. The accuracy rate of the model based on NIR and MRCC methods was 85%; in addition, the model, which took comprehensive multiple factors into consideration, can be used to identify crude calamine products, its counterfeits and processed products. Furthermore, by in-putting the correlation coefficients of multiple references as the spectral feature data of samples into BP-ANN, a BP-ANN model of qualitative identification was established, of which the accuracy rate was increased to 95%. The MRCC method can be used as a NIR-based method in the process of BP-ANN modeling.

  16. Meta-Storms: efficient search for similar microbial communities based on a novel indexing scheme and similarity score for metagenomic data.

    PubMed

    Su, Xiaoquan; Xu, Jian; Ning, Kang

    2012-10-01

    It has long been intriguing scientists to effectively compare different microbial communities (also referred as 'metagenomic samples' here) in a large scale: given a set of unknown samples, find similar metagenomic samples from a large repository and examine how similar these samples are. With the current metagenomic samples accumulated, it is possible to build a database of metagenomic samples of interests. Any metagenomic samples could then be searched against this database to find the most similar metagenomic sample(s). However, on one hand, current databases with a large number of metagenomic samples mostly serve as data repositories that offer few functionalities for analysis; and on the other hand, methods to measure the similarity of metagenomic data work well only for small set of samples by pairwise comparison. It is not yet clear, how to efficiently search for metagenomic samples against a large metagenomic database. In this study, we have proposed a novel method, Meta-Storms, that could systematically and efficiently organize and search metagenomic data. It includes the following components: (i) creating a database of metagenomic samples based on their taxonomical annotations, (ii) efficient indexing of samples in the database based on a hierarchical taxonomy indexing strategy, (iii) searching for a metagenomic sample against the database by a fast scoring function based on quantitative phylogeny and (iv) managing database by index export, index import, data insertion, data deletion and database merging. We have collected more than 1300 metagenomic data from the public domain and in-house facilities, and tested the Meta-Storms method on these datasets. Our experimental results show that Meta-Storms is capable of database creation and effective searching for a large number of metagenomic samples, and it could achieve similar accuracies compared with the current popular significance testing-based methods. Meta-Storms method would serve as a suitable database management and search system to quickly identify similar metagenomic samples from a large pool of samples. ningkang@qibebt.ac.cn Supplementary data are available at Bioinformatics online.

  17. Efficient method of image edge detection based on FSVM

    NASA Astrophysics Data System (ADS)

    Cai, Aiping; Xiong, Xiaomei

    2013-07-01

    For efficient object cover edge detection in digital images, this paper studied traditional methods and algorithm based on SVM. It analyzed Canny edge detection algorithm existed some pseudo-edge and poor anti-noise capability. In order to provide a reliable edge extraction method, propose a new detection algorithm based on FSVM. Which contains several steps: first, trains classify sample and gives the different membership function to different samples. Then, a new training sample is formed by increase the punishment some wrong sub-sample, and use the new FSVM classification model for train and test them. Finally the edges are extracted of the object image by using the model. Experimental result shows that good edge detection image will be obtained and adding noise experiments results show that this method has good anti-noise.

  18. Principal component analysis-based unsupervised feature extraction applied to in silico drug discovery for posttraumatic stress disorder-mediated heart disease.

    PubMed

    Taguchi, Y-h; Iwadate, Mitsuo; Umeyama, Hideaki

    2015-04-30

    Feature extraction (FE) is difficult, particularly if there are more features than samples, as small sample numbers often result in biased outcomes or overfitting. Furthermore, multiple sample classes often complicate FE because evaluating performance, which is usual in supervised FE, is generally harder than the two-class problem. Developing sample classification independent unsupervised methods would solve many of these problems. Two principal component analysis (PCA)-based FE, specifically, variational Bayes PCA (VBPCA) was extended to perform unsupervised FE, and together with conventional PCA (CPCA)-based unsupervised FE, were tested as sample classification independent unsupervised FE methods. VBPCA- and CPCA-based unsupervised FE both performed well when applied to simulated data, and a posttraumatic stress disorder (PTSD)-mediated heart disease data set that had multiple categorical class observations in mRNA/microRNA expression of stressed mouse heart. A critical set of PTSD miRNAs/mRNAs were identified that show aberrant expression between treatment and control samples, and significant, negative correlation with one another. Moreover, greater stability and biological feasibility than conventional supervised FE was also demonstrated. Based on the results obtained, in silico drug discovery was performed as translational validation of the methods. Our two proposed unsupervised FE methods (CPCA- and VBPCA-based) worked well on simulated data, and outperformed two conventional supervised FE methods on a real data set. Thus, these two methods have suggested equivalence for FE on categorical multiclass data sets, with potential translational utility for in silico drug discovery.

  19. An adaptive sampling method for variable-fidelity surrogate models using improved hierarchical kriging

    NASA Astrophysics Data System (ADS)

    Hu, Jiexiang; Zhou, Qi; Jiang, Ping; Shao, Xinyu; Xie, Tingli

    2018-01-01

    Variable-fidelity (VF) modelling methods have been widely used in complex engineering system design to mitigate the computational burden. Building a VF model generally includes two parts: design of experiments and metamodel construction. In this article, an adaptive sampling method based on improved hierarchical kriging (ASM-IHK) is proposed to refine the improved VF model. First, an improved hierarchical kriging model is developed as the metamodel, in which the low-fidelity model is varied through a polynomial response surface function to capture the characteristics of a high-fidelity model. Secondly, to reduce local approximation errors, an active learning strategy based on a sequential sampling method is introduced to make full use of the already required information on the current sampling points and to guide the sampling process of the high-fidelity model. Finally, two numerical examples and the modelling of the aerodynamic coefficient for an aircraft are provided to demonstrate the approximation capability of the proposed approach, as well as three other metamodelling methods and two sequential sampling methods. The results show that ASM-IHK provides a more accurate metamodel at the same simulation cost, which is very important in metamodel-based engineering design problems.

  20. Quantitative Evaluation of Hard X-ray Damage to Biological Samples using EUV Ptychography

    NASA Astrophysics Data System (ADS)

    Baksh, Peter; Odstrcil, Michal; Parsons, Aaron; Bailey, Jo; Deinhardt, Katrin; Chad, John E.; Brocklesby, William S.; Frey, Jeremy G.

    2017-06-01

    Coherent diffractive imaging (CDI) has become a standard method on a variety of synchrotron beam lines. The high brilliance short wavelength radiation from these sources can be used to reconstruct attenuation and relative phase of a sample with nanometre resolution via CDI methods. However, the interaction between the sample and high energy ionising radiation can cause degradation to sample structure. We demonstrate, using a laboratory based high harmonic generation (HHG) based extreme ultraviolet (EUV) source, imaging a sample of hippocampal neurons using the ptychography method. The significant increase in contrast of the sample in the EUV light allows identification of damage induced from exposure to 7.3 keV photons, without causing any damage to the sample itself.

  1. Face recognition based on symmetrical virtual image and original training image

    NASA Astrophysics Data System (ADS)

    Ke, Jingcheng; Peng, Yali; Liu, Shigang; Li, Jun; Pei, Zhao

    2018-02-01

    In face representation-based classification methods, we are able to obtain high recognition rate if a face has enough available training samples. However, in practical applications, we only have limited training samples to use. In order to obtain enough training samples, many methods simultaneously use the original training samples and corresponding virtual samples to strengthen the ability of representing the test sample. One is directly using the original training samples and corresponding mirror samples to recognize the test sample. However, when the test sample is nearly symmetrical while the original training samples are not, the integration of the original training and mirror samples might not well represent the test samples. To tackle the above-mentioned problem, in this paper, we propose a novel method to obtain a kind of virtual samples which are generated by averaging the original training samples and corresponding mirror samples. Then, the original training samples and the virtual samples are integrated to recognize the test sample. Experimental results on five face databases show that the proposed method is able to partly overcome the challenges of the various poses, facial expressions and illuminations of original face image.

  2. Comparison of Sample Size by Bootstrap and by Formulas Based on Normal Distribution Assumption.

    PubMed

    Wang, Zuozhen

    2018-01-01

    Bootstrapping technique is distribution-independent, which provides an indirect way to estimate the sample size for a clinical trial based on a relatively smaller sample. In this paper, sample size estimation to compare two parallel-design arms for continuous data by bootstrap procedure are presented for various test types (inequality, non-inferiority, superiority, and equivalence), respectively. Meanwhile, sample size calculation by mathematical formulas (normal distribution assumption) for the identical data are also carried out. Consequently, power difference between the two calculation methods is acceptably small for all the test types. It shows that the bootstrap procedure is a credible technique for sample size estimation. After that, we compared the powers determined using the two methods based on data that violate the normal distribution assumption. To accommodate the feature of the data, the nonparametric statistical method of Wilcoxon test was applied to compare the two groups in the data during the process of bootstrap power estimation. As a result, the power estimated by normal distribution-based formula is far larger than that by bootstrap for each specific sample size per group. Hence, for this type of data, it is preferable that the bootstrap method be applied for sample size calculation at the beginning, and that the same statistical method as used in the subsequent statistical analysis is employed for each bootstrap sample during the course of bootstrap sample size estimation, provided there is historical true data available that can be well representative of the population to which the proposed trial is planning to extrapolate.

  3. Sample handling for mass spectrometric proteomic investigations of human sera.

    PubMed

    West-Nielsen, Mikkel; Høgdall, Estrid V; Marchiori, Elena; Høgdall, Claus K; Schou, Christian; Heegaard, Niels H H

    2005-08-15

    Proteomic investigations of sera are potentially of value for diagnosis, prognosis, choice of therapy, and disease activity assessment by virtue of discovering new biomarkers and biomarker patterns. Much debate focuses on the biological relevance and the need for identification of such biomarkers while less effort has been invested in devising standard procedures for sample preparation and storage in relation to model building based on complex sets of mass spectrometric (MS) data. Thus, development of standardized methods for collection and storage of patient samples together with standards for transportation and handling of samples are needed. This requires knowledge about how sample processing affects MS-based proteome analyses and thereby how nonbiological biased classification errors are avoided. In this study, we characterize the effects of sample handling, including clotting conditions, storage temperature, storage time, and freeze/thaw cycles, on MS-based proteomics of human serum by using principal components analysis, support vector machine learning, and clustering methods based on genetic algorithms as class modeling and prediction methods. Using spiking to artificially create differentiable sample groups, this integrated approach yields data that--even when working with sample groups that differ more than may be expected in biological studies--clearly demonstrate the need for comparable sampling conditions for samples used for modeling and for the samples that are going into the test set group. Also, the study emphasizes the difference between class prediction and class comparison studies as well as the advantages and disadvantages of different modeling methods.

  4. A Compressed Sensing Based Method for Reducing the Sampling Time of A High Resolution Pressure Sensor Array System

    PubMed Central

    Sun, Chenglu; Li, Wei; Chen, Wei

    2017-01-01

    For extracting the pressure distribution image and respiratory waveform unobtrusively and comfortably, we proposed a smart mat which utilized a flexible pressure sensor array, printed electrodes and novel soft seven-layer structure to monitor those physiological information. However, in order to obtain high-resolution pressure distribution and more accurate respiratory waveform, it needs more time to acquire the pressure signal of all the pressure sensors embedded in the smart mat. In order to reduce the sampling time while keeping the same resolution and accuracy, a novel method based on compressed sensing (CS) theory was proposed. By utilizing the CS based method, 40% of the sampling time can be decreased by means of acquiring nearly one-third of original sampling points. Then several experiments were carried out to validate the performance of the CS based method. While less than one-third of original sampling points were measured, the correlation degree coefficient between reconstructed respiratory waveform and original waveform can achieve 0.9078, and the accuracy of the respiratory rate (RR) extracted from the reconstructed respiratory waveform can reach 95.54%. The experimental results demonstrated that the novel method can fit the high resolution smart mat system and be a viable option for reducing the sampling time of the pressure sensor array. PMID:28796188

  5. Some connections between importance sampling and enhanced sampling methods in molecular dynamics.

    PubMed

    Lie, H C; Quer, J

    2017-11-21

    In molecular dynamics, enhanced sampling methods enable the collection of better statistics of rare events from a reference or target distribution. We show that a large class of these methods is based on the idea of importance sampling from mathematical statistics. We illustrate this connection by comparing the Hartmann-Schütte method for rare event simulation (J. Stat. Mech. Theor. Exp. 2012, P11004) and the Valsson-Parrinello method of variationally enhanced sampling [Phys. Rev. Lett. 113, 090601 (2014)]. We use this connection in order to discuss how recent results from the Monte Carlo methods literature can guide the development of enhanced sampling methods.

  6. Some connections between importance sampling and enhanced sampling methods in molecular dynamics

    NASA Astrophysics Data System (ADS)

    Lie, H. C.; Quer, J.

    2017-11-01

    In molecular dynamics, enhanced sampling methods enable the collection of better statistics of rare events from a reference or target distribution. We show that a large class of these methods is based on the idea of importance sampling from mathematical statistics. We illustrate this connection by comparing the Hartmann-Schütte method for rare event simulation (J. Stat. Mech. Theor. Exp. 2012, P11004) and the Valsson-Parrinello method of variationally enhanced sampling [Phys. Rev. Lett. 113, 090601 (2014)]. We use this connection in order to discuss how recent results from the Monte Carlo methods literature can guide the development of enhanced sampling methods.

  7. Model-based frequency response characterization of a digital-image analysis system for epifluorescence microscopy

    NASA Technical Reports Server (NTRS)

    Hazra, Rajeeb; Viles, Charles L.; Park, Stephen K.; Reichenbach, Stephen E.; Sieracki, Michael E.

    1992-01-01

    Consideration is given to a model-based method for estimating the spatial frequency response of a digital-imaging system (e.g., a CCD camera) that is modeled as a linear, shift-invariant image acquisition subsystem that is cascaded with a linear, shift-variant sampling subsystem. The method characterizes the 2D frequency response of the image acquisition subsystem to beyond the Nyquist frequency by accounting explicitly for insufficient sampling and the sample-scene phase. Results for simulated systems and a real CCD-based epifluorescence microscopy system are presented to demonstrate the accuracy of the method.

  8. A PCR method based on 18S rRNA gene for detection of malaria parasite in Balochistan.

    PubMed

    Shahwani, Zubeda; Aleem, Abdul; Ahmed, Nazeer; Mushtaq, Muhammad; Afridi, Sarwat

    2016-12-01

    To establish a polymerase chain reaction method based on 18S ribosomal ribonucleic acid gene for the detection of plasmodium deoxyribonucleic acid in patients suffering from malaria symptoms. This cross-sectional study was conducted from September 2013 to October 2014 in district Quetta of Pakistan's Balochistan province. Blood samples were collected from patients suffering from general symptoms of malaria. A polymerase chain reaction-based technique was applied for the diagnosis of malaria and detection of responsible species in the patients who were suspected to carry the parasite. Performance of this polymerase chain reaction method was compared against the microscopy results. Parasite number was also calculated for microscopy positive samples.All samples after the genomic deoxyribonucleic acid isolation were subjected to polymerase chain reaction amplification and agarose gel electrophoresis. Of the 200 samples, 114(57%) were confirmed as positive and 86(43%) as negative for malaria by microscopy. Polymerase chain reaction identified 124(62%) samples as positive and 76(38%) as negative for malaria. The comparative analysis of both diagnostic methods confirmed 109(54.5%) samples as positive by both techniques. Besides, 5(6.58%) samples were identified as false positive and 15(12.1%) samples as false negative by polymerase chain reaction. Sensitivity, specificity and positive predictive values for polymerase chain reaction in comparison to microscopy were 87.98%, 93.42% and 96%, respectively. Polymerase chain reaction-based methods in malaria diagnosis and species identification were found to be more effective than other techniques.

  9. Surveillance Among Men Who have Sex with Men in the United States: A Comparison of Web-Based and Venue-Based Samples.

    PubMed

    Chen, Yen-Tyng; Bowles, Kristina; An, Qian; DiNenno, Elizabeth; Finlayson, Teresa; Hoots, Brooke; Paz-Bailey, Gabriela; Wejnert, Cyprian

    2018-07-01

    Although men who have sex with men (MSM) recruited through web-based and venue-based sampling methods have been compared, no large web-based and venue-based samples using similar survey instruments have been examined in the U.S. This study describes the differences in sociodemographic characteristics and risk behaviors between the 2012 Web-based HIV Behavioral Survey (n = 3221) and 2011 National HIV Behavioral Surveillance (n = 9256). Compared with participants in the venue-based sample, participants in the web-based sample were older, less likely to be black or Hispanic, more likely to have higher socioeconomic status, and more likely to have anal sex without a condom with their last male sex partner. Web-based participants were less likely to have multiple male sex partners, ever injected drugs, been tested for HIV in the past 12 months, and received free condoms than venue-based participants. The method for sampling MSM into a behavioral survey should consider the sub-population of MSM to be reached.

  10. A comparative proteomics method for multiple samples based on a 18O-reference strategy and a quantitation and identification-decoupled strategy.

    PubMed

    Wang, Hongbin; Zhang, Yongqian; Gui, Shuqi; Zhang, Yong; Lu, Fuping; Deng, Yulin

    2017-08-15

    Comparisons across large numbers of samples are frequently necessary in quantitative proteomics. Many quantitative methods used in proteomics are based on stable isotope labeling, but most of these are only useful for comparing two samples. For up to eight samples, the iTRAQ labeling technique can be used. For greater numbers of samples, the label-free method has been used, but this method was criticized for low reproducibility and accuracy. An ingenious strategy has been introduced, comparing each sample against a 18 O-labeled reference sample that was created by pooling equal amounts of all samples. However, it is necessary to use proportion-known protein mixtures to investigate and evaluate this new strategy. Another problem for comparative proteomics of multiple samples is the poor coincidence and reproducibility in protein identification results across samples. In present study, a method combining 18 O-reference strategy and a quantitation and identification-decoupled strategy was investigated with proportion-known protein mixtures. The results obviously demonstrated that the 18 O-reference strategy had greater accuracy and reliability than other previously used comparison methods based on transferring comparison or label-free strategies. By the decoupling strategy, the quantification data acquired by LC-MS and the identification data acquired by LC-MS/MS are matched and correlated to identify differential expressed proteins, according to retention time and accurate mass. This strategy made protein identification possible for all samples using a single pooled sample, and therefore gave a good reproducibility in protein identification across multiple samples, and allowed for optimizing peptide identification separately so as to identify more proteins. Copyright © 2017 Elsevier B.V. All rights reserved.

  11. Optimization of the solvent-based dissolution method to sample volatile organic compound vapors for compound-specific isotope analysis.

    PubMed

    Bouchard, Daniel; Wanner, Philipp; Luo, Hong; McLoughlin, Patrick W; Henderson, James K; Pirkle, Robert J; Hunkeler, Daniel

    2017-10-20

    The methodology of the solvent-based dissolution method used to sample gas phase volatile organic compounds (VOC) for compound-specific isotope analysis (CSIA) was optimized to lower the method detection limits for TCE and benzene. The sampling methodology previously evaluated by [1] consists in pulling the air through a solvent to dissolve and accumulate the gaseous VOC. After the sampling process, the solvent can then be treated similarly as groundwater samples to perform routine CSIA by diluting an aliquot of the solvent into water to reach the required concentration of the targeted contaminant. Among solvents tested, tetraethylene glycol dimethyl ether (TGDE) showed the best aptitude for the method. TGDE has a great affinity with TCE and benzene, hence efficiently dissolving the compounds during their transition through the solvent. The method detection limit for TCE (5±1μg/m 3 ) and benzene (1.7±0.5μg/m 3 ) is lower when using TGDE compared to methanol, which was previously used (385μg/m 3 for TCE and 130μg/m 3 for benzene) [2]. The method detection limit refers to the minimal gas phase concentration in ambient air required to load sufficient VOC mass into TGDE to perform δ 13 C analysis. Due to a different analytical procedure, the method detection limit associated with δ 37 Cl analysis was found to be 156±6μg/m 3 for TCE. Furthermore, the experimental results validated the relationship between the gas phase TCE and the progressive accumulation of dissolved TCE in the solvent during the sampling process. Accordingly, based on the air-solvent partitioning coefficient, the sampling methodology (e.g. sampling rate, sampling duration, amount of solvent) and the final TCE concentration in the solvent, the concentration of TCE in the gas phase prevailing during the sampling event can be determined. Moreover, the possibility to analyse for TCE concentration in the solvent after sampling (or other targeted VOCs) allows the field deployment of the sampling method without the need to determine the initial gas phase TCE concentration. The simplified field deployment approach of the solvent-based dissolution method combined with the conventional analytical procedure used for groundwater samples substantially facilitates the application of CSIA to gas phase studies. Copyright © 2017 Elsevier B.V. All rights reserved.

  12. [Local Regression Algorithm Based on Net Analyte Signal and Its Application in Near Infrared Spectral Analysis].

    PubMed

    Zhang, Hong-guang; Lu, Jian-gang

    2016-02-01

    Abstract To overcome the problems of significant difference among samples and nonlinearity between the property and spectra of samples in spectral quantitative analysis, a local regression algorithm is proposed in this paper. In this algorithm, net signal analysis method(NAS) was firstly used to obtain the net analyte signal of the calibration samples and unknown samples, then the Euclidean distance between net analyte signal of the sample and net analyte signal of calibration samples was calculated and utilized as similarity index. According to the defined similarity index, the local calibration sets were individually selected for each unknown sample. Finally, a local PLS regression model was built on each local calibration sets for each unknown sample. The proposed method was applied to a set of near infrared spectra of meat samples. The results demonstrate that the prediction precision and model complexity of the proposed method are superior to global PLS regression method and conventional local regression algorithm based on spectral Euclidean distance.

  13. Nonuniform fast Fourier transform method for numerical diffraction simulation on tilted planes.

    PubMed

    Xiao, Yu; Tang, Xiahui; Qin, Yingxiong; Peng, Hao; Wang, Wei; Zhong, Lijing

    2016-10-01

    The method, based on the rotation of the angular spectrum in the frequency domain, is generally used for the diffraction simulation between the tilted planes. Due to the rotation of the angular spectrum, the interval between the sampling points in the Fourier domain is not even. For the conventional fast Fourier transform (FFT)-based methods, a spectrum interpolation is needed to get the approximate sampling value on the equidistant sampling points. However, due to the numerical error caused by the spectrum interpolation, the calculation accuracy degrades very quickly as the rotation angle increases. Here, the diffraction propagation between the tilted planes is transformed into a problem about the discrete Fourier transform on the uneven sampling points, which can be evaluated effectively and precisely through the nonuniform fast Fourier transform method (NUFFT). The most important advantage of this method is that the conventional spectrum interpolation is avoided and the high calculation accuracy can be guaranteed for different rotation angles, even when the rotation angle is close to π/2. Also, its calculation efficiency is comparable with that of the conventional FFT-based methods. Numerical examples as well as a discussion about the calculation accuracy and the sampling method are presented.

  14. A ricin forensic profiling approach based on a complex set of biomarkers.

    PubMed

    Fredriksson, Sten-Åke; Wunschel, David S; Lindström, Susanne Wiklund; Nilsson, Calle; Wahl, Karen; Åstot, Crister

    2018-08-15

    A forensic method for the retrospective determination of preparation methods used for illicit ricin toxin production was developed. The method was based on a complex set of biomarkers, including carbohydrates, fatty acids, seed storage proteins, in combination with data on ricin and Ricinus communis agglutinin. The analyses were performed on samples prepared from four castor bean plant (R. communis) cultivars by four different sample preparation methods (PM1-PM4) ranging from simple disintegration of the castor beans to multi-step preparation methods including different protein precipitation methods. Comprehensive analytical data was collected by use of a range of analytical methods and robust orthogonal partial least squares-discriminant analysis- models (OPLS-DA) were constructed based on the calibration set. By the use of a decision tree and two OPLS-DA models, the sample preparation methods of test set samples were determined. The model statistics of the two models were good and a 100% rate of correct predictions of the test set was achieved. Copyright © 2018 Elsevier B.V. All rights reserved.

  15. Review of Statistical Methods for Analysing Healthcare Resources and Costs

    PubMed Central

    Mihaylova, Borislava; Briggs, Andrew; O'Hagan, Anthony; Thompson, Simon G

    2011-01-01

    We review statistical methods for analysing healthcare resource use and costs, their ability to address skewness, excess zeros, multimodality and heavy right tails, and their ease for general use. We aim to provide guidance on analysing resource use and costs focusing on randomised trials, although methods often have wider applicability. Twelve broad categories of methods were identified: (I) methods based on the normal distribution, (II) methods following transformation of data, (III) single-distribution generalized linear models (GLMs), (IV) parametric models based on skewed distributions outside the GLM family, (V) models based on mixtures of parametric distributions, (VI) two (or multi)-part and Tobit models, (VII) survival methods, (VIII) non-parametric methods, (IX) methods based on truncation or trimming of data, (X) data components models, (XI) methods based on averaging across models, and (XII) Markov chain methods. Based on this review, our recommendations are that, first, simple methods are preferred in large samples where the near-normality of sample means is assured. Second, in somewhat smaller samples, relatively simple methods, able to deal with one or two of above data characteristics, may be preferable but checking sensitivity to assumptions is necessary. Finally, some more complex methods hold promise, but are relatively untried; their implementation requires substantial expertise and they are not currently recommended for wider applied work. Copyright © 2010 John Wiley & Sons, Ltd. PMID:20799344

  16. Model-based inference for small area estimation with sampling weights

    PubMed Central

    Vandendijck, Y.; Faes, C.; Kirby, R.S.; Lawson, A.; Hens, N.

    2017-01-01

    Obtaining reliable estimates about health outcomes for areas or domains where only few to no samples are available is the goal of small area estimation (SAE). Often, we rely on health surveys to obtain information about health outcomes. Such surveys are often characterised by a complex design, stratification, and unequal sampling weights as common features. Hierarchical Bayesian models are well recognised in SAE as a spatial smoothing method, but often ignore the sampling weights that reflect the complex sampling design. In this paper, we focus on data obtained from a health survey where the sampling weights of the sampled individuals are the only information available about the design. We develop a predictive model-based approach to estimate the prevalence of a binary outcome for both the sampled and non-sampled individuals, using hierarchical Bayesian models that take into account the sampling weights. A simulation study is carried out to compare the performance of our proposed method with other established methods. The results indicate that our proposed method achieves great reductions in mean squared error when compared with standard approaches. It performs equally well or better when compared with more elaborate methods when there is a relationship between the responses and the sampling weights. The proposed method is applied to estimate asthma prevalence across districts. PMID:28989860

  17. Selective Distance-Based K+ Quantification on Paper-Based Microfluidics.

    PubMed

    Gerold, Chase T; Bakker, Eric; Henry, Charles S

    2018-04-03

    In this study, paper-based microfluidic devices (μPADs) capable of K + quantification in aqueous samples, as well as in human serum, using both colorimetric and distance-based methods are described. A lipophilic phase containing potassium ionophore I (valinomycin) was utilized to achieve highly selective quantification of K + in the presence of Na + , Li + , and Mg 2+ ions. Successful addition of a suspended lipophilic phase to a wax printed paper-based device is described and offers a solution to current approaches that rely on organic solvents, which damage wax barriers. The approach provides an avenue for future alkali/alkaline quantification utilizing μPADs. Colorimetric spot tests allowed for K + quantification from 0.1-5.0 mM using only 3.00 μL of sample solution. Selective distance-based quantification required small sample volumes (6.00 μL) and gave responses sensitive enough to distinguish between 1.0 and 2.5 mM of sample K + . μPADs using distance-based methods were also capable of differentiating between 4.3 and 6.9 mM K + in human serum samples. Distance-based methods required no digital analysis, electronic hardware, or pumps; any steps required for quantification could be carried out using the naked eye.

  18. Monitoring the chemical production of citrus-derived bioactive 5-demethylnobiletin using surface enhanced Raman spectroscopy

    PubMed Central

    Zheng, Jinkai; Fang, Xiang; Cao, Yong; Xiao, Hang; He, Lili

    2013-01-01

    To develop an accurate and convenient method for monitoring the production of citrus-derived bioactive 5-demethylnobiletin from demethylation reaction of nobiletin, we compared surface enhanced Raman spectroscopy (SERS) methods with a conventional HPLC method. Our results show that both the substrate-based and solution-based SERS methods correlated with HPLC method very well. The solution method produced lower root mean square error of calibration and higher correlation coefficient than the substrate method. The solution method utilized an ‘affinity chromatography’-like procedure to separate the reactant nobiletin from the product 5-demthylnobiletin based on their different binding affinity to the silver dendrites. The substrate method was found simpler and faster to collect the SERS ‘fingerprint’ spectra of the samples as no incubation between samples and silver was needed and only trace amount of samples were required. Our results demonstrated that the SERS methods were superior to HPLC method in conveniently and rapidly characterizing and quantifying 5-demethylnobiletin production. PMID:23885986

  19. Study on Measuring the Viscosity of Lubricating Oil by Viscometer Based on Hele - Shaw Principle

    NASA Astrophysics Data System (ADS)

    Li, Longfei

    2017-12-01

    In order to explore the method of accurately measuring the viscosity value of oil samples using the viscometer based on Hele-Shaw principle, three different measurement methods are designed in the laboratory, and the statistical characteristics of the measured values are compared, in order to get the best measurement method. The results show that the oil sample to be measured is placed in the magnetic field formed by the magnet, and the oil sample can be sucked from the same distance from the magnet. The viscosity value of the sample can be measured accurately.

  20. [Study of spatial stratified sampling strategy of Oncomelania hupensis snail survey based on plant abundance].

    PubMed

    Xun-Ping, W; An, Z

    2017-07-27

    Objective To optimize and simplify the survey method of Oncomelania hupensis snails in marshland endemic regions of schistosomiasis, so as to improve the precision, efficiency and economy of the snail survey. Methods A snail sampling strategy (Spatial Sampling Scenario of Oncomelania based on Plant Abundance, SOPA) which took the plant abundance as auxiliary variable was explored and an experimental study in a 50 m×50 m plot in a marshland in the Poyang Lake region was performed. Firstly, the push broom surveyed data was stratified into 5 layers by the plant abundance data; then, the required numbers of optimal sampling points of each layer through Hammond McCullagh equation were calculated; thirdly, every sample point in the line with the Multiple Directional Interpolation (MDI) placement scheme was pinpointed; and finally, the comparison study among the outcomes of the spatial random sampling strategy, the traditional systematic sampling method, the spatial stratified sampling method, Sandwich spatial sampling and inference and SOPA was performed. Results The method (SOPA) proposed in this study had the minimal absolute error of 0.213 8; and the traditional systematic sampling method had the largest estimate, and the absolute error was 0.924 4. Conclusion The snail sampling strategy (SOPA) proposed in this study obtains the higher estimation accuracy than the other four methods.

  1. Method and apparatus for detecting gem-polyhalogenated hydrocarbons

    DOEpatents

    Anderson, deceased, William G.; Anderson, legal representative, Johanna S.

    1990-01-01

    A method and optrode for detecting gem polyhalogenated hydrocarbons in a sample fluid based on a single phase Fujiwara reaction as provided. The method comprises contacting a reaction mixture with a sample fluid which contains the gem-polyhalogenated hydrocarbons. The reaction mixture comprises an aqueous solution of pyridine or derivative thereof and a hindered nitrogen base. Upon contact a fluorescent and/or chromgenic reaction product forms whose fluorescence and/or absorbance is related to the concentration of gem-polyhalogenated hydrocarbons in the sample fluid.

  2. Comparison of methods for the detection of coliphages in recreational water at two California, United States beaches.

    PubMed

    Rodríguez, Roberto A; Love, David C; Stewart, Jill R; Tajuba, Julianne; Knee, Jacqueline; Dickerson, Jerold W; Webster, Laura F; Sobsey, Mark D

    2012-04-01

    Methods for detection of two fecal indicator viruses, F+ and somatic coliphages, were evaluated for application to recreational marine water. Marine water samples were collected during the summer of 2007 in Southern California, United States from transects along Avalon Beach (n=186 samples) and Doheny Beach (n=101 samples). Coliphage detection methods included EPA method 1601 - two-step enrichment (ENR), EPA method 1602 - single agar layer (SAL), and variations of ENR. Variations included comparison of two incubation times (overnight and 5-h incubation) and two final detection steps (lysis zone assay and a rapid latex agglutination assay). A greater number of samples were positive for somatic and F+ coliphages by ENR than by SAL (p<0.01). The standard ENR with overnight incubation and detection by lysis zone assay was the most sensitive method for the detection of F+ and somatic coliphages from marine water, although the method takes up to three days to obtain results. A rapid 5-h enrichment version of ENR also performed well, with more positive samples than SAL, and could be performed in roughly 24h. Latex agglutination-based detection methods require the least amount of time to perform, although the sensitivity was less than lysis zone-based detection methods. Rapid culture-based enrichment of coliphages in marine water may be possible by further optimizing culture-based methods for saline water conditions to generate higher viral titers than currently available, as well as increasing the sensitivity of latex agglutination detection methods. Copyright © 2012 Elsevier B.V. All rights reserved.

  3. Comparability of HbA1c and lipids measured with dried blood spot versus venous samples: a systematic review and meta-analysis

    PubMed Central

    2014-01-01

    Background Levels of haemoglobin A1c (HbA1c) and blood lipids are important determinants of risk in patients with diabetes. Standard analysis methods based upon venous blood samples can be logistically challenging in resource-poor settings where much of the diabetes epidemic is occurring. Dried blood spots (DBS) provide a simple alternative method for sample collection but the comparability of data from analyses based on DBS is not well established. Methods We conducted a systematic review and meta-analysis to define the association of findings for HbA1c and blood lipids for analyses based upon standard methods compared to DBS. The Cochrane, Embase and Medline databases were searched for relevant reports and summary regression lines were estimated. Results 705 abstracts were found by the initial electronic search with 6 further reports identified by manual review of the full papers. 16 studies provided data for one or more outcomes of interest. There was a close agreement between the results for HbA1c assays based on venous and DBS samples (DBS = 0.9858venous + 0.3809), except for assays based upon affinity chromatography. Significant adjustment was required for assays of total cholesterol (DBS = 0.6807venous + 1.151) but results for triglycerides (DBS = 0.9557venous + 0.1427) were directly comparable. Conclusions For HbA1c and selected blood lipids, assays based on DBS samples are clearly associated with assays based on standard venous samples. There are, however, significant uncertainties about the nature of these associations and there is a need for standardisation of the sample collection, transportation, storage and analysis methods before the technique can be considered mainstream. This should be a research priority because better elucidation of metabolic risks in resource poor settings, where venous sampling is infeasible, will be key to addressing the global epidemic of cardiovascular diseases. PMID:25045323

  4. Evaluation of sampling methods for Bacillus spore-contaminated HVAC filters

    PubMed Central

    Calfee, M. Worth; Rose, Laura J.; Tufts, Jenia; Morse, Stephen; Clayton, Matt; Touati, Abderrahmane; Griffin-Gatchalian, Nicole; Slone, Christina; McSweeney, Neal

    2016-01-01

    The objective of this study was to compare an extraction-based sampling method to two vacuum-based sampling methods (vacuum sock and 37 mm cassette filter) with regards to their ability to recover Bacillus atrophaeus spores (surrogate for Bacillus anthracis) from pleated heating, ventilation, and air conditioning (HVAC) filters that are typically found in commercial and residential buildings. Electrostatic and mechanical HVAC filters were tested, both without and after loading with dust to 50% of their total holding capacity. The results were analyzed by one-way ANOVA across material types, presence or absence of dust, and sampling device. The extraction method gave higher relative recoveries than the two vacuum methods evaluated (p ≤ 0.001). On average, recoveries obtained by the vacuum methods were about 30% of those achieved by the extraction method. Relative recoveries between the two vacuum methods were not significantly different (p > 0.05). Although extraction methods yielded higher recoveries than vacuum methods, either HVAC filter sampling approach may provide a rapid and inexpensive mechanism for understanding the extent of contamination following a wide-area biological release incident. PMID:24184312

  5. Evaluation of sampling methods for Bacillus spore-contaminated HVAC filters.

    PubMed

    Calfee, M Worth; Rose, Laura J; Tufts, Jenia; Morse, Stephen; Clayton, Matt; Touati, Abderrahmane; Griffin-Gatchalian, Nicole; Slone, Christina; McSweeney, Neal

    2014-01-01

    The objective of this study was to compare an extraction-based sampling method to two vacuum-based sampling methods (vacuum sock and 37mm cassette filter) with regards to their ability to recover Bacillus atrophaeus spores (surrogate for Bacillus anthracis) from pleated heating, ventilation, and air conditioning (HVAC) filters that are typically found in commercial and residential buildings. Electrostatic and mechanical HVAC filters were tested, both without and after loading with dust to 50% of their total holding capacity. The results were analyzed by one-way ANOVA across material types, presence or absence of dust, and sampling device. The extraction method gave higher relative recoveries than the two vacuum methods evaluated (p≤0.001). On average, recoveries obtained by the vacuum methods were about 30% of those achieved by the extraction method. Relative recoveries between the two vacuum methods were not significantly different (p>0.05). Although extraction methods yielded higher recoveries than vacuum methods, either HVAC filter sampling approach may provide a rapid and inexpensive mechanism for understanding the extent of contamination following a wide-area biological release incident. Published by Elsevier B.V.

  6. Integral-equation based methods for parameter estimation in output pulses of radiation detectors: Application in nuclear medicine and spectroscopy

    NASA Astrophysics Data System (ADS)

    Mohammadian-Behbahani, Mohammad-Reza; Saramad, Shahyar

    2018-04-01

    Model based analysis methods are relatively new approaches for processing the output data of radiation detectors in nuclear medicine imaging and spectroscopy. A class of such methods requires fast algorithms for fitting pulse models to experimental data. In order to apply integral-equation based methods for processing the preamplifier output pulses, this article proposes a fast and simple method for estimating the parameters of the well-known bi-exponential pulse model by solving an integral equation. The proposed method needs samples from only three points of the recorded pulse as well as its first and second order integrals. After optimizing the sampling points, the estimation results were calculated and compared with two traditional integration-based methods. Different noise levels (signal-to-noise ratios from 10 to 3000) were simulated for testing the functionality of the proposed method, then it was applied to a set of experimental pulses. Finally, the effect of quantization noise was assessed by studying different sampling rates. Promising results by the proposed method endorse it for future real-time applications.

  7. Method of evaluation of process of red blood cell sedimentation based on photometry of droplet samples.

    PubMed

    Aristov, Alexander; Nosova, Ekaterina

    2017-04-01

    The paper focuses on research aimed at creating and testing a new approach to evaluate the processes of aggregation and sedimentation of red blood cells for purpose of its use in clinical laboratory diagnostics. The proposed method is based on photometric analysis of blood sample formed as a sessile drop. The results of clinical approbation of this method are given in the paper. Analysis of the processes occurring in the sample in the form of sessile drop during the process of blood cells sedimentation is described. The results of experimental studies to evaluate the effect of the droplet sample focusing properties on light radiation transmittance are presented. It is shown that this method significantly reduces the sample volume and provides sufficiently high sensitivity to the studied processes.

  8. Comparison of gel column, card, and cartridge techniques for dog erythrocyte antigen 1.1 blood typing

    PubMed Central

    Seth, Mayank; Jackson, Karen V.; Winzelberg, Sarah; Giger, Urs

    2012-01-01

    Objective To compare accuracy and ease of use of a card agglutination assay, an immunochromatographic cartridge method, and a gel-based method for canine blood typing. Sample Blood samples from 52 healthy blood donor dogs, 10 dogs with immune-mediated hemolytic anemia (IMHA), and 29 dogs with other diseases. Procedures Blood samples were tested in accordance with manufacturer guidelines. Samples with low PCVs were created by the addition of autologous plasma to separately assess the effects of anemia on test results. Results Compared with a composite reference standard of agreement between 2 methods, the gel-based method was found to be 100% accurate. The card agglutination assay was 89% to 91% accurate, depending on test interpretation, and the immunochromatographic cartridge method was 93% accurate but 100% specific. Errors were observed more frequently in samples from diseased dogs, particularly those with IMHA. In the presence of persistent autoagglutination, dog erythrocyte antigen (DEA) 1.1 typing was not possible, except with the immunochromatographic cartridge method. Conclusions and Clinical Relevance The card agglutination assay and immunochromatographic cartridge method, performed by trained personnel, were suitable for in-clinic emergency DEA 1.1 blood typing. There may be errors, particularly for samples from dogs with IMHA, and the immunochromatographic cartridge method may have an advantage of allowing typing of samples with persistent autoagglutination. The laboratory gel-based method would be preferred for routine DEA 1.1 typing of donors and patients if it is available and time permits. Current DEA 1.1 typing techniques appear to be appropriately standardized and easy to use. PMID:22280380

  9. Potential artifacts associated with historical preparation of joint compound samples and reported airborne asbestos concentrations.

    PubMed

    Brorby, G P; Sheehan, P J; Berman, D W; Bogen, K T; Holm, S E

    2011-05-01

    Airborne samples collected in the 1970s for drywall workers using asbestos-containing joint compounds were likely prepared and analyzed according to National Institute of Occupational Safety and Health Method P&CAM 239, the historical precursor to current Method 7400. Experimentation with a re-created, chrysotile-containing, carbonate-based joint compound suggested that analysis following sample preparation by the historical vs. current method produces different fiber counts, likely because of an interaction between the different clearing and mounting chemicals used and the carbonate-based joint compound matrix. Differences were also observed during analysis using Method 7402, depending on whether acetic acid/dimethylformamide or acetone was used during preparation to collapse the filter. Specifically, air samples of sanded chrysotile-containing joint compound prepared by the historical method yielded fiber counts significantly greater (average of 1.7-fold, 95% confidence interval: 1.5- to 2.0-fold) than those obtained by the current method. In addition, air samples prepared by Method 7402 using acetic acid/dimethylformamide yielded fiber counts that were greater (2.8-fold, 95% confidence interval: 2.5- to 3.2-fold) than those prepared by this method using acetone. These results indicated (1) there is an interaction between Method P&CAM 239 preparation chemicals and the carbonate-based joint compound matrix that reveals fibers that were previously bound in the matrix, and (2) the same appeared to be true for Method 7402 preparation chemicals acetic acid/dimethylformamide. This difference in fiber counts is the opposite of what has been reported historically for samples of relatively pure chrysotile dusts prepared using the same chemicals. This preparation artifact should be considered when interpreting historical air samples for drywall workers prepared by Method P&CAM 239. Copyright © 2011 JOEH, LLC

  10. A Multilevel, Hierarchical Sampling Technique for Spatially Correlated Random Fields

    DOE PAGES

    Osborn, Sarah; Vassilevski, Panayot S.; Villa, Umberto

    2017-10-26

    In this paper, we propose an alternative method to generate samples of a spatially correlated random field with applications to large-scale problems for forward propagation of uncertainty. A classical approach for generating these samples is the Karhunen--Loève (KL) decomposition. However, the KL expansion requires solving a dense eigenvalue problem and is therefore computationally infeasible for large-scale problems. Sampling methods based on stochastic partial differential equations provide a highly scalable way to sample Gaussian fields, but the resulting parametrization is mesh dependent. We propose a multilevel decomposition of the stochastic field to allow for scalable, hierarchical sampling based on solving amore » mixed finite element formulation of a stochastic reaction-diffusion equation with a random, white noise source function. Lastly, numerical experiments are presented to demonstrate the scalability of the sampling method as well as numerical results of multilevel Monte Carlo simulations for a subsurface porous media flow application using the proposed sampling method.« less

  11. A Multilevel, Hierarchical Sampling Technique for Spatially Correlated Random Fields

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Osborn, Sarah; Vassilevski, Panayot S.; Villa, Umberto

    In this paper, we propose an alternative method to generate samples of a spatially correlated random field with applications to large-scale problems for forward propagation of uncertainty. A classical approach for generating these samples is the Karhunen--Loève (KL) decomposition. However, the KL expansion requires solving a dense eigenvalue problem and is therefore computationally infeasible for large-scale problems. Sampling methods based on stochastic partial differential equations provide a highly scalable way to sample Gaussian fields, but the resulting parametrization is mesh dependent. We propose a multilevel decomposition of the stochastic field to allow for scalable, hierarchical sampling based on solving amore » mixed finite element formulation of a stochastic reaction-diffusion equation with a random, white noise source function. Lastly, numerical experiments are presented to demonstrate the scalability of the sampling method as well as numerical results of multilevel Monte Carlo simulations for a subsurface porous media flow application using the proposed sampling method.« less

  12. Alpha Matting with KL-Divergence Based Sparse Sampling.

    PubMed

    Karacan, Levent; Erdem, Aykut; Erdem, Erkut

    2017-06-22

    In this paper, we present a new sampling-based alpha matting approach for the accurate estimation of foreground and background layers of an image. Previous sampling-based methods typically rely on certain heuristics in collecting representative samples from known regions, and thus their performance deteriorates if the underlying assumptions are not satisfied. To alleviate this, we take an entirely new approach and formulate sampling as a sparse subset selection problem where we propose to pick a small set of candidate samples that best explains the unknown pixels. Moreover, we describe a new dissimilarity measure for comparing two samples which is based on KLdivergence between the distributions of features extracted in the vicinity of the samples. The proposed framework is general and could be easily extended to video matting by additionally taking temporal information into account in the sampling process. Evaluation on standard benchmark datasets for image and video matting demonstrates that our approach provides more accurate results compared to the state-of-the-art methods.

  13. SLEPR: A Sample-Level Enrichment-Based Pathway Ranking Method — Seeking Biological Themes through Pathway-Level Consistency

    PubMed Central

    Yi, Ming; Stephens, Robert M.

    2008-01-01

    Analysis of microarray and other high throughput data often involves identification of genes consistently up or down-regulated across samples as the first step in extraction of biological meaning. This gene-level paradigm can be limited as a result of valid sample fluctuations and biological complexities. In this report, we describe a novel method, SLEPR, which eliminates this limitation by relying on pathway-level consistencies. Our method first selects the sample-level differentiated genes from each individual sample, capturing genes missed by other analysis methods, ascertains the enrichment levels of associated pathways from each of those lists, and then ranks annotated pathways based on the consistency of enrichment levels of individual samples from both sample classes. As a proof of concept, we have used this method to analyze three public microarray datasets with a direct comparison with the GSEA method, one of the most popular pathway-level analysis methods in the field. We found that our method was able to reproduce the earlier observations with significant improvements in depth of coverage for validated or expected biological themes, but also produced additional insights that make biological sense. This new method extends existing analyses approaches and facilitates integration of different types of HTP data. PMID:18818771

  14. General constraints on sampling wildlife on FIA plots

    USGS Publications Warehouse

    Bailey, L.L.; Sauer, J.R.; Nichols, J.D.; Geissler, P.H.; McRoberts, Ronald E.; Reams, Gregory A.; Van Deusen, Paul C.; McWilliams, William H.; Cieszewski, Chris J.

    2005-01-01

    This paper reviews the constraints to sampling wildlife populations at FIA points. Wildlife sampling programs must have well-defined goals and provide information adequate to meet those goals. Investigators should choose a State variable based on information needs and the spatial sampling scale. We discuss estimation-based methods for three State variables: species richness, abundance, and patch occupancy. All methods incorporate two essential sources of variation: detectability estimation and spatial variation. FIA sampling imposes specific space and time criteria that may need to be adjusted to meet local wildlife objectives.

  15. An assessment of the liquid-gas partitioning behavior of major wastewater odorants using two comparative experimental approaches: liquid sample-based vaporization vs. impinger-based dynamic headspace extraction into sorbent tubes.

    PubMed

    Iqbal, Mohammad Asif; Kim, Ki-Hyun; Szulejko, Jan E; Cho, Jinwoo

    2014-01-01

    The gas-liquid partitioning behavior of major odorants (acetic acid, propionic acid, isobutyric acid, n-butyric acid, i-valeric acid, n-valeric acid, hexanoic acid, phenol, p-cresol, indole, skatole, and toluene (as a reference)) commonly found in microbially digested wastewaters was investigated by two experimental approaches. Firstly, a simple vaporization method was applied to measure the target odorants dissolved in liquid samples with the aid of sorbent tube/thermal desorption/gas chromatography/mass spectrometry. As an alternative method, an impinger-based dynamic headspace sampling method was also explored to measure the partitioning of target odorants between the gas and liquid phases with the same detection system. The relative extraction efficiency (in percent) of the odorants by dynamic headspace sampling was estimated against the calibration results derived by the vaporization method. Finally, the concentrations of the major odorants in real digested wastewater samples were also analyzed using both analytical approaches. Through a parallel application of the two experimental methods, we intended to develop an experimental approach to be able to assess the liquid-to-gas phase partitioning behavior of major odorants in a complex wastewater system. The relative sensitivity of the two methods expressed in terms of response factor ratios (RFvap/RFimp) of liquid standard calibration between vaporization and impinger-based calibrations varied widely from 981 (skatole) to 6,022 (acetic acid). Comparison of this relative sensitivity thus highlights the rather low extraction efficiency of the highly soluble and more acidic odorants from wastewater samples in dynamic headspace sampling.

  16. Generalized sample entropy analysis for traffic signals based on similarity measure

    NASA Astrophysics Data System (ADS)

    Shang, Du; Xu, Mengjia; Shang, Pengjian

    2017-05-01

    Sample entropy is a prevailing method used to quantify the complexity of a time series. In this paper a modified method of generalized sample entropy and surrogate data analysis is proposed as a new measure to assess the complexity of a complex dynamical system such as traffic signals. The method based on similarity distance presents a different way of signals patterns match showing distinct behaviors of complexity. Simulations are conducted over synthetic data and traffic signals for providing the comparative study, which is provided to show the power of the new method. Compared with previous sample entropy and surrogate data analysis, the new method has two main advantages. The first one is that it overcomes the limitation about the relationship between the dimension parameter and the length of series. The second one is that the modified sample entropy functions can be used to quantitatively distinguish time series from different complex systems by the similar measure.

  17. Measurement of Crystalline Silica Aerosol Using Quantum Cascade Laser-Based Infrared Spectroscopy.

    PubMed

    Wei, Shijun; Kulkarni, Pramod; Ashley, Kevin; Zheng, Lina

    2017-10-24

    Inhalation exposure to airborne respirable crystalline silica (RCS) poses major health risks in many industrial environments. There is a need for new sensitive instruments and methods for in-field or near real-time measurement of crystalline silica aerosol. The objective of this study was to develop an approach, using quantum cascade laser (QCL)-based infrared spectroscopy (IR), to quantify airborne concentrations of RCS. Three sampling methods were investigated for their potential for effective coupling with QCL-based transmittance measurements: (i) conventional aerosol filter collection, (ii) focused spot sample collection directly from the aerosol phase, and (iii) dried spot obtained from deposition of liquid suspensions. Spectral analysis methods were developed to obtain IR spectra from the collected particulate samples in the range 750-1030 cm -1 . The new instrument was calibrated and the results were compared with standardized methods based on Fourier transform infrared (FTIR) spectrometry. Results show that significantly lower detection limits for RCS (≈330 ng), compared to conventional infrared methods, could be achieved with effective microconcentration and careful coupling of the particulate sample with the QCL beam. These results offer promise for further development of sensitive filter-based laboratory methods and portable sensors for near real-time measurement of crystalline silica aerosol.

  18. A new framework of statistical inferences based on the valid joint sampling distribution of the observed counts in an incomplete contingency table.

    PubMed

    Tian, Guo-Liang; Li, Hui-Qiong

    2017-08-01

    Some existing confidence interval methods and hypothesis testing methods in the analysis of a contingency table with incomplete observations in both margins entirely depend on an underlying assumption that the sampling distribution of the observed counts is a product of independent multinomial/binomial distributions for complete and incomplete counts. However, it can be shown that this independency assumption is incorrect and can result in unreliable conclusions because of the under-estimation of the uncertainty. Therefore, the first objective of this paper is to derive the valid joint sampling distribution of the observed counts in a contingency table with incomplete observations in both margins. The second objective is to provide a new framework for analyzing incomplete contingency tables based on the derived joint sampling distribution of the observed counts by developing a Fisher scoring algorithm to calculate maximum likelihood estimates of parameters of interest, the bootstrap confidence interval methods, and the bootstrap testing hypothesis methods. We compare the differences between the valid sampling distribution and the sampling distribution under the independency assumption. Simulation studies showed that average/expected confidence-interval widths of parameters based on the sampling distribution under the independency assumption are shorter than those based on the new sampling distribution, yielding unrealistic results. A real data set is analyzed to illustrate the application of the new sampling distribution for incomplete contingency tables and the analysis results again confirm the conclusions obtained from the simulation studies.

  19. A computer program to obtain time-correlated gust loads for nonlinear aircraft using the matched-filter-based method

    NASA Technical Reports Server (NTRS)

    Scott, Robert C.; Pototzky, Anthony S.; Perry, Boyd, III

    1994-01-01

    NASA Langley Research Center has, for several years, conducted research in the area of time-correlated gust loads for linear and nonlinear aircraft. The results of this work led NASA to recommend that the Matched-Filter-Based One-Dimensional Search Method be used for gust load analyses of nonlinear aircraft. This manual describes this method, describes a FORTRAN code which performs this method, and presents example calculations for a sample nonlinear aircraft model. The name of the code is MFD1DS (Matched-Filter-Based One-Dimensional Search). The program source code, the example aircraft equations of motion, a sample input file, and a sample program output are all listed in the appendices.

  20. Comparison of methods for the extraction of DNA from formalin-fixed, paraffin-embedded archival tissues.

    PubMed

    Sengüven, Burcu; Baris, Emre; Oygur, Tulin; Berktas, Mehmet

    2014-01-01

    Discussing a protocol involving xylene-ethanol deparaffinization on slides followed by a kit-based extraction that allows for the extraction of high quality DNA from FFPE tissues. DNA was extracted from the FFPE tissues of 16 randomly selected blocks. Methods involving deparaffinization on slides or tubes, enzyme digestion overnight or for 72 hours and isolation using phenol chloroform method or a silica-based commercial kit were compared in terms of yields, concentrations and the amplifiability. The highest yield of DNA was produced from the samples that were deparaffinized on slides, digested for 72 hours and isolated with a commercial kit. Samples isolated with the phenol-chloroform method produced DNA of lower purity than the samples that were purified with kit. The samples isolated with the commercial kit resulted in better PCR amplification. Silica-based commercial kits and deparaffinized on slides should be considered for DNA extraction from FFPE.

  1. Assessment of Sample Preparation Bias in Mass Spectrometry-Based Proteomics.

    PubMed

    Klont, Frank; Bras, Linda; Wolters, Justina C; Ongay, Sara; Bischoff, Rainer; Halmos, Gyorgy B; Horvatovich, Péter

    2018-04-17

    For mass spectrometry-based proteomics, the selected sample preparation strategy is a key determinant for information that will be obtained. However, the corresponding selection is often not based on a fit-for-purpose evaluation. Here we report a comparison of in-gel (IGD), in-solution (ISD), on-filter (OFD), and on-pellet digestion (OPD) workflows on the basis of targeted (QconCAT-multiple reaction monitoring (MRM) method for mitochondrial proteins) and discovery proteomics (data-dependent acquisition, DDA) analyses using three different human head and neck tissues (i.e., nasal polyps, parotid gland, and palatine tonsils). Our study reveals differences between the sample preparation methods, for example, with respect to protein and peptide losses, quantification variability, protocol-induced methionine oxidation, and asparagine/glutamine deamidation as well as identification of cysteine-containing peptides. However, none of the methods performed best for all types of tissues, which argues against the existence of a universal sample preparation method for proteome analysis.

  2. Development of gas chromatographic methods for the analyses of organic carbonate-based electrolytes

    NASA Astrophysics Data System (ADS)

    Terborg, Lydia; Weber, Sascha; Passerini, Stefano; Winter, Martin; Karst, Uwe; Nowak, Sascha

    2014-01-01

    In this work, novel methods based on gas chromatography (GC) for the investigation of common organic carbonate-based electrolyte systems are presented, which are used in lithium ion batteries. The methods were developed for flame ionization detection (FID), mass spectrometric detection (MS). Further, headspace (HS) sampling for the investigation of solid samples like electrodes is reported. Limits of detection are reported for FID. Finally, the developed methods were applied to the electrolyte system of commercially available lithium ion batteries as well as on in-house assembled cells.

  3. Comparison of Methods for the Extraction of DNA from Formalin-Fixed, Paraffin-Embedded Archival Tissues

    PubMed Central

    Sengüven, Burcu; Baris, Emre; Oygur, Tulin; Berktas, Mehmet

    2014-01-01

    Aim: Discussing a protocol involving xylene-ethanol deparaffinization on slides followed by a kit-based extraction that allows for the extraction of high quality DNA from FFPE tissues. Methods: DNA was extracted from the FFPE tissues of 16 randomly selected blocks. Methods involving deparaffinization on slides or tubes, enzyme digestion overnight or for 72 hours and isolation using phenol chloroform method or a silica-based commercial kit were compared in terms of yields, concentrations and the amplifiability. Results: The highest yield of DNA was produced from the samples that were deparaffinized on slides, digested for 72 hours and isolated with a commercial kit. Samples isolated with the phenol-chloroform method produced DNA of lower purity than the samples that were purified with kit. The samples isolated with the commercial kit resulted in better PCR amplification. Conclusion: Silica-based commercial kits and deparaffinized on slides should be considered for DNA extraction from FFPE. PMID:24688314

  4. Target discrimination method for SAR images based on semisupervised co-training

    NASA Astrophysics Data System (ADS)

    Wang, Yan; Du, Lan; Dai, Hui

    2018-01-01

    Synthetic aperture radar (SAR) target discrimination is usually performed in a supervised manner. However, supervised methods for SAR target discrimination may need lots of labeled training samples, whose acquirement is costly, time consuming, and sometimes impossible. This paper proposes an SAR target discrimination method based on semisupervised co-training, which utilizes a limited number of labeled samples and an abundant number of unlabeled samples. First, Lincoln features, widely used in SAR target discrimination, are extracted from the training samples and partitioned into two sets according to their physical meanings. Second, two support vector machine classifiers are iteratively co-trained with the extracted two feature sets based on the co-training algorithm. Finally, the trained classifiers are exploited to classify the test data. The experimental results on real SAR images data not only validate the effectiveness of the proposed method compared with the traditional supervised methods, but also demonstrate the superiority of co-training over self-training, which only uses one feature set.

  5. Toward cost-efficient sampling methods

    NASA Astrophysics Data System (ADS)

    Luo, Peng; Li, Yongli; Wu, Chong; Zhang, Guijie

    2015-09-01

    The sampling method has been paid much attention in the field of complex network in general and statistical physics in particular. This paper proposes two new sampling methods based on the idea that a small part of vertices with high node degree could possess the most structure information of a complex network. The two proposed sampling methods are efficient in sampling high degree nodes so that they would be useful even if the sampling rate is low, which means cost-efficient. The first new sampling method is developed on the basis of the widely used stratified random sampling (SRS) method and the second one improves the famous snowball sampling (SBS) method. In order to demonstrate the validity and accuracy of two new sampling methods, we compare them with the existing sampling methods in three commonly used simulation networks that are scale-free network, random network, small-world network, and also in two real networks. The experimental results illustrate that the two proposed sampling methods perform much better than the existing sampling methods in terms of achieving the true network structure characteristics reflected by clustering coefficient, Bonacich centrality and average path length, especially when the sampling rate is low.

  6. Maintaining a permanent plot data base for growth and yield research: Solutions to some recurring problems

    Treesearch

    John C. Byrne

    1993-01-01

    Methods for solving some recurring problems of maintaining a permanent plot data base for growth and yield reseuch are described. These methods include documenting data from diverse sampling designs, changing sampling designs, changing field procedures, and coordinating activities in the plots with the land management agency. Managing a permanent plot data base (...

  7. Sorbent-based sampling methods for volatile and semi-volatile organic compounds in air. Part 2. Sorbent selection and other aspects of optimizing air monitoring methods.

    PubMed

    Woolfenden, Elizabeth

    2010-04-16

    Sorbent tubes/traps are widely used in combination with gas chromatographic (GC) analytical methods to monitor the vapour-phase fraction of organic compounds in air. Applications range from atmospheric research and ambient air monitoring (indoor and outdoor) to occupational hygiene (personal exposure assessment) and measuring chemical emission levels. Part 1 of this paper reviewed the main sorbent-based air sampling strategies including active (pumped) tube monitoring, diffusive (passive) sampling onto sorbent tubes/cartridges plus sorbent trapping/focusing of whole air samples that are either collected in containers (such as canisters or bags) or monitored online. Options for subsequent extraction and transfer to GC(MS) analysis were also summarised and the trend to thermal desorption (TD)-based methods and away from solvent extraction was explained. As a result of this trend, demand for TD-compatible sorbents (alternatives to traditional charcoal) is growing. Part 2 of this paper therefore continues with a summary of TD-compatible sorbents, their respective advantages and limitations and considerations for sorbent selection. Other analytical considerations for optimizing sorbent-based air monitoring methods are also discussed together with recent technical developments and sampling accessories which have extended the application range of sorbent trapping technology generally. Copyright 2010 Elsevier B.V. All rights reserved.

  8. Forensic discrimination of copper wire using trace element concentrations.

    PubMed

    Dettman, Joshua R; Cassabaum, Alyssa A; Saunders, Christopher P; Snyder, Deanna L; Buscaglia, JoAnn

    2014-08-19

    Copper may be recovered as evidence in high-profile cases such as thefts and improvised explosive device incidents; comparison of copper samples from the crime scene and those associated with the subject of an investigation can provide probative associative evidence and investigative support. A solution-based inductively coupled plasma mass spectrometry method for measuring trace element concentrations in high-purity copper was developed using standard reference materials. The method was evaluated for its ability to use trace element profiles to statistically discriminate between copper samples considering the precision of the measurement and manufacturing processes. The discriminating power was estimated by comparing samples chosen on the basis of the copper refining and production process to represent the within-source (samples expected to be similar) and between-source (samples expected to be different) variability using multivariate parametric- and empirical-based data simulation models with bootstrap resampling. If the false exclusion rate is set to 5%, >90% of the copper samples can be correctly determined to originate from different sources using a parametric-based model and >87% with an empirical-based approach. These results demonstrate the potential utility of the developed method for the comparison of copper samples encountered as forensic evidence.

  9. Nonlinear inversion of electrical resistivity imaging using pruning Bayesian neural networks

    NASA Astrophysics Data System (ADS)

    Jiang, Fei-Bo; Dai, Qian-Wei; Dong, Li

    2016-06-01

    Conventional artificial neural networks used to solve electrical resistivity imaging (ERI) inversion problem suffer from overfitting and local minima. To solve these problems, we propose to use a pruning Bayesian neural network (PBNN) nonlinear inversion method and a sample design method based on the K-medoids clustering algorithm. In the sample design method, the training samples of the neural network are designed according to the prior information provided by the K-medoids clustering results; thus, the training process of the neural network is well guided. The proposed PBNN, based on Bayesian regularization, is used to select the hidden layer structure by assessing the effect of each hidden neuron to the inversion results. Then, the hyperparameter α k , which is based on the generalized mean, is chosen to guide the pruning process according to the prior distribution of the training samples under the small-sample condition. The proposed algorithm is more efficient than other common adaptive regularization methods in geophysics. The inversion of synthetic data and field data suggests that the proposed method suppresses the noise in the neural network training stage and enhances the generalization. The inversion results with the proposed method are better than those of the BPNN, RBFNN, and RRBFNN inversion methods as well as the conventional least squares inversion.

  10. A Comparison of Two Sampling Strategies to Assess Discomycete Diversity in Wet Tropical Forests

    Treesearch

    SHARON A. CANTRELL

    2004-01-01

    Most of the fungal diversity studies that have used a systematic collecting scheme have not included the discomycetes, so optimal sampling methods are not available for this group. In this study, I tested two sampling methods at each sites in the Caribbean National Forest, Puerto Rico and Ebano Verde Reserve, Dominican Republic. For a plot-based sampling method, 10 ×...

  11. A Sequential Optimization Sampling Method for Metamodels with Radial Basis Functions

    PubMed Central

    Pan, Guang; Ye, Pengcheng; Yang, Zhidong

    2014-01-01

    Metamodels have been widely used in engineering design to facilitate analysis and optimization of complex systems that involve computationally expensive simulation programs. The accuracy of metamodels is strongly affected by the sampling methods. In this paper, a new sequential optimization sampling method is proposed. Based on the new sampling method, metamodels can be constructed repeatedly through the addition of sampling points, namely, extrema points of metamodels and minimum points of density function. Afterwards, the more accurate metamodels would be constructed by the procedure above. The validity and effectiveness of proposed sampling method are examined by studying typical numerical examples. PMID:25133206

  12. 40 CFR 761.289 - Compositing samples.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... Remediation Waste and Porous Surfaces in Accordance With § 761.61(a)(6) § 761.289 Compositing samples. Compositing is a method of combining several samples of a specific type of bulk PCB remediation waste or... compositing bulk PCB remediation waste samples. These procedures are based on the method for selecting...

  13. 40 CFR 761.289 - Compositing samples.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Remediation Waste and Porous Surfaces in Accordance With § 761.61(a)(6) § 761.289 Compositing samples. Compositing is a method of combining several samples of a specific type of bulk PCB remediation waste or... compositing bulk PCB remediation waste samples. These procedures are based on the method for selecting...

  14. A fast learning method for large scale and multi-class samples of SVM

    NASA Astrophysics Data System (ADS)

    Fan, Yu; Guo, Huiming

    2017-06-01

    A multi-class classification SVM(Support Vector Machine) fast learning method based on binary tree is presented to solve its low learning efficiency when SVM processing large scale multi-class samples. This paper adopts bottom-up method to set up binary tree hierarchy structure, according to achieved hierarchy structure, sub-classifier learns from corresponding samples of each node. During the learning, several class clusters are generated after the first clustering of the training samples. Firstly, central points are extracted from those class clusters which just have one type of samples. For those which have two types of samples, cluster numbers of their positive and negative samples are set respectively according to their mixture degree, secondary clustering undertaken afterwards, after which, central points are extracted from achieved sub-class clusters. By learning from the reduced samples formed by the integration of extracted central points above, sub-classifiers are obtained. Simulation experiment shows that, this fast learning method, which is based on multi-level clustering, can guarantee higher classification accuracy, greatly reduce sample numbers and effectively improve learning efficiency.

  15. Time Series Analysis Based on Running Mann Whitney Z Statistics

    USDA-ARS?s Scientific Manuscript database

    A sensitive and objective time series analysis method based on the calculation of Mann Whitney U statistics is described. This method samples data rankings over moving time windows, converts those samples to Mann-Whitney U statistics, and then normalizes the U statistics to Z statistics using Monte-...

  16. Plane-Based Sampling for Ray Casting Algorithm in Sequential Medical Images

    PubMed Central

    Lin, Lili; Chen, Shengyong; Shao, Yan; Gu, Zichun

    2013-01-01

    This paper proposes a plane-based sampling method to improve the traditional Ray Casting Algorithm (RCA) for the fast reconstruction of a three-dimensional biomedical model from sequential images. In the novel method, the optical properties of all sampling points depend on the intersection points when a ray travels through an equidistant parallel plan cluster of the volume dataset. The results show that the method improves the rendering speed at over three times compared with the conventional algorithm and the image quality is well guaranteed. PMID:23424608

  17. Comparison of Standard Culture-Based Method to Culture-Independent Method for Evaluation of Hygiene Effects on the Hand Microbiome.

    PubMed

    Zapka, C; Leff, J; Henley, J; Tittl, J; De Nardo, E; Butler, M; Griggs, R; Fierer, N; Edmonds-Wilson, S

    2017-03-28

    Hands play a critical role in the transmission of microbiota on one's own body, between individuals, and on environmental surfaces. Effectively measuring the composition of the hand microbiome is important to hand hygiene science, which has implications for human health. Hand hygiene products are evaluated using standard culture-based methods, but standard test methods for culture-independent microbiome characterization are lacking. We sampled the hands of 50 participants using swab-based and glove-based methods prior to and following four hand hygiene treatments (using a nonantimicrobial hand wash, alcohol-based hand sanitizer [ABHS], a 70% ethanol solution, or tap water). We compared results among culture plate counts, 16S rRNA gene sequencing of DNA extracted directly from hands, and sequencing of DNA extracted from culture plates. Glove-based sampling yielded higher numbers of unique operational taxonomic units (OTUs) but had less diversity in bacterial community composition than swab-based sampling. We detected treatment-induced changes in diversity only by using swab-based samples ( P < 0.001); we were unable to detect changes with glove-based samples. Bacterial cell counts significantly decreased with use of the ABHS ( P < 0.05) and ethanol control ( P < 0.05). Skin hydration at baseline correlated with bacterial abundances, bacterial community composition, pH, and redness across subjects. The importance of the method choice was substantial. These findings are important to ensure improvement of hand hygiene industry methods and for future hand microbiome studies. On the basis of our results and previously published studies, we propose recommendations for best practices in hand microbiome research. IMPORTANCE The hand microbiome is a critical area of research for diverse fields, such as public health and forensics. The suitability of culture-independent methods for assessing effects of hygiene products on microbiota has not been demonstrated. This is the first controlled laboratory clinical hand study to have compared traditional hand hygiene test methods with newer culture-independent characterization methods typically used by skin microbiologists. This study resulted in recommendations for hand hygiene product testing, development of methods, and future hand skin microbiome research. It also demonstrated the importance of inclusion of skin physiological metadata in skin microbiome research, which is atypical for skin microbiome studies. Copyright © 2017 Zapka et al.

  18. Comparing real-time and conventional PCR to culture-based methods for detecting and quantifying Escherichia coli O157 in cattle feces.

    PubMed

    Jacob, M E; Bai, J; Renter, D G; Rogers, A T; Shi, X; Nagaraja, T G

    2014-02-01

    Detection of Escherichia coli O157 in cattle feces has traditionally used culture-based methods; PCR-based methods have been suggested as an alternative. We aimed to determine if multiplex real-time (mq) or conventional PCR methods could reliably detect cattle naturally shedding high (≥10(4) CFU/g of feces) and low (∼10(2) CFU/g of feces) concentrations of E. coli O157. Feces were collected from pens of feedlot cattle and evaluated for E. coli O157 by culture methods. Samples were categorized as (i) high shedders, (ii) immunomagnetic separation (IMS) positive after enrichment, or (iii) culture negative. DNA was extracted pre- and postenrichment from 100 fecal samples from each category (high shedder, IMS positive, culture negative) and subjected to mqPCR and conventional PCR assays based on detecting three genes, rfbE, stx1, and stx2. In feces from cattle determined to be E. coli O157 high shedders by culture, 37% were positive by mqPCR prior to enrichment; 85% of samples were positive after enrichment. In IMS-positive samples, 4% were positive by mqPCR prior to enrichment, while 43% were positive after enrichment. In culture-negative feces, 7% were positive by mqPCR prior to enrichment, and 40% were positive after enrichment. The proportion of high shedder-positive and culture-positive (high shedder and IMS) samples were significantly different from mqPCR-positive samples before and after enrichment (P < 0.01). Similar results were observed for conventional PCR. Our data suggest that mqPCR and conventional PCR are most useful in identifying high shedder animals and may not be an appropriate substitute to culture-based methods for detection of E. coli O157 in cattle feces.

  19. Evaluation and comparison of FTA card and CTAB DNA extraction methods for non-agricultural taxa.

    PubMed

    Siegel, Chloe S; Stevenson, Florence O; Zimmer, Elizabeth A

    2017-02-01

    An efficient, effective DNA extraction method is necessary for comprehensive analysis of plant genomes. This study analyzed the quality of DNA obtained using paper FTA cards prepared directly in the field when compared to the more traditional cetyltrimethylammonium bromide (CTAB)-based extraction methods from silica-dried samples. DNA was extracted using FTA cards according to the manufacturer's protocol. In parallel, CTAB-based extractions were done using the automated AutoGen DNA isolation system. DNA quality for both methods was determined for 15 non-agricultural species collected in situ, by gel separation, spectrophotometry, fluorometry, and successful amplification and sequencing of nuclear and chloroplast gene markers. The FTA card extraction method yielded less concentrated, but also less fragmented samples than the CTAB-based technique. The card-extracted samples provided DNA that could be successfully amplified and sequenced. The FTA cards are also useful because the collected samples do not require refrigeration, extensive laboratory expertise, or as many hazardous chemicals as extractions using the CTAB-based technique. The relative success of the FTA card method in our study suggested that this method could be a valuable tool for studies in plant population genetics and conservation biology that may involve screening of hundreds of individual plants. The FTA cards, like the silica gel samples, do not contain plant material capable of propagation, and therefore do not require permits from the U.S. Department of Agriculture (USDA) Animal and Plant Health Inspection Service (APHIS) for transportation.

  20. An empirical comparison of isolate-based and sample-based definitions of antimicrobial resistance and their effect on estimates of prevalence.

    PubMed

    Humphry, R W; Evans, J; Webster, C; Tongue, S C; Innocent, G T; Gunn, G J

    2018-02-01

    Antimicrobial resistance is primarily a problem in human medicine but there are unquantified links of transmission in both directions between animal and human populations. Quantitative assessment of the costs and benefits of reduced antimicrobial usage in livestock requires robust quantification of transmission of resistance between animals, the environment and the human population. This in turn requires appropriate measurement of resistance. To tackle this we selected two different methods for determining whether a sample is resistant - one based on screening a sample, the other on testing individual isolates. Our overall objective was to explore the differences arising from choice of measurement. A literature search demonstrated the widespread use of testing of individual isolates. The first aim of this study was to compare, quantitatively, sample level and isolate level screening. Cattle or sheep faecal samples (n=41) submitted for routine parasitology were tested for antimicrobial resistance in two ways: (1) "streak" direct culture onto plates containing the antimicrobial of interest; (2) determination of minimum inhibitory concentration (MIC) of 8-10 isolates per sample compared to published MIC thresholds. Two antibiotics (ampicillin and nalidixic acid) were tested. With ampicillin, direct culture resulted in more than double the number of resistant samples than the MIC method based on eight individual isolates. The second aim of this study was to demonstrate the utility of the observed relationship between these two measures of antimicrobial resistance to re-estimate the prevalence of antimicrobial resistance from a previous study, in which we had used "streak" cultures. Boot-strap methods were used to estimate the proportion of samples that would have tested resistant in the historic study, had we used the isolate-based MIC method instead. Our boot-strap results indicate that our estimates of prevalence of antimicrobial resistance would have been considerably lower in the historic study had the MIC method been used. Finally we conclude that there is no single way of defining a sample as resistant to an antimicrobial agent. The method used greatly affects the estimated prevalence of antimicrobial resistance in a sampled population of animals, thus potentially resulting in misleading results. Comparing methods on the same samples allows us to re-estimate the prevalence from other studies, had other methods for determining resistance been used. The results of this study highlight the importance of establishing what the most appropriate measure of antimicrobial resistance is, for the proposed purpose of the results. Copyright © 2017 Elsevier B.V. All rights reserved.

  1. The EIPeptiDi tool: enhancing peptide discovery in ICAT-based LC MS/MS experiments.

    PubMed

    Cannataro, Mario; Cuda, Giovanni; Gaspari, Marco; Greco, Sergio; Tradigo, Giuseppe; Veltri, Pierangelo

    2007-07-15

    Isotope-coded affinity tags (ICAT) is a method for quantitative proteomics based on differential isotopic labeling, sample digestion and mass spectrometry (MS). The method allows the identification and relative quantification of proteins present in two samples and consists of the following phases. First, cysteine residues are either labeled using the ICAT Light or ICAT Heavy reagent (having identical chemical properties but different masses). Then, after whole sample digestion, the labeled peptides are captured selectively using the biotin tag contained in both ICAT reagents. Finally, the simplified peptide mixture is analyzed by nanoscale liquid chromatography-tandem mass spectrometry (LC-MS/MS). Nevertheless, the ICAT LC-MS/MS method still suffers from insufficient sample-to-sample reproducibility on peptide identification. In particular, the number and the type of peptides identified in different experiments can vary considerably and, thus, the statistical (comparative) analysis of sample sets is very challenging. Low information overlap at the peptide and, consequently, at the protein level, is very detrimental in situations where the number of samples to be analyzed is high. We designed a method for improving the data processing and peptide identification in sample sets subjected to ICAT labeling and LC-MS/MS analysis, based on cross validating MS/MS results. Such a method has been implemented in a tool, called EIPeptiDi, which boosts the ICAT data analysis software improving peptide identification throughout the input data set. Heavy/Light (H/L) pairs quantified but not identified by the MS/MS routine, are assigned to peptide sequences identified in other samples, by using similarity criteria based on chromatographic retention time and Heavy/Light mass attributes. EIPeptiDi significantly improves the number of identified peptides per sample, proving that the proposed method has a considerable impact on the protein identification process and, consequently, on the amount of potentially critical information in clinical studies. The EIPeptiDi tool is available at http://bioingegneria.unicz.it/~veltri/projects/eipeptidi/ with a demo data set. EIPeptiDi significantly increases the number of peptides identified and quantified in analyzed samples, thus reducing the number of unassigned H/L pairs and allowing a better comparative analysis of sample data sets.

  2. Comparison of Methods for Estimating Low Flow Characteristics of Streams

    USGS Publications Warehouse

    Tasker, Gary D.

    1987-01-01

    Four methods for estimating the 7-day, 10-year and 7-day, 20-year low flows for streams are compared by the bootstrap method. The bootstrap method is a Monte Carlo technique in which random samples are drawn from an unspecified sampling distribution defined from observed data. The nonparametric nature of the bootstrap makes it suitable for comparing methods based on a flow series for which the true distribution is unknown. Results show that the two methods based on hypothetical distribution (Log-Pearson III and Weibull) had lower mean square errors than did the G. E. P. Box-D. R. Cox transformation method or the Log-W. C. Boughton method which is based on a fit of plotting positions.

  3. The dynamic micro computed tomography at SSRF

    NASA Astrophysics Data System (ADS)

    Chen, R.; Xu, L.; Du, G.; Deng, B.; Xie, H.; Xiao, T.

    2018-05-01

    Synchrotron radiation micro-computed tomography (SR-μCT) is a critical technique for quantitative characterizing the 3D internal structure of samples, recently the dynamic SR-μCT has been attracting vast attention since it can evaluate the three-dimensional structure evolution of a sample. A dynamic μCT method, which is based on monochromatic beam, was developed at the X-ray Imaging and Biomedical Application Beamline at Shanghai Synchrotron Radiation Facility, by combining the compressed sensing based CT reconstruction algorithm and hardware upgrade. The monochromatic beam based method can achieve quantitative information, and lower dose than the white beam base method in which the lower energy beam is absorbed by the sample rather than contribute to the final imaging signal. The developed method is successfully used to investigate the compression of the air sac during respiration in a bell cricket, providing new knowledge for further research on the insect respiratory system.

  4. Palladium-based Mass-Tag Cell Barcoding with a Doublet-Filtering Scheme and Single Cell Deconvolution Algorithm

    PubMed Central

    Zunder, Eli R.; Finck, Rachel; Behbehani, Gregory K.; Amir, El-ad D.; Krishnaswamy, Smita; Gonzalez, Veronica D.; Lorang, Cynthia G.; Bjornson, Zach; Spitzer, Matthew H.; Bodenmiller, Bernd; Fantl, Wendy J.; Pe’er, Dana; Nolan, Garry P.

    2015-01-01

    SUMMARY Mass-tag cell barcoding (MCB) labels individual cell samples with unique combinatorial barcodes, after which they are pooled for processing and measurement as a single multiplexed sample. The MCB method eliminates variability between samples in antibody staining and instrument sensitivity, reduces antibody consumption, and shortens instrument measurement time. Here, we present an optimized MCB protocol with several improvements over previously described methods. The use of palladium-based labeling reagents expands the number of measurement channels available for mass cytometry and reduces interference with lanthanide-based antibody measurement. An error-detecting combinatorial barcoding scheme allows cell doublets to be identified and removed from the analysis. A debarcoding algorithm that is single cell-based rather than population-based improves the accuracy and efficiency of sample deconvolution. This debarcoding algorithm has been packaged into software that allows rapid and unbiased sample deconvolution. The MCB procedure takes 3–4 h, not including sample acquisition time of ~1 h per million cells. PMID:25612231

  5. Item Selection for the Development of Parallel Forms from an IRT-Based Seed Test Using a Sampling and Classification Approach

    ERIC Educational Resources Information Center

    Chen, Pei-Hua; Chang, Hua-Hua; Wu, Haiyan

    2012-01-01

    Two sampling-and-classification-based procedures were developed for automated test assembly: the Cell Only and the Cell and Cube methods. A simulation study based on a 540-item bank was conducted to compare the performance of the procedures with the performance of a mixed-integer programming (MIP) method for assembling multiple parallel test…

  6. Evaluation and comparison of FTA card and CTAB DNA extraction methods for non-agricultural taxa1

    PubMed Central

    Siegel, Chloe S.; Stevenson, Florence O.; Zimmer, Elizabeth A.

    2017-01-01

    Premise of the study: An efficient, effective DNA extraction method is necessary for comprehensive analysis of plant genomes. This study analyzed the quality of DNA obtained using paper FTA cards prepared directly in the field when compared to the more traditional cetyltrimethylammonium bromide (CTAB)–based extraction methods from silica-dried samples. Methods: DNA was extracted using FTA cards according to the manufacturer’s protocol. In parallel, CTAB-based extractions were done using the automated AutoGen DNA isolation system. DNA quality for both methods was determined for 15 non-agricultural species collected in situ, by gel separation, spectrophotometry, fluorometry, and successful amplification and sequencing of nuclear and chloroplast gene markers. Results: The FTA card extraction method yielded less concentrated, but also less fragmented samples than the CTAB-based technique. The card-extracted samples provided DNA that could be successfully amplified and sequenced. The FTA cards are also useful because the collected samples do not require refrigeration, extensive laboratory expertise, or as many hazardous chemicals as extractions using the CTAB-based technique. Discussion: The relative success of the FTA card method in our study suggested that this method could be a valuable tool for studies in plant population genetics and conservation biology that may involve screening of hundreds of individual plants. The FTA cards, like the silica gel samples, do not contain plant material capable of propagation, and therefore do not require permits from the U.S. Department of Agriculture (USDA) Animal and Plant Health Inspection Service (APHIS) for transportation. PMID:28224056

  7. Density-based empirical likelihood procedures for testing symmetry of data distributions and K-sample comparisons.

    PubMed

    Vexler, Albert; Tanajian, Hovig; Hutson, Alan D

    In practice, parametric likelihood-ratio techniques are powerful statistical tools. In this article, we propose and examine novel and simple distribution-free test statistics that efficiently approximate parametric likelihood ratios to analyze and compare distributions of K groups of observations. Using the density-based empirical likelihood methodology, we develop a Stata package that applies to a test for symmetry of data distributions and compares K -sample distributions. Recognizing that recent statistical software packages do not sufficiently address K -sample nonparametric comparisons of data distributions, we propose a new Stata command, vxdbel, to execute exact density-based empirical likelihood-ratio tests using K samples. To calculate p -values of the proposed tests, we use the following methods: 1) a classical technique based on Monte Carlo p -value evaluations; 2) an interpolation technique based on tabulated critical values; and 3) a new hybrid technique that combines methods 1 and 2. The third, cutting-edge method is shown to be very efficient in the context of exact-test p -value computations. This Bayesian-type method considers tabulated critical values as prior information and Monte Carlo generations of test statistic values as data used to depict the likelihood function. In this case, a nonparametric Bayesian method is proposed to compute critical values of exact tests.

  8. THE SCREENING AND RANKING ALGORITHM FOR CHANGE-POINTS DETECTION IN MULTIPLE SAMPLES

    PubMed Central

    Song, Chi; Min, Xiaoyi; Zhang, Heping

    2016-01-01

    The chromosome copy number variation (CNV) is the deviation of genomic regions from their normal copy number states, which may associate with many human diseases. Current genetic studies usually collect hundreds to thousands of samples to study the association between CNV and diseases. CNVs can be called by detecting the change-points in mean for sequences of array-based intensity measurements. Although multiple samples are of interest, the majority of the available CNV calling methods are single sample based. Only a few multiple sample methods have been proposed using scan statistics that are computationally intensive and designed toward either common or rare change-points detection. In this paper, we propose a novel multiple sample method by adaptively combining the scan statistic of the screening and ranking algorithm (SaRa), which is computationally efficient and is able to detect both common and rare change-points. We prove that asymptotically this method can find the true change-points with almost certainty and show in theory that multiple sample methods are superior to single sample methods when shared change-points are of interest. Additionally, we report extensive simulation studies to examine the performance of our proposed method. Finally, using our proposed method as well as two competing approaches, we attempt to detect CNVs in the data from the Primary Open-Angle Glaucoma Genes and Environment study, and conclude that our method is faster and requires less information while our ability to detect the CNVs is comparable or better. PMID:28090239

  9. A random spatial sampling method in a rural developing nation

    Treesearch

    Michelle C. Kondo; Kent D.W. Bream; Frances K. Barg; Charles C. Branas

    2014-01-01

    Nonrandom sampling of populations in developing nations has limitations and can inaccurately estimate health phenomena, especially among hard-to-reach populations such as rural residents. However, random sampling of rural populations in developing nations can be challenged by incomplete enumeration of the base population. We describe a stratified random sampling method...

  10. Domain Regeneration for Cross-Database Micro-Expression Recognition

    NASA Astrophysics Data System (ADS)

    Zong, Yuan; Zheng, Wenming; Huang, Xiaohua; Shi, Jingang; Cui, Zhen; Zhao, Guoying

    2018-05-01

    In this paper, we investigate the cross-database micro-expression recognition problem, where the training and testing samples are from two different micro-expression databases. Under this setting, the training and testing samples would have different feature distributions and hence the performance of most existing micro-expression recognition methods may decrease greatly. To solve this problem, we propose a simple yet effective method called Target Sample Re-Generator (TSRG) in this paper. By using TSRG, we are able to re-generate the samples from target micro-expression database and the re-generated target samples would share same or similar feature distributions with the original source samples. For this reason, we can then use the classifier learned based on the labeled source samples to accurately predict the micro-expression categories of the unlabeled target samples. To evaluate the performance of the proposed TSRG method, extensive cross-database micro-expression recognition experiments designed based on SMIC and CASME II databases are conducted. Compared with recent state-of-the-art cross-database emotion recognition methods, the proposed TSRG achieves more promising results.

  11. Ionic liquid-based reagents improve the stability of midterm fecal sample storage.

    PubMed

    Hao, Lilan; Xia, Zhongkui; Yang, Huanming; Wang, Jian; Han, Mo

    2017-08-01

    Fecal samples are widely used in metagenomic research, which aims to elucidate the relationship between human health and the intestinal microbiota. However, the best conditions for stable and reliable storage and transport of these samples at room temperature are still unknown, and whether samples stored at room temperature for several days will maintain their microbiota composition is still unknown. Here, we established and tested a preservation method using reagents containing imidazolium- or pyridinium-based ionic liquids. We stored human fecal samples in these reagents for up to 7 days at different temperatures. Subsequently, all samples were sequenced and compared with fresh samples and/or samples treated under other conditions. The 16S rRNA sequencing results suggested that ionic liquid-based reagents could stabilize the composition of the microbiota in fecal samples during a 7-day storage period, particularly when stored at room temperature. Thus, this method may have implications in the storage of fecal samples for metagenomic research. Copyright © 2017 Elsevier B.V. All rights reserved.

  12. Evaluating the effect of disturbed ensemble distributions on SCFG based statistical sampling of RNA secondary structures.

    PubMed

    Scheid, Anika; Nebel, Markus E

    2012-07-09

    Over the past years, statistical and Bayesian approaches have become increasingly appreciated to address the long-standing problem of computational RNA structure prediction. Recently, a novel probabilistic method for the prediction of RNA secondary structures from a single sequence has been studied which is based on generating statistically representative and reproducible samples of the entire ensemble of feasible structures for a particular input sequence. This method samples the possible foldings from a distribution implied by a sophisticated (traditional or length-dependent) stochastic context-free grammar (SCFG) that mirrors the standard thermodynamic model applied in modern physics-based prediction algorithms. Specifically, that grammar represents an exact probabilistic counterpart to the energy model underlying the Sfold software, which employs a sampling extension of the partition function (PF) approach to produce statistically representative subsets of the Boltzmann-weighted ensemble. Although both sampling approaches have the same worst-case time and space complexities, it has been indicated that they differ in performance (both with respect to prediction accuracy and quality of generated samples), where neither of these two competing approaches generally outperforms the other. In this work, we will consider the SCFG based approach in order to perform an analysis on how the quality of generated sample sets and the corresponding prediction accuracy changes when different degrees of disturbances are incorporated into the needed sampling probabilities. This is motivated by the fact that if the results prove to be resistant to large errors on the distinct sampling probabilities (compared to the exact ones), then it will be an indication that these probabilities do not need to be computed exactly, but it may be sufficient and more efficient to approximate them. Thus, it might then be possible to decrease the worst-case time requirements of such an SCFG based sampling method without significant accuracy losses. If, on the other hand, the quality of sampled structures can be observed to strongly react to slight disturbances, there is little hope for improving the complexity by heuristic procedures. We hence provide a reliable test for the hypothesis that a heuristic method could be implemented to improve the time scaling of RNA secondary structure prediction in the worst-case - without sacrificing much of the accuracy of the results. Our experiments indicate that absolute errors generally lead to the generation of useless sample sets, whereas relative errors seem to have only small negative impact on both the predictive accuracy and the overall quality of resulting structure samples. Based on these observations, we present some useful ideas for developing a time-reduced sampling method guaranteeing an acceptable predictive accuracy. We also discuss some inherent drawbacks that arise in the context of approximation. The key results of this paper are crucial for the design of an efficient and competitive heuristic prediction method based on the increasingly accepted and attractive statistical sampling approach. This has indeed been indicated by the construction of prototype algorithms.

  13. Evaluating the effect of disturbed ensemble distributions on SCFG based statistical sampling of RNA secondary structures

    PubMed Central

    2012-01-01

    Background Over the past years, statistical and Bayesian approaches have become increasingly appreciated to address the long-standing problem of computational RNA structure prediction. Recently, a novel probabilistic method for the prediction of RNA secondary structures from a single sequence has been studied which is based on generating statistically representative and reproducible samples of the entire ensemble of feasible structures for a particular input sequence. This method samples the possible foldings from a distribution implied by a sophisticated (traditional or length-dependent) stochastic context-free grammar (SCFG) that mirrors the standard thermodynamic model applied in modern physics-based prediction algorithms. Specifically, that grammar represents an exact probabilistic counterpart to the energy model underlying the Sfold software, which employs a sampling extension of the partition function (PF) approach to produce statistically representative subsets of the Boltzmann-weighted ensemble. Although both sampling approaches have the same worst-case time and space complexities, it has been indicated that they differ in performance (both with respect to prediction accuracy and quality of generated samples), where neither of these two competing approaches generally outperforms the other. Results In this work, we will consider the SCFG based approach in order to perform an analysis on how the quality of generated sample sets and the corresponding prediction accuracy changes when different degrees of disturbances are incorporated into the needed sampling probabilities. This is motivated by the fact that if the results prove to be resistant to large errors on the distinct sampling probabilities (compared to the exact ones), then it will be an indication that these probabilities do not need to be computed exactly, but it may be sufficient and more efficient to approximate them. Thus, it might then be possible to decrease the worst-case time requirements of such an SCFG based sampling method without significant accuracy losses. If, on the other hand, the quality of sampled structures can be observed to strongly react to slight disturbances, there is little hope for improving the complexity by heuristic procedures. We hence provide a reliable test for the hypothesis that a heuristic method could be implemented to improve the time scaling of RNA secondary structure prediction in the worst-case – without sacrificing much of the accuracy of the results. Conclusions Our experiments indicate that absolute errors generally lead to the generation of useless sample sets, whereas relative errors seem to have only small negative impact on both the predictive accuracy and the overall quality of resulting structure samples. Based on these observations, we present some useful ideas for developing a time-reduced sampling method guaranteeing an acceptable predictive accuracy. We also discuss some inherent drawbacks that arise in the context of approximation. The key results of this paper are crucial for the design of an efficient and competitive heuristic prediction method based on the increasingly accepted and attractive statistical sampling approach. This has indeed been indicated by the construction of prototype algorithms. PMID:22776037

  14. Evaluation of a polymerase chain reaction-based system for detection of Salmonella enteritidis, Escherichia coli O157:H7, Listeria spp., and Listeria monocytogenes on fresh fruits and vegetables.

    PubMed

    Shearer, A E; Strapp, C M; Joerger, R D

    2001-06-01

    A polymerase chain reaction (PCR)-based detection system, BAX, was evaluated for its sensitivity in detecting Salmonella Enteritidis, Escherichia coli O157:H7, Listeria sp., and Listeria monocytogenes on fresh produce. Fifteen different types of produce (alfalfa sprouts, green peppers, parsley, white cabbage, radishes, onions, carrots, mushrooms, leaf lettuce, tomatoes, strawberries, cantaloupe, mango, apples, and oranges) were inoculated, in separate studies, with Salmonella Enteritidis, E. coli O157:H7, and L. monocytogenes down to the predicted level of 1 CFU per 25-g sample. Detection by BAX was compared to recovery of the inoculated bacteria by culture methods according to the Food and Drug Administration's (FDA) Bacteriological Analytical Manual (BAM). BAX was essentially as sensitive as the culture-based method in detecting Salmonella Enteritidis and L. monocytogenes and more sensitive than the culture-based method for the detection of E. coli O157:H7 on green pepper, carrot, radish, and sprout samples. Detection of the pathogenic bacteria in samples spiked with a predicted number of less than 10 CFU was possible for most produce samples, but both methods failed to detect L. monocytogenes on carrot samples and one of two mushroom and onion samples spiked with less than 100 CFU. Both BAX and the culture method were also unable to consistently recover low numbers of E. coli O157:H7 from alfalfa sprouts. The PCR method allowed detection of Salmonella Enteritidis, E. coli O157:H7, and L. monocytogenes at least 2 days earlier than the conventional culture methods.

  15. A comparison of moment-based methods of estimation for the log Pearson type 3 distribution

    NASA Astrophysics Data System (ADS)

    Koutrouvelis, I. A.; Canavos, G. C.

    2000-06-01

    The log Pearson type 3 distribution is a very important model in statistical hydrology, especially for modeling annual flood series. In this paper we compare the various methods based on moments for estimating quantiles of this distribution. Besides the methods of direct and mixed moments which were found most successful in previous studies and the well-known indirect method of moments, we develop generalized direct moments and generalized mixed moments methods and a new method of adaptive mixed moments. The last method chooses the orders of two moments for the original observations by utilizing information contained in the sample itself. The results of Monte Carlo experiments demonstrated the superiority of this method in estimating flood events of high return periods when a large sample is available and in estimating flood events of low return periods regardless of the sample size. In addition, a comparison of simulation and asymptotic results shows that the adaptive method may be used for the construction of meaningful confidence intervals for design events based on the asymptotic theory even with small samples. The simulation results also point to the specific members of the class of generalized moments estimates which maintain small values for bias and/or mean square error.

  16. [Identification of spill oil species based on low concentration synchronous fluorescence spectra and RBF neural network].

    PubMed

    Liu, Qian-qian; Wang, Chun-yan; Shi, Xiao-feng; Li, Wen-dong; Luan, Xiao-ning; Hou, Shi-lin; Zhang, Jin-liang; Zheng, Rong-er

    2012-04-01

    In this paper, a new method was developed to differentiate the spill oil samples. The synchronous fluorescence spectra in the lower nonlinear concentration range of 10(-2) - 10(-1) g x L(-1) were collected to get training data base. Radial basis function artificial neural network (RBF-ANN) was used to identify the samples sets, along with principal component analysis (PCA) as the feature extraction method. The recognition rate of the closely-related oil source samples is 92%. All the results demonstrated that the proposed method could identify the crude oil samples effectively by just one synchronous spectrum of the spill oil sample. The method was supposed to be very suitable to the real-time spill oil identification, and can also be easily applied to the oil logging and the analysis of other multi-PAHs or multi-fluorescent mixtures.

  17. A New Method for Estimating Bacterial Abundances in Natural Samples using Sublimation

    NASA Technical Reports Server (NTRS)

    Glavin, Daniel P.; Cleaves, H. James; Schubert, Michael; Aubrey, Andrew; Bada, Jeffrey L.

    2004-01-01

    We have developed a new method based on the sublimation of adenine from Escherichia coli to estimate bacterial cell counts in natural samples. To demonstrate this technique, several types of natural samples including beach sand, seawater, deep-sea sediment, and two soil samples from the Atacama Desert were heated to a temperature of 500 C for several seconds under reduced pressure. The sublimate was collected on a cold finger and the amount of adenine released from the samples then determined by high performance liquid chromatography (HPLC) with UV absorbance detection. Based on the total amount of adenine recovered from DNA and RNA in these samples, we estimated bacterial cell counts ranging from approx. l0(exp 5) to l0(exp 9) E. coli cell equivalents per gram. For most of these samples, the sublimation based cell counts were in agreement with total bacterial counts obtained by traditional DAPI staining. The simplicity and robustness of the sublimation technique compared to the DAPI staining method makes this approach particularly attractive for use by spacecraft instrumentation. NASA is currently planning to send a lander to Mars in 2009 in order to assess whether or not organic compounds, especially those that might be associated with life, are present in Martian surface samples. Based on our analyses of the Atacama Desert soil samples, several million bacterial cells per gam of Martian soil should be detectable using this sublimation technique.

  18. A two-step electrodialysis method for DNA purification from polluted metallic environmental samples.

    PubMed

    Rodríguez-Mejía, José Luis; Martínez-Anaya, Claudia; Folch-Mallol, Jorge Luis; Dantán-González, Edgar

    2008-08-01

    Extracting DNA from samples of polluted environments using standard methods often results in low yields of poor-quality material unsuited to subsequent manipulation and analysis by molecular biological techniques. Here, we report a novel two-step electrodialysis-based method for the extraction of DNA from environmental samples. This technique permits the rapid and efficient isolation of high-quality DNA based on its acidic nature, and without the requirement for phenol-chloroform-isoamyl alcohol cleanup and ethanol precipitation steps. Subsequent PCR, endonuclease restriction, and cloning reactions were successfully performed utilizing DNA obtained by electrodialysis, whereas some or all of these techniques failed using DNA extracted with two alternative methods. We also show that his technique is applicable to purify DNA from a range of polluted and nonpolluted samples.

  19. Sampling methods, dispersion patterns, and fixed precision sequential sampling plans for western flower thrips (Thysanoptera: Thripidae) and cotton fleahoppers (Hemiptera: Miridae) in cotton.

    PubMed

    Parajulee, M N; Shrestha, R B; Leser, J F

    2006-04-01

    A 2-yr field study was conducted to examine the effectiveness of two sampling methods (visual and plant washing techniques) for western flower thrips, Frankliniella occidentalis (Pergande), and five sampling methods (visual, beat bucket, drop cloth, sweep net, and vacuum) for cotton fleahopper, Pseudatomoscelis seriatus (Reuter), in Texas cotton, Gossypium hirsutum (L.), and to develop sequential sampling plans for each pest. The plant washing technique gave similar results to the visual method in detecting adult thrips, but the washing technique detected significantly higher number of thrips larvae compared with the visual sampling. Visual sampling detected the highest number of fleahoppers followed by beat bucket, drop cloth, vacuum, and sweep net sampling, with no significant difference in catch efficiency between vacuum and sweep net methods. However, based on fixed precision cost reliability, the sweep net sampling was the most cost-effective method followed by vacuum, beat bucket, drop cloth, and visual sampling. Taylor's Power Law analysis revealed that the field dispersion patterns of both thrips and fleahoppers were aggregated throughout the crop growing season. For thrips management decision based on visual sampling (0.25 precision), 15 plants were estimated to be the minimum sample size when the estimated population density was one thrips per plant, whereas the minimum sample size was nine plants when thrips density approached 10 thrips per plant. The minimum visual sample size for cotton fleahoppers was 16 plants when the density was one fleahopper per plant, but the sample size decreased rapidly with an increase in fleahopper density, requiring only four plants to be sampled when the density was 10 fleahoppers per plant. Sequential sampling plans were developed and validated with independent data for both thrips and cotton fleahoppers.

  20. Using pseudoalignment and base quality to accurately quantify microbial community composition

    PubMed Central

    Novembre, John

    2018-01-01

    Pooled DNA from multiple unknown organisms arises in a variety of contexts, for example microbial samples from ecological or human health research. Determining the composition of pooled samples can be difficult, especially at the scale of modern sequencing data and reference databases. Here we propose a novel method for taxonomic profiling in pooled DNA that combines the speed and low-memory requirements of k-mer based pseudoalignment with a likelihood framework that uses base quality information to better resolve multiply mapped reads. We apply the method to the problem of classifying 16S rRNA reads using a reference database of known organisms, a common challenge in microbiome research. Using simulations, we show the method is accurate across a variety of read lengths, with different length reference sequences, at different sample depths, and when samples contain reads originating from organisms absent from the reference. We also assess performance in real 16S data, where we reanalyze previous genetic association data to show our method discovers a larger number of quantitative trait associations than other widely used methods. We implement our method in the software Karp, for k-mer based analysis of read pools, to provide a novel combination of speed and accuracy that is uniquely suited for enhancing discoveries in microbial studies. PMID:29659582

  1. Comparative Evaluation of Vacuum-based Surface Sampling ...

    EPA Pesticide Factsheets

    Journal Article Following a biological contamination incident, collection of surface samples is necessary to determine the extent and level of contamination, and to deem an area safe for reentry upon decontamination. Current sampling strategies targeting Bacillus anthracis spores prescribe vacuum-based methods for rough and/or porous surfaces. In this study, four commonly-used B. anthracis spore sampling devices (vacuum socks, 37 mm 0.8 µm MCE filter cassettes, 37 mm 0.3 µm PTFE filter cassettes, and 3MTM forensic filters) were comparatively evaluated for their ability to recover surface-associated spores. The vacuum sock device was evaluated at two sampling speeds (slow and fast), resulting in five total methods evaluated. Aerosolized spores (~105 cm-2) of a surrogate Bacillus species (Bacillus atrophaeus) were allowed to settle onto three material types (concrete, carpet, and upholstery). Ten replicate samples were collected using each vacuum method, from each of the three material types. In addition, stainless steel (i.e., nonporous) surfaces inoculated simultaneously were sampled with pre-moistened wipes. Recoveries from wipes of steel surfaces were utilized to verify the inoculum, and to normalize vacuum-based recoveries across trials. Recovery (CFU cm-2) and relative recovery (vacuum recovery/wipe recovery) were determined for each method and material type. Relative recoveries were compared by one-way and three-way ANOVA. Data analysis by one-

  2. Recommended Mass Spectrometry-Based Strategies to Identify Ricin-Containing Samples.

    PubMed

    Kalb, Suzanne R; Schieltz, David M; Becher, François; Astot, Crister; Fredriksson, Sten-Åke; Barr, John R

    2015-11-25

    Ricin is a protein toxin produced by the castor bean plant (Ricinus communis) together with a related protein known as R. communis agglutinin (RCA120). Mass spectrometric (MS) assays have the capacity to unambiguously identify ricin and to detect ricin's activity in samples with complex matrices. These qualitative and quantitative assays enable detection and differentiation of ricin from the less toxic RCA120 through determination of the amino acid sequence of the protein in question, and active ricin can be monitored by MS as the release of adenine from the depurination of a nucleic acid substrate. In this work, we describe the application of MS-based methods to detect, differentiate and quantify ricin and RCA120 in nine blinded samples supplied as part of the EQuATox proficiency test. Overall, MS-based assays successfully identified all samples containing ricin or RCA120 with the exception of the sample spiked with the lowest concentration (0.414 ng/mL). In fact, mass spectrometry was the most successful method for differentiation of ricin and RCA120 based on amino acid determination. Mass spectrometric methods were also successful at ranking the functional activities of the samples, successfully yielding semi-quantitative results. These results indicate that MS-based assays are excellent techniques to detect, differentiate, and quantify ricin and RCA120 in complex matrices.

  3. A Two-Step Approach to Uncertainty Quantification of Core Simulators

    DOE PAGES

    Yankov, Artem; Collins, Benjamin; Klein, Markus; ...

    2012-01-01

    For the multiple sources of error introduced into the standard computational regime for simulating reactor cores, rigorous uncertainty analysis methods are available primarily to quantify the effects of cross section uncertainties. Two methods for propagating cross section uncertainties through core simulators are the XSUSA statistical approach and the “two-step” method. The XSUSA approach, which is based on the SUSA code package, is fundamentally a stochastic sampling method. Alternatively, the two-step method utilizes generalized perturbation theory in the first step and stochastic sampling in the second step. The consistency of these two methods in quantifying uncertainties in the multiplication factor andmore » in the core power distribution was examined in the framework of phase I-3 of the OECD Uncertainty Analysis in Modeling benchmark. With the Three Mile Island Unit 1 core as a base model for analysis, the XSUSA and two-step methods were applied with certain limitations, and the results were compared to those produced by other stochastic sampling-based codes. Based on the uncertainty analysis results, conclusions were drawn as to the method that is currently more viable for computing uncertainties in burnup and transient calculations.« less

  4. Kinetic quantitation of cerebral PET-FDG studies without concurrent blood sampling: statistical recovery of the arterial input function.

    PubMed

    O'Sullivan, F; Kirrane, J; Muzi, M; O'Sullivan, J N; Spence, A M; Mankoff, D A; Krohn, K A

    2010-03-01

    Kinetic quantitation of dynamic positron emission tomography (PET) studies via compartmental modeling usually requires the time-course of the radio-tracer concentration in the arterial blood as an arterial input function (AIF). For human and animal imaging applications, significant practical difficulties are associated with direct arterial sampling and as a result there is substantial interest in alternative methods that require no blood sampling at the time of the study. A fixed population template input function derived from prior experience with directly sampled arterial curves is one possibility. Image-based extraction, including requisite adjustment for spillover and recovery, is another approach. The present work considers a hybrid statistical approach based on a penalty formulation in which the information derived from a priori studies is combined in a Bayesian manner with information contained in the sampled image data in order to obtain an input function estimate. The absolute scaling of the input is achieved by an empirical calibration equation involving the injected dose together with the subject's weight, height and gender. The technique is illustrated in the context of (18)F -Fluorodeoxyglucose (FDG) PET studies in humans. A collection of 79 arterially sampled FDG blood curves are used as a basis for a priori characterization of input function variability, including scaling characteristics. Data from a series of 12 dynamic cerebral FDG PET studies in normal subjects are used to evaluate the performance of the penalty-based AIF estimation technique. The focus of evaluations is on quantitation of FDG kinetics over a set of 10 regional brain structures. As well as the new method, a fixed population template AIF and a direct AIF estimate based on segmentation are also considered. Kinetics analyses resulting from these three AIFs are compared with those resulting from radially sampled AIFs. The proposed penalty-based AIF extraction method is found to achieve significant improvements over the fixed template and the segmentation methods. As well as achieving acceptable kinetic parameter accuracy, the quality of fit of the region of interest (ROI) time-course data based on the extracted AIF, matches results based on arterially sampled AIFs. In comparison, significant deviation in the estimation of FDG flux and degradation in ROI data fit are found with the template and segmentation methods. The proposed AIF extraction method is recommended for practical use.

  5. Rapid prediction of ochratoxin A-producing strains of Penicillium on dry-cured meat by MOS-based electronic nose.

    PubMed

    Lippolis, Vincenzo; Ferrara, Massimo; Cervellieri, Salvatore; Damascelli, Anna; Epifani, Filomena; Pascale, Michelangelo; Perrone, Giancarlo

    2016-02-02

    The availability of rapid diagnostic methods for monitoring ochratoxigenic species during the seasoning processes for dry-cured meats is crucial and constitutes a key stage in order to prevent the risk of ochratoxin A (OTA) contamination. A rapid, easy-to-perform and non-invasive method using an electronic nose (e-nose) based on metal oxide semiconductors (MOS) was developed to discriminate dry-cured meat samples in two classes based on the fungal contamination: class P (samples contaminated by OTA-producing Penicillium strains) and class NP (samples contaminated by OTA non-producing Penicillium strains). Two OTA-producing strains of Penicillium nordicum and two OTA non-producing strains of Penicillium nalgiovense and Penicillium salamii, were tested. The feasibility of this approach was initially evaluated by e-nose analysis of 480 samples of both Yeast extract sucrose (YES) and meat-based agar media inoculated with the tested Penicillium strains and incubated up to 14 days. The high recognition percentages (higher than 82%) obtained by Discriminant Function Analysis (DFA), either in calibration and cross-validation (leave-more-out approach), for both YES and meat-based samples demonstrated the validity of the used approach. The e-nose method was subsequently developed and validated for the analysis of dry-cured meat samples. A total of 240 e-nose analyses were carried out using inoculated sausages, seasoned by a laboratory-scale process and sampled at 5, 7, 10 and 14 days. DFA provided calibration models that permitted discrimination of dry-cured meat samples after only 5 days of seasoning with mean recognition percentages in calibration and cross-validation of 98 and 88%, respectively. A further validation of the developed e-nose method was performed using 60 dry-cured meat samples produced by an industrial-scale seasoning process showing a total recognition percentage of 73%. The pattern of volatile compounds of dry-cured meat samples was identified and characterized by a developed HS-SPME/GC-MS method. Seven volatile compounds (2-methyl-1-butanol, octane, 1R-α-pinene, d-limonene, undecane, tetradecanal, 9-(Z)-octadecenoic acid methyl ester) allowed discrimination between dry-cured meat samples of classes P and NP. These results demonstrate that MOS-based electronic nose can be a useful tool for a rapid screening in preventing OTA contamination in the cured meat supply chain. Copyright © 2015 Elsevier B.V. All rights reserved.

  6. New method for estimating bacterial cell abundances in natural samples by use of sublimation

    NASA Technical Reports Server (NTRS)

    Glavin, Daniel P.; Cleaves, H. James; Schubert, Michael; Aubrey, Andrew; Bada, Jeffrey L.

    2004-01-01

    We have developed a new method based on the sublimation of adenine from Escherichia coli to estimate bacterial cell counts in natural samples. To demonstrate this technique, several types of natural samples, including beach sand, seawater, deep-sea sediment, and two soil samples from the Atacama Desert, were heated to a temperature of 500 degrees C for several seconds under reduced pressure. The sublimate was collected on a cold finger, and the amount of adenine released from the samples was then determined by high-performance liquid chromatography with UV absorbance detection. Based on the total amount of adenine recovered from DNA and RNA in these samples, we estimated bacterial cell counts ranging from approximately 10(5) to 10(9) E. coli cell equivalents per gram. For most of these samples, the sublimation-based cell counts were in agreement with total bacterial counts obtained by traditional DAPI (4,6-diamidino-2-phenylindole) staining.

  7. Active learning based segmentation of Crohns disease from abdominal MRI.

    PubMed

    Mahapatra, Dwarikanath; Vos, Franciscus M; Buhmann, Joachim M

    2016-05-01

    This paper proposes a novel active learning (AL) framework, and combines it with semi supervised learning (SSL) for segmenting Crohns disease (CD) tissues from abdominal magnetic resonance (MR) images. Robust fully supervised learning (FSL) based classifiers require lots of labeled data of different disease severities. Obtaining such data is time consuming and requires considerable expertise. SSL methods use a few labeled samples, and leverage the information from many unlabeled samples to train an accurate classifier. AL queries labels of most informative samples and maximizes gain from the labeling effort. Our primary contribution is in designing a query strategy that combines novel context information with classification uncertainty and feature similarity. Combining SSL and AL gives a robust segmentation method that: (1) optimally uses few labeled samples and many unlabeled samples; and (2) requires lower training time. Experimental results show our method achieves higher segmentation accuracy than FSL methods with fewer samples and reduced training effort. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  8. Highly Effective DNA Extraction Method for Nuclear Short Tandem Repeat Testing of Skeletal Remains from Mass Graves

    PubMed Central

    Davoren, Jon; Vanek, Daniel; Konjhodzić, Rijad; Crews, John; Huffine, Edwin; Parsons, Thomas J.

    2007-01-01

    Aim To quantitatively compare a silica extraction method with a commonly used phenol/chloroform extraction method for DNA analysis of specimens exhumed from mass graves. Methods DNA was extracted from twenty randomly chosen femur samples, using the International Commission on Missing Persons (ICMP) silica method, based on Qiagen Blood Maxi Kit, and compared with the DNA extracted by the standard phenol/chloroform-based method. The efficacy of extraction methods was compared by real time polymerase chain reaction (PCR) to measure DNA quantity and the presence of inhibitors and by amplification with the PowerPlex 16 (PP16) multiplex nuclear short tandem repeat (STR) kit. Results DNA quantification results showed that the silica-based method extracted on average 1.94 ng of DNA per gram of bone (range 0.25-9.58 ng/g), compared with only 0.68 ng/g by the organic method extracted (range 0.0016-4.4880 ng/g). Inhibition tests showed that there were on average significantly lower levels of PCR inhibitors in DNA isolated by the organic method. When amplified with PP16, all samples extracted by silica-based method produced 16 full loci profiles, while only 75% of the DNA extracts obtained by organic technique amplified 16 loci profiles. Conclusions The silica-based extraction method showed better results in nuclear STR typing from degraded bone samples than a commonly used phenol/chloroform method. PMID:17696302

  9. “Magnitude-based Inference”: A Statistical Review

    PubMed Central

    Welsh, Alan H.; Knight, Emma J.

    2015-01-01

    ABSTRACT Purpose We consider “magnitude-based inference” and its interpretation by examining in detail its use in the problem of comparing two means. Methods We extract from the spreadsheets, which are provided to users of the analysis (http://www.sportsci.org/), a precise description of how “magnitude-based inference” is implemented. We compare the implemented version of the method with general descriptions of it and interpret the method in familiar statistical terms. Results and Conclusions We show that “magnitude-based inference” is not a progressive improvement on modern statistics. The additional probabilities introduced are not directly related to the confidence interval but, rather, are interpretable either as P values for two different nonstandard tests (for different null hypotheses) or as approximate Bayesian calculations, which also lead to a type of test. We also discuss sample size calculations associated with “magnitude-based inference” and show that the substantial reduction in sample sizes claimed for the method (30% of the sample size obtained from standard frequentist calculations) is not justifiable so the sample size calculations should not be used. Rather than using “magnitude-based inference,” a better solution is to be realistic about the limitations of the data and use either confidence intervals or a fully Bayesian analysis. PMID:25051387

  10. Detection of periodontopathogenic bacteria in pregnant women by traditional anaerobic culture method and by a commercial molecular genetic method.

    PubMed

    Urbán, Edit; Terhes, Gabriella; Radnai, Márta; Gorzó, István; Nagy, Elisabeth

    2010-06-01

    To culture facultative and strict anaerobic bacteria is a well-established method for analyzing subgingival plaque samples. Micro-IDent and micro-IDent Plus (HAIN Lifescience GmbH, Nehren, Germany) tests are two commercially available rapid PCR-based methods for the identification and quantification of putative periodontopathogen bacteria. In this study, we compared these commercial PCR-based hybridization methods with conventional anaerobic culture technique. A total of 36 subgingival plaque samples were collected from periodontal pockets of pregnant women with chronic localized periodontitis. Aliquots of these samples were evaluated with species-specific probes provided by micro-IDent and micro-IDent Plus tests simultaneously, and from the same samples anaerobic and capnophylic bacteria were cultured on selective media. The overall agreement between both methods was excellent for Eubacterium nodatum, Tannerella forsythia and Porphyromonas gingivalis (97-92%), fair for Capnocytophaga sp, Eikenella corrodens, Actinobacillus actinomycetemcomitans, and Prevotella intermedia (91-89%) and poor for Fusobacterium nucleatum, Parvimonas micra (Micromonas micros), and Campylobacter rectus (86-78%). Discrepancies in the results may be explained by inability of culture method to distinguish between closely related taxa (e.i P. intermedia/Prevotella. nigrescens), and problems of keeping periodontopathogen bacteria viable, which is required for successful detection by standard culture method. Nucleic acid-based methods may replace cultivation method as frequently used methods in microbiological diagnosis of progressive periodontitis, thus micro-IDent and micro-IDent Plus tests can be recommended where culture of periodontopathogenic bacteria is not performed in routine microbiology laboratories to analyze subgingival plaque samples. 2010 Elsevier Ltd. All rights reserved.

  11. Accurate low-cost methods for performance evaluation of cache memory systems

    NASA Technical Reports Server (NTRS)

    Laha, Subhasis; Patel, Janak H.; Iyer, Ravishankar K.

    1988-01-01

    Methods of simulation based on statistical techniques are proposed to decrease the need for large trace measurements and for predicting true program behavior. Sampling techniques are applied while the address trace is collected from a workload. This drastically reduces the space and time needed to collect the trace. Simulation techniques are developed to use the sampled data not only to predict the mean miss rate of the cache, but also to provide an empirical estimate of its actual distribution. Finally, a concept of primed cache is introduced to simulate large caches by the sampling-based method.

  12. Validation and Comparison of Two Sampling Methods to Assess Dermal Exposure to Drilling Fluids and Crude Oil

    PubMed Central

    Galea, Karen S.; McGonagle, Carolyn; Sleeuwenhoek, Anne; Todd, David; Jiménez, Araceli Sánchez

    2014-01-01

    Dermal exposure to drilling fluids and crude oil is an exposure route of concern. However, there have been no published studies describing sampling methods or reporting dermal exposure measurements. We describe a study that aimed to evaluate a wipe sampling method to assess dermal exposure to an oil-based drilling fluid and crude oil, as well as to investigate the feasibility of using an interception cotton glove sampler for exposure on the hands/wrists. A direct comparison of the wipe and interception methods was also completed using pigs’ trotters as a surrogate for human skin and a direct surface contact exposure scenario. Overall, acceptable recovery and sampling efficiencies were reported for both methods, and both methods had satisfactory storage stability at 1 and 7 days, although there appeared to be some loss over 14 days. The methods’ comparison study revealed significantly higher removal of both fluids from the metal surface with the glove samples compared with the wipe samples (on average 2.5 times higher). Both evaluated sampling methods were found to be suitable for assessing dermal exposure to oil-based drilling fluids and crude oil; however, the comparison study clearly illustrates that glove samplers may overestimate the amount of fluid transferred to the skin. Further comparison of the two dermal sampling methods using additional exposure situations such as immersion or deposition, as well as a field evaluation, is warranted to confirm their appropriateness and suitability in the working environment. PMID:24598941

  13. Evaluation and recommendation of sensitivity analysis methods for application to Stochastic Human Exposure and Dose Simulation models.

    PubMed

    Mokhtari, Amirhossein; Christopher Frey, H; Zheng, Junyu

    2006-11-01

    Sensitivity analyses of exposure or risk models can help identify the most significant factors to aid in risk management or to prioritize additional research to reduce uncertainty in the estimates. However, sensitivity analysis is challenged by non-linearity, interactions between inputs, and multiple days or time scales. Selected sensitivity analysis methods are evaluated with respect to their applicability to human exposure models with such features using a testbed. The testbed is a simplified version of a US Environmental Protection Agency's Stochastic Human Exposure and Dose Simulation (SHEDS) model. The methods evaluated include the Pearson and Spearman correlation, sample and rank regression, analysis of variance, Fourier amplitude sensitivity test (FAST), and Sobol's method. The first five methods are known as "sampling-based" techniques, wheras the latter two methods are known as "variance-based" techniques. The main objective of the test cases was to identify the main and total contributions of individual inputs to the output variance. Sobol's method and FAST directly quantified these measures of sensitivity. Results show that sensitivity of an input typically changed when evaluated under different time scales (e.g., daily versus monthly). All methods provided similar insights regarding less important inputs; however, Sobol's method and FAST provided more robust insights with respect to sensitivity of important inputs compared to the sampling-based techniques. Thus, the sampling-based methods can be used in a screening step to identify unimportant inputs, followed by application of more computationally intensive refined methods to a smaller set of inputs. The implications of time variation in sensitivity results for risk management are briefly discussed.

  14. Multidimensional scaling analysis of financial time series based on modified cross-sample entropy methods

    NASA Astrophysics Data System (ADS)

    He, Jiayi; Shang, Pengjian; Xiong, Hui

    2018-06-01

    Stocks, as the concrete manifestation of financial time series with plenty of potential information, are often used in the study of financial time series. In this paper, we utilize the stock data to recognize their patterns through out the dissimilarity matrix based on modified cross-sample entropy, then three-dimensional perceptual maps of the results are provided through multidimensional scaling method. Two modified multidimensional scaling methods are proposed in this paper, that is, multidimensional scaling based on Kronecker-delta cross-sample entropy (MDS-KCSE) and multidimensional scaling based on permutation cross-sample entropy (MDS-PCSE). These two methods use Kronecker-delta based cross-sample entropy and permutation based cross-sample entropy to replace the distance or dissimilarity measurement in classical multidimensional scaling (MDS). Multidimensional scaling based on Chebyshev distance (MDSC) is employed to provide a reference for comparisons. Our analysis reveals a clear clustering both in synthetic data and 18 indices from diverse stock markets. It implies that time series generated by the same model are easier to have similar irregularity than others, and the difference in the stock index, which is caused by the country or region and the different financial policies, can reflect the irregularity in the data. In the synthetic data experiments, not only the time series generated by different models can be distinguished, the one generated under different parameters of the same model can also be detected. In the financial data experiment, the stock indices are clearly divided into five groups. Through analysis, we find that they correspond to five regions, respectively, that is, Europe, North America, South America, Asian-Pacific (with the exception of mainland China), mainland China and Russia. The results also demonstrate that MDS-KCSE and MDS-PCSE provide more effective divisions in experiments than MDSC.

  15. Restricted random search method based on taboo search in the multiple minima problem

    NASA Astrophysics Data System (ADS)

    Hong, Seung Do; Jhon, Mu Shik

    1997-03-01

    The restricted random search method is proposed as a simple Monte Carlo sampling method to search minima fast in the multiple minima problem. This method is based on taboo search applied recently to continuous test functions. The concept of the taboo region instead of the taboo list is used and therefore the sampling of a region near an old configuration is restricted in this method. This method is applied to 2-dimensional test functions and the argon clusters. This method is found to be a practical and efficient method to search near-global configurations of test functions and the argon clusters.

  16. Multistage point relascope and randomized branch sampling for downed coarse woody debris estimation

    Treesearch

    Jeffrey H. Gove; Mark J. Ducey; Harry T. Valentine

    2002-01-01

    New sampling methods have recently been introduced that allow estimation of downed coarse woody debris using an angle gauge, or relascope. The theory behind these methods is based on sampling straight pieces of downed coarse woody debris. When pieces deviate from this ideal situation, auxillary methods must be employed. We describe a two-stage procedure where the...

  17. Piecewise SALT sampling for estimating suspended sediment yields

    Treesearch

    Robert B. Thomas

    1989-01-01

    A probability sampling method called SALT (Selection At List Time) has been developed for collecting and summarizing data on delivery of suspended sediment in rivers. It is based on sampling and estimating yield using a suspended-sediment rating curve for high discharges and simple random sampling for low flows. The method gives unbiased estimates of total yield and...

  18. Sorbent-based sampling methods for volatile and semi-volatile organic compounds in air Part 1: Sorbent-based air monitoring options.

    PubMed

    Woolfenden, Elizabeth

    2010-04-16

    Sorbent tubes/traps are widely used in combination with gas chromatographic (GC) analytical methods to monitor the vapour-phase fraction of organic compounds in air. Target compounds range in volatility from acetylene and freons to phthalates and PCBs and include apolar, polar and reactive species. Airborne vapour concentrations will vary depending on the nature of the location, nearby pollution sources, weather conditions, etc. Levels can range from low percent concentrations in stack and vent emissions to low part per trillion (ppt) levels in ultra-clean outdoor locations. Hundreds, even thousands of different compounds may be present in any given atmosphere. GC is commonly used in combination with mass spectrometry (MS) detection especially for environmental monitoring or for screening uncharacterised workplace atmospheres. Given the complexity and variability of organic vapours in air, no one sampling approach suits every monitoring scenario. A variety of different sampling strategies and sorbent media have been developed to address specific applications. Key sorbent-based examples include: active (pumped) sampling onto tubes packed with one or more sorbents held at ambient temperature; diffusive (passive) sampling onto sorbent tubes/cartridges; on-line sampling of air/gas streams into cooled sorbent traps; and transfer of air samples from containers (canisters, Tedlar) bags, etc.) into cooled sorbent focusing traps. Whichever sampling approach is selected, subsequent analysis almost always involves either solvent extraction or thermal desorption (TD) prior to GC(/MS) analysis. The overall performance of the air monitoring method will depend heavily on appropriate selection of key sampling and analytical parameters. This comprehensive review of air monitoring using sorbent tubes/traps is divided into 2 parts. (1) Sorbent-based air sampling option. (2) Sorbent selection and other aspects of optimizing sorbent-based air monitoring methods. The paper presents current state-of-the-art and recent developments in relevant areas such as sorbent research, sampler design, enhanced approaches to analytical quality assurance and on-tube derivatisation. Copyright 2009 Elsevier B.V. All rights reserved.

  19. [DNA quantification of blood samples pre-treated with pyramidon].

    PubMed

    Zhu, Chuan-Hong; Zheng, Dao-Li; Ni, Rao-Zhi; Wang, Hai-Sheng; Ning, Ping; Fang, Hui; Liu, Yan

    2014-06-01

    To study DNA quantification and STR typing of samples pre-treated with pyramidon. The blood samples of ten unrelated individuals were anticoagulated in EDTA. The blood stains were made on the filter paper. The experimental groups were divided into six groups in accordance with the storage time, 30 min, 1 h, 3 h, 6 h, 12 h and 24h after pre-treated with pyramidon. DNA was extracted by three methods: magnetic bead-based extraction, QIAcube DNA purification method and Chelex-100 method. The quantification of DNA was made by fluorescent quantitative PCR. STR typing was detected by PCR-STR fluorescent technology. In the same DNA extraction method, the sample DNA decreased gradually with times after pre-treatment with pyramidon. In the same storage time, the DNA quantification in different extraction methods had significant differences. Sixteen loci DNA typing were detected in 90.56% of samples. Pyramidon pre-treatment could cause DNA degradation, but effective STR typing can be achieved within 24 h. The magnetic bead-based extraction is the best method for STR profiling and DNA extraction.

  20. General Constraints on Sampling Wildlife on FIA Plots

    Treesearch

    Larissa L. Bailey; John R. Sauer; James D. Nichols; Paul H. Geissler

    2005-01-01

    This paper reviews the constraints to sampling wildlife populations at FIA points. Wildlife sampling programs must have well-defined goals and provide information adequate to meet those goals. Investigators should choose a State variable based on information needs and the spatial sampling scale. We discuss estimation-based methods for three State variables: species...

  1. COMPARISON OF MACROINVERTEBRATE SAMPLING METHODS FOR NONWADEABLE STREAMS

    EPA Science Inventory

    The bioassessment of nonwadeable streams in the United States is increasing, but methods for these systems are not as well developed as for wadeable streams. In this study, we compared six benthic macroinvertebrate field sampling methods for nonwadeable streams based on those us...

  2. Reduced electron exposure for energy-dispersive spectroscopy using dynamic sampling

    DOE PAGES

    Zhang, Yan; Godaliyadda, G. M. Dilshan; Ferrier, Nicola; ...

    2017-10-23

    Analytical electron microscopy and spectroscopy of biological specimens, polymers, and other beam sensitive materials has been a challenging area due to irradiation damage. There is a pressing need to develop novel imaging and spectroscopic imaging methods that will minimize such sample damage as well as reduce the data acquisition time. The latter is useful for high-throughput analysis of materials structure and chemistry. Here, in this work, we present a novel machine learning based method for dynamic sparse sampling of EDS data using a scanning electron microscope. Our method, based on the supervised learning approach for dynamic sampling algorithm and neuralmore » networks based classification of EDS data, allows a dramatic reduction in the total sampling of up to 90%, while maintaining the fidelity of the reconstructed elemental maps and spectroscopic data. In conclusion, we believe this approach will enable imaging and elemental mapping of materials that would otherwise be inaccessible to these analysis techniques.« less

  3. Measures of precision for dissimilarity-based multivariate analysis of ecological communities

    PubMed Central

    Anderson, Marti J; Santana-Garcon, Julia

    2015-01-01

    Ecological studies require key decisions regarding the appropriate size and number of sampling units. No methods currently exist to measure precision for multivariate assemblage data when dissimilarity-based analyses are intended to follow. Here, we propose a pseudo multivariate dissimilarity-based standard error (MultSE) as a useful quantity for assessing sample-size adequacy in studies of ecological communities. Based on sums of squared dissimilarities, MultSE measures variability in the position of the centroid in the space of a chosen dissimilarity measure under repeated sampling for a given sample size. We describe a novel double resampling method to quantify uncertainty in MultSE values with increasing sample size. For more complex designs, values of MultSE can be calculated from the pseudo residual mean square of a permanova model, with the double resampling done within appropriate cells in the design. R code functions for implementing these techniques, along with ecological examples, are provided. PMID:25438826

  4. A Novel Method to Handle the Effect of Uneven Sampling Effort in Biodiversity Databases

    PubMed Central

    Pardo, Iker; Pata, María P.; Gómez, Daniel; García, María B.

    2013-01-01

    How reliable are results on spatial distribution of biodiversity based on databases? Many studies have evidenced the uncertainty related to this kind of analysis due to sampling effort bias and the need for its quantification. Despite that a number of methods are available for that, little is known about their statistical limitations and discrimination capability, which could seriously constrain their use. We assess for the first time the discrimination capacity of two widely used methods and a proposed new one (FIDEGAM), all based on species accumulation curves, under different scenarios of sampling exhaustiveness using Receiver Operating Characteristic (ROC) analyses. Additionally, we examine to what extent the output of each method represents the sampling completeness in a simulated scenario where the true species richness is known. Finally, we apply FIDEGAM to a real situation and explore the spatial patterns of plant diversity in a National Park. FIDEGAM showed an excellent discrimination capability to distinguish between well and poorly sampled areas regardless of sampling exhaustiveness, whereas the other methods failed. Accordingly, FIDEGAM values were strongly correlated with the true percentage of species detected in a simulated scenario, whereas sampling completeness estimated with other methods showed no relationship due to null discrimination capability. Quantifying sampling effort is necessary to account for the uncertainty in biodiversity analyses, however, not all proposed methods are equally reliable. Our comparative analysis demonstrated that FIDEGAM was the most accurate discriminator method in all scenarios of sampling exhaustiveness, and therefore, it can be efficiently applied to most databases in order to enhance the reliability of biodiversity analyses. PMID:23326357

  5. A novel method to handle the effect of uneven sampling effort in biodiversity databases.

    PubMed

    Pardo, Iker; Pata, María P; Gómez, Daniel; García, María B

    2013-01-01

    How reliable are results on spatial distribution of biodiversity based on databases? Many studies have evidenced the uncertainty related to this kind of analysis due to sampling effort bias and the need for its quantification. Despite that a number of methods are available for that, little is known about their statistical limitations and discrimination capability, which could seriously constrain their use. We assess for the first time the discrimination capacity of two widely used methods and a proposed new one (FIDEGAM), all based on species accumulation curves, under different scenarios of sampling exhaustiveness using Receiver Operating Characteristic (ROC) analyses. Additionally, we examine to what extent the output of each method represents the sampling completeness in a simulated scenario where the true species richness is known. Finally, we apply FIDEGAM to a real situation and explore the spatial patterns of plant diversity in a National Park. FIDEGAM showed an excellent discrimination capability to distinguish between well and poorly sampled areas regardless of sampling exhaustiveness, whereas the other methods failed. Accordingly, FIDEGAM values were strongly correlated with the true percentage of species detected in a simulated scenario, whereas sampling completeness estimated with other methods showed no relationship due to null discrimination capability. Quantifying sampling effort is necessary to account for the uncertainty in biodiversity analyses, however, not all proposed methods are equally reliable. Our comparative analysis demonstrated that FIDEGAM was the most accurate discriminator method in all scenarios of sampling exhaustiveness, and therefore, it can be efficiently applied to most databases in order to enhance the reliability of biodiversity analyses.

  6. Deterministic multidimensional nonuniform gap sampling.

    PubMed

    Worley, Bradley; Powers, Robert

    2015-12-01

    Born from empirical observations in nonuniformly sampled multidimensional NMR data relating to gaps between sampled points, the Poisson-gap sampling method has enjoyed widespread use in biomolecular NMR. While the majority of nonuniform sampling schemes are fully randomly drawn from probability densities that vary over a Nyquist grid, the Poisson-gap scheme employs constrained random deviates to minimize the gaps between sampled grid points. We describe a deterministic gap sampling method, based on the average behavior of Poisson-gap sampling, which performs comparably to its random counterpart with the additional benefit of completely deterministic behavior. We also introduce a general algorithm for multidimensional nonuniform sampling based on a gap equation, and apply it to yield a deterministic sampling scheme that combines burst-mode sampling features with those of Poisson-gap schemes. Finally, we derive a relationship between stochastic gap equations and the expectation value of their sampling probability densities. Copyright © 2015 Elsevier Inc. All rights reserved.

  7. Using ancestry matching to combine family-based and unrelated samples for genome-wide association studies‡

    PubMed Central

    Crossett, Andrew; Kent, Brian P.; Klei, Lambertus; Ringquist, Steven; Trucco, Massimo; Roeder, Kathryn; Devlin, Bernie

    2015-01-01

    We propose a method to analyze family-based samples together with unrelated cases and controls. The method builds on the idea of matched case–control analysis using conditional logistic regression (CLR). For each trio within the family, a case (the proband) and matched pseudo-controls are constructed, based upon the transmitted and untransmitted alleles. Unrelated controls, matched by genetic ancestry, supplement the sample of pseudo-controls; likewise unrelated cases are also paired with genetically matched controls. Within each matched stratum, the case genotype is contrasted with control pseudo-control genotypes via CLR, using a method we call matched-CLR (mCLR). Eigenanalysis of numerous SNP genotypes provides a tool for mapping genetic ancestry. The result of such an analysis can be thought of as a multidimensional map, or eigenmap, in which the relative genetic similarities and differences amongst individuals is encoded in the map. Once constructed, new individuals can be projected onto the ancestry map based on their genotypes. Successful differentiation of individuals of distinct ancestry depends on having a diverse, yet representative sample from which to construct the ancestry map. Once samples are well-matched, mCLR yields comparable power to competing methods while ensuring excellent control over Type I error. PMID:20862653

  8. Comparison of DNA preservation methods for environmental bacterial community samples

    USGS Publications Warehouse

    Gray, Michael A.; Pratte, Zoe A.; Kellogg, Christina A.

    2013-01-01

    Field collections of environmental samples, for example corals, for molecular microbial analyses present distinct challenges. The lack of laboratory facilities in remote locations is common, and preservation of microbial community DNA for later study is critical. A particular challenge is keeping samples frozen in transit. Five nucleic acid preservation methods that do not require cold storage were compared for effectiveness over time and ease of use. Mixed microbial communities of known composition were created and preserved by DNAgard™, RNAlater®, DMSO–EDTA–salt (DESS), FTA® cards, and FTA Elute® cards. Automated ribosomal intergenic spacer analysis and clone libraries were used to detect specific changes in the faux communities over weeks and months of storage. A previously known bias in FTA® cards that results in lower recovery of pure cultures of Gram-positive bacteria was also detected in mixed community samples. There appears to be a uniform bias across all five preservation methods against microorganisms with high G + C DNA. Overall, the liquid-based preservatives (DNAgard™, RNAlater®, and DESS) outperformed the card-based methods. No single liquid method clearly outperformed the others, leaving method choice to be based on experimental design, field facilities, shipping constraints, and allowable cost.

  9. Sample size for post-marketing safety studies based on historical controls.

    PubMed

    Wu, Yu-te; Makuch, Robert W

    2010-08-01

    As part of a drug's entire life cycle, post-marketing studies are an important part in the identification of rare, serious adverse events. Recently, the US Food and Drug Administration (FDA) has begun to implement new post-marketing safety mandates as a consequence of increased emphasis on safety. The purpose of this research is to provide exact sample size formula for the proposed hybrid design, based on a two-group cohort study with incorporation of historical external data. Exact sample size formula based on the Poisson distribution is developed, because the detection of rare events is our outcome of interest. Performance of exact method is compared to its approximate large-sample theory counterpart. The proposed hybrid design requires a smaller sample size compared to the standard, two-group prospective study design. In addition, the exact method reduces the number of subjects required in the treatment group by up to 30% compared to the approximate method for the study scenarios examined. The proposed hybrid design satisfies the advantages and rationale of the two-group design with smaller sample sizes generally required. 2010 John Wiley & Sons, Ltd.

  10. A new sampling scheme for developing metamodels with the zeros of Chebyshev polynomials

    NASA Astrophysics Data System (ADS)

    Wu, Jinglai; Luo, Zhen; Zhang, Nong; Zhang, Yunqing

    2015-09-01

    The accuracy of metamodelling is determined by both the sampling and approximation. This article proposes a new sampling method based on the zeros of Chebyshev polynomials to capture the sampling information effectively. First, the zeros of one-dimensional Chebyshev polynomials are applied to construct Chebyshev tensor product (CTP) sampling, and the CTP is then used to construct high-order multi-dimensional metamodels using the 'hypercube' polynomials. Secondly, the CTP sampling is further enhanced to develop Chebyshev collocation method (CCM) sampling, to construct the 'simplex' polynomials. The samples of CCM are randomly and directly chosen from the CTP samples. Two widely studied sampling methods, namely the Smolyak sparse grid and Hammersley, are used to demonstrate the effectiveness of the proposed sampling method. Several numerical examples are utilized to validate the approximation accuracy of the proposed metamodel under different dimensions.

  11. Validation and comparison of two sampling methods to assess dermal exposure to drilling fluids and crude oil.

    PubMed

    Galea, Karen S; McGonagle, Carolyn; Sleeuwenhoek, Anne; Todd, David; Jiménez, Araceli Sánchez

    2014-06-01

    Dermal exposure to drilling fluids and crude oil is an exposure route of concern. However, there have been no published studies describing sampling methods or reporting dermal exposure measurements. We describe a study that aimed to evaluate a wipe sampling method to assess dermal exposure to an oil-based drilling fluid and crude oil, as well as to investigate the feasibility of using an interception cotton glove sampler for exposure on the hands/wrists. A direct comparison of the wipe and interception methods was also completed using pigs' trotters as a surrogate for human skin and a direct surface contact exposure scenario. Overall, acceptable recovery and sampling efficiencies were reported for both methods, and both methods had satisfactory storage stability at 1 and 7 days, although there appeared to be some loss over 14 days. The methods' comparison study revealed significantly higher removal of both fluids from the metal surface with the glove samples compared with the wipe samples (on average 2.5 times higher). Both evaluated sampling methods were found to be suitable for assessing dermal exposure to oil-based drilling fluids and crude oil; however, the comparison study clearly illustrates that glove samplers may overestimate the amount of fluid transferred to the skin. Further comparison of the two dermal sampling methods using additional exposure situations such as immersion or deposition, as well as a field evaluation, is warranted to confirm their appropriateness and suitability in the working environment. © The Author 2014. Published by Oxford University Press on behalf of the British Occupational Hygiene Society.

  12. Multi-laboratory survey of qPCR enterococci analysis method performance

    EPA Pesticide Factsheets

    Quantitative polymerase chain reaction (qPCR) has become a frequently used technique for quantifying enterococci in recreational surface waters, but there are several methodological options. Here we evaluated how three method permutations, type of mastermix, sample extract dilution and use of controls in results calculation, affect method reliability among multiple laboratories with respect to sample interference. Multiple samples from each of 22 sites representing an array of habitat types were analyzed using EPA Method 1611 and 1609 reagents with full strength and five-fold diluted extracts. The presence of interference was assessed three ways: using sample processing and PCR amplifications controls; consistency of results across extract dilutions; and relative recovery of target genes from spiked enterococci in water sample compared to control matrices with acceptable recovery defined as 50 to 200%. Method 1609, which is based on an environmental mastermix, was found to be superior to Method 1611, which is based on a universal mastermix. Method 1611 had over a 40% control assay failure rate with undiluted extracts and a 6% failure rate with diluted extracts. Method 1609 failed in only 11% and 3% of undiluted and diluted extracts analyses. Use of sample processing control assay results in the delta-delta Ct method for calculating relative target gene recoveries increased the number of acceptable recovery results. Delta-delta tended to bias recoveries fr

  13. Differentially co-expressed interacting protein pairs discriminate samples under distinct stages of HIV type 1 infection.

    PubMed

    Yoon, Dukyong; Kim, Hyosil; Suh-Kim, Haeyoung; Park, Rae Woong; Lee, KiYoung

    2011-01-01

    Microarray analyses based on differentially expressed genes (DEGs) have been widely used to distinguish samples across different cellular conditions. However, studies based on DEGs have not been able to clearly determine significant differences between samples of pathophysiologically similar HIV-1 stages, e.g., between acute and chronic progressive (or AIDS) or between uninfected and clinically latent stages. We here suggest a novel approach to allow such discrimination based on stage-specific genetic features of HIV-1 infection. Our approach is based on co-expression changes of genes known to interact. The method can identify a genetic signature for a single sample as contrasted with existing protein-protein-based analyses with correlational designs. Our approach distinguishes each sample using differentially co-expressed interacting protein pairs (DEPs) based on co-expression scores of individual interacting pairs within a sample. The co-expression score has positive value if two genes in a sample are simultaneously up-regulated or down-regulated. And the score has higher absolute value if expression-changing ratios are similar between the two genes. We compared characteristics of DEPs with that of DEGs by evaluating their usefulness in separation of HIV-1 stage. And we identified DEP-based network-modules and their gene-ontology enrichment to find out the HIV-1 stage-specific gene signature. Based on the DEP approach, we observed clear separation among samples from distinct HIV-1 stages using clustering and principal component analyses. Moreover, the discrimination power of DEPs on the samples (70-100% accuracy) was much higher than that of DEGs (35-45%) using several well-known classifiers. DEP-based network analysis also revealed the HIV-1 stage-specific network modules; the main biological processes were related to "translation," "RNA splicing," "mRNA, RNA, and nucleic acid transport," and "DNA metabolism." Through the HIV-1 stage-related modules, changing stage-specific patterns of protein interactions could be observed. DEP-based method discriminated the HIV-1 infection stages clearly, and revealed a HIV-1 stage-specific gene signature. The proposed DEP-based method might complement existing DEG-based approaches in various microarray expression analyses.

  14. Evaluation of selected static methods used to estimate element mobility, acid-generating and acid-neutralizing potentials associated with geologically diverse mining wastes

    USGS Publications Warehouse

    Hageman, Philip L.; Seal, Robert R.; Diehl, Sharon F.; Piatak, Nadine M.; Lowers, Heather

    2015-01-01

    A comparison study of selected static leaching and acid–base accounting (ABA) methods using a mineralogically diverse set of 12 modern-style, metal mine waste samples was undertaken to understand the relative performance of the various tests. To complement this study, in-depth mineralogical studies were conducted in order to elucidate the relationships between sample mineralogy, weathering features, and leachate and ABA characteristics. In part one of the study, splits of the samples were leached using six commonly used leaching tests including paste pH, the U.S. Geological Survey (USGS) Field Leach Test (FLT) (both 5-min and 18-h agitation), the U.S. Environmental Protection Agency (USEPA) Method 1312 SPLP (both leachate pH 4.2 and leachate pH 5.0), and the USEPA Method 1311 TCLP (leachate pH 4.9). Leachate geochemical trends were compared in order to assess differences, if any, produced by the various leaching procedures. Results showed that the FLT (5-min agitation) was just as effective as the 18-h leaching tests in revealing the leachate geochemical characteristics of the samples. Leaching results also showed that the TCLP leaching test produces inconsistent results when compared to results produced from the other leaching tests. In part two of the study, the ABA was determined on splits of the samples using both well-established traditional static testing methods and a relatively quick, simplified net acid–base accounting (NABA) procedure. Results showed that the traditional methods, while time consuming, provide the most in-depth data on both the acid generating, and acid neutralizing tendencies of the samples. However, the simplified NABA method provided a relatively fast, effective estimation of the net acid–base account of the samples. Overall, this study showed that while most of the well-established methods are useful and effective, the use of a simplified leaching test and the NABA acid–base accounting method provide investigators fast, quantitative tools that can be used to provide rapid, reliable information about the leachability of metals and other constituents of concern, and the acid-generating potential of metal mining waste.

  15. Molecular Weights of Bovine and Porcine Heparin Samples: Comparison of Chromatographic Methods and Results of a Collaborative Survey.

    PubMed

    Bertini, Sabrina; Risi, Giulia; Guerrini, Marco; Carrick, Kevin; Szajek, Anita Y; Mulloy, Barbara

    2017-07-19

    In a collaborative study involving six laboratories in the USA, Europe, and India the molecular weight distributions of a panel of heparin sodium samples were determined, in order to compare heparin sodium of bovine intestinal origin with that of bovine lung and porcine intestinal origin. Porcine samples met the current criteria as laid out in the USP Heparin Sodium monograph. Bovine lung heparin samples had consistently lower average molecular weights. Bovine intestinal heparin was variable in molecular weight; some samples fell below the USP limits, some fell within these limits and others fell above the upper limits. These data will inform the establishment of pharmacopeial acceptance criteria for heparin sodium derived from bovine intestinal mucosa. The method for MW determination as described in the USP monograph uses a single, broad standard calibrant to characterize the chromatographic profile of heparin sodium on high-resolution silica-based GPC columns. These columns may be short-lived in some laboratories. Using the panel of samples described above, methods based on the use of robust polymer-based columns have been developed. In addition to the use of the USP's broad standard calibrant for heparin sodium with these columns, a set of conditions have been devised that allow light-scattering detected molecular weight characterization of heparin sodium, giving results that agree well with the monograph method. These findings may facilitate the validation of variant chromatographic methods with some practical advantages over the USP monograph method.

  16. Online selective kernel-based temporal difference learning.

    PubMed

    Chen, Xingguo; Gao, Yang; Wang, Ruili

    2013-12-01

    In this paper, an online selective kernel-based temporal difference (OSKTD) learning algorithm is proposed to deal with large scale and/or continuous reinforcement learning problems. OSKTD includes two online procedures: online sparsification and parameter updating for the selective kernel-based value function. A new sparsification method (i.e., a kernel distance-based online sparsification method) is proposed based on selective ensemble learning, which is computationally less complex compared with other sparsification methods. With the proposed sparsification method, the sparsified dictionary of samples is constructed online by checking if a sample needs to be added to the sparsified dictionary. In addition, based on local validity, a selective kernel-based value function is proposed to select the best samples from the sample dictionary for the selective kernel-based value function approximator. The parameters of the selective kernel-based value function are iteratively updated by using the temporal difference (TD) learning algorithm combined with the gradient descent technique. The complexity of the online sparsification procedure in the OSKTD algorithm is O(n). In addition, two typical experiments (Maze and Mountain Car) are used to compare with both traditional and up-to-date O(n) algorithms (GTD, GTD2, and TDC using the kernel-based value function), and the results demonstrate the effectiveness of our proposed algorithm. In the Maze problem, OSKTD converges to an optimal policy and converges faster than both traditional and up-to-date algorithms. In the Mountain Car problem, OSKTD converges, requires less computation time compared with other sparsification methods, gets a better local optima than the traditional algorithms, and converges much faster than the up-to-date algorithms. In addition, OSKTD can reach a competitive ultimate optima compared with the up-to-date algorithms.

  17. Rapid and sensitive analysis of polychlorinated biphenyls and acrylamide in food samples using ionic liquid-based in situ dispersive liquid-liquid microextraction coupled to headspace gas chromatography.

    PubMed

    Zhang, Cheng; Cagliero, Cecilia; Pierson, Stephen A; Anderson, Jared L

    2017-01-20

    A simple and rapid ionic liquid (IL)-based in situ dispersive liquid-liquid microextraction (DLLME) method was developed and coupled to headspace gas chromatography (HS-GC) employing electron capture (ECD) and mass spectrometry (MS) detection for the analysis of polychlorinated biphenyls (PCBs) and acrylamide at trace levels from milk and coffee samples. The chemical structures of the halide-based ILs were tailored by introducing various functional groups to the cations to evaluate the effect of different structural features on the extraction efficiency of the target analytes. Extraction parameters including the molar ratio of IL to metathesis reagent and IL mass were optimized. The effects of HS oven temperature and the HS sample vial volume on the analyte response were also evaluated. The optimized in situ DLLME method exhibited good analytical precision, good linearity, and provided detection limits down to the low ppt level for PCBs and the low ppb level for acrylamide in aqueous samples. The matrix-compatibility of the developed method was also established by quantifying acrylamide in brewed coffee samples. This method is much simpler and faster compared to previously reported GC-MS methods using solid-phase microextraction (SPME) for the extraction/preconcentration of PCBs and acrylamide from complex food samples. Copyright © 2016 Elsevier B.V. All rights reserved.

  18. Absolute method of measuring magnetic susceptibility

    USGS Publications Warehouse

    Thorpe, A.; Senftle, F.E.

    1959-01-01

    An absolute method of standardization and measurement of the magnetic susceptibility of small samples is presented which can be applied to most techniques based on the Faraday method. The fact that the susceptibility is a function of the area under the curve of sample displacement versus distance of the magnet from the sample, offers a simple method of measuring the susceptibility without recourse to a standard sample. Typical results on a few substances are compared with reported values, and an error of less than 2% can be achieved. ?? 1959 The American Institute of Physics.

  19. Ultratrace level determination and quantitative analysis of kidney injury biomarkers in patient samples attained by zinc oxide nanorods

    NASA Astrophysics Data System (ADS)

    Singh, Manpreet; Alabanza, Anginelle; Gonzalez, Lorelis E.; Wang, Weiwei; Reeves, W. Brian; Hahm, Jong-In

    2016-02-01

    Determining ultratrace amounts of protein biomarkers in patient samples in a straightforward and quantitative manner is extremely important for early disease diagnosis and treatment. Here, we successfully demonstrate the novel use of zinc oxide nanorods (ZnO NRs) in the ultrasensitive and quantitative detection of two acute kidney injury (AKI)-related protein biomarkers, tumor necrosis factor (TNF)-α and interleukin (IL)-8, directly from patient samples. We first validate the ZnO NRs-based IL-8 results via comparison with those obtained from using a conventional enzyme-linked immunosorbent method in samples from 38 individuals. We further assess the full detection capability of the ZnO NRs-based technique by quantifying TNF-α, whose levels in human urine are often below the detection limits of conventional methods. Using the ZnO NR platforms, we determine the TNF-α concentrations of all 46 patient samples tested, down to the fg per mL level. Subsequently, we screen for TNF-α levels in approximately 50 additional samples collected from different patient groups in order to demonstrate a potential use of the ZnO NRs-based assay in assessing cytokine levels useful for further clinical monitoring. Our research efforts demonstrate that ZnO NRs can be straightforwardly employed in the rapid, ultrasensitive, quantitative, and simultaneous detection of multiple AKI-related biomarkers directly in patient urine samples, providing an unparalleled detection capability beyond those of conventional analysis methods. Additional key advantages of the ZnO NRs-based approach include a fast detection speed, low-volume assay condition, multiplexing ability, and easy automation/integration capability to existing fluorescence instrumentation. Therefore, we anticipate that our ZnO NRs-based detection method will be highly beneficial for overcoming the frequent challenges in early biomarker development and treatment assessment, pertaining to the facile and ultrasensitive quantification of hard-to-trace biomolecules.Determining ultratrace amounts of protein biomarkers in patient samples in a straightforward and quantitative manner is extremely important for early disease diagnosis and treatment. Here, we successfully demonstrate the novel use of zinc oxide nanorods (ZnO NRs) in the ultrasensitive and quantitative detection of two acute kidney injury (AKI)-related protein biomarkers, tumor necrosis factor (TNF)-α and interleukin (IL)-8, directly from patient samples. We first validate the ZnO NRs-based IL-8 results via comparison with those obtained from using a conventional enzyme-linked immunosorbent method in samples from 38 individuals. We further assess the full detection capability of the ZnO NRs-based technique by quantifying TNF-α, whose levels in human urine are often below the detection limits of conventional methods. Using the ZnO NR platforms, we determine the TNF-α concentrations of all 46 patient samples tested, down to the fg per mL level. Subsequently, we screen for TNF-α levels in approximately 50 additional samples collected from different patient groups in order to demonstrate a potential use of the ZnO NRs-based assay in assessing cytokine levels useful for further clinical monitoring. Our research efforts demonstrate that ZnO NRs can be straightforwardly employed in the rapid, ultrasensitive, quantitative, and simultaneous detection of multiple AKI-related biomarkers directly in patient urine samples, providing an unparalleled detection capability beyond those of conventional analysis methods. Additional key advantages of the ZnO NRs-based approach include a fast detection speed, low-volume assay condition, multiplexing ability, and easy automation/integration capability to existing fluorescence instrumentation. Therefore, we anticipate that our ZnO NRs-based detection method will be highly beneficial for overcoming the frequent challenges in early biomarker development and treatment assessment, pertaining to the facile and ultrasensitive quantification of hard-to-trace biomolecules. Electronic supplementary information (ESI) available: Typical SEM images of the ZnO NRs used in the biomarker assays are provided in Fig. S1. See DOI: 10.1039/c5nr08706f

  20. Designing efficient nitrous oxide sampling strategies in agroecosystems using simulation models

    NASA Astrophysics Data System (ADS)

    Saha, Debasish; Kemanian, Armen R.; Rau, Benjamin M.; Adler, Paul R.; Montes, Felipe

    2017-04-01

    Annual cumulative soil nitrous oxide (N2O) emissions calculated from discrete chamber-based flux measurements have unknown uncertainty. We used outputs from simulations obtained with an agroecosystem model to design sampling strategies that yield accurate cumulative N2O flux estimates with a known uncertainty level. Daily soil N2O fluxes were simulated for Ames, IA (corn-soybean rotation), College Station, TX (corn-vetch rotation), Fort Collins, CO (irrigated corn), and Pullman, WA (winter wheat), representing diverse agro-ecoregions of the United States. Fertilization source, rate, and timing were site-specific. These simulated fluxes surrogated daily measurements in the analysis. We ;sampled; the fluxes using a fixed interval (1-32 days) or a rule-based (decision tree-based) sampling method. Two types of decision trees were built: a high-input tree (HI) that included soil inorganic nitrogen (SIN) as a predictor variable, and a low-input tree (LI) that excluded SIN. Other predictor variables were identified with Random Forest. The decision trees were inverted to be used as rules for sampling a representative number of members from each terminal node. The uncertainty of the annual N2O flux estimation increased along with the fixed interval length. A 4- and 8-day fixed sampling interval was required at College Station and Ames, respectively, to yield ±20% accuracy in the flux estimate; a 12-day interval rendered the same accuracy at Fort Collins and Pullman. Both the HI and the LI rule-based methods provided the same accuracy as that of fixed interval method with up to a 60% reduction in sampling events, particularly at locations with greater temporal flux variability. For instance, at Ames, the HI rule-based and the fixed interval methods required 16 and 91 sampling events, respectively, to achieve the same absolute bias of 0.2 kg N ha-1 yr-1 in estimating cumulative N2O flux. These results suggest that using simulation models along with decision trees can reduce the cost and improve the accuracy of the estimations of cumulative N2O fluxes using the discrete chamber-based method.

  1. Whole-surface round object imaging method using line-scan hyperspectral imaging system

    USDA-ARS?s Scientific Manuscript database

    To achieve comprehensive online quality and safety inspection of fruits, whole-surface sample presentation and imaging regimes must be considered. Specifically, a round object sample presentation method is under development to achieve effective whole-surface sample evaluation based on the use of a s...

  2. Do sampling methods differ in their utility for ecological monitoring? Comparison of line-point intercept, grid-point intercept, and ocular estimate methods

    USDA-ARS?s Scientific Manuscript database

    This study compared the utility of three sampling methods for ecological monitoring based on: interchangeability of data (rank correlations), precision (coefficient of variation), cost (minutes/transect), and potential of each method to generate multiple indicators. Species richness and foliar cover...

  3. Comparison of PCR-Based Diagnosis with Centrifuged-Based Enrichment Method for Detection of Borrelia persica in Animal Blood Samples.

    PubMed

    Naddaf, S R; Kishdehi, M; Siavashi, Mr

    2011-01-01

    The mainstay of diagnosis of relapsing fever (RF) is demonstration of the spirochetes in Giemsa-stained thick blood smears, but during non fever periods the bacteria are very scanty and rarely detected in blood smears by microscopy. This study is aimed to evaluate the sensitivity of different methods developed for detection of low-grade spirochetemia. Animal blood samples with low degrees of spirochetemia were tested with two PCRs and a nested PCR targeting flaB, GlpQ, and rrs genes. Also, a centrifuged-based enrichment method and Giemsa staining were performed on blood samples with various degrees of spirochetemia. The flaB-PCR and nested rrs-PCR turned positive with various degrees of spirochetemia including the blood samples that turned negative with dark-field microscopy. The GlpQ-PCR was positive as far as at least one spirochete was seen in 5-10 microscopic fields. The sensitivity of GlpQ-PCR increased when DNA from Buffy Coat Layer (BCL) was used as template. The centrifuged-based enrichment method turned positive with as low concentration as 50 bacteria/ml blood, while Giemsa thick staining detected bacteria with concentrations ≥ 25000 bacteria/ml. Centrifuged-based enrichment method appeared as much as 500-fold more sensitive than thick smears, which makes it even superior to some PCR assays. Due to simplicity and minimal laboratory requirements, this method can be considered a valuable tool for diagnosis of RF in rural health centers.

  4. Modulation Based on Probability Density Functions

    NASA Technical Reports Server (NTRS)

    Williams, Glenn L.

    2009-01-01

    A proposed method of modulating a sinusoidal carrier signal to convey digital information involves the use of histograms representing probability density functions (PDFs) that characterize samples of the signal waveform. The method is based partly on the observation that when a waveform is sampled (whether by analog or digital means) over a time interval at least as long as one half cycle of the waveform, the samples can be sorted by frequency of occurrence, thereby constructing a histogram representing a PDF of the waveform during that time interval.

  5. Facebook Ads Recruit Parents of Children with Cancer for an Online Survey of Web-Based Research Preferences

    PubMed Central

    Akard, Terrah Foster; Wray, Sarah; Gilmer, Mary

    2014-01-01

    Background Studies involving samples of children with life-threatening illnesses and their families face significant challenges, including inadequate sample sizes and limited diversity. Social media recruitment and web-based research methods may help address such challenges yet have not been explored in pediatric cancer populations. Objective This study examined the feasibility of using Facebook ads to recruit parent caregivers of children and teens with cancer. We also explored the feasibility of web-based video recording in pediatric palliative care populations by surveying parents of children with cancer regarding (a) their preferences for research methods and (b) technological capabilities of their computers and phones. Methods Facebook's paid advertising program was used to recruit parent caregivers of children currently living with cancer to complete an electronic survey about research preferences and technological capabilities. Results The advertising campaign generated 3,897,981 impressions which resulted in 1050 clicks at a total cost of $1129.88. Of 284 screened individuals, 106 were eligible. Forty-five caregivers of children with cancer completed the entire electronic survey. Parents preferred and had technological capabilities for web-based and electronic research methods. Participant survey responses are reported. Conclusion Facebook was a useful, cost-effective method to recruit a diverse sample of parent caregivers of children with cancer. Web-based video recording and data collection may be feasible and desirable in samples of children with cancer and their families. Implications for Practice Web-based methods (e.g., Facebook, Skype) may enhance communication and access between nurses and pediatric oncology patients and their families. PMID:24945264

  6. Estimating uncertainty in respondent-driven sampling using a tree bootstrap method.

    PubMed

    Baraff, Aaron J; McCormick, Tyler H; Raftery, Adrian E

    2016-12-20

    Respondent-driven sampling (RDS) is a network-based form of chain-referral sampling used to estimate attributes of populations that are difficult to access using standard survey tools. Although it has grown quickly in popularity since its introduction, the statistical properties of RDS estimates remain elusive. In particular, the sampling variability of these estimates has been shown to be much higher than previously acknowledged, and even methods designed to account for RDS result in misleadingly narrow confidence intervals. In this paper, we introduce a tree bootstrap method for estimating uncertainty in RDS estimates based on resampling recruitment trees. We use simulations from known social networks to show that the tree bootstrap method not only outperforms existing methods but also captures the high variability of RDS, even in extreme cases with high design effects. We also apply the method to data from injecting drug users in Ukraine. Unlike other methods, the tree bootstrap depends only on the structure of the sampled recruitment trees, not on the attributes being measured on the respondents, so correlations between attributes can be estimated as well as variability. Our results suggest that it is possible to accurately assess the high level of uncertainty inherent in RDS.

  7. A similarity based learning framework for interim analysis of outcome prediction of acupuncture for neck pain.

    PubMed

    Zhang, Gang; Liang, Zhaohui; Yin, Jian; Fu, Wenbin; Li, Guo-Zheng

    2013-01-01

    Chronic neck pain is a common morbid disorder in modern society. Acupuncture has been administered for treating chronic pain as an alternative therapy for a long time, with its effectiveness supported by the latest clinical evidence. However, the potential effective difference in different syndrome types is questioned due to the limits of sample size and statistical methods. We applied machine learning methods in an attempt to solve this problem. Through a multi-objective sorting of subjective measurements, outstanding samples are selected to form the base of our kernel-oriented model. With calculation of similarities between the concerned sample and base samples, we are able to make full use of information contained in the known samples, which is especially effective in the case of a small sample set. To tackle the parameters selection problem in similarity learning, we propose an ensemble version of slightly different parameter setting to obtain stronger learning. The experimental result on a real data set shows that compared to some previous well-known methods, the proposed algorithm is capable of discovering the underlying difference among different syndrome types and is feasible for predicting the effective tendency in clinical trials of large samples.

  8. Application of Liquid Chromatography/Ion Trap Mass Spectrometry Technique to Determine Ergot Alkaloids in Grain Products

    PubMed Central

    Szymczyk, Krystyna; Jędrzejczak, Renata; Roszko, Marek

    2015-01-01

    Summary A liquid chromatography/ion trap mass spectrometry-based method to determine six ergot alkaloids and their isomers is presented. The samples were cleaned on neutral alumina-based solid-phase extraction cartridges. The following method parameters were obtained (depending on the analyte and spiking level): method recovery from 63.0 to 104.6%, relative standard deviation below 18%, linear range from 1 to 325 µg/kg, linear correlation coefficient not less than 0.98. The developed analytical procedure was applied to determine the levels of ergot alkaloids in 65 samples of selected rye-based food products (flour – 34 samples, bran – 12 samples, rye – 18 samples, flakes – 1 sample). Measurable levels of alkaloids were found in majority of the analysed samples, particularly in rye flour. Additionally, alkaloids were determined in ergot sclerotia isolated from rye grains. Total content was nearly 0.01% (97.9 mg/kg). However, the alkaloid profile was dominated by ergocristine at 45.6% (44.7 mg/kg), an alkaloid not commonly found in the tested food products. Ergocorninine at 0.2% (0.2 mg/kg) was the least abundant alkaloid. PMID:27904328

  9. Computational methods for efficient structural reliability and reliability sensitivity analysis

    NASA Technical Reports Server (NTRS)

    Wu, Y.-T.

    1993-01-01

    This paper presents recent developments in efficient structural reliability analysis methods. The paper proposes an efficient, adaptive importance sampling (AIS) method that can be used to compute reliability and reliability sensitivities. The AIS approach uses a sampling density that is proportional to the joint PDF of the random variables. Starting from an initial approximate failure domain, sampling proceeds adaptively and incrementally with the goal of reaching a sampling domain that is slightly greater than the failure domain to minimize over-sampling in the safe region. Several reliability sensitivity coefficients are proposed that can be computed directly and easily from the above AIS-based failure points. These probability sensitivities can be used for identifying key random variables and for adjusting design to achieve reliability-based objectives. The proposed AIS methodology is demonstrated using a turbine blade reliability analysis problem.

  10. Finite-key analysis for quantum key distribution with weak coherent pulses based on Bernoulli sampling

    NASA Astrophysics Data System (ADS)

    Kawakami, Shun; Sasaki, Toshihiko; Koashi, Masato

    2017-07-01

    An essential step in quantum key distribution is the estimation of parameters related to the leaked amount of information, which is usually done by sampling of the communication data. When the data size is finite, the final key rate depends on how the estimation process handles statistical fluctuations. Many of the present security analyses are based on the method with simple random sampling, where hypergeometric distribution or its known bounds are used for the estimation. Here we propose a concise method based on Bernoulli sampling, which is related to binomial distribution. Our method is suitable for the Bennett-Brassard 1984 (BB84) protocol with weak coherent pulses [C. H. Bennett and G. Brassard, Proceedings of the IEEE Conference on Computers, Systems and Signal Processing (IEEE, New York, 1984), Vol. 175], reducing the number of estimated parameters to achieve a higher key generation rate compared to the method with simple random sampling. We also apply the method to prove the security of the differential-quadrature-phase-shift (DQPS) protocol in the finite-key regime. The result indicates that the advantage of the DQPS protocol over the phase-encoding BB84 protocol in terms of the key rate, which was previously confirmed in the asymptotic regime, persists in the finite-key regime.

  11. Detecting the sampling rate through observations

    NASA Astrophysics Data System (ADS)

    Shoji, Isao

    2018-09-01

    This paper proposes a method to detect the sampling rate of discrete time series of diffusion processes. Using the maximum likelihood estimates of the parameters of a diffusion process, we establish a criterion based on the Kullback-Leibler divergence and thereby estimate the sampling rate. Simulation studies are conducted to check whether the method can detect the sampling rates from data and their results show a good performance in the detection. In addition, the method is applied to a financial time series sampled on daily basis and shows the detected sampling rate is different from the conventional rates.

  12. Facebook advertisements recruit parents of children with cancer for an online survey of web-based research preferences.

    PubMed

    Akard, Terrah Foster; Wray, Sarah; Gilmer, Mary Jo

    2015-01-01

    Studies involving samples of children with life-threatening illnesses and their families face significant challenges, including inadequate sample sizes and limited diversity. Social media recruitment and Web-based research methods may help address such challenges yet have not been explored in pediatric cancer populations. This study examined the feasibility of using Facebook advertisements to recruit parent caregivers of children and teenagers with cancer. We also explored the feasibility of Web-based video recording in pediatric palliative care populations by surveying parents of children with cancer regarding (a) their preferences for research methods and (b) technological capabilities of their computers and phones. Facebook's paid advertising program was used to recruit parent caregivers of children currently living with cancer to complete an electronic survey about research preferences and technological capabilities. The advertising campaign generated 3 897 981 impressions, which resulted in 1050 clicks at a total cost of $1129.88. Of 284 screened individuals, 106 were eligible. Forty-five caregivers of children with cancer completed the entire electronic survey. Parents preferred and had technological capabilities for Web-based and electronic research methods. Participant survey responses are reported. Facebook was a useful, cost-effective method to recruit a diverse sample of parent caregivers of children with cancer. Web-based video recording and data collection may be feasible and desirable in samples of children with cancer and their families. Web-based methods (eg, Facebook, Skype) may enhance communication and access between nurses and pediatric oncology patients and their families.

  13. 9 CFR 149.6 - Slaughter facilities.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... result based on an ELISA method and is confirmed positive by further testing using the digestion method... decertified. (C) If a test sample yields a positive test result based on an ELISA method, but is not confirmed...

  14. 9 CFR 149.6 - Slaughter facilities.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... result based on an ELISA method and is confirmed positive by further testing using the digestion method... decertified. (C) If a test sample yields a positive test result based on an ELISA method, but is not confirmed...

  15. 9 CFR 149.6 - Slaughter facilities.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... result based on an ELISA method and is confirmed positive by further testing using the digestion method... decertified. (C) If a test sample yields a positive test result based on an ELISA method, but is not confirmed...

  16. 9 CFR 149.6 - Slaughter facilities.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... result based on an ELISA method and is confirmed positive by further testing using the digestion method... decertified. (C) If a test sample yields a positive test result based on an ELISA method, but is not confirmed...

  17. 9 CFR 149.6 - Slaughter facilities.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... result based on an ELISA method and is confirmed positive by further testing using the digestion method... decertified. (C) If a test sample yields a positive test result based on an ELISA method, but is not confirmed...

  18. Force-momentum-based self-guided Langevin dynamics: A rapid sampling method that approaches the canonical ensemble

    NASA Astrophysics Data System (ADS)

    Wu, Xiongwu; Brooks, Bernard R.

    2011-11-01

    The self-guided Langevin dynamics (SGLD) is a method to accelerate conformational searching. This method is unique in the way that it selectively enhances and suppresses molecular motions based on their frequency to accelerate conformational searching without modifying energy surfaces or raising temperatures. It has been applied to studies of many long time scale events, such as protein folding. Recent progress in the understanding of the conformational distribution in SGLD simulations makes SGLD also an accurate method for quantitative studies. The SGLD partition function provides a way to convert the SGLD conformational distribution to the canonical ensemble distribution and to calculate ensemble average properties through reweighting. Based on the SGLD partition function, this work presents a force-momentum-based self-guided Langevin dynamics (SGLDfp) simulation method to directly sample the canonical ensemble. This method includes interaction forces in its guiding force to compensate the perturbation caused by the momentum-based guiding force so that it can approximately sample the canonical ensemble. Using several example systems, we demonstrate that SGLDfp simulations can approximately maintain the canonical ensemble distribution and significantly accelerate conformational searching. With optimal parameters, SGLDfp and SGLD simulations can cross energy barriers of more than 15 kT and 20 kT, respectively, at similar rates for LD simulations to cross energy barriers of 10 kT. The SGLDfp method is size extensive and works well for large systems. For studies where preserving accessible conformational space is critical, such as free energy calculations and protein folding studies, SGLDfp is an efficient approach to search and sample the conformational space.

  19. "Magnitude-based inference": a statistical review.

    PubMed

    Welsh, Alan H; Knight, Emma J

    2015-04-01

    We consider "magnitude-based inference" and its interpretation by examining in detail its use in the problem of comparing two means. We extract from the spreadsheets, which are provided to users of the analysis (http://www.sportsci.org/), a precise description of how "magnitude-based inference" is implemented. We compare the implemented version of the method with general descriptions of it and interpret the method in familiar statistical terms. We show that "magnitude-based inference" is not a progressive improvement on modern statistics. The additional probabilities introduced are not directly related to the confidence interval but, rather, are interpretable either as P values for two different nonstandard tests (for different null hypotheses) or as approximate Bayesian calculations, which also lead to a type of test. We also discuss sample size calculations associated with "magnitude-based inference" and show that the substantial reduction in sample sizes claimed for the method (30% of the sample size obtained from standard frequentist calculations) is not justifiable so the sample size calculations should not be used. Rather than using "magnitude-based inference," a better solution is to be realistic about the limitations of the data and use either confidence intervals or a fully Bayesian analysis.

  20. Live Authority in the Classroom in Video Conference-Based Synchronous Distance Education: The Teaching Assistant

    ERIC Educational Resources Information Center

    Karal, Hasan; Çebi, Ayça; Turgut, Yigit Emrah

    2010-01-01

    The aim of this study was to define the role of the assistant in a classroom environment where students are taught using video conference-based synchronous distance education. Qualitative research approach was adopted and, among purposeful sampling methods, criterion sampling method was preferred in the scope of the study. The study was carried…

  1. Detecting spatial structures in throughfall data: the effect of extent, sample size, sampling design, and variogram estimation method

    NASA Astrophysics Data System (ADS)

    Voss, Sebastian; Zimmermann, Beate; Zimmermann, Alexander

    2016-04-01

    In the last three decades, an increasing number of studies analyzed spatial patterns in throughfall to investigate the consequences of rainfall redistribution for biogeochemical and hydrological processes in forests. In the majority of cases, variograms were used to characterize the spatial properties of the throughfall data. The estimation of the variogram from sample data requires an appropriate sampling scheme: most importantly, a large sample and an appropriate layout of sampling locations that often has to serve both variogram estimation and geostatistical prediction. While some recommendations on these aspects exist, they focus on Gaussian data and high ratios of the variogram range to the extent of the study area. However, many hydrological data, and throughfall data in particular, do not follow a Gaussian distribution. In this study, we examined the effect of extent, sample size, sampling design, and calculation methods on variogram estimation of throughfall data. For our investigation, we first generated non-Gaussian random fields based on throughfall data with heavy outliers. Subsequently, we sampled the fields with three extents (plots with edge lengths of 25 m, 50 m, and 100 m), four common sampling designs (two grid-based layouts, transect and random sampling), and five sample sizes (50, 100, 150, 200, 400). We then estimated the variogram parameters by method-of-moments and residual maximum likelihood. Our key findings are threefold. First, the choice of the extent has a substantial influence on the estimation of the variogram. A comparatively small ratio of the extent to the correlation length is beneficial for variogram estimation. Second, a combination of a minimum sample size of 150, a design that ensures the sampling of small distances and variogram estimation by residual maximum likelihood offers a good compromise between accuracy and efficiency. Third, studies relying on method-of-moments based variogram estimation may have to employ at least 200 sampling points for reliable variogram estimates. These suggested sample sizes exceed the numbers recommended by studies dealing with Gaussian data by up to 100 %. Given that most previous throughfall studies relied on method-of-moments variogram estimation and sample sizes << 200, our current knowledge about throughfall spatial variability stands on shaky ground.

  2. Detecting spatial structures in throughfall data: The effect of extent, sample size, sampling design, and variogram estimation method

    NASA Astrophysics Data System (ADS)

    Voss, Sebastian; Zimmermann, Beate; Zimmermann, Alexander

    2016-09-01

    In the last decades, an increasing number of studies analyzed spatial patterns in throughfall by means of variograms. The estimation of the variogram from sample data requires an appropriate sampling scheme: most importantly, a large sample and a layout of sampling locations that often has to serve both variogram estimation and geostatistical prediction. While some recommendations on these aspects exist, they focus on Gaussian data and high ratios of the variogram range to the extent of the study area. However, many hydrological data, and throughfall data in particular, do not follow a Gaussian distribution. In this study, we examined the effect of extent, sample size, sampling design, and calculation method on variogram estimation of throughfall data. For our investigation, we first generated non-Gaussian random fields based on throughfall data with large outliers. Subsequently, we sampled the fields with three extents (plots with edge lengths of 25 m, 50 m, and 100 m), four common sampling designs (two grid-based layouts, transect and random sampling) and five sample sizes (50, 100, 150, 200, 400). We then estimated the variogram parameters by method-of-moments (non-robust and robust estimators) and residual maximum likelihood. Our key findings are threefold. First, the choice of the extent has a substantial influence on the estimation of the variogram. A comparatively small ratio of the extent to the correlation length is beneficial for variogram estimation. Second, a combination of a minimum sample size of 150, a design that ensures the sampling of small distances and variogram estimation by residual maximum likelihood offers a good compromise between accuracy and efficiency. Third, studies relying on method-of-moments based variogram estimation may have to employ at least 200 sampling points for reliable variogram estimates. These suggested sample sizes exceed the number recommended by studies dealing with Gaussian data by up to 100 %. Given that most previous throughfall studies relied on method-of-moments variogram estimation and sample sizes ≪200, currently available data are prone to large uncertainties.

  3. Recommended Mass Spectrometry-Based Strategies to Identify Ricin-Containing Samples

    PubMed Central

    Kalb, Suzanne R.; Schieltz, David M.; Becher, François; Astot, Crister; Fredriksson, Sten-Åke; Barr, John R.

    2015-01-01

    Ricin is a protein toxin produced by the castor bean plant (Ricinus communis) together with a related protein known as R. communis agglutinin (RCA120). Mass spectrometric (MS) assays have the capacity to unambiguously identify ricin and to detect ricin’s activity in samples with complex matrices. These qualitative and quantitative assays enable detection and differentiation of ricin from the less toxic RCA120 through determination of the amino acid sequence of the protein in question, and active ricin can be monitored by MS as the release of adenine from the depurination of a nucleic acid substrate. In this work, we describe the application of MS-based methods to detect, differentiate and quantify ricin and RCA120 in nine blinded samples supplied as part of the EQuATox proficiency test. Overall, MS-based assays successfully identified all samples containing ricin or RCA120 with the exception of the sample spiked with the lowest concentration (0.414 ng/mL). In fact, mass spectrometry was the most successful method for differentiation of ricin and RCA120 based on amino acid determination. Mass spectrometric methods were also successful at ranking the functional activities of the samples, successfully yielding semi-quantitative results. These results indicate that MS-based assays are excellent techniques to detect, differentiate, and quantify ricin and RCA120 in complex matrices. PMID:26610568

  4. Sequential ensemble-based optimal design for parameter estimation: SEQUENTIAL ENSEMBLE-BASED OPTIMAL DESIGN

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Man, Jun; Zhang, Jiangjiang; Li, Weixuan

    2016-10-01

    The ensemble Kalman filter (EnKF) has been widely used in parameter estimation for hydrological models. The focus of most previous studies was to develop more efficient analysis (estimation) algorithms. On the other hand, it is intuitively understandable that a well-designed sampling (data-collection) strategy should provide more informative measurements and subsequently improve the parameter estimation. In this work, a Sequential Ensemble-based Optimal Design (SEOD) method, coupled with EnKF, information theory and sequential optimal design, is proposed to improve the performance of parameter estimation. Based on the first-order and second-order statistics, different information metrics including the Shannon entropy difference (SD), degrees ofmore » freedom for signal (DFS) and relative entropy (RE) are used to design the optimal sampling strategy, respectively. The effectiveness of the proposed method is illustrated by synthetic one-dimensional and two-dimensional unsaturated flow case studies. It is shown that the designed sampling strategies can provide more accurate parameter estimation and state prediction compared with conventional sampling strategies. Optimal sampling designs based on various information metrics perform similarly in our cases. The effect of ensemble size on the optimal design is also investigated. Overall, larger ensemble size improves the parameter estimation and convergence of optimal sampling strategy. Although the proposed method is applied to unsaturated flow problems in this study, it can be equally applied in any other hydrological problems.« less

  5. Comparison of three sampling and analytical methods for the determination of airborne hexavalent chromium.

    PubMed

    Boiano, J M; Wallace, M E; Sieber, W K; Groff, J H; Wang, J; Ashley, K

    2000-08-01

    A field study was conducted with the goal of comparing the performance of three recently developed or modified sampling and analytical methods for the determination of airborne hexavalent chromium (Cr(VI)). The study was carried out in a hard chrome electroplating facility and in a jet engine manufacturing facility where airborne Cr(VI) was expected to be present. The analytical methods evaluated included two laboratory-based procedures (OSHA Method ID-215 and NIOSH Method 7605) and a field-portable method (NIOSH Method 7703). These three methods employ an identical sampling methodology: collection of Cr(VI)-containing aerosol on a polyvinyl chloride (PVC) filter housed in a sampling cassette, which is connected to a personal sampling pump calibrated at an appropriate flow rate. The basis of the analytical methods for all three methods involves extraction of the PVC filter in alkaline buffer solution, chemical isolation of the Cr(VI) ion, complexation of the Cr(VI) ion with 1,5-diphenylcarbazide, and spectrometric measurement of the violet chromium diphenylcarbazone complex at 540 nm. However, there are notable specific differences within the sample preparation procedures used in three methods. To assess the comparability of the three measurement protocols, a total of 20 side-by-side air samples were collected, equally divided between a chromic acid electroplating operation and a spray paint operation where water soluble forms of Cr(VI) were used. A range of Cr(VI) concentrations from 0.6 to 960 microg m(-3), with Cr(VI) mass loadings ranging from 0.4 to 32 microg, was measured at the two operations. The equivalence of the means of the log-transformed Cr(VI) concentrations obtained from the different analytical methods was compared. Based on analysis of variance (ANOVA) results, no statistically significant differences were observed between mean values measured using each of the three methods. Small but statistically significant differences were observed between results obtained from performance evaluation samples for the NIOSH field method and the OSHA laboratory method.

  6. Geology-based method of assessing sensitivity of streams to acidic deposition in Charles and Anne Arundel Counties, Maryland

    USGS Publications Warehouse

    Rice, Karen C.; Bricker, Owen P.

    1991-01-01

    The report describes the results of a study to assess the sensitivity of streams to acidic deposition in Charles and Anne Arundel Counties, Maryland using a geology-based method. Water samples were collected from streams in July and August 1988 when streams were at base-flow conditions. Eighteen water samples collected from streams in Charles County, and 17 water samples from streams in Anne Arundel County were analyzed in the field for pH, specific conductance, and acid-neutralizing capacity (ANC); 8 water samples from streams in Charles County were analyzed in the laboratory for chloride and sulfate concentrations. The assessment revealed that streams in these counties are sensitive to acidification by acidic deposition.

  7. Molecular method for detection of total coliforms in drinking water samples.

    PubMed

    Maheux, Andrée F; Boudreau, Dominique K; Bisson, Marc-Antoine; Dion-Dupont, Vanessa; Bouchard, Sébastien; Nkuranga, Martine; Bergeron, Michel G; Rodriguez, Manuel J

    2014-07-01

    This work demonstrates the ability of a bacterial concentration and recovery procedure combined with three different PCR assays targeting the lacZ, wecG, and 16S rRNA genes, respectively, to detect the presence of total coliforms in 100-ml samples of potable water (presence/absence test). PCR assays were first compared to the culture-based Colilert and MI agar methods to determine their ability to detect 147 coliform strains representing 76 species of Enterobacteriaceae encountered in fecal and environmental settings. Results showed that 86 (58.5%) and 109 (74.1%) strains yielded a positive signal with Colilert and MI agar methods, respectively, whereas the lacZ, wecG, and 16S rRNA PCR assays detected 133 (90.5%), 111 (75.5%), and 146 (99.3%) of the 147 total coliform strains tested. These assays were then assessed by testing 122 well water samples collected in the Québec City region of Canada. Results showed that 97 (79.5%) of the samples tested by culture-based methods and 95 (77.9%), 82 (67.2%), and 98 (80.3%) of samples tested using PCR-based methods contained total coliforms, respectively. Consequently, despite the high genetic variability of the total coliform group, this study demonstrated that it is possible to use molecular assays to detect total coliforms in potable water: the 16S rRNA molecular assay was shown to be as efficient as recommended culture-based methods. This assay might be used in combination with an Escherichia coli molecular assay to assess drinking water quality. Copyright © 2014, American Society for Microbiology. All Rights Reserved.

  8. Molecular Method for Detection of Total Coliforms in Drinking Water Samples

    PubMed Central

    Boudreau, Dominique K.; Bisson, Marc-Antoine; Dion-Dupont, Vanessa; Bouchard, Sébastien; Nkuranga, Martine; Bergeron, Michel G.; Rodriguez, Manuel J.

    2014-01-01

    This work demonstrates the ability of a bacterial concentration and recovery procedure combined with three different PCR assays targeting the lacZ, wecG, and 16S rRNA genes, respectively, to detect the presence of total coliforms in 100-ml samples of potable water (presence/absence test). PCR assays were first compared to the culture-based Colilert and MI agar methods to determine their ability to detect 147 coliform strains representing 76 species of Enterobacteriaceae encountered in fecal and environmental settings. Results showed that 86 (58.5%) and 109 (74.1%) strains yielded a positive signal with Colilert and MI agar methods, respectively, whereas the lacZ, wecG, and 16S rRNA PCR assays detected 133 (90.5%), 111 (75.5%), and 146 (99.3%) of the 147 total coliform strains tested. These assays were then assessed by testing 122 well water samples collected in the Québec City region of Canada. Results showed that 97 (79.5%) of the samples tested by culture-based methods and 95 (77.9%), 82 (67.2%), and 98 (80.3%) of samples tested using PCR-based methods contained total coliforms, respectively. Consequently, despite the high genetic variability of the total coliform group, this study demonstrated that it is possible to use molecular assays to detect total coliforms in potable water: the 16S rRNA molecular assay was shown to be as efficient as recommended culture-based methods. This assay might be used in combination with an Escherichia coli molecular assay to assess drinking water quality. PMID:24771030

  9. Recent developments in detection and enumeration of waterborne bacteria: a retrospective minireview.

    PubMed

    Deshmukh, Rehan A; Joshi, Kopal; Bhand, Sunil; Roy, Utpal

    2016-12-01

    Waterborne diseases have emerged as global health problems and their rapid and sensitive detection in environmental water samples is of great importance. Bacterial identification and enumeration in water samples is significant as it helps to maintain safe drinking water for public consumption. Culture-based methods are laborious, time-consuming, and yield false-positive results, whereas viable but nonculturable (VBNCs) microorganisms cannot be recovered. Hence, numerous methods have been developed for rapid detection and quantification of waterborne pathogenic bacteria in water. These rapid methods can be classified into nucleic acid-based, immunology-based, and biosensor-based detection methods. This review summarizes the principle and current state of rapid methods for the monitoring and detection of waterborne bacterial pathogens. Rapid methods outlined are polymerase chain reaction (PCR), digital droplet PCR, real-time PCR, multiplex PCR, DNA microarray, Next-generation sequencing (pyrosequencing, Illumina technology and genomics), and fluorescence in situ hybridization that are categorized as nucleic acid-based methods. Enzyme-linked immunosorbent assay (ELISA) and immunofluorescence are classified into immunology-based methods. Optical, electrochemical, and mass-based biosensors are grouped into biosensor-based methods. Overall, these methods are sensitive, specific, time-effective, and important in prevention and diagnosis of waterborne bacterial diseases. © 2016 The Authors. MicrobiologyOpen published by John Wiley & Sons Ltd.

  10. A general method to determine sampling windows for nonlinear mixed effects models with an application to population pharmacokinetic studies.

    PubMed

    Foo, Lee Kien; McGree, James; Duffull, Stephen

    2012-01-01

    Optimal design methods have been proposed to determine the best sampling times when sparse blood sampling is required in clinical pharmacokinetic studies. However, the optimal blood sampling time points may not be feasible in clinical practice. Sampling windows, a time interval for blood sample collection, have been proposed to provide flexibility in blood sampling times while preserving efficient parameter estimation. Because of the complexity of the population pharmacokinetic models, which are generally nonlinear mixed effects models, there is no analytical solution available to determine sampling windows. We propose a method for determination of sampling windows based on MCMC sampling techniques. The proposed method attains a stationary distribution rapidly and provides time-sensitive windows around the optimal design points. The proposed method is applicable to determine sampling windows for any nonlinear mixed effects model although our work focuses on an application to population pharmacokinetic models. Copyright © 2012 John Wiley & Sons, Ltd.

  11. Comparing conventional Descriptive Analysis and Napping®-UFP against physiochemical measurements: a case study using apples.

    PubMed

    Pickup, William; Bremer, Phil; Peng, Mei

    2018-03-01

    The extensive time and cost associated with conventional sensory profiling methods has spurred sensory researchers to develop rapid method alternatives, such as Napping® with Ultra-Flash Profiling (UFP). Napping®-UFP generates sensory maps by requiring untrained panellists to separate samples based on perceived sensory similarities. Evaluations of this method have been restrained to manufactured/formulated food models, and predominantly structured on comparisons against the conventional descriptive method. The present study aims to extend the validation of Napping®-UFP (N = 72) to natural biological products; and to evaluate this method against Descriptive Analysis (DA; N = 8) with physiochemical measurements as an additional evaluative criterion. The results revealed that sample configurations generated by DA and Napping®-UFP were not significantly correlated (RV = 0.425, P = 0.077); however, they were both correlated with the product map generated based on the instrumental measures (P < 0.05). The finding also noted that sample characterisations from DA and Napping®-UFP were driven by different sensory attributes, indicating potential structural differences between these two methods in configuring samples. Overall, these findings lent support for the extended use of Napping®-UFP for evaluations of natural biological products. Although DA was shown to be a better method for establishing sensory-instrumental relationships, Napping®-UFP exhibited strengths in generating informative sample configurations based on holistic perception of products. © 2017 Society of Chemical Industry. © 2017 Society of Chemical Industry.

  12. A simple linear model for estimating ozone AOT40 at forest sites from raw passive sampling data.

    PubMed

    Ferretti, Marco; Cristofolini, Fabiana; Cristofori, Antonella; Gerosa, Giacomo; Gottardini, Elena

    2012-08-01

    A rapid, empirical method is described for estimating weekly AOT40 from ozone concentrations measured with passive samplers at forest sites. The method is based on linear regression and was developed after three years of measurements in Trentino (northern Italy). It was tested against an independent set of data from passive sampler sites across Italy. It provides good weekly estimates compared with those measured by conventional monitors (0.85 ≤R(2)≤ 0.970; 97 ≤ RMSE ≤ 302). Estimates obtained using passive sampling at forest sites are comparable to those obtained by another estimation method based on modelling hourly concentrations (R(2) = 0.94; 131 ≤ RMSE ≤ 351). Regression coefficients of passive sampling are similar to those obtained with conventional monitors at forest sites. Testing against an independent dataset generated by passive sampling provided similar results (0.86 ≤R(2)≤ 0.99; 65 ≤ RMSE ≤ 478). Errors tend to accumulate when weekly AOT40 estimates are summed to obtain the total AOT40 over the May-July period, and the median deviation between the two estimation methods based on passive sampling is 11%. The method proposed does not require any assumptions, complex calculation or modelling technique, and can be useful when other estimation methods are not feasible, either in principle or in practice. However, the method is not useful when estimates of hourly concentrations are of interest.

  13. Determination of five tetracyclines and their epimers by LC-MS/MS based on a liquid-liquid extraction with low temperature partitioning.

    PubMed

    Desmarchelier, Aurélien; Anizan, Sébastien; Minh Tien, Mai; Savoy, Marie-Claude; Bion, Cindy

    2018-04-01

    An LC-MS/MS method is presented for screening five tetracyclines and their epimers in a broad range of food products. The scope of matrices includes meat-, fish-, seafood-based products, various dairy ingredients, infant formulae and fats. The method principle is based on a liquid-liquid extraction with aqueous ethylenediaminetetraacetic acid (EDTA) and acetonitrile followed by a freezing step to promote phase separation at low temperature. After defatting with hexane, sample extracts were evaporated and reconstituted before injection onto the LC-MS/MS system. The addition of oxalic acid in the aqueous mobile phase was mandatory to maintain good peak shape and sensitivity over the run. The screening is based upon a double preparation of each sample, one 'as such' and a second one with the analytes spiked in the sample, in order to mitigate the risk of false negative response. The method was validated according to the European Community Reference Laboratories Residues Guidelines. A total of 93 samples were included in the validation by two independent laboratories giving both false-negative and false-positive rates at 0% for all compounds. Over the last two years, 2600 samples were analysed routinely and only one chicken sample was found above the regulatory limit.

  14. Performance of human fecal anaerobe-associated PCR-based assays in a multi-laboratory method evaluation study

    USGS Publications Warehouse

    Layton, Blythe A.; Cao, Yiping; Ebentier, Darcy L.; Hanley, Kaitlyn; Ballesté, Elisenda; Brandão, João; Byappanahalli, Muruleedhara N.; Converse, Reagan; Farnleitner, Andreas H.; Gentry-Shields, Jennifer; Gourmelon, Michèle; Lee, Chang Soo; Lee, Jiyoung; Lozach, Solen; Madi, Tania; Meijer, Wim G.; Noble, Rachel; Peed, Lindsay; Reischer, Georg H.; Rodrigues, Raquel; Rose, Joan B.; Schriewer, Alexander; Sinigalliano, Chris; Srinivasan, Sangeetha; Stewart, Jill; ,; Laurie, C.; Wang, Dan; Whitman, Richard; Wuertz, Stefan; Jay, Jenny; Holden, Patricia A.; Boehm, Alexandria B.; Shanks, Orin; Griffith, John F.

    2013-01-01

    A number of PCR-based methods for detecting human fecal material in environmental waters have been developed over the past decade, but these methods have rarely received independent comparative testing in large multi-laboratory studies. Here, we evaluated ten of these methods (BacH, BacHum-UCD, Bacteroides thetaiotaomicron (BtH), BsteriF1, gyrB, HF183 endpoint, HF183 SYBR, HF183 Taqman®, HumM2, and Methanobrevibacter smithii nifH (Mnif)) using 64 blind samples prepared in one laboratory. The blind samples contained either one or two fecal sources from human, wastewater or non-human sources. The assay results were assessed for presence/absence of the human markers and also quantitatively while varying the following: 1) classification of samples that were detected but not quantifiable (DNQ) as positive or negative; 2) reference fecal sample concentration unit of measure (such as culturable indicator bacteria, wet mass, total DNA, etc); and 3) human fecal source type (stool, sewage or septage). Assay performance using presence/absence metrics was found to depend on the classification of DNQ samples. The assays that performed best quantitatively varied based on the fecal concentration unit of measure and laboratory protocol. All methods were consistently more sensitive to human stools compared to sewage or septage in both the presence/absence and quantitative analysis. Overall, HF183 Taqman® was found to be the most effective marker of human fecal contamination in this California-based study.

  15. Semi-automated in vivo solid-phase microextraction sampling and the diffusion-based interface calibration model to determine the pharmacokinetics of methoxyfenoterol and fenoterol in rats.

    PubMed

    Yeung, Joanne Chung Yan; de Lannoy, Inés; Gien, Brad; Vuckovic, Dajana; Yang, Yingbo; Bojko, Barbara; Pawliszyn, Janusz

    2012-09-12

    In vivo solid-phase microextraction (SPME) can be used to sample the circulating blood of animals without the need to withdraw a representative blood sample. In this study, in vivo SPME in combination with liquid-chromatography tandem mass spectrometry (LC-MS/MS) was used to determine the pharmacokinetics of two drug analytes, R,R-fenoterol and R,R-methoxyfenoterol, administered as 5 mg kg(-1) i.v. bolus doses to groups of 5 rats. This research illustrates, for the first time, the feasibility of the diffusion-based calibration interface model for in vivo SPME studies. To provide a constant sampling rate as required for the diffusion-based interface model, partial automation of the SPME sampling of the analytes from the circulating blood was accomplished using an automated blood sampling system. The use of the blood sampling system allowed automation of all SPME sampling steps in vivo, except for the insertion and removal of the SPME probe from the sampling interface. The results from in vivo SPME were compared to the conventional method based on blood withdrawal and sample clean up by plasma protein precipitation. Both whole blood and plasma concentrations were determined by the conventional method. The concentrations of methoxyfenoterol and fenoterol obtained by SPME generally concur with the whole blood concentrations determined by the conventional method indicating the utility of the proposed method. The proposed diffusion-based interface model has several advantages over other kinetic calibration models for in vivo SPME sampling including (i) it does not require the addition of a standard into the sample matrix during in vivo studies, (ii) it is simple and rapid and eliminates the need to pre-load appropriate standard onto the SPME extraction phase and (iii) the calibration constant for SPME can be calculated based on the diffusion coefficient, extraction time, fiber length and radius, and size of the boundary layer. In the current study, the experimental calibration constants of 338.9±30 mm(-3) and 298.5±25 mm(-3) are in excellent agreement with the theoretical calibration constants of 307.9 mm(-3) and 316.0 mm(-3) for fenoterol and methoxyfenoterol respectively. Copyright © 2012 Elsevier B.V. All rights reserved.

  16. An improved initialization center k-means clustering algorithm based on distance and density

    NASA Astrophysics Data System (ADS)

    Duan, Yanling; Liu, Qun; Xia, Shuyin

    2018-04-01

    Aiming at the problem of the random initial clustering center of k means algorithm that the clustering results are influenced by outlier data sample and are unstable in multiple clustering, a method of central point initialization method based on larger distance and higher density is proposed. The reciprocal of the weighted average of distance is used to represent the sample density, and the data sample with the larger distance and the higher density are selected as the initial clustering centers to optimize the clustering results. Then, a clustering evaluation method based on distance and density is designed to verify the feasibility of the algorithm and the practicality, the experimental results on UCI data sets show that the algorithm has a certain stability and practicality.

  17. Simple Sodium Dodecyl Sulfate-Assisted Sample Preparation Method for LC-MS-based Proteomic Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhou, Jianying; Dann, Geoffrey P.; Shi, Tujin

    2012-03-10

    Sodium dodecyl sulfate (SDS) is one of the most popular laboratory reagents used for highly efficient biological sample extraction; however, SDS presents a significant challenge to LC-MS-based proteomic analyses due to its severe interference with reversed-phase LC separations and electrospray ionization interfaces. This study reports a simple SDS-assisted proteomic sample preparation method facilitated by a novel peptide-level SDS removal protocol. After SDS-assisted protein extraction and digestion, SDS was effectively (>99.9%) removed from peptides through ion substitution-mediated DS- precipitation with potassium chloride (KCl) followed by {approx}10 min centrifugation. Excellent peptide recovery (>95%) was observed for less than 20 {mu}g of peptides.more » Further experiments demonstrated the compatibility of this protocol with LC-MS/MS analyses. The resulting proteome coverage from this SDS-assisted protocol was comparable to or better than those obtained from other standard proteomic preparation methods in both mammalian tissues and bacterial samples. These results suggest that this SDS-assisted protocol is a practical, simple, and broadly applicable proteomic sample processing method, which can be particularly useful when dealing with samples difficult to solubilize by other methods.« less

  18. Method of plasma etching Ga-based compound semiconductors

    DOEpatents

    Qiu, Weibin; Goddard, Lynford L.

    2012-12-25

    A method of plasma etching Ga-based compound semiconductors includes providing a process chamber and a source electrode adjacent to the process chamber. The process chamber contains a sample comprising a Ga-based compound semiconductor. The sample is in contact with a platen which is electrically connected to a first power supply, and the source electrode is electrically connected to a second power supply. The method includes flowing SiCl.sub.4 gas into the chamber, flowing Ar gas into the chamber, and flowing H.sub.2 gas into the chamber. RF power is supplied independently to the source electrode and the platen. A plasma is generated based on the gases in the process chamber, and regions of a surface of the sample adjacent to one or more masked portions of the surface are etched to create a substantially smooth etched surface including features having substantially vertical walls beneath the masked portions.

  19. A passive guard for low thermal conductivity measurement of small samples by the hot plate method

    NASA Astrophysics Data System (ADS)

    Jannot, Yves; Degiovanni, Alain; Grigorova-Moutiers, Veneta; Godefroy, Justine

    2017-01-01

    Hot plate methods under steady state conditions are based on a 1D model to estimate the thermal conductivity, using measurements of the temperatures T 0 and T 1 of the two sides of the sample and of the heat flux crossing it. To be consistent with the hypothesis of the 1D heat flux, either a hot plate guarded apparatus is used, or the temperature is measured at the centre of the sample. On one hand the latter method can be used only if the ratio thickness/width of the sample is sufficiently low and on the other hand the guarded hot plate method requires large width samples (typical cross section of 0.6  ×  0.6 m2). That is why both methods cannot be used for low width samples. The method presented in this paper is based on an optimal choice of the temperatures T 0 and T 1 compared to the ambient temperature T a, enabling the estimation of the thermal conductivity with a centered hot plate method, by applying the 1D heat flux model. It will be shown that these optimal values do not depend on the size or on the thermal conductivity of samples (in the range 0.015-0.2 W m-1 K-1), but only on T a. The experimental results obtained validate the method for several reference samples for values of the ratio thickness/width up to 0.3, thus enabling the measurement of the thermal conductivity of samples having a small cross-section, down to 0.045  ×  0.045 m2.

  20. [Development of selective determination methods for quinones with fluorescence and chemiluminescence detection and their application to environmental and biological samples].

    PubMed

    Kishikawa, Naoya

    2010-10-01

    Quinones are compounds that have various characteristics such as a biological electron transporter, an industrial product and a harmful environmental pollutant. Therefore, an effective determination method for quinones is required in many fields. This review describes the development of sensitive and selective determination methods for quinones based on some detection principles and their application to analyses in environmental, pharmaceutical and biological samples. Firstly, a fluorescence method was developed based on fluorogenic derivatization of quinones and applied to environmental analysis. Secondly, a luminol chemiluminescence method was developed based on generation of reactive oxygen species through the redox cycle of quinone and applied to pharmaceutical analysis. Thirdly, a photo-induced chemiluminescence method was developed based on formation of reactive oxygen species and fluorophore or chemiluminescence enhancer by the photoreaction of quinones and applied to biological and environmental analyses.

  1. Hard-to-reach populations of men who have sex with men and sex workers: a systematic review on sampling methods.

    PubMed

    Barros, Ana B; Dias, Sonia F; Martins, Maria Rosario O

    2015-10-30

    In public health, hard-to-reach populations are often recruited by non-probabilistic sampling methods that produce biased results. In order to overcome this, several sampling methods have been improved and developed in the last years. The aim of this systematic review was to identify all current methods used to survey most-at-risk populations of men who have sex with men and sex workers. The review also aimed to assess if there were any relations between the study populations and the sampling methods used to recruit them. Lastly, we wanted to assess if the number of publications originated in middle and low human development (MLHD) countries had been increasing in the last years. A systematic review was conducted using electronic databases and a total of 268 published studies were included in the analysis. In this review, 11 recruitment methods were identified. Semi-probabilistic methods were used most commonly to survey men who have sex with men, and the use of the Internet was the method that gathered more respondents. We found that female sex workers were more frequently recruited through non-probabilistic methods than men who have sex with men (odds = 2.2; p < 0.05; confidence interval (CI) [1.1-4.2]). In the last 6 years, the number of studies based in middle and low human development countries increased more than the number of studies based in very high and high human development countries (odds = 2.5; p < 0.05; CI [1.3-4.9]). This systematic literature review identified 11 methods used to sample men who have sex with men and female sex workers. There is an association between the type of sampling method and the population being studied. The number of studies based in middle and low human development countries has increased in the last 6 years of this study.

  2. Determination of Meteorite Porosity Using Liquid Nitrogen

    NASA Technical Reports Server (NTRS)

    Kohout, T.; Kletetschka, G.; Pesonen, L. J.; Wasilewski, P. J.

    2005-01-01

    We introduce a new harmless method for porosity measurement suitable for meteorite samples. The method is a modification of the traditional Archimedean method based on immersion of the samples in a liquid medium like water or organic liquids. In our case we used liquid nitrogen for its chemically inert characteristics.

  3. Development of a Simultaneous Extraction and Cleanup Method for Pyrethroid Pesticides from Indoor House Dust Samples

    EPA Science Inventory

    An efficient and reliable analytical method was developed for the sensitive and selective quantification of pyrethroid pesticides (PYRs) in house dust samples. The method is based on selective pressurized liquid extraction (SPLE) of the dust-bound PYRs into dichloromethane (DCM) wi...

  4. Crystallization of hard spheres revisited. I. Extracting kinetics and free energy landscape from forward flux sampling.

    PubMed

    Richard, David; Speck, Thomas

    2018-03-28

    We investigate the kinetics and the free energy landscape of the crystallization of hard spheres from a supersaturated metastable liquid though direct simulations and forward flux sampling. In this first paper, we describe and test two different ways to reconstruct the free energy barriers from the sampled steady state probability distribution of cluster sizes without sampling the equilibrium distribution. The first method is based on mean first passage times, and the second method is based on splitting probabilities. We verify both methods for a single particle moving in a double-well potential. For the nucleation of hard spheres, these methods allow us to probe a wide range of supersaturations and to reconstruct the kinetics and the free energy landscape from the same simulation. Results are consistent with the scaling predicted by classical nucleation theory although a quantitative fit requires a rather large effective interfacial tension.

  5. Crystallization of hard spheres revisited. I. Extracting kinetics and free energy landscape from forward flux sampling

    NASA Astrophysics Data System (ADS)

    Richard, David; Speck, Thomas

    2018-03-01

    We investigate the kinetics and the free energy landscape of the crystallization of hard spheres from a supersaturated metastable liquid though direct simulations and forward flux sampling. In this first paper, we describe and test two different ways to reconstruct the free energy barriers from the sampled steady state probability distribution of cluster sizes without sampling the equilibrium distribution. The first method is based on mean first passage times, and the second method is based on splitting probabilities. We verify both methods for a single particle moving in a double-well potential. For the nucleation of hard spheres, these methods allow us to probe a wide range of supersaturations and to reconstruct the kinetics and the free energy landscape from the same simulation. Results are consistent with the scaling predicted by classical nucleation theory although a quantitative fit requires a rather large effective interfacial tension.

  6. Geodesic denoising for optical coherence tomography images

    NASA Astrophysics Data System (ADS)

    Shahrian Varnousfaderani, Ehsan; Vogl, Wolf-Dieter; Wu, Jing; Gerendas, Bianca S.; Simader, Christian; Langs, Georg; Waldstein, Sebastian M.; Schmidt-Erfurth, Ursula

    2016-03-01

    Optical coherence tomography (OCT) is an optical signal acquisition method capturing micrometer resolution, cross-sectional three-dimensional images. OCT images are used widely in ophthalmology to diagnose and monitor retinal diseases such as age-related macular degeneration (AMD) and Glaucoma. While OCT allows the visualization of retinal structures such as vessels and retinal layers, image quality and contrast is reduced by speckle noise, obfuscating small, low intensity structures and structural boundaries. Existing denoising methods for OCT images may remove clinically significant image features such as texture and boundaries of anomalies. In this paper, we propose a novel patch based denoising method, Geodesic Denoising. The method reduces noise in OCT images while preserving clinically significant, although small, pathological structures, such as fluid-filled cysts in diseased retinas. Our method selects optimal image patch distribution representations based on geodesic patch similarity to noisy samples. Patch distributions are then randomly sampled to build a set of best matching candidates for every noisy sample, and the denoised value is computed based on a geodesic weighted average of the best candidate samples. Our method is evaluated qualitatively on real pathological OCT scans and quantitatively on a proposed set of ground truth, noise free synthetic OCT scans with artificially added noise and pathologies. Experimental results show that performance of our method is comparable with state of the art denoising methods while outperforming them in preserving the critical clinically relevant structures.

  7. A high-throughput AO/PI-based cell concentration and viability detection method using the Celigo image cytometry.

    PubMed

    Chan, Leo Li-Ying; Smith, Tim; Kumph, Kendra A; Kuksin, Dmitry; Kessel, Sarah; Déry, Olivier; Cribbes, Scott; Lai, Ning; Qiu, Jean

    2016-10-01

    To ensure cell-based assays are performed properly, both cell concentration and viability have to be determined so that the data can be normalized to generate meaningful and comparable results. Cell-based assays performed in immuno-oncology, toxicology, or bioprocessing research often require measuring of multiple samples and conditions, thus the current automated cell counter that uses single disposable counting slides is not practical for high-throughput screening assays. In the recent years, a plate-based image cytometry system has been developed for high-throughput biomolecular screening assays. In this work, we demonstrate a high-throughput AO/PI-based cell concentration and viability method using the Celigo image cytometer. First, we validate the method by comparing directly to Cellometer automated cell counter. Next, cell concentration dynamic range, viability dynamic range, and consistency are determined. The high-throughput AO/PI method described here allows for 96-well to 384-well plate samples to be analyzed in less than 7 min, which greatly reduces the time required for the single sample-based automated cell counter. In addition, this method can improve the efficiency for high-throughput screening assays, where multiple cell counts and viability measurements are needed prior to performing assays such as flow cytometry, ELISA, or simply plating cells for cell culture.

  8. Component-based subspace linear discriminant analysis method for face recognition with one training sample

    NASA Astrophysics Data System (ADS)

    Huang, Jian; Yuen, Pong C.; Chen, Wen-Sheng; Lai, J. H.

    2005-05-01

    Many face recognition algorithms/systems have been developed in the last decade and excellent performances have also been reported when there is a sufficient number of representative training samples. In many real-life applications such as passport identification, only one well-controlled frontal sample image is available for training. Under this situation, the performance of existing algorithms will degrade dramatically or may not even be implemented. We propose a component-based linear discriminant analysis (LDA) method to solve the one training sample problem. The basic idea of the proposed method is to construct local facial feature component bunches by moving each local feature region in four directions. In this way, we not only generate more samples with lower dimension than the original image, but also consider the face detection localization error while training. After that, we propose a subspace LDA method, which is tailor-made for a small number of training samples, for the local feature projection to maximize the discrimination power. Theoretical analysis and experiment results show that our proposed subspace LDA is efficient and overcomes the limitations in existing LDA methods. Finally, we combine the contributions of each local component bunch with a weighted combination scheme to draw the recognition decision. A FERET database is used for evaluating the proposed method and results are encouraging.

  9. Validation of High Resolution Melting Analysis (HRM) of the Amplified ITS2 Region for the Detection and Identification of Yeasts from Clinical Samples: Comparison with Culture and MALDI-TOF Based Identification.

    PubMed

    Duyvejonck, Hans; Cools, Piet; Decruyenaere, Johan; Roelens, Kristien; Noens, Lucien; Vermeulen, Stefan; Claeys, Geert; Decat, Ellen; Van Mechelen, Els; Vaneechoutte, Mario

    2015-01-01

    Candida species are known as opportunistic pathogens, and a possible cause of invasive infections. Because of their species-specific antimycotic resistance patterns, reliable techniques for their detection, quantification and identification are needed. We validated a DNA amplification method for direct detection of Candida spp. from clinical samples, namely the ITS2-High Resolution Melting Analysis (direct method), by comparing it with a culture and MALDI-TOF Mass Spectrometry based method (indirect method) to establish the presence of Candida species in three different types of clinical samples. A total of 347 clinical samples, i.e. throat swabs, rectal swabs and vaginal swabs, were collected from the gynaecology/obstetrics, intensive care and haematology wards at the Ghent University Hospital, Belgium. For the direct method, ITS2-HRM was preceded by NucliSENS easyMAG DNA extraction, directly on the clinical samples. For the indirect method, clinical samples were cultured on Candida ID and individual colonies were identified by MALDI-TOF. For 83.9% of the samples there was complete concordance between both techniques, i.e. the same Candida species were detected in 31.1% of the samples or no Candida species were detected in 52.8% of the samples. In 16.1% of the clinical samples, discrepant results were obtained, of which only 6.01% were considered as major discrepancies. Discrepancies occurred mostly when overall numbers of Candida cells in the samples were low and/or when multiple species were present in the sample. Most of the discrepancies could be decided in the advantage of the direct method. This is due to samples in which no yeast could be cultured whereas low amounts could be detected by the direct method and to samples in which high quantities of Candida robusta according to ITS2-HRM were missed by culture on Candida ID agar. It remains to be decided whether the diagnostic advantages of the direct method compensate for its disadvantages.

  10. The unbiasedness of a generalized mirage boundary correction method for Monte Carlo integration estimators of volume

    Treesearch

    Thomas B. Lynch; Jeffrey H. Gove

    2014-01-01

    The typical "double counting" application of the mirage method of boundary correction cannot be applied to sampling systems such as critical height sampling (CHS) that are based on a Monte Carlo sample of a tree (or debris) attribute because the critical height (or other random attribute) sampled from a mirage point is generally not equal to the critical...

  11. Viability qPCR, a new tool for Legionella risk management.

    PubMed

    Lizana, X; López, A; Benito, S; Agustí, G; Ríos, M; Piqué, N; Marqués, A M; Codony, F

    2017-11-01

    Viability quantitative Polymerase Chain Reaction (v-qPCR) is a recent analytical approach for only detecting live microorganisms by DNA amplification-based methods This approach is based on the use of a reagent that irreversibly fixes dead cells DNA. In this study, we evaluate the utility of v-qPCR versus culture method for Legionellosis risk management. The present study was performed using 116 real samples. Water samples were simultaneously analysed by culture, v-qPCR and qPCR methods. Results were compared by means of a non-parametric test. In 11.6% of samples using both methods (culture method and v-qPCR) results were positive, in 50.0% of samples both methods gave rise to negative results. As expected, equivalence between methods was not observed in all cases, as in 32.1% of samples positive results were obtained by v-qPCR and all of them gave rise to negative results by culture. Only in 6.3% of samples, with very low Legionella levels, was culture positive and v-qPCR negative. In 3.5% of samples, overgrowth of other bacteria did not allow performing the culture. When comparing both methods, significant differences between culture and v-qPCR were in the samples belonging to the cooling towers-evaporative condensers group. The v-qPCR method detected greater presence and obtained higher concentrations of Legionella spp. (p<0.001). Otherwise, no significant differences between methods were found in the rest of the groups. The v-qPCR method can be used as a quick tool to evaluate Legionellosis risk, especially in cooling towers-evaporative condensers, where this technique can detect higher levels than culture. The combined interpretation of PCR results along with the ratio of live cells is proposed as a tool for understanding the sample context and estimating the Legionellosis risk potential according to 4 levels of hierarchy. Copyright © 2017 Elsevier GmbH. All rights reserved.

  12. Fast Bayesian experimental design: Laplace-based importance sampling for the expected information gain

    NASA Astrophysics Data System (ADS)

    Beck, Joakim; Dia, Ben Mansour; Espath, Luis F. R.; Long, Quan; Tempone, Raúl

    2018-06-01

    In calculating expected information gain in optimal Bayesian experimental design, the computation of the inner loop in the classical double-loop Monte Carlo requires a large number of samples and suffers from underflow if the number of samples is small. These drawbacks can be avoided by using an importance sampling approach. We present a computationally efficient method for optimal Bayesian experimental design that introduces importance sampling based on the Laplace method to the inner loop. We derive the optimal values for the method parameters in which the average computational cost is minimized according to the desired error tolerance. We use three numerical examples to demonstrate the computational efficiency of our method compared with the classical double-loop Monte Carlo, and a more recent single-loop Monte Carlo method that uses the Laplace method as an approximation of the return value of the inner loop. The first example is a scalar problem that is linear in the uncertain parameter. The second example is a nonlinear scalar problem. The third example deals with the optimal sensor placement for an electrical impedance tomography experiment to recover the fiber orientation in laminate composites.

  13. Evaluation of commercial kit based on loop-mediated isothermal amplification for rapid detection of low levels of uninjured and injured Salmonella on duck meat, bean sprouts, and fishballs in Singapore.

    PubMed

    Lim, Hazel Sin Yue; Zheng, Qianwang; Miks-Krajnik, Marta; Turner, Matthew; Yuk, Hyun-Gyun

    2015-06-01

    The objective of this study was to evaluate performance of the commercial kit based on loop-mediated isothermal amplification (LAMP) in comparison with the International Organization for Standardization method for detecting uninjured and sublethally injured Salmonella cells artificially inoculated at levels of 10(0) and 10(1) CFU/25 g on raw duck wing, raw mung bean sprouts, and processed fishballs. Injured cells were prepared by a heat treatment for duck wings and fishball samples and a chlorine treatment for bean sprout samples. Additionally, a validation study was performed on naturally contaminated food samples sold in Singapore. A total of 110 samples of each commodity were analyzed in this study. Regardless of inoculum levels, the detection by the commercial LAMP kit showed 100% sensitivity and specificity for both inoculated and uninoculated samples compared with the International Organization for Standardization method, with the exception of bean sprout samples. Only 20% of bean sprout samples inoculated with 10(0) CFU/25 g injured Salmonella cells were positive by using the commercial LAMP-based kit. However, all negative samples became positive following a secondary enrichment in Rappaport-Vassiliadis medium with soy broth or after concentration by centrifugation. These results suggest that secondary enrichment or centrifugation should be considered as an additional step to increase the sensitivity of the commercial LAMP-based kit with low numbers of injured target cells in samples with high background microflora (such as mung bean sprouts). The validation study also showed that the commercial LAMP-based kit provided 91% sensitivity and 95% specificity for naturally contaminated samples. Thus, this study demonstrates that the commercial LAMP-based kit might be a cost-effective method, as this system could provide rapid, accurate detection of both uninjured and injured Salmonella cells on raw duck wings, raw mung bean sprouts, and processed fishballs in less than 26 h.

  14. Analysis of nitrosamines in water by automated SPE and isotope dilution GC/HRMS Occurrence in the different steps of a drinking water treatment plant, and in chlorinated samples from a reservoir and a sewage treatment plant effluent.

    PubMed

    Planas, Carles; Palacios, Oscar; Ventura, Francesc; Rivera, Josep; Caixach, Josep

    2008-08-15

    A method based on automated solid-phase extraction (SPE) and isotope dilution gas chromatography/high resolution mass spectrometry (GC/HRMS) has been developed for the analysis of nine nitrosamines in water samples. The combination of automated SPE and GC/HRMS for the analysis of nitrosamines has not been reported previously. The method shows as advantages the selectivity and sensitivity of GC/HRMS analysis and the high efficiency of automated SPE with coconut charcoal EPA 521 cartridges. Low method detection limits (MDLs) were achieved, along with a greater facility of the procedure and less dependence on the operator with regard to the methods based on manual SPE. Quality requirements for isotope dilution-based methods were accomplished for most analysed nitrosamines, regarding to trueness (80-120%), method precision (<15%) and MDLs (0.08-1.7 ng/L). Nineteen water samples (16 samples from a drinking water treatment plant {DWTP}, 2 chlorinated samples from a sewage treatment plant {STP} effluent, and 1 chlorinated sample from a reservoir) were analysed. Concentrations of nitrosamines in the STP effluent were 309.4 and 730.2 ng/L, being higher when higher doses of chlorine were applied. N-Nitrosodimethylamine (NDMA) and N-nitrosodiethylamine (NDEA) were the main compounds identified in the STP effluent, and NDEA was detected above 200 ng/L, regulatory level for NDMA in effluents stated in Ontario (Canada). Lower concentrations of nitrosamines were found in the reservoir (20.3 ng/L) and in the DWTP samples (n.d. -28.6 ng/L). NDMA and NDEA were respectively found in the reservoir and in treated and highly chlorinated DWTP samples at concentrations above 10 ng/L (guide value established in different countries). The highest concentrations of nitrosamines were found after chlorination and ozonation processes (ozonated, treated and highly chlorinated water) in DWTP samples.

  15. Simultaneous determination of sample thickness, tilt, and electron mean free path using tomographic tilt images based on Beer-Lambert law

    PubMed Central

    Yan, Rui; Edwards, Thomas J.; Pankratz, Logan M.; Kuhn, Richard J.; Lanman, Jason K.; Liu, Jun; Jiang, Wen

    2015-01-01

    Cryo-electron tomography (cryo-ET) is an emerging technique that can elucidate the architecture of macromolecular complexes and cellular ultrastructure in a near-native state. Some important sample parameters, such as thickness and tilt, are needed for 3-D reconstruction. However, these parameters can currently only be determined using trial 3-D reconstructions. Accurate electron mean free path plays a significant role in modeling image formation process essential for simulation of electron microscopy images and model-based iterative 3-D reconstruction methods; however, their values are voltage and sample dependent and have only been experimentally measured for a limited number of sample conditions. Here, we report a computational method, tomoThickness, based on the Beer-Lambert law, to simultaneously determine the sample thickness, tilt and electron inelastic mean free path by solving an overdetermined nonlinear least square optimization problem utilizing the strong constraints of tilt relationships. The method has been extensively tested with both stained and cryo datasets. The fitted electron mean free paths are consistent with reported experimental measurements. The accurate thickness estimation eliminates the need for a generous assignment of Z-dimension size of the tomogram. Interestingly, we have also found that nearly all samples are a few degrees tilted relative to the electron beam. Compensation of the intrinsic sample tilt can result in horizontal structure and reduced Z-dimension of tomograms. Our fast, pre-reconstruction method can thus provide important sample parameters that can help improve performance of tomographic reconstruction of a wide range of samples. PMID:26433027

  16. Simultaneous determination of sample thickness, tilt, and electron mean free path using tomographic tilt images based on Beer-Lambert law.

    PubMed

    Yan, Rui; Edwards, Thomas J; Pankratz, Logan M; Kuhn, Richard J; Lanman, Jason K; Liu, Jun; Jiang, Wen

    2015-11-01

    Cryo-electron tomography (cryo-ET) is an emerging technique that can elucidate the architecture of macromolecular complexes and cellular ultrastructure in a near-native state. Some important sample parameters, such as thickness and tilt, are needed for 3-D reconstruction. However, these parameters can currently only be determined using trial 3-D reconstructions. Accurate electron mean free path plays a significant role in modeling image formation process essential for simulation of electron microscopy images and model-based iterative 3-D reconstruction methods; however, their values are voltage and sample dependent and have only been experimentally measured for a limited number of sample conditions. Here, we report a computational method, tomoThickness, based on the Beer-Lambert law, to simultaneously determine the sample thickness, tilt and electron inelastic mean free path by solving an overdetermined nonlinear least square optimization problem utilizing the strong constraints of tilt relationships. The method has been extensively tested with both stained and cryo datasets. The fitted electron mean free paths are consistent with reported experimental measurements. The accurate thickness estimation eliminates the need for a generous assignment of Z-dimension size of the tomogram. Interestingly, we have also found that nearly all samples are a few degrees tilted relative to the electron beam. Compensation of the intrinsic sample tilt can result in horizontal structure and reduced Z-dimension of tomograms. Our fast, pre-reconstruction method can thus provide important sample parameters that can help improve performance of tomographic reconstruction of a wide range of samples. Copyright © 2015 Elsevier Inc. All rights reserved.

  17. Direct determination of neonicotinoid insecticides in an analytically challenging crop such as Chinese chives using selective ELISAs.

    PubMed

    Watanabe, Eiki; Miyake, Shiro

    2018-06-05

    Easy-to-use commercial kit-based enzyme-linked immunosorbent assays (ELISAs) have been used to detect neonicotinoid dinotefuran, clothianidin and imidacloprid in Chinese chives, which are considered a troublesome matrix for chromatographic techniques. Based on their high water solubility, water was used as an extractant. Matrix interference could be avoided substantially just diluting sample extracts. Average recoveries of insecticides from spiked samples were 85-113%, with relative standard deviation of <15%. The concentrations of insecticides detected from the spiked samples with the proposed ELISA methods correlated well with those by the reference high-performance liquid chromatography (HPLC) method. The residues analyzed by the ELISA methods were consistently 1.24 times that found by the HPLC method, attributable to loss of analyte during sample clean-up for HPLC analyses. It was revealed that the ELISA methods can be applied easily to pesticide residue analysis in troublesome matrix such as Chinese chives.

  18. Development and validation of a 48-target analytical method for high-throughput monitoring of genetically modified organisms.

    PubMed

    Li, Xiaofei; Wu, Yuhua; Li, Jun; Li, Yunjing; Long, Likun; Li, Feiwu; Wu, Gang

    2015-01-05

    The rapid increase in the number of genetically modified (GM) varieties has led to a demand for high-throughput methods to detect genetically modified organisms (GMOs). We describe a new dynamic array-based high throughput method to simultaneously detect 48 targets in 48 samples on a Fludigm system. The test targets included species-specific genes, common screening elements, most of the Chinese-approved GM events, and several unapproved events. The 48 TaqMan assays successfully amplified products from both single-event samples and complex samples with a GMO DNA amount of 0.05 ng, and displayed high specificity. To improve the sensitivity of detection, a preamplification step for 48 pooled targets was added to enrich the amount of template before performing dynamic chip assays. This dynamic chip-based method allowed the synchronous high-throughput detection of multiple targets in multiple samples. Thus, it represents an efficient, qualitative method for GMO multi-detection.

  19. Development and Validation of A 48-Target Analytical Method for High-throughput Monitoring of Genetically Modified Organisms

    PubMed Central

    Li, Xiaofei; Wu, Yuhua; Li, Jun; Li, Yunjing; Long, Likun; Li, Feiwu; Wu, Gang

    2015-01-01

    The rapid increase in the number of genetically modified (GM) varieties has led to a demand for high-throughput methods to detect genetically modified organisms (GMOs). We describe a new dynamic array-based high throughput method to simultaneously detect 48 targets in 48 samples on a Fludigm system. The test targets included species-specific genes, common screening elements, most of the Chinese-approved GM events, and several unapproved events. The 48 TaqMan assays successfully amplified products from both single-event samples and complex samples with a GMO DNA amount of 0.05 ng, and displayed high specificity. To improve the sensitivity of detection, a preamplification step for 48 pooled targets was added to enrich the amount of template before performing dynamic chip assays. This dynamic chip-based method allowed the synchronous high-throughput detection of multiple targets in multiple samples. Thus, it represents an efficient, qualitative method for GMO multi-detection. PMID:25556930

  20. Semiautomated Device for Batch Extraction of Metabolites from Tissue Samples

    PubMed Central

    2012-01-01

    Metabolomics has become a mainstream analytical strategy for investigating metabolism. The quality of data derived from these studies is proportional to the consistency of the sample preparation. Although considerable research has been devoted to finding optimal extraction protocols, most of the established methods require extensive sample handling. Manual sample preparation can be highly effective in the hands of skilled technicians, but an automated tool for purifying metabolites from complex biological tissues would be of obvious utility to the field. Here, we introduce the semiautomated metabolite batch extraction device (SAMBED), a new tool designed to simplify metabolomics sample preparation. We discuss SAMBED’s design and show that SAMBED-based extractions are of comparable quality to extracts produced through traditional methods (13% mean coefficient of variation from SAMBED versus 16% from manual extractions). Moreover, we show that aqueous SAMBED-based methods can be completed in less than a quarter of the time required for manual extractions. PMID:22292466

  1. Development and validation of a SYBR Green I-based real-time polymerase chain reaction method for detection of haptoglobin gene deletion in clinical materials.

    PubMed

    Soejima, Mikiko; Tsuchiya, Yuji; Egashira, Kouichi; Kawano, Hiroyuki; Sagawa, Kimitaka; Koda, Yoshiro

    2010-06-01

    Anhaptoglobinemic patients run the risk of severe anaphylactic transfusion reaction because they produce serum haptoglobin (Hp) antibodies. Being homozygous for the Hp gene deletion (HP(del)) is the only known cause of congenital anhaptoglobinemia, and clinical diagnosis of HP(del) before transfusion is important to prevent anaphylactic shock. We recently developed a 5'-nuclease (TaqMan) real-time polymerase chain reaction (PCR) method. A SYBR Green I-based duplex real-time PCR assay using two forward primers and a common reverse primer followed by melting curve analysis was developed to determine HP(del) zygosity in a single tube. In addition, to obviate initial DNA extraction, we examined serially diluted blood samples as PCR templates. Allelic discrimination of HP(del) yielded optimal results at blood sample dilutions of 1:64 to 1:1024. The results from 2231 blood samples were fully concordant with those obtained by the TaqMan-based real-time PCR method. The detection rate of the HP(del) allele by the SYBR Green I-based method is comparable with that using the TaqMan-based method. This method is readily applicable due to its low initial cost and analyzability using economical real-time PCR machines and is suitable for high-throughput analysis as an alternative method for allelic discrimination of HP(del).

  2. Neutron activation analysis of certified samples by the absolute method

    NASA Astrophysics Data System (ADS)

    Kadem, F.; Belouadah, N.; Idiri, Z.

    2015-07-01

    The nuclear reactions analysis technique is mainly based on the relative method or the use of activation cross sections. In order to validate nuclear data for the calculated cross section evaluated from systematic studies, we used the neutron activation analysis technique (NAA) to determine the various constituent concentrations of certified samples for animal blood, milk and hay. In this analysis, the absolute method is used. The neutron activation technique involves irradiating the sample and subsequently performing a measurement of the activity of the sample. The fundamental equation of the activation connects several physical parameters including the cross section that is essential for the quantitative determination of the different elements composing the sample without resorting to the use of standard sample. Called the absolute method, it allows a measurement as accurate as the relative method. The results obtained by the absolute method showed that the values are as precise as the relative method requiring the use of standard sample for each element to be quantified.

  3. Comparison of DNA preservation methods for environmental bacterial community samples.

    PubMed

    Gray, Michael A; Pratte, Zoe A; Kellogg, Christina A

    2013-02-01

    Field collections of environmental samples, for example corals, for molecular microbial analyses present distinct challenges. The lack of laboratory facilities in remote locations is common, and preservation of microbial community DNA for later study is critical. A particular challenge is keeping samples frozen in transit. Five nucleic acid preservation methods that do not require cold storage were compared for effectiveness over time and ease of use. Mixed microbial communities of known composition were created and preserved by DNAgard(™), RNAlater(®), DMSO-EDTA-salt (DESS), FTA(®) cards, and FTA Elute(®) cards. Automated ribosomal intergenic spacer analysis and clone libraries were used to detect specific changes in the faux communities over weeks and months of storage. A previously known bias in FTA(®) cards that results in lower recovery of pure cultures of Gram-positive bacteria was also detected in mixed community samples. There appears to be a uniform bias across all five preservation methods against microorganisms with high G + C DNA. Overall, the liquid-based preservatives (DNAgard(™), RNAlater(®), and DESS) outperformed the card-based methods. No single liquid method clearly outperformed the others, leaving method choice to be based on experimental design, field facilities, shipping constraints, and allowable cost. © 2012 Federation of European Microbiological Societies. Published by Blackwell Publishing Ltd. All rights reserved.

  4. Sodium and potassium content of 24 h urinary collections: a comparison between field- and laboratory-based analysers.

    PubMed

    Yin, Xuejun; Neal, Bruce; Tian, Maoyi; Li, Zhifang; Petersen, Kristina; Komatsu, Yuichiro; Feng, Xiangxian; Wu, Yangfeng

    2018-04-01

    Measurement of mean population Na and K intakes typically uses laboratory-based assays, which can add significant logistical burden and costs. A valid field-based measurement method would be a significant advance. In the current study, we used 24 h urine samples to compare estimates of Na, K and Na:K ratio based upon assays done using the field-based Horiba twin meter v. laboratory-based methods. The performance of the Horiba twin meter was determined by comparing field-based estimates of mean Na and K against those obtained using laboratory-based methods. The reported 95 % limits of agreement of Bland-Altman plots were calculated based on a regression approach for non-uniform differences. The 24 h urine samples were collected as part of an ongoing study being done in rural China. One hundred and sixty-six complete 24 h urine samples were qualified for estimating 24 h urinary Na and K excretion. Mean Na and K excretion were estimated as 170·4 and 37·4 mmol/d, respectively, using the meter-based assays; and 193·4 and 43·8 mmol/d, respectively, using the laboratory-based assays. There was excellent relative reliability (intraclass correlation coefficient) for both Na (0·986) and K (0·986). Bland-Altman plots showed moderate-to-good agreement between the two methods. Na and K intake estimations were moderately underestimated using assays based upon the Horiba twin meter. Compared with standard laboratory-based methods, the portable device was more practical and convenient.

  5. Efficiently sampling conformations and pathways using the concurrent adaptive sampling (CAS) algorithm

    NASA Astrophysics Data System (ADS)

    Ahn, Surl-Hee; Grate, Jay W.; Darve, Eric F.

    2017-08-01

    Molecular dynamics simulations are useful in obtaining thermodynamic and kinetic properties of bio-molecules, but they are limited by the time scale barrier. That is, we may not obtain properties' efficiently because we need to run microseconds or longer simulations using femtosecond time steps. To overcome this time scale barrier, we can use the weighted ensemble (WE) method, a powerful enhanced sampling method that efficiently samples thermodynamic and kinetic properties. However, the WE method requires an appropriate partitioning of phase space into discrete macrostates, which can be problematic when we have a high-dimensional collective space or when little is known a priori about the molecular system. Hence, we developed a new WE-based method, called the "Concurrent Adaptive Sampling (CAS) algorithm," to tackle these issues. The CAS algorithm is not constrained to use only one or two collective variables, unlike most reaction coordinate-dependent methods. Instead, it can use a large number of collective variables and adaptive macrostates to enhance the sampling in the high-dimensional space. This is especially useful for systems in which we do not know what the right reaction coordinates are, in which case we can use many collective variables to sample conformations and pathways. In addition, a clustering technique based on the committor function is used to accelerate sampling the slowest process in the molecular system. In this paper, we introduce the new method and show results from two-dimensional models and bio-molecules, specifically penta-alanine and a triazine trimer.

  6. Appreciating the difference between design-based and model-based sampling strategies in quantitative morphology of the nervous system.

    PubMed

    Geuna, S

    2000-11-20

    Quantitative morphology of the nervous system has undergone great developments over recent years, and several new technical procedures have been devised and applied successfully to neuromorphological research. However, a lively debate has arisen on some issues, and a great deal of confusion appears to exist that is definitely responsible for the slow spread of the new techniques among scientists. One such element of confusion is related to uncertainty about the meaning, implications, and advantages of the design-based sampling strategy that characterize the new techniques. In this article, to help remove this uncertainty, morphoquantitative methods are described and contrasted on the basis of the inferential paradigm of the sampling strategy: design-based vs model-based. Moreover, some recommendations are made to help scientists judge the appropriateness of a method used for a given study in relation to its specific goals. Finally, the use of the term stereology to label, more or less expressly, only some methods is critically discussed. Copyright 2000 Wiley-Liss, Inc.

  7. Correcting for the influence of sampling conditions on biomarkers of exposure to phenols and phthalates: a 2-step standardization method based on regression residuals.

    PubMed

    Mortamais, Marion; Chevrier, Cécile; Philippat, Claire; Petit, Claire; Calafat, Antonia M; Ye, Xiaoyun; Silva, Manori J; Brambilla, Christian; Eijkemans, Marinus J C; Charles, Marie-Aline; Cordier, Sylvaine; Slama, Rémy

    2012-04-26

    Environmental epidemiology and biomonitoring studies typically rely on biological samples to assay the concentration of non-persistent exposure biomarkers. Between-participant variations in sampling conditions of these biological samples constitute a potential source of exposure misclassification. Few studies attempted to correct biomarker levels for this error. We aimed to assess the influence of sampling conditions on concentrations of urinary biomarkers of select phenols and phthalates, two widely-produced families of chemicals, and to standardize biomarker concentrations on sampling conditions. Urine samples were collected between 2002 and 2006 among 287 pregnant women from Eden and Pélagie cohorts, from which phthalates and phenols metabolites levels were assayed. We applied a 2-step standardization method based on regression residuals. First, the influence of sampling conditions (including sampling hour, duration of storage before freezing) and of creatinine levels on biomarker concentrations were characterized using adjusted linear regression models. In the second step, the model estimates were used to remove the variability in biomarker concentrations due to sampling conditions and to standardize concentrations as if all samples had been collected under the same conditions (e.g., same hour of urine collection). Sampling hour was associated with concentrations of several exposure biomarkers. After standardization for sampling conditions, median concentrations differed by--38% for 2,5-dichlorophenol to +80 % for a metabolite of diisodecyl phthalate. However, at the individual level, standardized biomarker levels were strongly correlated (correlation coefficients above 0.80) with unstandardized measures. Sampling conditions, such as sampling hour, should be systematically collected in biomarker-based studies, in particular when the biomarker half-life is short. The 2-step standardization method based on regression residuals that we proposed in order to limit the impact of heterogeneity in sampling conditions could be further tested in studies describing levels of biomarkers or their influence on health.

  8. Robust kernel collaborative representation for face recognition

    NASA Astrophysics Data System (ADS)

    Huang, Wei; Wang, Xiaohui; Ma, Yanbo; Jiang, Yuzheng; Zhu, Yinghui; Jin, Zhong

    2015-05-01

    One of the greatest challenges of representation-based face recognition is that the training samples are usually insufficient. In other words, the training set usually does not include enough samples to show varieties of high-dimensional face images caused by illuminations, facial expressions, and postures. When the test sample is significantly different from the training samples of the same subject, the recognition performance will be sharply reduced. We propose a robust kernel collaborative representation based on virtual samples for face recognition. We think that the virtual training set conveys some reasonable and possible variations of the original training samples. Hence, we design a new object function to more closely match the representation coefficients generated from the original and virtual training sets. In order to further improve the robustness, we implement the corresponding representation-based face recognition in kernel space. It is noteworthy that any kind of virtual training samples can be used in our method. We use noised face images to obtain virtual face samples. The noise can be approximately viewed as a reflection of the varieties of illuminations, facial expressions, and postures. Our work is a simple and feasible way to obtain virtual face samples to impose Gaussian noise (and other types of noise) specifically to the original training samples to obtain possible variations of the original samples. Experimental results on the FERET, Georgia Tech, and ORL face databases show that the proposed method is more robust than two state-of-the-art face recognition methods, such as CRC and Kernel CRC.

  9. Magnetic Stirrer Method for the Detection of Trichinella Larvae in Muscle Samples.

    PubMed

    Mayer-Scholl, Anne; Pozio, Edoardo; Gayda, Jennifer; Thaben, Nora; Bahn, Peter; Nöckler, Karsten

    2017-03-03

    Trichinellosis is a debilitating disease in humans and is caused by the consumption of raw or undercooked meat of animals infected with the nematode larvae of the genus Trichinella. The most important sources of human infections worldwide are game meat and pork or pork products. In many countries, the prevention of human trichinellosis is based on the identification of infected animals by means of the artificial digestion of muscle samples from susceptible animal carcasses. There are several methods based on the digestion of meat but the magnetic stirrer method is considered the gold standard. This method allows the detection of Trichinella larvae by microscopy after the enzymatic digestion of muscle samples and subsequent filtration and sedimentation steps. Although this method does not require special and expensive equipment, internal controls cannot be used. Therefore, stringent quality management should be applied throughout the test. The aim of the present work is to provide detailed handling instructions and critical control points of the method to analysts, based on the experience of the European Union Reference Laboratory for Parasites and the National Reference Laboratory of Germany for Trichinella.

  10. An Improved Quantitative Real-Time PCR Assay for the Enumeration of Heterosigma akashiwo (Raphidophyceae) Cysts Using a DNA Debris Removal Method and a Cyst-Based Standard Curve.

    PubMed

    Kim, Joo-Hwan; Kim, Jin Ho; Wang, Pengbin; Park, Bum Soo; Han, Myung-Soo

    2016-01-01

    The identification and quantification of Heterosigma akashiwo cysts in sediments by light microscopy can be difficult due to the small size and morphology of the cysts, which are often indistinguishable from those of other types of algae. Quantitative real-time PCR (qPCR) based assays represent a potentially efficient method for quantifying the abundance of H. akashiwo cysts, although standard curves must be based on cyst DNA rather than on vegetative cell DNA due to differences in gene copy number and DNA extraction yield between these two cell types. Furthermore, qPCR on sediment samples can be complicated by the presence of extracellular DNA debris. To solve these problems, we constructed a cyst-based standard curve and developed a simple method for removing DNA debris from sediment samples. This cyst-based standard curve was compared with a standard curve based on vegetative cells, as vegetative cells may have twice the gene copy number of cysts. To remove DNA debris from the sediment, we developed a simple method involving dilution with distilled water and heating at 75°C. A total of 18 sediment samples were used to evaluate this method. Cyst abundance determined using the qPCR assay without DNA debris removal yielded results up to 51-fold greater than with direct counting. By contrast, a highly significant correlation was observed between cyst abundance determined by direct counting and the qPCR assay in conjunction with DNA debris removal (r2 = 0.72, slope = 1.07, p < 0.001). Therefore, this improved qPCR method should be a powerful tool for the accurate quantification of H. akashiwo cysts in sediment samples.

  11. Sample-space-based feature extraction and class preserving projection for gene expression data.

    PubMed

    Wang, Wenjun

    2013-01-01

    In order to overcome the problems of high computational complexity and serious matrix singularity for feature extraction using Principal Component Analysis (PCA) and Fisher's Linear Discrinimant Analysis (LDA) in high-dimensional data, sample-space-based feature extraction is presented, which transforms the computation procedure of feature extraction from gene space to sample space by representing the optimal transformation vector with the weighted sum of samples. The technique is used in the implementation of PCA, LDA, Class Preserving Projection (CPP) which is a new method for discriminant feature extraction proposed, and the experimental results on gene expression data demonstrate the effectiveness of the method.

  12. Research of mine water source identification based on LIF technology

    NASA Astrophysics Data System (ADS)

    Zhou, Mengran; Yan, Pengcheng

    2016-09-01

    According to the problem that traditional chemical methods to the mine water source identification takes a long time, put forward a method for rapid source identification system of mine water inrush based on the technology of laser induced fluorescence (LIF). Emphatically analyzes the basic principle of LIF technology. The hardware composition of LIF system are analyzed and the related modules were selected. Through the fluorescence experiment with the water samples of coal mine in the LIF system, fluorescence spectra of water samples are got. Traditional water source identification mainly according to the ion concentration representative of the water, but it is hard to analysis the ion concentration of the water from the fluorescence spectra. This paper proposes a simple and practical method of rapid identification of water by fluorescence spectrum, which measure the space distance between unknown water samples and standard samples, and then based on the clustering analysis, the category of the unknown water sample can be get. Water source identification for unknown samples verified the reliability of the LIF system, and solve the problem that the current coal mine can't have a better real-time and online monitoring on water inrush, which is of great significance for coal mine safety in production.

  13. Evaluation of methods for measuring particulate matter emissions from gas turbines.

    PubMed

    Petzold, Andreas; Marsh, Richard; Johnson, Mark; Miller, Michael; Sevcenco, Yura; Delhaye, David; Ibrahim, Amir; Williams, Paul; Bauer, Heidi; Crayford, Andrew; Bachalo, William D; Raper, David

    2011-04-15

    The project SAMPLE evaluated methods for measuring particle properties in the exhaust of aircraft engines with respect to the development of standardized operation procedures for particulate matter measurement in aviation industry. Filter-based off-line mass methods included gravimetry and chemical analysis of carbonaceous species by combustion methods. Online mass methods were based on light absorption measurement or used size distribution measurements obtained from an electrical mobility analyzer approach. Number concentrations were determined using different condensation particle counters (CPC). Total mass from filter-based methods balanced gravimetric mass within 8% error. Carbonaceous matter accounted for 70% of gravimetric mass while the remaining 30% were attributed to hydrated sulfate and noncarbonaceous organic matter fractions. Online methods were closely correlated over the entire range of emission levels studied in the tests. Elemental carbon from combustion methods and black carbon from optical methods deviated by maximum 5% with respect to mass for low to medium emission levels, whereas for high emission levels a systematic deviation between online methods and filter based methods was found which is attributed to sampling effects. CPC based instruments proved highly reproducible for number concentration measurements with a maximum interinstrument standard deviation of 7.5%.

  14. Measures of precision for dissimilarity-based multivariate analysis of ecological communities.

    PubMed

    Anderson, Marti J; Santana-Garcon, Julia

    2015-01-01

    Ecological studies require key decisions regarding the appropriate size and number of sampling units. No methods currently exist to measure precision for multivariate assemblage data when dissimilarity-based analyses are intended to follow. Here, we propose a pseudo multivariate dissimilarity-based standard error (MultSE) as a useful quantity for assessing sample-size adequacy in studies of ecological communities. Based on sums of squared dissimilarities, MultSE measures variability in the position of the centroid in the space of a chosen dissimilarity measure under repeated sampling for a given sample size. We describe a novel double resampling method to quantify uncertainty in MultSE values with increasing sample size. For more complex designs, values of MultSE can be calculated from the pseudo residual mean square of a permanova model, with the double resampling done within appropriate cells in the design. R code functions for implementing these techniques, along with ecological examples, are provided. © 2014 The Authors. Ecology Letters published by John Wiley & Sons Ltd and CNRS.

  15. Microfluidic-Based Robotic Sampling System for Radioactive Solutions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jack D. Law; Julia L. Tripp; Tara E. Smith

    A novel microfluidic based robotic sampling system has been developed for sampling and analysis of liquid solutions in nuclear processes. This system couples the use of a microfluidic sample chip with a robotic system designed to allow remote, automated sampling of process solutions in-cell and facilitates direct coupling of the microfluidic sample chip with analytical instrumentation. This system provides the capability for near real time analysis, reduces analytical waste, and minimizes the potential for personnel exposure associated with traditional sampling methods. A prototype sampling system was designed, built and tested. System testing demonstrated operability of the microfluidic based sample systemmore » and identified system modifications to optimize performance.« less

  16. Comparison of ion chromatographic methods based on conductivity detection, post-column-reaction and on-line-coupling IC-ICP-MS for the determination of bromate.

    PubMed

    Schminke, G; Seubert, A

    2000-02-01

    An established method for the determination of the disinfection by-product bromate is ion chromatography (IC). This paper presents a comparison of three IC methods based on either conductivity detection (IC-CD), a post-column-reaction (IC-PCR-VIS) or the on-line-coupling with inductively coupled plasma mass spectrometry (IC-ICP-MS). Main characteristics of the methods such as method detection limits (MDL), time of analysis and sample pretreatment are compared and applicability for routine analysis is critically discussed. The most sensitive and rugged method is IC-ICP-MS, followed by IC-PCR-VIS. The photometric detection is subject to a minor interference in real world samples, presumably caused by carbonate. The lowest sensitivity is shown by the IC-CD method as slowest method compared, which, in addition, requires a sample pretreatment. The highest amount of information is delivered by IC-PCR-VIS, which allows the simultaneous determination of the seven standard anions and bromate.

  17. The Recent Developments in Sample Preparation for Mass Spectrometry-Based Metabolomics.

    PubMed

    Gong, Zhi-Gang; Hu, Jing; Wu, Xi; Xu, Yong-Jiang

    2017-07-04

    Metabolomics is a critical member in systems biology. Although great progress has been achieved in metabolomics, there are still some problems in sample preparation, data processing and data interpretation. In this review, we intend to explore the roles, challenges and trends in sample preparation for mass spectrometry- (MS-) based metabolomics. The newly emerged sample preparation methods were also critically examined, including laser microdissection, in vivo sampling, dried blood spot, microwave, ultrasound and enzyme-assisted extraction, as well as microextraction techniques. Finally, we provide some conclusions and perspectives for sample preparation in MS-based metabolomics.

  18. [Discrimination of types of polyacrylamide based on near infrared spectroscopy coupled with least square support vector machine].

    PubMed

    Zhang, Hong-Guang; Yang, Qin-Min; Lu, Jian-Gang

    2014-04-01

    In this paper, a novel discriminant methodology based on near infrared spectroscopic analysis technique and least square support vector machine was proposed for rapid and nondestructive discrimination of different types of Polyacrylamide. The diffuse reflectance spectra of samples of Non-ionic Polyacrylamide, Anionic Polyacrylamide and Cationic Polyacrylamide were measured. Then principal component analysis method was applied to reduce the dimension of the spectral data and extract of the principal compnents. The first three principal components were used for cluster analysis of the three different types of Polyacrylamide. Then those principal components were also used as inputs of least square support vector machine model. The optimization of the parameters and the number of principal components used as inputs of least square support vector machine model was performed through cross validation based on grid search. 60 samples of each type of Polyacrylamide were collected. Thus a total of 180 samples were obtained. 135 samples, 45 samples for each type of Polyacrylamide, were randomly split into a training set to build calibration model and the rest 45 samples were used as test set to evaluate the performance of the developed model. In addition, 5 Cationic Polyacrylamide samples and 5 Anionic Polyacrylamide samples adulterated with different proportion of Non-ionic Polyacrylamide were also prepared to show the feasibilty of the proposed method to discriminate the adulterated Polyacrylamide samples. The prediction error threshold for each type of Polyacrylamide was determined by F statistical significance test method based on the prediction error of the training set of corresponding type of Polyacrylamide in cross validation. The discrimination accuracy of the built model was 100% for prediction of the test set. The prediction of the model for the 10 mixing samples was also presented, and all mixing samples were accurately discriminated as adulterated samples. The overall results demonstrate that the discrimination method proposed in the present paper can rapidly and nondestructively discriminate the different types of Polyacrylamide and the adulterated Polyacrylamide samples, and offered a new approach to discriminate the types of Polyacrylamide.

  19. Dielectrophoresis-Based Sample Handling in General-Purpose Programmable Diagnostic Instruments

    PubMed Central

    Gascoyne, Peter R. C.; Vykoukal, Jody V.

    2009-01-01

    As the molecular origins of disease are better understood, the need for affordable, rapid, and automated technologies that enable microscale molecular diagnostics has become apparent. Widespread use of microsystems that perform sample preparation and molecular analysis could ensure that the benefits of new biomedical discoveries are realized by a maximum number of people, even those in environments lacking any infrastructure. While progress has been made in developing miniaturized diagnostic systems, samples are generally processed off-device using labor-intensive and time-consuming traditional sample preparation methods. We present the concept of an integrated programmable general-purpose sample analysis processor (GSAP) architecture where raw samples are routed to separation and analysis functional blocks contained within a single device. Several dielectrophoresis-based methods that could serve as the foundation for building GSAP functional blocks are reviewed including methods for cell and particle sorting, cell focusing, cell ac impedance analysis, cell lysis, and the manipulation of molecules and reagent droplets. PMID:19684877

  20. A modified method for diffusive monitoring of 3-ethenylpyridine as a specific marker of environmental tobacco smoke

    NASA Astrophysics Data System (ADS)

    Kuusimäki, Leea; Peltonen, Kimmo; Vainiotalo, Sinikka

    A previously introduced method for monitoring environmental tobacco smoke (ETS) was further validated. The method is based on diffusive sampling of a vapour-phase marker, 3-ethenylpyridine (3-EP), with 3 M passive monitors (type 3500). Experiments were done in a dynamic chamber to assess diffusive sampling in comparison with active sampling in charcoal tubes or XAD-4 tubes. The sampling rate for 3-EP collected on the diffusive sampler was 23.1±0.6 mL min -1. The relative standard deviation for parallel samples ( n=6) ranged from 4% to 14% among experiments ( n=9). No marked reverse diffusion of 3-EP was detected nor any significant effect of relative humidity at 20%, 50% or 80%. The diffusive sampling of 3-EP was validated in field measurements in 15 restaurants in comparison with 3-EP and nicotine measurements using active sampling. The 3-EP concentration in restaurants ranged from 0.01 to 9.8 μg m -3, and the uptake rate for 3-EP based on 92 parallel samples was 24.0±0.4 mL min -1. A linear correlation ( r=0.98) was observed between 3-EP and nicotine concentrations, the average ratio of 3-EP to nicotine being 1:8. Active sampling of 3-EP and nicotine in charcoal tubes provided more reliable results than sampling in XAD-4 tubes. All samples were analysed using gas chromatography-mass spectrometry after elution with a 15% solution of pyridine in toluene. For nicotine, the limit of quantification of the charcoal tube method was 4 ng per sample, corresponding to 0.04 μg m -3 for an air sample of 96 L. For 3-EP, the limit of quantification of the diffusive method was 0.5-1.0 ng per sample, corresponding to 0.04-0.09 μg m -3 for 8 h sampling. The diffusive method proved suitable for ETS monitoring, even at low levels of ETS.

  1. Concentration comparison of selected constituents between groundwater samples collected within the Missouri River alluvial aquifer using purge and pump and grab-sampling methods, near the city of Independence, Missouri, 2013

    USGS Publications Warehouse

    Krempa, Heather M.

    2015-10-29

    Relative percent differences between methods were greater than 10 percent for most analyzed trace elements. Barium, cobalt, manganese, and boron had concentrations that were significantly different between sampling methods. Barium, molybdenum, boron, and uranium method concentrations indicate a close association between pump and grab samples based on bivariate plots and simple linear regressions. Grab sample concentrations were generally larger than pump concentrations for these elements and may be because of using a larger pore sized filter for grab samples. Analysis of zinc blank samples suggests zinc contamination in filtered grab samples. Variations of analyzed trace elements between pump and grab samples could reduce the ability to monitor temporal changes and potential groundwater contamination threats. The degree of precision necessary for monitoring potential groundwater threats and application objectives need to be considered when determining acceptable variation amounts.

  2. Curcuminoid content of Curcuma longa L. and Curcuma xanthorrhiza rhizome based on drying method with NMR and HPLC-UVD

    NASA Astrophysics Data System (ADS)

    Hadi, S.; Artanti, A. N.; Rinanto, Y.; Wahyuni, D. S. C.

    2018-04-01

    Curcuminoid, consisting of curcumin, demethoxycurcumin and bis demethoxycurcumin, is the major compound in Curcuma longa L. and Curcuma xanthorrhiza rhizome. It has been known to have a potent antioxidants, anticancer, antibacteria activity. Those rhizomes needs to be dried beforehand which influenced the active compounds concentration. The present work was conducted to assess the curcuminoid content of C. longa L. and C. xanthorrhiza based on drying method with Nuclear Magnetic Resonance (NMR) and High Pressure Liquid Chromatography (HPLC)-UVD. Samples were collected and dried using freeze-drying and oven method. The latter is the common method applied in most drying method at herbal medicine preparation procedure. All samples were extracted using 96% ethanol and analyzed using NMR and HPLC-UVD. Curcuminoid as a bioactive compound in the sample exhibited no significant difference and weak significant difference in C. xanthorrhiza and C. longa L., respectively. HLPC-UVD as a reliable analytical method for the quantification is subsequently used to confirm of the data obtained by NMR. It resulted that curcuminoid content showed no significant difference in both samples. This replied that curcuminoids content in both samples were stable into heating process. These results are useful information for simplicia standardization method in pharmaceutical products regarding to preparation procedure.

  3. Studying Cannabis Use Behaviors With Facebook and Web Surveys: Methods and Insights

    PubMed Central

    2018-01-01

    The rapid and wide-reaching expansion of internet access and digital technologies offers epidemiologists numerous opportunities to study health behaviors. One particularly promising new data collection strategy is the use of Facebook’s advertising platform in conjunction with Web-based surveys. Our research team at the Center for Technology and Behavioral Health has used this quick and cost-efficient method to recruit large samples and address unique scientific questions related to cannabis use. In conducting this research, we have gleaned several insights for using this sampling method effectively and have begun to document the characteristics of the resulting data. We believe this information could be useful to other researchers attempting to study cannabis use or, potentially, other health behaviors. The first aim of this paper is to describe case examples of procedures for using Facebook as a survey sampling method for studying cannabis use. We then present several distinctive features of the data produced using this method. Finally, we discuss the utility of this sampling method for addressing specific types of epidemiological research questions. Overall, we believe that sampling with Facebook advertisements and Web surveys is best conceptualized as a targeted, nonprobability-based method for oversampling cannabis users across the United States. PMID:29720366

  4. Efficient sample preparation method based on solvent-assisted dispersive solid-phase extraction for the trace detection of butachlor in urine and waste water samples.

    PubMed

    Aladaghlo, Zolfaghar; Fakhari, Alireza; Behbahani, Mohammad

    2016-10-01

    In this work, an efficient sample preparation method termed solvent-assisted dispersive solid-phase extraction was applied. The used sample preparation method was based on the dispersion of the sorbent (benzophenone) into the aqueous sample to maximize the interaction surface. In this approach, the dispersion of the sorbent at a very low milligram level was achieved by inserting a solution of the sorbent and disperser solvent into the aqueous sample. The cloudy solution created from the dispersion of the sorbent in the bulk aqueous sample. After pre-concentration of the butachlor, the cloudy solution was centrifuged and butachlor in the sediment phase dissolved in ethanol and determined by gas chromatography with flame ionization detection. Under the optimized conditions (solution pH = 7.0, sorbent: benzophenone, 2%, disperser solvent: ethanol, 500 μL, centrifuged at 4000 rpm for 3 min), the method detection limit for butachlor was 2, 3 and 3 μg/L for distilled water, waste water, and urine sample, respectively. Furthermore, the preconcentration factor was 198.8, 175.0, and 174.2 in distilled water, waste water, and urine sample, respectively. Solvent-assisted dispersive solid-phase extraction was successfully used for the trace monitoring of butachlor in urine and waste water samples. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  5. Use of Language Sample Analysis by School-Based SLPs: Results of a Nationwide Survey

    ERIC Educational Resources Information Center

    Pavelko, Stacey L.; Owens, Robert E., Jr.; Ireland, Marie; Hahs-Vaughn, Debbie L.

    2016-01-01

    Purpose: This article examines use of language sample analysis (LSA) by school-based speech-language pathologists (SLPs), including characteristics of language samples, methods of transcription and analysis, barriers to LSA use, and factors affecting LSA use, such as American Speech-Language-Hearing Association certification, number of years'…

  6. A distance limited method for sampling downed coarse woody debris

    Treesearch

    Jeffrey H. Gove; Mark J. Ducey; Harry T. Valentine; Michael S. Williams

    2012-01-01

    A new sampling method for down coarse woody debris is proposed based on limiting the perpendicular distance from individual pieces to a randomly chosen sample point. Two approaches are presented that allow different protocols to be used to determine field measurements; estimators for each protocol are also developed. Both protocols are compared via simulation against...

  7. Comparison of green sample preparation techniques in the analysis of pyrethrins and pyrethroids in baby food by liquid chromatography-tandem mass spectrometry.

    PubMed

    Petrarca, Mateus Henrique; Ccanccapa-Cartagena, Alexander; Masiá, Ana; Godoy, Helena Teixeira; Picó, Yolanda

    2017-05-12

    A new selective and sensitive liquid chromatography triple quadrupole mass spectrometry method was developed for simultaneous analysis of natural pyrethrins and synthetic pyrethroids residues in baby food. In this study, two sample preparation methods based on ultrasound-assisted dispersive liquid-liquid microextraction (UA-DLLME) and salting-out assisted liquid-liquid extraction (SALLE) were optimized, and then, compared regarding the performance criteria. Appropriate linearity in solvent and matrix-based calibrations, and suitable recoveries (75-120%) and precision (RSD values≤16%) were achieved for selected analytes by any of the sample preparation procedures. Both methods provided the analytical selectivity required for the monitoring of the insecticides in fruit-, cereal- and milk-based baby foods. SALLE, recognized by cost-effectiveness, and simple and fast execution, provided a lower enrichment factor, consequently, higher limits of quantification (LOQs) were obtained. Some of them too high to meet the strict legislation regarding baby food. Nonetheless, the combination of ultrasound and DLLME also resulted in a high sample throughput and environmental-friendly method, whose LOQs were lower than the default maximum residue limit (MRL) of 10μgkg -1 set by European Community for baby foods. In the commercial baby foods analyzed, cyhalothrin and etofenprox were detected in different samples, demonstrating the suitability of proposed method for baby food control. Copyright © 2017 Elsevier B.V. All rights reserved.

  8. The Precision Efficacy Analysis for Regression Sample Size Method.

    ERIC Educational Resources Information Center

    Brooks, Gordon P.; Barcikowski, Robert S.

    The general purpose of this study was to examine the efficiency of the Precision Efficacy Analysis for Regression (PEAR) method for choosing appropriate sample sizes in regression studies used for precision. The PEAR method, which is based on the algebraic manipulation of an accepted cross-validity formula, essentially uses an effect size to…

  9. Development and validation of a matrix solid-phase dispersion method to determine acrylamide in coffee and coffee substitutes.

    PubMed

    Soares, Cristina M Dias; Alves, Rita C; Casal, Susana; Oliveira, M Beatriz P P; Fernandes, José Oliveira

    2010-04-01

    The present study describes the development and validation of a new method based on a matrix solid-phase dispersion (MSPD) sample preparation procedure followed by GC-MS for determination of acrylamide levels in coffee (ground coffee and brewed coffee) and coffee substitute samples. Samples were dispersed in C(18) sorbent and the mixture was further packed into a preconditioned custom-made ISOLUTE bilayered SPE column (C(18)/Multimode; 1 g + 1 g). Acrylamide was subsequently eluted with water, and then derivatized with bromine and quantified by GC-MS in SIM mode. The MSPD/GC-MS method presented a LOD of 5 microg/kg and a LOQ of 10 microg/kg. Intra and interday precisions ranged from 2% to 4% and 4% to 10%, respectively. To evaluate the performance of the method, 11 samples of ground and brewed coffee and coffee substitutes were simultaneously analyzed by the developed method and also by a previously validated method based in a liquid-extraction (LE) procedure, and the results were compared showing a high correlation between them.

  10. Detection of Adulterated Vegetable Oils Containing Waste Cooking Oils Based on the Contents and Ratios of Cholesterol, β-Sitosterol, and Campesterol by Gas Chromatography/Mass Spectrometry.

    PubMed

    Zhao, Haixiang; Wang, Yongli; Xu, Xiuli; Ren, Heling; Li, Li; Xiang, Li; Zhong, Weike

    2015-01-01

    A simple and accurate authentication method for the detection of adulterated vegetable oils that contain waste cooking oil (WCO) was developed. This method is based on the determination of cholesterol, β-sitosterol, and campesterol in vegetable oils and WCO by GC/MS without any derivatization. A total of 148 samples involving 12 types of vegetable oil and WCO were analyzed. According to the results, the contents and ratios of cholesterol, β-sitosterol, and campesterol were found to be criteria for detecting vegetable oils adulterated with WCO. This method could accurately detect adulterated vegetable oils containing 5% refined WCO. The developed method has been successfully applied to multilaboratory analysis of 81 oil samples. Seventy-five samples were analyzed correctly, and only six adulterated samples could not be detected. This method could not yet be used for detection of vegetable oils adulterated with WCO that are used for frying non-animal foods. It provides a quick method for detecting adulterated edible vegetable oils containing WCO.

  11. Pyrosequencing®-Based Identification of Low-Frequency Mutations Enriched Through Enhanced-ice-COLD-PCR.

    PubMed

    How-Kit, Alexandre; Tost, Jörg

    2015-01-01

    A number of molecular diagnostic assays have been developed in the last years for mutation detection. Although these methods have become increasingly sensitive, most of them are incompatible with a sequencing-based readout and require prior knowledge of the mutation present in the sample. Consequently, coamplification at low denaturation (COLD)-PCR-based methods have been developed and combine a high analytical sensitivity due to mutation enrichment in the sample with the identification of known or unknown mutations by downstream sequencing experiments. Among these methods, the recently developed Enhanced-ice-COLD-PCR appeared as the most powerful method as it outperformed the other COLD-PCR-based methods in terms of the mutation enrichment and due to the simplicity of the experimental setup of the assay. Indeed, E-ice-COLD-PCR is very versatile as it can be used on all types of PCR platforms and is applicable to different types of samples including fresh frozen, FFPE, and plasma samples. The technique relies on the incorporation of an LNA containing blocker probe in the PCR reaction followed by selective heteroduplex denaturation enabling amplification of the mutant allele while amplification of the wild-type allele is prevented. Combined with Pyrosequencing(®), which is a very quantitative high-resolution sequencing technology, E-ice-COLD-PCR can detect and identify mutations with a limit of detection down to 0.01 %.

  12. GEE-based SNP set association test for continuous and discrete traits in family-based association studies.

    PubMed

    Wang, Xuefeng; Lee, Seunggeun; Zhu, Xiaofeng; Redline, Susan; Lin, Xihong

    2013-12-01

    Family-based genetic association studies of related individuals provide opportunities to detect genetic variants that complement studies of unrelated individuals. Most statistical methods for family association studies for common variants are single marker based, which test one SNP a time. In this paper, we consider testing the effect of an SNP set, e.g., SNPs in a gene, in family studies, for both continuous and discrete traits. Specifically, we propose a generalized estimating equations (GEEs) based kernel association test, a variance component based testing method, to test for the association between a phenotype and multiple variants in an SNP set jointly using family samples. The proposed approach allows for both continuous and discrete traits, where the correlation among family members is taken into account through the use of an empirical covariance estimator. We derive the theoretical distribution of the proposed statistic under the null and develop analytical methods to calculate the P-values. We also propose an efficient resampling method for correcting for small sample size bias in family studies. The proposed method allows for easily incorporating covariates and SNP-SNP interactions. Simulation studies show that the proposed method properly controls for type I error rates under both random and ascertained sampling schemes in family studies. We demonstrate through simulation studies that our approach has superior performance for association mapping compared to the single marker based minimum P-value GEE test for an SNP-set effect over a range of scenarios. We illustrate the application of the proposed method using data from the Cleveland Family GWAS Study. © 2013 WILEY PERIODICALS, INC.

  13. Interpolating seismic data via the POCS method based on shearlet transform

    NASA Astrophysics Data System (ADS)

    Jicheng, Liu; Yongxin, Chou; Jianjiang, Zhu

    2018-06-01

    A method based on shearlet transform and the projection onto convex sets with L0-norm constraint is proposed to interpolate irregularly sampled 2D and 3D seismic data. The 2D directional filter of shearlet transform is constructed by modulating a low-pass diamond filter pair to minimize the effect of additional edges introduced by the missing traces. In order to abate the spatial aliasing and control the maximal gap between missing traces for a 3D data cube, a 2D separable jittered sampling strategy is discussed. Finally, numerical experiments on 2D and 3D synthetic and real data with different under-sampling rates prove the validity of the proposed method.

  14. Microwave absorption properties of gold nanoparticle doped polymers

    NASA Astrophysics Data System (ADS)

    Jiang, C.; Ouattara, L.; Ingrosso, C.; Curri, M. L.; Krozer, V.; Boisen, A.; Jakobsen, M. H.; Johansen, T. K.

    2011-03-01

    This paper presents a method for characterizing microwave absorption properties of gold nanoparticle doped polymers. The method is based on on-wafer measurements at the frequencies from 0.5 GHz to 20 GHz. The on-wafer measurement method makes it possible to characterize electromagnetic (EM) property of small volume samples. The epoxy based SU8 polymer and SU8 doped with gold nanoparticles are chosen as the samples under test. Two types of microwave test devices are designed for exciting the samples through electrical coupling and magnetic coupling, respectively. Measurement results demonstrate that the nanocomposites absorb a certain amount of microwave energy due to gold nanoparticles. Higher nanoparticle concentration results in more significant absorption effect.

  15. Equilibrium Molecular Thermodynamics from Kirkwood Sampling

    PubMed Central

    2015-01-01

    We present two methods for barrierless equilibrium sampling of molecular systems based on the recently proposed Kirkwood method (J. Chem. Phys.2009, 130, 134102). Kirkwood sampling employs low-order correlations among internal coordinates of a molecule for random (or non-Markovian) sampling of the high dimensional conformational space. This is a geometrical sampling method independent of the potential energy surface. The first method is a variant of biased Monte Carlo, where Kirkwood sampling is used for generating trial Monte Carlo moves. Using this method, equilibrium distributions corresponding to different temperatures and potential energy functions can be generated from a given set of low-order correlations. Since Kirkwood samples are generated independently, this method is ideally suited for massively parallel distributed computing. The second approach is a variant of reservoir replica exchange, where Kirkwood sampling is used to construct a reservoir of conformations, which exchanges conformations with the replicas performing equilibrium sampling corresponding to different thermodynamic states. Coupling with the Kirkwood reservoir enhances sampling by facilitating global jumps in the conformational space. The efficiency of both methods depends on the overlap of the Kirkwood distribution with the target equilibrium distribution. We present proof-of-concept results for a model nine-atom linear molecule and alanine dipeptide. PMID:25915525

  16. Improved LC-MS/MS method for the quantification of hepcidin-25 in clinical samples.

    PubMed

    Abbas, Ioana M; Hoffmann, Holger; Montes-Bayón, María; Weller, Michael G

    2018-06-01

    Mass spectrometry-based methods play a crucial role in the quantification of the main iron metabolism regulator hepcidin by singling out the bioactive 25-residue peptide from the other naturally occurring N-truncated isoforms (hepcidin-20, -22, -24), which seem to be inactive in iron homeostasis. However, several difficulties arise in the MS analysis of hepcidin due to the "sticky" character of the peptide and the lack of suitable standards. Here, we propose the use of amino- and fluoro-silanized autosampler vials to reduce hepcidin interaction to laboratory glassware surfaces after testing several types of vials for the preparation of stock solutions and serum samples for isotope dilution liquid chromatography-tandem mass spectrometry (ID-LC-MS/MS). Furthermore, we have investigated two sample preparation strategies and two chromatographic separation conditions with the aim of developing a LC-MS/MS method for the sensitive and reliable quantification of hepcidin-25 in serum samples. A chromatographic separation based on usual acidic mobile phases was compared with a novel approach involving the separation of hepcidin-25 with solvents at high pH containing 0.1% of ammonia. Both methods were applied to clinical samples in an intra-laboratory comparison of two LC-MS/MS methods using the same hepcidin-25 calibrators with good correlation of the results. Finally, we recommend a LC-MS/MS-based quantification method with a dynamic range of 0.5-40 μg/L for the assessment of hepcidin-25 in human serum that uses TFA-based mobile phases and silanized glass vials. Graphical abstract Structure of hepcidin-25 (Protein Data Bank, PDB ID 2KEF).

  17. Determination of Total Selenium in Infant Formulas: Comparison of the Performance of FIA and MCFA Flow Systems

    PubMed Central

    Pistón, Mariela; Knochen, Moisés

    2012-01-01

    Two flow methods, based, respectively, on flow-injection analysis (FIA) and on multicommutated flow analysis (MCFA), were compared with regard to their use for the determination of total selenium in infant formulas by hydride-generation atomic absorption spectrometry. The method based on multicommutation provided lower detection and quantification limits (0.08 and 0.27 μg L−1 compared to 0.59 and 1.95 μ L−1, resp.), higher sampling frequency (160 versus. 70 samples per hour), and reduced reagent consumption. Linearity, precision, and accuracy were similar for the two methods compared. It was concluded that, while both methods proved to be appropriate for the purpose, the MCFA-based method exhibited a better performance. PMID:22505923

  18. A chemodynamic approach for estimating losses of target organic chemicals from water during sample holding time

    USGS Publications Warehouse

    Capel, P.D.; Larson, S.J.

    1995-01-01

    Minimizing the loss of target organic chemicals from environmental water samples between the time of sample collection and isolation is important to the integrity of an investigation. During this sample holding time, there is a potential for analyte loss through volatilization from the water to the headspace, sorption to the walls and cap of the sample bottle; and transformation through biotic and/or abiotic reactions. This paper presents a chemodynamic-based, generalized approach to estimate the most probable loss processes for individual target organic chemicals. The basic premise is that the investigator must know which loss process(es) are important for a particular analyte, based on its chemodynamic properties, when choosing the appropriate method(s) to prevent loss.

  19. Convenient mounting method for electrical measurements of thin samples

    NASA Technical Reports Server (NTRS)

    Matus, L. G.; Summers, R. L.

    1986-01-01

    A method for mounting thin samples for electrical measurements is described. The technique is based on a vacuum chuck concept in which the vacuum chuck simultaneously holds the sample and established electrical contact. The mounting plate is composed of a glass-ceramic insulating material and the surfaces of the plate and vacuum chuck are polished. The operation of the vacuum chuck is examined. The contacts on the sample and mounting plate, which are sputter-deposited through metal masks, are analyzed. The mounting method was utilized for van der Pauw measurements.

  20. Uncertainty in the profitability of fertilizer management based on various sampling designs.

    NASA Astrophysics Data System (ADS)

    Muhammed, Shibu; Ben, Marchant; Webster, Richard; Milne, Alice; Dailey, Gordon; Whitmore, Andrew

    2016-04-01

    Many farmers sample their soil to measure the concentrations of plant nutrients, including phosphorus (P), so as to decide how much fertilizer to apply. Now that fertilizer can be applied at variable rates, farmers want to know whether maps of nutrient concentration made from grid samples or from field subdivisions (zones within their fields) are merited: do such maps lead to greater profit than would a single measurement on a bulked sample for each field when all costs are taken into account? We have examined the merits of grid-based and zone-based sampling strategies over single field-based averages using continuous spatial data on wheat yields at harvest in six fields in southern England and simulated concentrations of P in the soil. Features of the spatial variation in the yields provide predictions about which sampling scheme is likely to be most cost effective, but there is uncertainty associated with these predictions that must be communicated to farmers. Where variograms of the yield have large variances and long effective ranges, grid-sampling and mapping nutrients are likely to be cost-effective. Where effective ranges are short, sampling must be dense to reveal the spatial variation and may be expensive. In these circumstances variable-rate application of fertilizer is likely to be impracticable and almost certainly not cost-effective. We have explored several methods for communicating these results and found that the most effective method was using probability maps that show the likelihood of grid-based and zone-based sampling being more profitable that a field-based estimate.

  1. An efficient and reliable DNA-based sex identification method for archaeological Pacific salmonid (Oncorhynchus spp.) remains.

    PubMed

    Royle, Thomas C A; Sakhrani, Dionne; Speller, Camilla F; Butler, Virginia L; Devlin, Robert H; Cannon, Aubrey; Yang, Dongya Y

    2018-01-01

    Pacific salmonid (Oncorhynchus spp.) remains are routinely recovered from archaeological sites in northwestern North America but typically lack sexually dimorphic features, precluding the sex identification of these remains through morphological approaches. Consequently, little is known about the deep history of the sex-selective salmonid fishing strategies practiced by some of the region's Indigenous peoples. Here, we present a DNA-based method for the sex identification of archaeological Pacific salmonid remains that integrates two PCR assays that each co-amplify fragments of the sexually dimorphic on the Y chromosome (sdY) gene and an internal positive control (Clock1a or D-loop). The first assay co-amplifies a 95 bp fragment of sdY and a 108 bp fragment of the autosomal Clock1a gene, whereas the second assay co-amplifies the same sdY fragment and a 249 bp fragment of the mitochondrial D-loop region. This method's reliability, sensitivity, and efficiency, were evaluated by applying it to 72 modern Pacific salmonids from five species and 75 archaeological remains from six Pacific salmonids. The sex identities assigned to each of the modern samples were concordant with their known phenotypic sex, highlighting the method's reliability. Applications of the method to dilutions of modern DNA samples indicate it can correctly identify the sex of samples with as little as ~39 pg of total genomic DNA. The successful sex identification of 70 of the 75 (93%) archaeological samples further demonstrates the method's sensitivity. The method's reliance on two co-amplifications that preferentially amplify sdY helps validate the sex identities assigned to samples and reduce erroneous identifications caused by allelic dropout and contamination. Furthermore, by sequencing the D-loop fragment used as a positive control, species-level and sex identifications can be simultaneously assigned to samples. Overall, our results indicate the DNA-based method reported in this study is a sensitive and reliable sex identification method for ancient salmonid remains.

  2. Preliminary Study on Appearance-Based Detection of Anatomical Point Landmarks in Body Trunk CT Images

    NASA Astrophysics Data System (ADS)

    Nemoto, Mitsutaka; Nomura, Yukihiro; Hanaoka, Shohei; Masutani, Yoshitaka; Yoshikawa, Takeharu; Hayashi, Naoto; Yoshioka, Naoki; Ohtomo, Kuni

    Anatomical point landmarks as most primitive anatomical knowledge are useful for medical image understanding. In this study, we propose a detection method for anatomical point landmark based on appearance models, which include gray-level statistical variations at point landmarks and their surrounding area. The models are built based on results of Principal Component Analysis (PCA) of sample data sets. In addition, we employed generative learning method by transforming ROI of sample data. In this study, we evaluated our method with 24 data sets of body trunk CT images and obtained 95.8 ± 7.3 % of the average sensitivity in 28 landmarks.

  3. Low Field Squid MRI Devices, Components and Methods

    NASA Technical Reports Server (NTRS)

    Hahn, Inseob (Inventor); Penanen, Konstantin I. (Inventor); Eom, Byeong H. (Inventor)

    2013-01-01

    Low field SQUID MRI devices, components and methods are disclosed. They include a portable low field (SQUID)-based MRI instrument and a portable low field SQUID-based MRI system to be operated under a bed where a subject is adapted to be located. Also disclosed is a method of distributing wires on an image encoding coil system adapted to be used with an NMR or MRI device for analyzing a sample or subject and a second order superconducting gradiometer adapted to be used with a low field SQUID-based MRI device as a sensing component for an MRI signal related to a subject or sample.

  4. Low Field Squid MRI Devices, Components and Methods

    NASA Technical Reports Server (NTRS)

    Penanen, Konstantin I. (Inventor); Eom, Byeong H. (Inventor); Hahn, Inseob (Inventor)

    2014-01-01

    Low field SQUID MRI devices, components and methods are disclosed. They include a portable low field (SQUID)-based MRI instrument and a portable low field SQUID-based MRI system to be operated under a bed where a subject is adapted to be located. Also disclosed is a method of distributing wires on an image encoding coil system adapted to be used with an NMR or MRI device for analyzing a sample or subject and a second order superconducting gradiometer adapted to be used with a low field SQUID-based MRI device as a sensing component for an MRI signal related to a subject or sample.

  5. Low field SQUID MRI devices, components and methods

    NASA Technical Reports Server (NTRS)

    Penanen, Konstantin I. (Inventor); Eom, Byeong H. (Inventor); Hahn, Inseob (Inventor)

    2011-01-01

    Low field SQUID MRI devices, components and methods are disclosed. They include a portable low field (SQUID)-based MRI instrument and a portable low field SQUID-based MRI system to be operated under a bed where a subject is adapted to be located. Also disclosed is a method of distributing wires on an image encoding coil system adapted to be used with an NMR or MRI device for analyzing a sample or subject and a second order superconducting gradiometer adapted to be used with a low field SQUID-based MRI device as a sensing component for an MRI signal related to a subject or sample.

  6. Low field SQUID MRI devices, components and methods

    NASA Technical Reports Server (NTRS)

    Penanen, Konstantin I. (Inventor); Eom, Byeong H (Inventor); Hahn, Inseob (Inventor)

    2010-01-01

    Low field SQUID MRI devices, components and methods are disclosed. They include a portable low field (SQUID)-based MRI instrument and a portable low field SQUID-based MRI system to be operated under a bed where a subject is adapted to be located. Also disclosed is a method of distributing wires on an image encoding coil system adapted to be used with an NMR or MRI device for analyzing a sample or subject and a second order superconducting gradiometer adapted to be used with a low field SQUID-based MRI device as a sensing component for an MRI signal related to a subject or sample.

  7. Near-Native Protein Loop Sampling Using Nonparametric Density Estimation Accommodating Sparcity

    PubMed Central

    Day, Ryan; Lennox, Kristin P.; Sukhanov, Paul; Dahl, David B.; Vannucci, Marina; Tsai, Jerry

    2011-01-01

    Unlike the core structural elements of a protein like regular secondary structure, template based modeling (TBM) has difficulty with loop regions due to their variability in sequence and structure as well as the sparse sampling from a limited number of homologous templates. We present a novel, knowledge-based method for loop sampling that leverages homologous torsion angle information to estimate a continuous joint backbone dihedral angle density at each loop position. The φ,ψ distributions are estimated via a Dirichlet process mixture of hidden Markov models (DPM-HMM). Models are quickly generated based on samples from these distributions and were enriched using an end-to-end distance filter. The performance of the DPM-HMM method was evaluated against a diverse test set in a leave-one-out approach. Candidates as low as 0.45 Å RMSD and with a worst case of 3.66 Å were produced. For the canonical loops like the immunoglobulin complementarity-determining regions (mean RMSD <2.0 Å), the DPM-HMM method performs as well or better than the best templates, demonstrating that our automated method recaptures these canonical loops without inclusion of any IgG specific terms or manual intervention. In cases with poor or few good templates (mean RMSD >7.0 Å), this sampling method produces a population of loop structures to around 3.66 Å for loops up to 17 residues. In a direct test of sampling to the Loopy algorithm, our method demonstrates the ability to sample nearer native structures for both the canonical CDRH1 and non-canonical CDRH3 loops. Lastly, in the realistic test conditions of the CASP9 experiment, successful application of DPM-HMM for 90 loops from 45 TBM targets shows the general applicability of our sampling method in loop modeling problem. These results demonstrate that our DPM-HMM produces an advantage by consistently sampling near native loop structure. The software used in this analysis is available for download at http://www.stat.tamu.edu/~dahl/software/cortorgles/. PMID:22028638

  8. Operational Evaluation of the Rapid Viability PCR Method for ...

    EPA Pesticide Factsheets

    Journal Article This research work has a significant impact on the use of the RV-PCR method to analyze post-decontamination environmental samples during an anthrax event. The method has shown 98% agreement with the traditional culture based method. With such a success, this method, upon validation, will significantly increase the laboratory throughput/capacity to analyze a large number of anthrax event samples in a relatively short time.

  9. A novel one-class SVM based negative data sampling method for reconstructing proteome-wide HTLV-human protein interaction networks.

    PubMed

    Mei, Suyu; Zhu, Hao

    2015-01-26

    Protein-protein interaction (PPI) prediction is generally treated as a problem of binary classification wherein negative data sampling is still an open problem to be addressed. The commonly used random sampling is prone to yield less representative negative data with considerable false negatives. Meanwhile rational constraints are seldom exerted on model selection to reduce the risk of false positive predictions for most of the existing computational methods. In this work, we propose a novel negative data sampling method based on one-class SVM (support vector machine, SVM) to predict proteome-wide protein interactions between HTLV retrovirus and Homo sapiens, wherein one-class SVM is used to choose reliable and representative negative data, and two-class SVM is used to yield proteome-wide outcomes as predictive feedback for rational model selection. Computational results suggest that one-class SVM is more suited to be used as negative data sampling method than two-class PPI predictor, and the predictive feedback constrained model selection helps to yield a rational predictive model that reduces the risk of false positive predictions. Some predictions have been validated by the recent literature. Lastly, gene ontology based clustering of the predicted PPI networks is conducted to provide valuable cues for the pathogenesis of HTLV retrovirus.

  10. Interval sampling methods and measurement error: a computer simulation.

    PubMed

    Wirth, Oliver; Slaven, James; Taylor, Matthew A

    2014-01-01

    A simulation study was conducted to provide a more thorough account of measurement error associated with interval sampling methods. A computer program simulated the application of momentary time sampling, partial-interval recording, and whole-interval recording methods on target events randomly distributed across an observation period. The simulation yielded measures of error for multiple combinations of observation period, interval duration, event duration, and cumulative event duration. The simulations were conducted up to 100 times to yield measures of error variability. Although the present simulation confirmed some previously reported characteristics of interval sampling methods, it also revealed many new findings that pertain to each method's inherent strengths and weaknesses. The analysis and resulting error tables can help guide the selection of the most appropriate sampling method for observation-based behavioral assessments. © Society for the Experimental Analysis of Behavior.

  11. Estimation of pyrethroid pesticide intake using regression ...

    EPA Pesticide Factsheets

    Population-based estimates of pesticide intake are needed to characterize exposure for particular demographic groups based on their dietary behaviors. Regression modeling performed on measurements of selected pesticides in composited duplicate diet samples allowed (1) estimation of pesticide intakes for a defined demographic community, and (2) comparison of dietary pesticide intakes between the composite and individual samples. Extant databases were useful for assigning individual samples to composites, but they could not provide the breadth of information needed to facilitate measurable levels in every composite. Composite sample measurements were found to be good predictors of pyrethroid pesticide levels in their individual sample constituents where sufficient measurements are available above the method detection limit. Statistical inference shows little evidence of differences between individual and composite measurements and suggests that regression modeling of food groups based on composite dietary samples may provide an effective tool for estimating dietary pesticide intake for a defined population. The research presented in the journal article will improve community's ability to determine exposures through the dietary route with a less burdensome and costly method.

  12. What a drop can do: dried blood spots as a minimally invasive method for integrating biomarkers into population-based research.

    PubMed

    McDade, Thomas W; Williams, Sharon; Snodgrass, J Josh

    2007-11-01

    Logistical constraints associated with the collection and analysis of biological samples in community-based settings have been a significant impediment to integrative, multilevel bio-demographic and biobehavioral research. However recent methodological developments have overcome many of these constraints and have also expanded the options for incorporating biomarkers into population-based health research in international as well as domestic contexts. In particular using dried blood spot (DBS) samples-drops of whole blood collected on filter paper from a simple finger prick-provides a minimally invasive method for collecting blood samples in nonclinical settings. After a brief discussion of biomarkers more generally, we review procedures for collecting, handling, and analyzing DBS samples. Advantages of using DBS samples-compared with venipuncture include the relative ease and low cost of sample collection, transport, and storage. Disadvantages include requirements for assay development and validation as well as the relatively small volumes of sample. We present the results of a comprehensive literature review of published protocols for analysis of DBS samples, and we provide more detailed analysis of protocols for 45 analytes likely to be of particular relevance to population-level health research. Our objective is to provide investigators with the information they need to make informed decisions regarding the appropriateness of blood spot methods for their research interests.

  13. An empirical study using permutation-based resampling in meta-regression

    PubMed Central

    2012-01-01

    Background In meta-regression, as the number of trials in the analyses decreases, the risk of false positives or false negatives increases. This is partly due to the assumption of normality that may not hold in small samples. Creation of a distribution from the observed trials using permutation methods to calculate P values may allow for less spurious findings. Permutation has not been empirically tested in meta-regression. The objective of this study was to perform an empirical investigation to explore the differences in results for meta-analyses on a small number of trials using standard large sample approaches verses permutation-based methods for meta-regression. Methods We isolated a sample of randomized controlled clinical trials (RCTs) for interventions that have a small number of trials (herbal medicine trials). Trials were then grouped by herbal species and condition and assessed for methodological quality using the Jadad scale, and data were extracted for each outcome. Finally, we performed meta-analyses on the primary outcome of each group of trials and meta-regression for methodological quality subgroups within each meta-analysis. We used large sample methods and permutation methods in our meta-regression modeling. We then compared final models and final P values between methods. Results We collected 110 trials across 5 intervention/outcome pairings and 5 to 10 trials per covariate. When applying large sample methods and permutation-based methods in our backwards stepwise regression the covariates in the final models were identical in all cases. The P values for the covariates in the final model were larger in 78% (7/9) of the cases for permutation and identical for 22% (2/9) of the cases. Conclusions We present empirical evidence that permutation-based resampling may not change final models when using backwards stepwise regression, but may increase P values in meta-regression of multiple covariates for relatively small amount of trials. PMID:22587815

  14. TL and ESR based identification of gamma-irradiated frozen fish using different hydrolysis techniques

    NASA Astrophysics Data System (ADS)

    Ahn, Jae-Jun; Akram, Kashif; Shahbaz, Hafiz Muhammad; Kwon, Joong-Ho

    2014-12-01

    Frozen fish fillets (walleye Pollack and Japanese Spanish mackerel) were selected as samples for irradiation (0-10 kGy) detection trials using different hydrolysis methods. Photostimulated luminescence (PSL)-based screening analysis for gamma-irradiated frozen fillets showed low sensitivity due to limited silicate mineral contents on the samples. Same limitations were found in the thermoluminescence (TL) analysis on mineral samples isolated by density separation method. However, acid (HCl) and alkali (KOH) hydrolysis methods were effective in getting enough minerals to carry out TL analysis, which was reconfirmed through the normalization step by calculating the TL ratios (TL1/TL2). For improved electron spin resonance (ESR) analysis, alkali and enzyme (alcalase) hydrolysis methods were compared in separating minute-bone fractions. The enzymatic method provided more clear radiation-specific hydroxyapatite radicals than that of the alkaline method. Different hydrolysis methods could extend the application of TL and ESR techniques in identifying the irradiation history of frozen fish fillets.

  15. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jenkins, T.F.; Thorne, P.G.; Myers, K.F.

    Salting-out solvent extraction (SOE) was compared with cartridge and membrane solid-phase extraction (SPE) for preconcentration of nitroaromatics, nitramines, and aminonitroaromatics prior to determination by reversed-phase high-performance liquid chromatography. The solid phases used were manufacturer-cleaned materials, Porapak RDX for the cartridge method and Empore SDB-RPS for the membrane method. Thirty-three groundwater samples from the Naval Surface Warfare Center, Crane, Indiana, were analyzed using the direct analysis protocol specified in SW846 Method 8330, and the results were compared with analyses conducted after preconcentration using SOE with acetonitrile, cartridge-based SPE, and membrane-based SPE. For high-concentration samples, analytical results from the three preconcentration techniquesmore » were compared with results from the direct analysis protocol; good recovery of all target analytes was achieved by all three pre-concentration methods. For low-concentration samples, results from the two SPE methods were correlated with results from the SOE method; very similar data was obtained by the SOE and SPE methods, even at concentrations well below 1 microgram/L.« less

  16. Differential expression analysis for RNAseq using Poisson mixed models

    PubMed Central

    Sun, Shiquan; Hood, Michelle; Scott, Laura; Peng, Qinke; Mukherjee, Sayan; Tung, Jenny

    2017-01-01

    Abstract Identifying differentially expressed (DE) genes from RNA sequencing (RNAseq) studies is among the most common analyses in genomics. However, RNAseq DE analysis presents several statistical and computational challenges, including over-dispersed read counts and, in some settings, sample non-independence. Previous count-based methods rely on simple hierarchical Poisson models (e.g. negative binomial) to model independent over-dispersion, but do not account for sample non-independence due to relatedness, population structure and/or hidden confounders. Here, we present a Poisson mixed model with two random effects terms that account for both independent over-dispersion and sample non-independence. We also develop a scalable sampling-based inference algorithm using a latent variable representation of the Poisson distribution. With simulations, we show that our method properly controls for type I error and is generally more powerful than other widely used approaches, except in small samples (n <15) with other unfavorable properties (e.g. small effect sizes). We also apply our method to three real datasets that contain related individuals, population stratification or hidden confounders. Our results show that our method increases power in all three data compared to other approaches, though the power gain is smallest in the smallest sample (n = 6). Our method is implemented in MACAU, freely available at www.xzlab.org/software.html. PMID:28369632

  17. Generalized analog thresholding for spike acquisition at ultralow sampling rates

    PubMed Central

    He, Bryan D.; Wein, Alex; Varshney, Lav R.; Kusuma, Julius; Richardson, Andrew G.

    2015-01-01

    Efficient spike acquisition techniques are needed to bridge the divide from creating large multielectrode arrays (MEA) to achieving whole-cortex electrophysiology. In this paper, we introduce generalized analog thresholding (gAT), which achieves millisecond temporal resolution with sampling rates as low as 10 Hz. Consider the torrent of data from a single 1,000-channel MEA, which would generate more than 3 GB/min using standard 30-kHz Nyquist sampling. Recent neural signal processing methods based on compressive sensing still require Nyquist sampling as a first step and use iterative methods to reconstruct spikes. Analog thresholding (AT) remains the best existing alternative, where spike waveforms are passed through an analog comparator and sampled at 1 kHz, with instant spike reconstruction. By generalizing AT, the new method reduces sampling rates another order of magnitude, detects more than one spike per interval, and reconstructs spike width. Unlike compressive sensing, the new method reveals a simple closed-form solution to achieve instant (noniterative) spike reconstruction. The base method is already robust to hardware nonidealities, including realistic quantization error and integration noise. Because it achieves these considerable specifications using hardware-friendly components like integrators and comparators, generalized AT could translate large-scale MEAs into implantable devices for scientific investigation and medical technology. PMID:25904712

  18. Factors affecting the relationship between quantitative polymerase chain reaction (qPCR) and culture-based enumeration of Enterococcus in environmental waters.

    PubMed

    Raith, M R; Ebentier, D L; Cao, Y; Griffith, J F; Weisberg, S B

    2014-03-01

    To determine the extent to which discrepancies between qPCR and culture-based results in beach water quality monitoring can be attributed to: (i) within-method variability, (ii) between-method difference within each method class (qPCR or culture) and (iii) between-class difference. We analysed 306 samples using two culture-based (EPA1600 and Enterolert) and two qPCR (Taqman and Scorpion) methods, each in duplicate. Both qPCR methods correlated with EPA1600, but regression analyses indicated approximately 0·8 log10 unit overestimation by qPCR compared to culture methods. Differences between methods within a class were less than half of this and were minimal for between-replicate within a method. Using the 104 Enterococcus per 100 ml management decision threshold, Taqman qPCR indicated the same decisions as EPA1600 for 87% of the samples, but indicated beach posting for unhealthful water when EPA1600 did not for 12% of the samples. After accounting for within-method and within-class variability, 8% of the samples exhibited true between-class discrepancy where both qPCR methods indicated beach posting while both culture methods did not. Measurement target difference (DNA vs growth) accounted for the majority of the qPCR-vs-culture discrepancy, but its influence on monitoring application is outweighed by frequent incorrect posting with culture methods due to incubation time delay. This is the first study to quantify the frequency with which culture-vs-qPCR discrepancies can be attributed to target difference - vs - method variability. © 2013 The Society for Applied Microbiology.

  19. Towards establishing a human fecal contamination index in microbial source tracking

    EPA Science Inventory

    There have been significant advances in development of PCR-based methods to detect source associated DNA sequences (markers), but method evaluation has focused on performance with individual challenge samples. Little attention has been given to integration of multiple samples fro...

  20. The internet as psychological laboratory.

    PubMed

    Skitka, Linda J; Sargis, Edward G

    2006-01-01

    This chapter reviews studies published in American Psychological Association (APA) journals from 2003-2004 and additional studies (received in response to listserv requests) that used the Internet to collect data (N=121 total studies). Specific examples of three kinds of Web-based research are reviewed: (a) translational (established methods and research questions are adapted to the Web), (b) phenomenological (behavior on the Web is the focus of study), and (c) novel (methodologically innovations unique to Web-based research). Among other findings, our review indicated that 21% of APA journals published at least one article that reported on Web-based research, most Web-based psychological research uses experimental methods, a surprising number use college student samples, and deception in Web-based research is not uncommon. Strengths and weaknesses of Web-based psychological research in general, and our sample of studies in particular, are reviewed with special attention to possible concerns about sampling and the use of deception.

  1. Semi-Supervised Marginal Fisher Analysis for Hyperspectral Image Classification

    NASA Astrophysics Data System (ADS)

    Huang, H.; Liu, J.; Pan, Y.

    2012-07-01

    The problem of learning with both labeled and unlabeled examples arises frequently in Hyperspectral image (HSI) classification. While marginal Fisher analysis is a supervised method, which cannot be directly applied for Semi-supervised classification. In this paper, we proposed a novel method, called semi-supervised marginal Fisher analysis (SSMFA), to process HSI of natural scenes, which uses a combination of semi-supervised learning and manifold learning. In SSMFA, a new difference-based optimization objective function with unlabeled samples has been designed. SSMFA preserves the manifold structure of labeled and unlabeled samples in addition to separating labeled samples in different classes from each other. The semi-supervised method has an analytic form of the globally optimal solution, and it can be computed based on eigen decomposition. Classification experiments with a challenging HSI task demonstrate that this method outperforms current state-of-the-art HSI-classification methods.

  2. Dynamic spiking studies using the DNPH sampling train

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Steger, J.L.; Knoll, J.E.

    1996-12-31

    The proposed aldehyde and ketone sampling method using aqueous 2,4-dinitrophenylhydrazine (DNPH) was evaluated in the laboratory and in the field. The sampling trains studied were based on the train described in SW 846 Method 0011. Nine compounds were evaluated: formaldehyde, acetaldehyde, quinone, acrolein, propionaldeyde, methyl isobutyl ketone, methyl ethyl ketone, acetophenone, and isophorone. In the laboratory, the trains were spiked both statistically and dynamically. Laboratory studies also investigated potential interferences to the method. Based on their potential to hydrolyze in acid solution to form formaldehyde, dimethylolurea, saligenin, s-trioxane, hexamethylenetetramine, and paraformaldehyde were investigated. Ten runs were performed using quadruplicate samplingmore » trains. Two of the four trains were dynamically spiked with the nine aldehydes and ketones. The test results were evaluated using the EPA method 301 criteria for method precision (< + pr - 50% relative standard deviation) and bias (correction factor of 1.00 + or - 0.30).« less

  3. Measurement of the Microwave Refractive Index of Materials Based on Parallel Plate Waveguides

    NASA Astrophysics Data System (ADS)

    Zhao, F.; Pei, J.; Kan, J. S.; Zhao, Q.

    2017-12-01

    An electrical field scanning apparatus based on a parallel plate waveguide method is constructed, which collects the amplitude and phase matrices as a function of the relative position. On the basis of such data, a method for calculating the refractive index of the measured wedge samples is proposed in this paper. The measurement and calculation results of different PTFE samples reveal that the refractive index measured by the apparatus is substantially consistent with the refractive index inferred with the permittivity of the sample. The proposed refractive index calculation method proposed in this paper is a competitive method for the characterization of the refractive index of materials with positive refractive index. Since the apparatus and method can be used to measure and calculate arbitrary direction of the microwave propagation, it is believed that both of them can be applied to the negative refractive index materials, such as metamaterials or “left-handed” materials.

  4. A novel method for sex determination by detecting the number of X chromosomes.

    PubMed

    Nakanishi, Hiroaki; Shojo, Hideki; Ohmori, Takeshi; Hara, Masaaki; Takada, Aya; Adachi, Noboru; Saito, Kazuyuki

    2015-01-01

    A novel method for sex determination, based on the detection of the number of X chromosomes, was established. Current methods, based on the detection of the Y chromosome, can directly identify an unknown sample as male, but female gender is determined indirectly, by not detecting the Y chromosome. Thus, a direct determination of female gender is important because the quality (e.g., fragmentation and amelogenin-Y null allele) of the Y chromosome DNA may lead to a false result. Thus, we developed a novel sex determination method by analyzing the number of X chromosomes using a copy number variation (CNV) detection technique (the comparative Ct method). In this study, we designed a primer set using the amelogenin-X gene without the CNV region as the target to determine the X chromosome copy number, to exclude the influence of the CNV region from the comparative Ct value. The number of X chromosomes was determined statistically using the CopyCaller software with real-time PCR. All DNA samples from participants (20 males, 20 females) were evaluated correctly using this method with 1-ng template DNA. A minimum of 0.2-ng template DNA was found to be necessary for accurate sex determination with this method. When using ultraviolet-irradiated template DNA, as mock forensic samples, the sex of the samples could not be determined by short tandem repeat (STR) analysis but was correctly determined using our method. Thus, we successfully developed a method of sex determination based on the number of X chromosomes. Our novel method will be useful in forensic practice for sex determination.

  5. Highly simplified lateral flow-based nucleic acid sample preparation and passive fluid flow control

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cary, Robert E.

    2015-12-08

    Highly simplified lateral flow chromatographic nucleic acid sample preparation methods, devices, and integrated systems are provided for the efficient concentration of trace samples and the removal of nucleic acid amplification inhibitors. Methods for capturing and reducing inhibitors of nucleic acid amplification reactions, such as humic acid, using polyvinylpyrrolidone treated elements of the lateral flow device are also provided. Further provided are passive fluid control methods and systems for use in lateral flow assays.

  6. Highly simplified lateral flow-based nucleic acid sample preparation and passive fluid flow control

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cary, Robert B.

    Highly simplified lateral flow chromatographic nucleic acid sample preparation methods, devices, and integrated systems are provided for the efficient concentration of trace samples and the removal of nucleic acid amplification inhibitors. Methods for capturing and reducing inhibitors of nucleic acid amplification reactions, such as humic acid, using polyvinylpyrrolidone treated elements of the lateral flow device are also provided. Further provided are passive fluid control methods and systems for use in lateral flow assays.

  7. Possible overestimation of surface disinfection efficiency by assessment methods based on liquid sampling procedures as demonstrated by in situ quantification of spore viability.

    PubMed

    Grand, I; Bellon-Fontaine, M-N; Herry, J-M; Hilaire, D; Moriconi, F-X; Naïtali, M

    2011-09-01

    The standard test methods used to assess the efficiency of a disinfectant applied to surfaces are often based on counting the microbial survivors sampled in a liquid, but total cell removal from surfaces is seldom achieved. One might therefore wonder whether evaluations of microbial survivors in liquid-sampled cells are representative of the levels of survivors in whole populations. The present study was thus designed to determine the "damaged/undamaged" status induced by a peracetic acid disinfection for Bacillus atrophaeus spores deposited on glass coupons directly on this substrate and to compare it to the status of spores collected in liquid by a sampling procedure. The method utilized to assess the viability of both surface-associated and liquid-sampled spores included fluorescence labeling with a combination of Syto 61 and Chemchrome V6 dyes and quantifications by analyzing the images acquired by confocal laser scanning microscopy. The principal result of the study was that the viability of spores sampled in the liquid was found to be poorer than that of surface-associated spores. For example, after 2 min of peracetic acid disinfection, less than 17% ± 5% of viable cells were detected among liquid-sampled cells compared to 79% ± 5% or 47% ± 4%, respectively, when the viability was evaluated on the surface after or without the sampling procedure. Moreover, assessments of the survivors collected in the liquid phase, evaluated using the microscopic method and standard plate counts, were well correlated. Evaluations based on the determination of survivors among the liquid-sampled cells can thus overestimate the efficiency of surface disinfection procedures.

  8. Possible Overestimation of Surface Disinfection Efficiency by Assessment Methods Based on Liquid Sampling Procedures as Demonstrated by In Situ Quantification of Spore Viability ▿

    PubMed Central

    Grand, I.; Bellon-Fontaine, M.-N.; Herry, J.-M.; Hilaire, D.; Moriconi, F.-X.; Naïtali, M.

    2011-01-01

    The standard test methods used to assess the efficiency of a disinfectant applied to surfaces are often based on counting the microbial survivors sampled in a liquid, but total cell removal from surfaces is seldom achieved. One might therefore wonder whether evaluations of microbial survivors in liquid-sampled cells are representative of the levels of survivors in whole populations. The present study was thus designed to determine the “damaged/undamaged” status induced by a peracetic acid disinfection for Bacillus atrophaeus spores deposited on glass coupons directly on this substrate and to compare it to the status of spores collected in liquid by a sampling procedure. The method utilized to assess the viability of both surface-associated and liquid-sampled spores included fluorescence labeling with a combination of Syto 61 and Chemchrome V6 dyes and quantifications by analyzing the images acquired by confocal laser scanning microscopy. The principal result of the study was that the viability of spores sampled in the liquid was found to be poorer than that of surface-associated spores. For example, after 2 min of peracetic acid disinfection, less than 17% ± 5% of viable cells were detected among liquid-sampled cells compared to 79% ± 5% or 47% ± 4%, respectively, when the viability was evaluated on the surface after or without the sampling procedure. Moreover, assessments of the survivors collected in the liquid phase, evaluated using the microscopic method and standard plate counts, were well correlated. Evaluations based on the determination of survivors among the liquid-sampled cells can thus overestimate the efficiency of surface disinfection procedures. PMID:21742922

  9. Method for sampling and analysis of volatile biomarkers in process gas from aerobic digestion of poultry carcasses using time-weighted average SPME and GC-MS.

    PubMed

    Koziel, Jacek A; Nguyen, Lam T; Glanville, Thomas D; Ahn, Heekwon; Frana, Timothy S; Hans van Leeuwen, J

    2017-10-01

    A passive sampling method, using retracted solid-phase microextraction (SPME) - gas chromatography-mass spectrometry and time-weighted averaging, was developed and validated for tracking marker volatile organic compounds (VOCs) emitted during aerobic digestion of biohazardous animal tissue. The retracted SPME configuration protects the fragile fiber from buffeting by the process gas stream, and it requires less equipment and is potentially more biosecure than conventional active sampling methods. VOC concentrations predicted via a model based on Fick's first law of diffusion were within 6.6-12.3% of experimentally controlled values after accounting for VOC adsorption to the SPME fiber housing. Method detection limits for five marker VOCs ranged from 0.70 to 8.44ppbv and were statistically equivalent (p>0.05) to those for active sorbent-tube-based sampling. The sampling time of 30min and fiber retraction of 5mm were found to be optimal for the tissue digestion process. Copyright © 2017 Elsevier Ltd. All rights reserved.

  10. Two-dimensional imaging of two types of radicals by the CW-EPR method

    NASA Astrophysics Data System (ADS)

    Czechowski, Tomasz; Krzyminiewski, Ryszard; Jurga, Jan; Chlewicki, Wojciech

    2008-01-01

    The CW-EPR method of image reconstruction is based on sample rotation in a magnetic field with a constant gradient (50 G/cm). In order to obtain a projection (radical density distribution) along a given direction, the EPR spectra are recorded with and without the gradient. Deconvolution, then gives the distribution of the spin density. Projection at 36 different angles gives the information that is necessary for reconstruction of the radical distribution. The problem becomes more complex when there are at least two types of radicals in the sample, because the deconvolution procedure does not give satisfactory results. We propose a method to calculate the projections for each radical, based on iterative procedures. The images of density distribution for each radical obtained by our procedure have proved that the method of deconvolution, in combination with iterative fitting, provides correct results. The test was performed on a sample of polymer PPS Br 111 ( p-phenylene sulphide) with glass fibres and minerals. The results indicated a heterogeneous distribution of radicals in the sample volume. The images obtained were in agreement with the known shape of the sample.

  11. Chapter A5. Section 6.1.F. Wastewater, Pharmaceutical, and Antibiotic Compounds

    USGS Publications Warehouse

    Lewis, Michael Edward; Zaugg, Steven D.

    2003-01-01

    The USGS differentiates between samples collected for analysis of wastewater compounds and those collected for analysis of pharmaceutical and antibiotic compounds, based on the analytical schedule for the laboratory method. Currently, only the wastewater laboratory method for field-filtered samples (SH1433) is an approved, routine (production) method. (The unfiltered wastewater method LC 8033 also is available but requires a proposal for custom analysis.) At this time, analysis of samples for pharmaceutical and antibiotic compounds is confined to research studies and is available only on a custom basis.

  12. Identification and Quantification of Microplastics in Wastewater Using Focal Plane Array-Based Reflectance Micro-FT-IR Imaging.

    PubMed

    Tagg, Alexander S; Sapp, Melanie; Harrison, Jesse P; Ojeda, Jesús J

    2015-06-16

    Microplastics (<5 mm) have been documented in environmental samples on a global scale. While these pollutants may enter aquatic environments via wastewater treatment facilities, the abundance of microplastics in these matrices has not been investigated. Although efficient methods for the analysis of microplastics in sediment samples and marine organisms have been published, no methods have been developed for detecting these pollutants within organic-rich wastewater samples. In addition, there is no standardized method for analyzing microplastics isolated from environmental samples. In many cases, part of the identification protocol relies on visual selection before analysis, which is open to bias. In order to address this, a new method for the analysis of microplastics in wastewater was developed. A pretreatment step using 30% hydrogen peroxide (H2O2) was employed to remove biogenic material, and focal plane array (FPA)-based reflectance micro-Fourier-transform (FT-IR) imaging was shown to successfully image and identify different microplastic types (polyethylene, polypropylene, nylon-6, polyvinyl chloride, polystyrene). Microplastic-spiked wastewater samples were used to validate the methodology, resulting in a robust protocol which was nonselective and reproducible (the overall success identification rate was 98.33%). The use of FPA-based micro-FT-IR spectroscopy also provides a considerable reduction in analysis time compared with previous methods, since samples that could take several days to be mapped using a single-element detector can now be imaged in less than 9 h (circular filter with a diameter of 47 mm). This method for identifying and quantifying microplastics in wastewater is likely to provide an essential tool for further research into the pathways by which microplastics enter the environment.

  13. Improving regression-model-based streamwater constituent load estimates derived from serially correlated data

    USGS Publications Warehouse

    Aulenbach, Brent T.

    2013-01-01

    A regression-model based approach is a commonly used, efficient method for estimating streamwater constituent load when there is a relationship between streamwater constituent concentration and continuous variables such as streamwater discharge, season and time. A subsetting experiment using a 30-year dataset of daily suspended sediment observations from the Mississippi River at Thebes, Illinois, was performed to determine optimal sampling frequency, model calibration period length, and regression model methodology, as well as to determine the effect of serial correlation of model residuals on load estimate precision. Two regression-based methods were used to estimate streamwater loads, the Adjusted Maximum Likelihood Estimator (AMLE), and the composite method, a hybrid load estimation approach. While both methods accurately and precisely estimated loads at the model’s calibration period time scale, precisions were progressively worse at shorter reporting periods, from annually to monthly. Serial correlation in model residuals resulted in observed AMLE precision to be significantly worse than the model calculated standard errors of prediction. The composite method effectively improved upon AMLE loads for shorter reporting periods, but required a sampling interval of at least 15-days or shorter, when the serial correlations in the observed load residuals were greater than 0.15. AMLE precision was better at shorter sampling intervals and when using the shortest model calibration periods, such that the regression models better fit the temporal changes in the concentration–discharge relationship. The models with the largest errors typically had poor high flow sampling coverage resulting in unrepresentative models. Increasing sampling frequency and/or targeted high flow sampling are more efficient approaches to ensure sufficient sampling and to avoid poorly performing models, than increasing calibration period length.

  14. Development of a polymerase chain reaction applicable to rapid and sensitive detection of Clonorchis sinensis eggs in human stool samples

    PubMed Central

    Cho, Pyo Yun; Na, Byoung-Kuk; Mi Choi, Kyung; Kim, Jin Su; Cho, Shin-Hyeong; Lee, Won-Ja; Lim, Sung-Bin; Cha, Seok Ho; Park, Yun-Kyu; Pak, Jhang Ho; Lee, Hyeong-Woo; Hong, Sung-Jong; Kim, Tong-Soo

    2013-01-01

    Microscopic examination of eggs of parasitic helminths in stool samples has been the most widely used classical diagnostic method for infections, but tiny and low numbers of eggs in stool samples often hamper diagnosis of helminthic infections with classical microscopic examination. Moreover, it is also difficult to differentiate parasite eggs by the classical method, if they have similar morphological characteristics. In this study, we developed a rapid and sensitive polymerase chain reaction (PCR)-based molecular diagnostic method for detection of Clonorchis sinensis eggs in stool samples. Nine primers were designed based on the long-terminal repeat (LTR) of C. sinensis retrotransposon1 (CsRn1) gene, and seven PCR primer sets were paired. Polymerase chain reaction with each primer pair produced specific amplicons for C. sinensis, but not for other trematodes including Metagonimus yokogawai and Paragonimus westermani. Particularly, three primer sets were able to detect 10 C. sinensis eggs and were applicable to amplify specific amplicons from DNA samples purified from stool of C. sinensis-infected patients. This PCR method could be useful for diagnosis of C. sinensis infections in human stool samples with a high level of specificity and sensitivity. PMID:23916334

  15. A simple distillation method to extract bromine from natural water and salt samples for isotope analysis by multi-collector inductively coupled plasma mass spectrometry.

    PubMed

    Eggenkamp, H G M; Louvat, P

    2018-04-30

    In natural samples bromine is present in trace amounts, and measurement of stable Br isotopes necessitates its separation from the matrix. Most methods described previously need large samples or samples with high Br/Cl ratios. The use of metals as reagents, proposed in previous Br distillation methods, must be avoided for multi-collector inductively coupled plasma mass spectrometry (MC-ICP-MS) analyses, because of risk of cross-contamination, since the instrument is also used to measure stable isotopes of metals. Dedicated to water and evaporite samples with low Br/Cl ratios, the proposed method is a simple distillation that separates bromide from chloride for isotopic analyses by MC-ICP-MS. It is based on the difference in oxidation potential between chloride and bromide in the presence of nitric acid. The sample is mixed with dilute (1:5) nitric acid in a distillation flask and heated over a candle flame for 10 min. The distillate (bromine) is trapped in an ammonia solution and reduced to bromide. Chloride is only distilled to a very small extent. The obtained solution can be measured directly by MC-ICP-MS for stable Br isotopes. The method was tested for a variety of volumes, ammonia concentrations, pH values and distillation times and compared with the classic ion-exchange chromatography method. The method more efficiently separates Br from Cl, so that samples with lower Br/Cl ratios can be analysed, with Br isotope data in agreement with those obtained by previous methods. Unlike other Br extraction methods based on oxidation, the distillation method presented here does not use any metallic ion for redox reactions that could contaminate the mass spectrometer. It is efficient in separating Br from samples with low Br/Cl ratios. The method ensures reproducible recovery yields and a long-term reproducibility of ±0.11‰ (1 standard deviation). The distillation method was successfully applied to samples with low Br/Cl ratios and low Br amounts (down to 20 μg). Copyright © 2018 John Wiley & Sons, Ltd.

  16. Network Model-Assisted Inference from Respondent-Driven Sampling Data

    PubMed Central

    Gile, Krista J.; Handcock, Mark S.

    2015-01-01

    Summary Respondent-Driven Sampling is a widely-used method for sampling hard-to-reach human populations by link-tracing over their social networks. Inference from such data requires specialized techniques because the sampling process is both partially beyond the control of the researcher, and partially implicitly defined. Therefore, it is not generally possible to directly compute the sampling weights for traditional design-based inference, and likelihood inference requires modeling the complex sampling process. As an alternative, we introduce a model-assisted approach, resulting in a design-based estimator leveraging a working network model. We derive a new class of estimators for population means and a corresponding bootstrap standard error estimator. We demonstrate improved performance compared to existing estimators, including adjustment for an initial convenience sample. We also apply the method and an extension to the estimation of HIV prevalence in a high-risk population. PMID:26640328

  17. Network Model-Assisted Inference from Respondent-Driven Sampling Data.

    PubMed

    Gile, Krista J; Handcock, Mark S

    2015-06-01

    Respondent-Driven Sampling is a widely-used method for sampling hard-to-reach human populations by link-tracing over their social networks. Inference from such data requires specialized techniques because the sampling process is both partially beyond the control of the researcher, and partially implicitly defined. Therefore, it is not generally possible to directly compute the sampling weights for traditional design-based inference, and likelihood inference requires modeling the complex sampling process. As an alternative, we introduce a model-assisted approach, resulting in a design-based estimator leveraging a working network model. We derive a new class of estimators for population means and a corresponding bootstrap standard error estimator. We demonstrate improved performance compared to existing estimators, including adjustment for an initial convenience sample. We also apply the method and an extension to the estimation of HIV prevalence in a high-risk population.

  18. A robust and efficient statistical method for genetic association studies using case and control samples from multiple cohorts

    PubMed Central

    2013-01-01

    Background The theoretical basis of genome-wide association studies (GWAS) is statistical inference of linkage disequilibrium (LD) between any polymorphic marker and a putative disease locus. Most methods widely implemented for such analyses are vulnerable to several key demographic factors and deliver a poor statistical power for detecting genuine associations and also a high false positive rate. Here, we present a likelihood-based statistical approach that accounts properly for non-random nature of case–control samples in regard of genotypic distribution at the loci in populations under study and confers flexibility to test for genetic association in presence of different confounding factors such as population structure, non-randomness of samples etc. Results We implemented this novel method together with several popular methods in the literature of GWAS, to re-analyze recently published Parkinson’s disease (PD) case–control samples. The real data analysis and computer simulation show that the new method confers not only significantly improved statistical power for detecting the associations but also robustness to the difficulties stemmed from non-randomly sampling and genetic structures when compared to its rivals. In particular, the new method detected 44 significant SNPs within 25 chromosomal regions of size < 1 Mb but only 6 SNPs in two of these regions were previously detected by the trend test based methods. It discovered two SNPs located 1.18 Mb and 0.18 Mb from the PD candidates, FGF20 and PARK8, without invoking false positive risk. Conclusions We developed a novel likelihood-based method which provides adequate estimation of LD and other population model parameters by using case and control samples, the ease in integration of these samples from multiple genetically divergent populations and thus confers statistically robust and powerful analyses of GWAS. On basis of simulation studies and analysis of real datasets, we demonstrated significant improvement of the new method over the non-parametric trend test, which is the most popularly implemented in the literature of GWAS. PMID:23394771

  19. Base-Calling Algorithm with Vocabulary (BCV) Method for Analyzing Population Sequencing Chromatograms

    PubMed Central

    Fantin, Yuri S.; Neverov, Alexey D.; Favorov, Alexander V.; Alvarez-Figueroa, Maria V.; Braslavskaya, Svetlana I.; Gordukova, Maria A.; Karandashova, Inga V.; Kuleshov, Konstantin V.; Myznikova, Anna I.; Polishchuk, Maya S.; Reshetov, Denis A.; Voiciehovskaya, Yana A.; Mironov, Andrei A.; Chulanov, Vladimir P.

    2013-01-01

    Sanger sequencing is a common method of reading DNA sequences. It is less expensive than high-throughput methods, and it is appropriate for numerous applications including molecular diagnostics. However, sequencing mixtures of similar DNA of pathogens with this method is challenging. This is important because most clinical samples contain such mixtures, rather than pure single strains. The traditional solution is to sequence selected clones of PCR products, a complicated, time-consuming, and expensive procedure. Here, we propose the base-calling with vocabulary (BCV) method that computationally deciphers Sanger chromatograms obtained from mixed DNA samples. The inputs to the BCV algorithm are a chromatogram and a dictionary of sequences that are similar to those we expect to obtain. We apply the base-calling function on a test dataset of chromatograms without ambiguous positions, as well as one with 3–14% sequence degeneracy. Furthermore, we use BCV to assemble a consensus sequence for an HIV genome fragment in a sample containing a mixture of viral DNA variants and to determine the positions of the indels. Finally, we detect drug-resistant Mycobacterium tuberculosis strains carrying frameshift mutations mixed with wild-type bacteria in the pncA gene, and roughly characterize bacterial communities in clinical samples by direct 16S rRNA sequencing. PMID:23382983

  20. Method for charging a hydrogen getter

    DOEpatents

    Tracy, C. Edwin; Keyser, Matthew A.; Benson, David K.

    1998-01-01

    A method for charging a sample of either a permanent or reversible getter material with a high concentration of hydrogen while maintaining a base pressure below 10.sup.-4 torr at room temperature involves placing the sample of hydrogen getter material in a chamber, activating the sample of hydrogen getter material, overcharging the sample of getter material through conventional charging techniques to a high concentration of hydrogen, and then subjecting the sample of getter material to a low temperature vacuum bake-out process. Application of the method results in a reversible hydrogen getter which is highly charged to maximum capacities of hydrogen and which concurrently exhibits minimum hydrogen vapor pressures at room temperatures.

  1. Observer-based output feedback control of networked control systems with non-uniform sampling and time-varying delay

    NASA Astrophysics Data System (ADS)

    Meng, Su; Chen, Jie; Sun, Jian

    2017-10-01

    This paper investigates the problem of observer-based output feedback control for networked control systems with non-uniform sampling and time-varying transmission delay. The sampling intervals are assumed to vary within a given interval. The transmission delay belongs to a known interval. A discrete-time model is first established, which contains time-varying delay and norm-bounded uncertainties coming from non-uniform sampling intervals. It is then converted to an interconnection of two subsystems in which the forward channel is delay-free. The scaled small gain theorem is used to derive the stability condition for the closed-loop system. Moreover, the observer-based output feedback controller design method is proposed by utilising a modified cone complementary linearisation algorithm. Finally, numerical examples illustrate the validity and superiority of the proposed method.

  2. Method and system for laser-based formation of micro-shapes in surfaces of optical elements

    DOEpatents

    Bass, Isaac Louis; Guss, Gabriel Mark

    2013-03-05

    A method of forming a surface feature extending into a sample includes providing a laser operable to emit an output beam and modulating the output beam to form a pulse train having a plurality of pulses. The method also includes a) directing the pulse train along an optical path intersecting an exposed portion of the sample at a position i and b) focusing a first portion of the plurality of pulses to impinge on the sample at the position i. Each of the plurality of pulses is characterized by a spot size at the sample. The method further includes c) ablating at least a portion of the sample at the position i to form a portion of the surface feature and d) incrementing counter i. The method includes e) repeating steps a) through d) to form the surface feature. The sample is free of a rim surrounding the surface feature.

  3. Challenges to be overcome using population-based sampling methods to recruit veterans for a study of post-traumatic stress disorder and traumatic brain injury.

    PubMed

    Bayley, Peter J; Kong, Jennifer Y; Helmer, Drew A; Schneiderman, Aaron; Roselli, Lauren A; Rosse, Stephanie M; Jackson, Jordan A; Baldwin, Janet; Isaac, Linda; Nolasco, Michael; Blackman, Marc R; Reinhard, Matthew J; Ashford, John Wesson; Chapman, Julie C

    2014-04-08

    Many investigators are interested in recruiting veterans from recent conflicts in Afghanistan and Iraq with Traumatic Brain Injury (TBI) and/or Post Traumatic Stress Disorder (PTSD). Researchers pursuing such studies may experience problems in recruiting sufficient numbers unless effective strategies are used. Currently, there is very little information on recruitment strategies for individuals with TBI and/or PTSD. It is known that groups of patients with medical conditions may be less likely to volunteer for clinical research. This study investigated the feasibility of recruiting veterans returning from recent military conflicts--Operation Enduring Freedom (OEF) and Operation Iraqi Freedom (OIF)--using a population-based sampling method. Individuals were sampled from a previous epidemiological study. Three study sites focused on recruiting survey respondents (n = 445) who lived within a 60 mile radius of one of the sites. Overall, the successful recruitment of veterans using a population-based sampling method was dependent on the ability to contact potential participants following mass mailing. Study enrollment of participants with probable TBI and/or PTSD had a recruitment yield (enrolled/total identified) of 5.4%. We were able to contact 146 individuals, representing a contact rate of 33%. Sixty-six of the individuals contacted were screened. The major reasons for not screening included a stated lack of interest in the study (n = 37), a failure to answer screening calls after initial contact (n = 30), and an unwillingness or inability to travel to a study site (n = 10). Based on the phone screening, 36 veterans were eligible for the study. Twenty-four veterans were enrolled, (recruitment yield = 5.4%) and twelve were not enrolled for a variety of reasons. Our experience with a population-based sampling method for recruitment of recent combat veterans illustrates the challenges encountered, particularly contacting and screening potential participants. The screening and enrollment data will help guide recruitment for future studies using population-based methods.

  4. Systematic sampling for suspended sediment

    Treesearch

    Robert B. Thomas

    1991-01-01

    Abstract - Because of high costs or complex logistics, scientific populations cannot be measured entirely and must be sampled. Accepted scientific practice holds that sample selection be based on statistical principles to assure objectivity when estimating totals and variances. Probability sampling--obtaining samples with known probabilities--is the only method that...

  5. 14C sample preparation for AMS microdosing studies at Lund University using online combustion and septa-sealed vials

    NASA Astrophysics Data System (ADS)

    Sydoff, Marie; Stenström, Kristina

    2010-04-01

    The Department of Physics at Lund University is participating in a European Union project called EUMAPP (European Union Microdose AMS Partnership Programme), in which sample preparation and accelerator mass spectrometry (AMS) measurements of biological samples from microdosing studies have been made. This paper describes a simplified method of converting biological samples to solid graphite for 14C analysis with AMS. The method is based on online combustion of the samples, and reduction of CO 2 in septa-sealed vials. The septa-sealed vials and disposable materials are used to eliminate sample cross-contamination. Measurements of ANU and Ox I standards show deviations of 2% and 3%, respectively, relative to reference values. This level of accuracy is sufficient for biological samples from microdosing studies. Since the method has very few handling steps from sample to graphite, the risk of failure during the sample preparation process is minimized, making the method easy to use in routine preparation of samples.

  6. Evaluation of seven aquatic sampling methods for amphibians and other aquatic fauna

    USGS Publications Warehouse

    Gunzburger, M.S.

    2007-01-01

    To design effective and efficient research and monitoring programs researchers must have a thorough understanding of the capabilities and limitations of their sampling methods. Few direct comparative studies exist for aquatic sampling methods for amphibians. The objective of this study was to simultaneously employ seven aquatic sampling methods in 10 wetlands to compare amphibian species richness and number of individuals detected with each method. Four sampling methods allowed counts of individuals (metal dipnet, D-frame dipnet, box trap, crayfish trap), whereas the other three methods allowed detection of species (visual encounter, aural, and froglogger). Amphibian species richness was greatest with froglogger, box trap, and aural samples. For anuran species, the sampling methods by which each life stage was detected was related to relative length of larval and breeding periods and tadpole size. Detection probability of amphibians varied across sampling methods. Box trap sampling resulted in the most precise amphibian count, but the precision of all four count-based methods was low (coefficient of variation > 145 for all methods). The efficacy of the four count sampling methods at sampling fish and aquatic invertebrates was also analyzed because these predatory taxa are known to be important predictors of amphibian habitat distribution. Species richness and counts were similar for fish with the four methods, whereas invertebrate species richness and counts were greatest in box traps. An effective wetland amphibian monitoring program in the southeastern United States should include multiple sampling methods to obtain the most accurate assessment of species community composition at each site. The combined use of frogloggers, crayfish traps, and dipnets may be the most efficient and effective amphibian monitoring protocol. ?? 2007 Brill Academic Publishers.

  7. Sample size determination for mediation analysis of longitudinal data.

    PubMed

    Pan, Haitao; Liu, Suyu; Miao, Danmin; Yuan, Ying

    2018-03-27

    Sample size planning for longitudinal data is crucial when designing mediation studies because sufficient statistical power is not only required in grant applications and peer-reviewed publications, but is essential to reliable research results. However, sample size determination is not straightforward for mediation analysis of longitudinal design. To facilitate planning the sample size for longitudinal mediation studies with a multilevel mediation model, this article provides the sample size required to achieve 80% power by simulations under various sizes of the mediation effect, within-subject correlations and numbers of repeated measures. The sample size calculation is based on three commonly used mediation tests: Sobel's method, distribution of product method and the bootstrap method. Among the three methods of testing the mediation effects, Sobel's method required the largest sample size to achieve 80% power. Bootstrapping and the distribution of the product method performed similarly and were more powerful than Sobel's method, as reflected by the relatively smaller sample sizes. For all three methods, the sample size required to achieve 80% power depended on the value of the ICC (i.e., within-subject correlation). A larger value of ICC typically required a larger sample size to achieve 80% power. Simulation results also illustrated the advantage of the longitudinal study design. The sample size tables for most encountered scenarios in practice have also been published for convenient use. Extensive simulations study showed that the distribution of the product method and bootstrapping method have superior performance to the Sobel's method, but the product method was recommended to use in practice in terms of less computation time load compared to the bootstrapping method. A R package has been developed for the product method of sample size determination in mediation longitudinal study design.

  8. Ultrasonic-based membrane aided sample preparation of urine proteomes.

    PubMed

    Jesus, Jemmyson Romário; Santos, Hugo M; López-Fernández, H; Lodeiro, Carlos; Arruda, Marco Aurélio Zezzi; Capelo, J L

    2018-02-01

    A new ultrafast ultrasonic-based method for shotgun proteomics as well as label-free protein quantification in urine samples is developed. The method first separates the urine proteins using nitrocellulose-based membranes and then proteins are in-membrane digested using trypsin. The enzymatic digestion process is accelerated from overnight to four minutes using a sonoreactor ultrasonic device. Overall, the sample treatment pipeline comprising protein separation, digestion and identification is done in just 3h. The process is assessed using urine of healthy volunteers. The method shows that male can be differentiated from female using the protein content of urine in a fast, easy and straightforward way. 232 and 226 proteins are identified in urine of male and female, respectively. From this, 162 are common to both genders, whilst 70 are unique to male and 64 to female. From the 162 common proteins, 13 are present at levels statistically different (p < 0.05). The method matches the analytical minimalism concept as outlined by Halls, as each stage of this analysis is evaluated to minimize the time, cost, sample requirement, reagent consumption, energy requirements and production of waste products. Copyright © 2017 Elsevier B.V. All rights reserved.

  9. Unconstrained Enhanced Sampling for Free Energy Calculations of Biomolecules: A Review

    PubMed Central

    Miao, Yinglong; McCammon, J. Andrew

    2016-01-01

    Free energy calculations are central to understanding the structure, dynamics and function of biomolecules. Yet insufficient sampling of biomolecular configurations is often regarded as one of the main sources of error. Many enhanced sampling techniques have been developed to address this issue. Notably, enhanced sampling methods based on biasing collective variables (CVs), including the widely used umbrella sampling, adaptive biasing force and metadynamics, have been discussed in a recent excellent review (Abrams and Bussi, Entropy, 2014). Here, we aim to review enhanced sampling methods that do not require predefined system-dependent CVs for biomolecular simulations and as such do not suffer from the hidden energy barrier problem as encountered in the CV-biasing methods. These methods include, but are not limited to, replica exchange/parallel tempering, self-guided molecular/Langevin dynamics, essential energy space random walk and accelerated molecular dynamics. While it is overwhelming to describe all details of each method, we provide a summary of the methods along with the applications and offer our perspectives. We conclude with challenges and prospects of the unconstrained enhanced sampling methods for accurate biomolecular free energy calculations. PMID:27453631

  10. Unconstrained Enhanced Sampling for Free Energy Calculations of Biomolecules: A Review.

    PubMed

    Miao, Yinglong; McCammon, J Andrew

    Free energy calculations are central to understanding the structure, dynamics and function of biomolecules. Yet insufficient sampling of biomolecular configurations is often regarded as one of the main sources of error. Many enhanced sampling techniques have been developed to address this issue. Notably, enhanced sampling methods based on biasing collective variables (CVs), including the widely used umbrella sampling, adaptive biasing force and metadynamics, have been discussed in a recent excellent review (Abrams and Bussi, Entropy, 2014). Here, we aim to review enhanced sampling methods that do not require predefined system-dependent CVs for biomolecular simulations and as such do not suffer from the hidden energy barrier problem as encountered in the CV-biasing methods. These methods include, but are not limited to, replica exchange/parallel tempering, self-guided molecular/Langevin dynamics, essential energy space random walk and accelerated molecular dynamics. While it is overwhelming to describe all details of each method, we provide a summary of the methods along with the applications and offer our perspectives. We conclude with challenges and prospects of the unconstrained enhanced sampling methods for accurate biomolecular free energy calculations.

  11. Comparing methods of determining Legionella spp. in complex water matrices.

    PubMed

    Díaz-Flores, Álvaro; Montero, Juan Carlos; Castro, Francisco Javier; Alejandres, Eva María; Bayón, Carmen; Solís, Inmaculada; Fernández-Lafuente, Roberto; Rodríguez, Guillermo

    2015-04-29

    Legionella testing conducted at environmental laboratories plays an essential role in assessing the risk of disease transmission associated with water systems. However, drawbacks of culture-based methodology used for Legionella enumeration can have great impact on the results and interpretation which together can lead to underestimation of the actual risk. Up to 20% of the samples analysed by these laboratories produced inconclusive results, making effective risk management impossible. Overgrowth of competing microbiota was reported as an important factor for culture failure. For quantitative polymerase chain reaction (qPCR), the interpretation of the results from the environmental samples still remains a challenge. Inhibitors may cause up to 10% of inconclusive results. This study compared a quantitative method based on immunomagnetic separation (IMS method) with culture and qPCR, as a new approach to routine monitoring of Legionella. First, pilot studies evaluated the recovery and detectability of Legionella spp using an IMS method, in the presence of microbiota and biocides. The IMS method results were not affected by microbiota while culture counts were significantly reduced (1.4 log) or negative in the same samples. Damage by biocides of viable Legionella was detected by the IMS method. Secondly, a total of 65 water samples were assayed by all three techniques (culture, qPCR and the IMS method). Of these, 27 (41.5%) were recorded as positive by at least one test. Legionella spp was detected by culture in 7 (25.9%) of the 27 samples. Eighteen (66.7%) of the 27 samples were positive by the IMS method, thirteen of them reporting counts below 10(3) colony forming units per liter (CFU l(-1)), six presented interfering microbiota and three presented PCR inhibition. Of the 65 water samples, 24 presented interfering microbiota by culture and 8 presented partial or complete inhibition of the PCR reaction. So the rate of inconclusive results of culture and PCR was 36.9 and 12.3%, respectively, without any inconclusive results reported for the IMS method. The IMS method generally improved the recovery and detectability of Legionella in environmental matrices, suggesting the possibility to use IMS method as valuable indicator of risk. Thus, this method may significantly improve our knowledge about the exposure risk to these bacteria, allowing us to implement evidence-based monitoring and disinfection strategies.

  12. Development and validation of a FISH-based method for the detection and quantification of E. coli and coliform bacteria in water samples.

    PubMed

    Hügler, Michael; Böckle, Karin; Eberhagen, Ingrid; Thelen, Karin; Beimfohr, Claudia; Hambsch, Beate

    2011-01-01

    Monitoring of microbiological contaminants in water supplies requires fast and sensitive methods for the specific detection of indicator organisms or pathogens. We developed a protocol for the simultaneous detection of E. coli and coliform bacteria based on the Fluorescence in situ Hybridization (FISH) technology. This protocol consists of two approaches. The first allows the direct detection of single E. coli and coliform bacterial cells on the filter membranes. The second approach includes incubation of the filter membranes on a nutrient agar plate and subsequent detection of the grown micro-colonies. Both approaches were validated using drinking water samples spiked with pure cultures and naturally contaminated water samples. The effects of heat, chlorine and UV disinfection were also investigated. The micro-colony approach yielded very good results for all samples and conditions tested, and thus can be thoroughly recommended for usage as an alternative method to detect E. coli and coliform bacteria in water samples. However, during this study, some limitations became visible for the single cell approach. The method cannot be applied for water samples which have been disinfected by UV irradiation. In addition, our results indicated that green fluorescent dyes are not suitable to be used with chlorine disinfected samples.

  13. [Research on fast detecting tomato seedlings nitrogen content based on NIR characteristic spectrum selection].

    PubMed

    Wu, Jing-zhu; Wang, Feng-zhu; Wang, Li-li; Zhang, Xiao-chao; Mao, Wen-hua

    2015-01-01

    In order to improve the accuracy and robustness of detecting tomato seedlings nitrogen content based on near-infrared spectroscopy (NIR), 4 kinds of characteristic spectrum selecting methods were studied in the present paper, i. e. competitive adaptive reweighted sampling (CARS), Monte Carlo uninformative variables elimination (MCUVE), backward interval partial least squares (BiPLS) and synergy interval partial least squares (SiPLS). There were totally 60 tomato seedlings cultivated at 10 different nitrogen-treatment levels (urea concentration from 0 to 120 mg . L-1), with 6 samples at each nitrogen-treatment level. They are in different degrees of over nitrogen, moderate nitrogen, lack of nitrogen and no nitrogen status. Each sample leaves were collected to scan near-infrared spectroscopy from 12 500 to 3 600 cm-1. The quantitative models based on the above 4 methods were established. According to the experimental result, the calibration model based on CARS and MCUVE selecting methods show better performance than those based on BiPLS and SiPLS selecting methods, but their prediction ability is much lower than that of the latter. Among them, the model built by BiPLS has the best prediction performance. The correlation coefficient (r), root mean square error of prediction (RMSEP) and ratio of performance to standard derivate (RPD) is 0. 952 7, 0. 118 3 and 3. 291, respectively. Therefore, NIR technology combined with characteristic spectrum selecting methods can improve the model performance. But the characteristic spectrum selecting methods are not universal. For the built model based or single wavelength variables selection is more sensitive, it is more suitable for the uniform object. While the anti-interference ability of the model built based on wavelength interval selection is much stronger, it is more suitable for the uneven and poor reproducibility object. Therefore, the characteristic spectrum selection will only play a better role in building model, combined with the consideration of sample state and the model indexes.

  14. Mapping of bird distributions from point count surveys

    USGS Publications Warehouse

    Sauer, J.R.; Pendleton, G.W.; Orsillo, Sandra; Ralph, C.J.; Sauer, J.R.; Droege, S.

    1995-01-01

    Maps generated from bird survey data are used for a variety of scientific purposes, but little is known about their bias and precision. We review methods for preparing maps from point count data and appropriate sampling methods for maps based on point counts. Maps based on point counts can be affected by bias associated with incomplete counts, primarily due to changes in proportion counted as a function of observer or habitat differences. Large-scale surveys also generally suffer from regional and temporal variation in sampling intensity. A simulated surface is used to demonstrate sampling principles for maps.

  15. [Development of a SPA-ELISA method for detecting anti-coronavirus IgG antibodies in serum samples from fulvous fruit bats].

    PubMed

    Zhou, Jie; Liao, Yu-xue; Chen, Zhong; Li, Yu-chun; Gao, Lu-Lu; Chen, Yi-xiong; Cai, Lian-gong; Chen, Qing; Yu, Shou-yi

    2008-05-01

    To develop an simple and sensitive method for detecting anti-coronavirus IgG antibodies in bat sera based on enzyme-linked immunosorbent assay (ELISA). A commercial ELISA kit for detecting SARS-CoV antibody was modified for detecting coronavirus antibodies in bat serum samples. The second antibody in the kit was replaced with horseradish peroxidase-conjugated protein-A (HRP-SPA) based on the characteristics of binding between Staphylococcus aureus protein A (SPA) and mammal IgG Fc fragment. The sera of 55 fulvous fruit bats (Rousettus dasymallus) were tested using the SPA-ELISA. The test results of the positive and negative controls in the kit and the serum samples from convalescent ;patient were consistent with expectation. Coronavirus antibody was detected in 2 out of the 55 bat serum samples. Serum neutralization test confirmed the validity of the SPA-ELISA method. This SPA-ELISA method is applicable for detecting coronavirus antibody in bat sera.

  16. Wavelength Selection Method Based on Differential Evolution for Precise Quantitative Analysis Using Terahertz Time-Domain Spectroscopy.

    PubMed

    Li, Zhi; Chen, Weidong; Lian, Feiyu; Ge, Hongyi; Guan, Aihong

    2017-12-01

    Quantitative analysis of component mixtures is an important application of terahertz time-domain spectroscopy (THz-TDS) and has attracted broad interest in recent research. Although the accuracy of quantitative analysis using THz-TDS is affected by a host of factors, wavelength selection from the sample's THz absorption spectrum is the most crucial component. The raw spectrum consists of signals from the sample and scattering and other random disturbances that can critically influence the quantitative accuracy. For precise quantitative analysis using THz-TDS, the signal from the sample needs to be retained while the scattering and other noise sources are eliminated. In this paper, a novel wavelength selection method based on differential evolution (DE) is investigated. By performing quantitative experiments on a series of binary amino acid mixtures using THz-TDS, we demonstrate the efficacy of the DE-based wavelength selection method, which yields an error rate below 5%.

  17. Improvements in soft gelatin capsule sample preparation for USP-based simethicone FTIR analysis.

    PubMed

    Hargis, Amy D; Whittall, Linda B

    2013-02-23

    Due to the absence of a significant chromophore, Simethicone raw material and finished product analysis is achieved using a FTIR-based method that quantifies the polydimethylsiloxane (PDMS) component of the active ingredient. The method can be found in the USP monographs for several dosage forms of Simethicone-containing pharmaceutical products. For soft gelatin capsules, the PDMS assay values determined using the procedure described in the USP method were variable (%RSDs from 2 to 9%) and often lower than expected based on raw material values. After investigation, it was determined that the extraction procedure used for sample preparation was causing loss of material to the container walls due to the hydrophobic nature of PDMS. Evaluation revealed that a simple dissolution of the gelatin capsule fill in toluene provided improved assay results (%RSDs≤0.5%) as well as a simplified and rapid sample preparation. Copyright © 2012 Elsevier B.V. All rights reserved.

  18. Determination of isocyanate groups in the organic intermediates by reaction-based headspace gas chromatography.

    PubMed

    Xie, Wei-Qi; Chai, Xin-Sheng

    2016-10-14

    This work reports on a novel method for the determination of isocyanate groups in the related organic intermediates by a reaction-based headspace gas chromatography. The method is based on measuring the CO 2 formed from the reaction between the isocyanate groups in the organic intermediates and water in a closed headspace sample vial at 45°C for 20min. The results showed that the method has a good precision and accuracy, in which the relative standard deviation in the repeatability measurement was 5.26%, and the relative differences between the data obtained by the HS-GC method and the reference back-titration method were within 9.42%. The present method is simple and efficient and is particularly suitable to be used for determining the isocyanate groups in the batch sample analysis. Copyright © 2016 Elsevier B.V. All rights reserved.

  19. Machine vision application in animal trajectory tracking.

    PubMed

    Koniar, Dušan; Hargaš, Libor; Loncová, Zuzana; Duchoň, František; Beňo, Peter

    2016-04-01

    This article was motivated by the doctors' demand to make a technical support in pathologies of gastrointestinal tract research [10], which would be based on machine vision tools. Proposed solution should be less expensive alternative to already existing RF (radio frequency) methods. The objective of whole experiment was to evaluate the amount of animal motion dependent on degree of pathology (gastric ulcer). In the theoretical part of the article, several methods of animal trajectory tracking are presented: two differential methods based on background subtraction, the thresholding methods based on global and local threshold and the last method used for animal tracking was the color matching with a chosen template containing a searched spectrum of colors. The methods were tested offline on five video samples. Each sample contained situation with moving guinea pig locked in a cage under various lighting conditions. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  20. Simultaneous determination of selected biogenic amines in alcoholic beverage samples by isotachophoretic and chromatographic methods.

    PubMed

    Jastrzębska, Aneta; Piasta, Anna; Szłyk, Edward

    2014-01-01

    A simple and useful method for the determination of biogenic amines in beverage samples based on isotachophoretic separation is described. The proposed procedure permitted simultaneous analysis of histamine, tyramine, cadaverine, putrescine, tryptamine, 2-phenylethylamine, spermine and spermidine. The data presented demonstrate the utility, simplicity, flexibility, sensitivity and environmentally friendly character of the proposed method. The precision of the method expressed as coefficient of variations varied from 0.1% to 5.9% for beverage samples, whereas recoveries varied from 91% to 101%. The results for the determination of biogenic amines were compared with an HPLC procedure based on a pre-column derivatisation reaction of biogenic amines with dansyl chloride. Furthermore, the derivatisation procedure was optimised by verification of concentration and pH of the buffer, the addition of organic solvents, reaction time and temperature.

  1. Evaluation of Legionella Air Contamination in Healthcare Facilities by Different Sampling Methods: An Italian Multicenter Study.

    PubMed

    Montagna, Maria Teresa; De Giglio, Osvalda; Cristina, Maria Luisa; Napoli, Christian; Pacifico, Claudia; Agodi, Antonella; Baldovin, Tatjana; Casini, Beatrice; Coniglio, Maria Anna; D'Errico, Marcello Mario; Delia, Santi Antonino; Deriu, Maria Grazia; Guida, Marco; Laganà, Pasqualina; Liguori, Giorgio; Moro, Matteo; Mura, Ida; Pennino, Francesca; Privitera, Gaetano; Romano Spica, Vincenzo; Sembeni, Silvia; Spagnolo, Anna Maria; Tardivo, Stefano; Torre, Ida; Valeriani, Federica; Albertini, Roberto; Pasquarella, Cesira

    2017-06-22

    Healthcare facilities (HF) represent an at-risk environment for legionellosis transmission occurring after inhalation of contaminated aerosols. In general, the control of water is preferred to that of air because, to date, there are no standardized sampling protocols. Legionella air contamination was investigated in the bathrooms of 11 HF by active sampling (Surface Air System and Coriolis ® μ) and passive sampling using settling plates. During the 8-hour sampling, hot tap water was sampled three times. All air samples were evaluated using culture-based methods, whereas liquid samples collected using the Coriolis ® μ were also analyzed by real-time PCR. Legionella presence in the air and water was then compared by sequence-based typing (SBT) methods. Air contamination was found in four HF (36.4%) by at least one of the culturable methods. The culturable investigation by Coriolis ® μ did not yield Legionella in any enrolled HF. However, molecular investigation using Coriolis ® μ resulted in eight HF testing positive for Legionella in the air. Comparison of Legionella air and water contamination indicated that Legionella water concentration could be predictive of its presence in the air. Furthermore, a molecular study of 12 L. pneumophila strains confirmed a match between the Legionella strains from air and water samples by SBT for three out of four HF that tested positive for Legionella by at least one of the culturable methods. Overall, our study shows that Legionella air detection cannot replace water sampling because the absence of microorganisms from the air does not necessarily represent their absence from water; nevertheless, air sampling may provide useful information for risk assessment. The liquid impingement technique appears to have the greatest capacity for collecting airborne Legionella if combined with molecular investigations.

  2. Evaluation of Legionella Air Contamination in Healthcare Facilities by Different Sampling Methods: An Italian Multicenter Study

    PubMed Central

    Montagna, Maria Teresa; De Giglio, Osvalda; Cristina, Maria Luisa; Napoli, Christian; Pacifico, Claudia; Agodi, Antonella; Baldovin, Tatjana; Casini, Beatrice; Coniglio, Maria Anna; D’Errico, Marcello Mario; Delia, Santi Antonino; Deriu, Maria Grazia; Guida, Marco; Laganà, Pasqualina; Liguori, Giorgio; Moro, Matteo; Mura, Ida; Pennino, Francesca; Privitera, Gaetano; Romano Spica, Vincenzo; Sembeni, Silvia; Spagnolo, Anna Maria; Tardivo, Stefano; Torre, Ida; Valeriani, Federica; Albertini, Roberto; Pasquarella, Cesira

    2017-01-01

    Healthcare facilities (HF) represent an at-risk environment for legionellosis transmission occurring after inhalation of contaminated aerosols. In general, the control of water is preferred to that of air because, to date, there are no standardized sampling protocols. Legionella air contamination was investigated in the bathrooms of 11 HF by active sampling (Surface Air System and Coriolis®μ) and passive sampling using settling plates. During the 8-hour sampling, hot tap water was sampled three times. All air samples were evaluated using culture-based methods, whereas liquid samples collected using the Coriolis®μ were also analyzed by real-time PCR. Legionella presence in the air and water was then compared by sequence-based typing (SBT) methods. Air contamination was found in four HF (36.4%) by at least one of the culturable methods. The culturable investigation by Coriolis®μ did not yield Legionella in any enrolled HF. However, molecular investigation using Coriolis®μ resulted in eight HF testing positive for Legionella in the air. Comparison of Legionella air and water contamination indicated that Legionella water concentration could be predictive of its presence in the air. Furthermore, a molecular study of 12 L. pneumophila strains confirmed a match between the Legionella strains from air and water samples by SBT for three out of four HF that tested positive for Legionella by at least one of the culturable methods. Overall, our study shows that Legionella air detection cannot replace water sampling because the absence of microorganisms from the air does not necessarily represent their absence from water; nevertheless, air sampling may provide useful information for risk assessment. The liquid impingement technique appears to have the greatest capacity for collecting airborne Legionella if combined with molecular investigations. PMID:28640202

  3. Antipsychotic drug poisoning monitoring of clozapine in urine by using coffee ring effect based surface-enhanced Raman spectroscopy.

    PubMed

    Zhu, Qingxia; Yu, Xiaoyan; Wu, Zebing; Lu, Feng; Yuan, Yongfang

    2018-07-19

    Antipsychotics are the drugs most often involved in drug poisoning cases, and therefore, therapeutic drug monitoring (TDM) is necessary for safe and effective medication administration of these drugs. In this study, a coffee ring effect-based surface-enhanced Raman spectroscopy (CRE-SERS) method was developed and successfully used to monitor antipsychotic poisoning by using urine samples for the first time. The established method exhibited excellent SERS performance since more hot spots were obtained in the "coffee ring". Using the optimized CRE-SERS method, the sensitivity was improved one order more than that of the conventional method with reasonable reproducibility. The antipsychotic drug clozapine (CLO) spiked into urine samples at 0.5-50 μg mL -1 was quantitatively detected, at concentrations above the thresholds for toxicity. The CRE-SERS method allowed CLO and its metabolites to be ultimately distinguished from real poisoning urine samples. The coffee-ring effect would provide more opportunities for practical applications of the SERS-based method. The frequent occurrence of drug poisoning may have created a new area for the application of the CRE-SERS method. It is anticipated that the developed method will also have great potential for other drug poisoning monitoring. Copyright © 2018 Elsevier B.V. All rights reserved.

  4. Comparison of Relative Bias, Precision, and Efficiency of Sampling Methods for Natural Enemies of Soybean Aphid (Hemiptera: Aphididae).

    PubMed

    Bannerman, J A; Costamagna, A C; McCornack, B P; Ragsdale, D W

    2015-06-01

    Generalist natural enemies play an important role in controlling soybean aphid, Aphis glycines (Hemiptera: Aphididae), in North America. Several sampling methods are used to monitor natural enemy populations in soybean, but there has been little work investigating their relative bias, precision, and efficiency. We compare five sampling methods: quadrats, whole-plant counts, sweep-netting, walking transects, and yellow sticky cards to determine the most practical methods for sampling the three most prominent species, which included Harmonia axyridis (Pallas), Coccinella septempunctata L. (Coleoptera: Coccinellidae), and Orius insidiosus (Say) (Hemiptera: Anthocoridae). We show an important time by sampling method interaction indicated by diverging community similarities within and between sampling methods as the growing season progressed. Similarly, correlations between sampling methods for the three most abundant species over multiple time periods indicated differences in relative bias between sampling methods and suggests that bias is not consistent throughout the growing season, particularly for sticky cards and whole-plant samples. Furthermore, we show that sticky cards produce strongly biased capture rates relative to the other four sampling methods. Precision and efficiency differed between sampling methods and sticky cards produced the most precise (but highly biased) results for adult natural enemies, while walking transects and whole-plant counts were the most efficient methods for detecting coccinellids and O. insidiosus, respectively. Based on bias, precision, and efficiency considerations, the most practical sampling methods for monitoring in soybean include walking transects for coccinellid detection and whole-plant counts for detection of small predators like O. insidiosus. Sweep-netting and quadrat samples are also useful for some applications, when efficiency is not paramount. © The Authors 2015. Published by Oxford University Press on behalf of Entomological Society of America. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  5. Method for Pre-Conditioning a Measured Surface Height Map for Model Validation

    NASA Technical Reports Server (NTRS)

    Sidick, Erkin

    2012-01-01

    This software allows one to up-sample or down-sample a measured surface map for model validation, not only without introducing any re-sampling errors, but also eliminating the existing measurement noise and measurement errors. Because the re-sampling of a surface map is accomplished based on the analytical expressions of Zernike-polynomials and a power spectral density model, such re-sampling does not introduce any aliasing and interpolation errors as is done by the conventional interpolation and FFT-based (fast-Fourier-transform-based) spatial-filtering method. Also, this new method automatically eliminates the measurement noise and other measurement errors such as artificial discontinuity. The developmental cycle of an optical system, such as a space telescope, includes, but is not limited to, the following two steps: (1) deriving requirements or specs on the optical quality of individual optics before they are fabricated through optical modeling and simulations, and (2) validating the optical model using the measured surface height maps after all optics are fabricated. There are a number of computational issues related to model validation, one of which is the "pre-conditioning" or pre-processing of the measured surface maps before using them in a model validation software tool. This software addresses the following issues: (1) up- or down-sampling a measured surface map to match it with the gridded data format of a model validation tool, and (2) eliminating the surface measurement noise or measurement errors such that the resulted surface height map is continuous or smoothly-varying. So far, the preferred method used for re-sampling a surface map is two-dimensional interpolation. The main problem of this method is that the same pixel can take different values when the method of interpolation is changed among the different methods such as the "nearest," "linear," "cubic," and "spline" fitting in Matlab. The conventional, FFT-based spatial filtering method used to eliminate the surface measurement noise or measurement errors can also suffer from aliasing effects. During re-sampling of a surface map, this software preserves the low spatial-frequency characteristic of a given surface map through the use of Zernike-polynomial fit coefficients, and maintains mid- and high-spatial-frequency characteristics of the given surface map by the use of a PSD model derived from the two-dimensional PSD data of the mid- and high-spatial-frequency components of the original surface map. Because this new method creates the new surface map in the desired sampling format from analytical expressions only, it does not encounter any aliasing effects and does not cause any discontinuity in the resultant surface map.

  6. Comparability of river suspended-sediment sampling and laboratory analysis methods

    USGS Publications Warehouse

    Groten, Joel T.; Johnson, Gregory D.

    2018-03-06

    Accurate measurements of suspended sediment, a leading water-quality impairment in many Minnesota rivers, are important for managing and protecting water resources; however, water-quality standards for suspended sediment in Minnesota are based on grab field sampling and total suspended solids (TSS) laboratory analysis methods that have underrepresented concentrations of suspended sediment in rivers compared to U.S. Geological Survey equal-width-increment or equal-discharge-increment (EWDI) field sampling and suspended sediment concentration (SSC) laboratory analysis methods. Because of this underrepresentation, the U.S. Geological Survey, in collaboration with the Minnesota Pollution Control Agency, collected concurrent grab and EWDI samples at eight sites to compare results obtained using different combinations of field sampling and laboratory analysis methods.Study results determined that grab field sampling and TSS laboratory analysis results were biased substantially low compared to EWDI sampling and SSC laboratory analysis results, respectively. Differences in both field sampling and laboratory analysis methods caused grab and TSS methods to be biased substantially low. The difference in laboratory analysis methods was slightly greater than field sampling methods.Sand-sized particles had a strong effect on the comparability of the field sampling and laboratory analysis methods. These results indicated that grab field sampling and TSS laboratory analysis methods fail to capture most of the sand being transported by the stream. The results indicate there is less of a difference among samples collected with grab field sampling and analyzed for TSS and concentration of fines in SSC. Even though differences are present, the presence of strong correlations between SSC and TSS concentrations provides the opportunity to develop site specific relations to address transport processes not captured by grab field sampling and TSS laboratory analysis methods.

  7. High frequency resolution terahertz time-domain spectroscopy

    NASA Astrophysics Data System (ADS)

    Sangala, Bagvanth Reddy

    2013-12-01

    A new method for the high frequency resolution terahertz time-domain spectroscopy is developed based on the characteristic matrix method. This method is useful for studying planar samples or stack of planar samples. The terahertz radiation was generated by optical rectification in a ZnTe crystal and detected by another ZnTe crystal via electro-optic sampling method. In this new characteristic matrix based method, the spectra of the sample and reference waveforms will be modeled by using characteristic matrices. We applied this new method to measure the optical constants of air. The terahertz transmission through the layered systems air-Teflon-air-Quartz-air and Nitrogen gas-Teflon-Nitrogen gas-Quartz-Nitrogen gas was modeled by the characteristic matrix method. A transmission coefficient is derived from these models which was optimized to fit the experimental transmission coefficient to extract the optical constants of air. The optimization of an error function involving the experimental complex transmission coefficient and the theoretical transmission coefficient was performed using patternsearch algorithm of MATLAB. Since this method takes account of the echo waveforms due to reflections in the layered samples, this method allows analysis of longer time-domain waveforms giving rise to very high frequency resolution in the frequency-domain. We have presented the high frequency resolution terahertz time-domain spectroscopy of air and compared the results with the literature values. We have also fitted the complex susceptibility of air to the Lorentzian and Gaussian functions to extract the linewidths.

  8. Correlative multiple porosimetries for reservoir sandstones with adoption of a new reference-sample-guided computed-tomographic method.

    PubMed

    Jin, Jae Hwa; Kim, Junho; Lee, Jeong-Yil; Oh, Young Min

    2016-07-22

    One of the main interests in petroleum geology and reservoir engineering is to quantify the porosity of reservoir beds as accurately as possible. A variety of direct measurements, including methods of mercury intrusion, helium injection and petrographic image analysis, have been developed; however, their application frequently yields equivocal results because these methods are different in theoretical bases, means of measurement, and causes of measurement errors. Here, we present a set of porosities measured in Berea Sandstone samples by the multiple methods, in particular with adoption of a new method using computed tomography and reference samples. The multiple porosimetric data show a marked correlativeness among different methods, suggesting that these methods are compatible with each other. The new method of reference-sample-guided computed tomography is more effective than the previous methods when the accompanied merits such as experimental conveniences are taken into account.

  9. Correlative multiple porosimetries for reservoir sandstones with adoption of a new reference-sample-guided computed-tomographic method

    PubMed Central

    Jin, Jae Hwa; Kim, Junho; Lee, Jeong-Yil; Oh, Young Min

    2016-01-01

    One of the main interests in petroleum geology and reservoir engineering is to quantify the porosity of reservoir beds as accurately as possible. A variety of direct measurements, including methods of mercury intrusion, helium injection and petrographic image analysis, have been developed; however, their application frequently yields equivocal results because these methods are different in theoretical bases, means of measurement, and causes of measurement errors. Here, we present a set of porosities measured in Berea Sandstone samples by the multiple methods, in particular with adoption of a new method using computed tomography and reference samples. The multiple porosimetric data show a marked correlativeness among different methods, suggesting that these methods are compatible with each other. The new method of reference-sample-guided computed tomography is more effective than the previous methods when the accompanied merits such as experimental conveniences are taken into account. PMID:27445105

  10. Long-term frozen storage of urine samples: a trouble to get PCR results in Schistosoma spp. DNA detection?

    PubMed

    Fernández-Soto, Pedro; Velasco Tirado, Virginia; Carranza Rodríguez, Cristina; Pérez-Arellano, José Luis; Muro, Antonio

    2013-01-01

    Human schistosomiasis remains a serious worldwide public health problem. At present, a sensitive and specific assay for routine diagnosis of schistosome infection is not yet available. The potential for detecting schistosome-derived DNA by PCR-based methods in human clinical samples is currently being investigated as a diagnostic tool with potential application in routine schistosomiasis diagnosis. Collection of diagnostic samples such as stool or blood is usually difficult in some populations. However, urine is a biological sample that can be collected in a non-invasive method, easy to get from people of all ages and easy in management, but as a sample for PCR diagnosis is still not widely used. This could be due to the high variability in the reported efficiency of detection as a result of the high variation in urine samples' storage or conditions for handling and DNA preservation and extraction methods. We evaluate different commercial DNA extraction methods from a series of long-term frozen storage human urine samples from patients with parasitological confirmed schistosomiasis in order to assess the PCR effectiveness for Schistosoma spp. detection. Patients urine samples were frozen for 18 months up to 7 years until use. Results were compared with those obtained in PCR assays using fresh healthy human urine artificially contaminated with Schistosoma mansoni DNA and urine samples from mice experimentally infected with S. mansoni cercariae stored frozen for at least 12 months before use. PCR results in fresh human artificial urine samples using different DNA based extraction methods were much more effective than those obtained when long-term frozen human urine samples were used as the source of DNA template. Long-term frozen human urine samples are probably not a good source for DNA extraction for use as a template in PCR detection of Schistosoma spp., regardless of the DNA method of extraction used.

  11. Automated Fast Screening Method for Cocaine Identification in Seized Drug Samples Using a Portable Fourier Transform Infrared (FT-IR) Instrument.

    PubMed

    Mainali, Dipak; Seelenbinder, John

    2016-05-01

    Quick and presumptive identification of seized drug samples without destroying evidence is necessary for law enforcement officials to control the trafficking and abuse of drugs. This work reports an automated screening method to detect the presence of cocaine in seized samples using portable Fourier transform infrared (FT-IR) spectrometers. The method is based on the identification of well-defined characteristic vibrational frequencies related to the functional group of the cocaine molecule and is fully automated through the use of an expert system. Traditionally, analysts look for key functional group bands in the infrared spectra and characterization of the molecules present is dependent on user interpretation. This implies the need for user expertise, especially in samples that likely are mixtures. As such, this approach is biased and also not suitable for non-experts. The method proposed in this work uses the well-established "center of gravity" peak picking mathematical algorithm and combines it with the conditional reporting feature in MicroLab software to provide an automated method that can be successfully employed by users with varied experience levels. The method reports the confidence level of cocaine present only when a certain number of cocaine related peaks are identified by the automated method. Unlike library search and chemometric methods that are dependent on the library database or the training set samples used to build the calibration model, the proposed method is relatively independent of adulterants and diluents present in the seized mixture. This automated method in combination with a portable FT-IR spectrometer provides law enforcement officials, criminal investigators, or forensic experts a quick field-based prescreening capability for the presence of cocaine in seized drug samples. © The Author(s) 2016.

  12. Quadrature demodulation based circuit implementation of pulse stream for ultrasonic signal FRI sparse sampling

    NASA Astrophysics Data System (ADS)

    Shoupeng, Song; Zhou, Jiang

    2017-03-01

    Converting ultrasonic signal to ultrasonic pulse stream is the key step of finite rate of innovation (FRI) sparse sampling. At present, ultrasonic pulse-stream-forming techniques are mainly based on digital algorithms. No hardware circuit that can achieve it has been reported. This paper proposes a new quadrature demodulation (QD) based circuit implementation method for forming an ultrasonic pulse stream. Elaborating on FRI sparse sampling theory, the process of ultrasonic signal is explained, followed by a discussion and analysis of ultrasonic pulse-stream-forming methods. In contrast to ultrasonic signal envelope extracting techniques, a quadrature demodulation method (QDM) is proposed. Simulation experiments were performed to determine its performance at various signal-to-noise ratios (SNRs). The circuit was then designed, with mixing module, oscillator, low pass filter (LPF), and root of square sum module. Finally, application experiments were carried out on pipeline sample ultrasonic flaw testing. The experimental results indicate that the QDM can accurately convert ultrasonic signal to ultrasonic pulse stream, and reverse the original signal information, such as pulse width, amplitude, and time of arrival. This technique lays the foundation for ultrasonic signal FRI sparse sampling directly with hardware circuitry.

  13. A High-Throughput Method for Direct Detection of Therapeutic Oligonucleotide-Induced Gene Silencing In Vivo

    PubMed Central

    Coles, Andrew H.; Osborn, Maire F.; Alterman, Julia F.; Turanov, Anton A.; Godinho, Bruno M.D.C.; Kennington, Lori; Chase, Kathryn; Aronin, Neil

    2016-01-01

    Preclinical development of RNA interference (RNAi)-based therapeutics requires a rapid, accurate, and robust method of simultaneously quantifying mRNA knockdown in hundreds of samples. The most well-established method to achieve this is quantitative real-time polymerase chain reaction (qRT-PCR), a labor-intensive methodology that requires sample purification, which increases the potential to introduce additional bias. Here, we describe that the QuantiGene® branched DNA (bDNA) assay linked to a 96-well Qiagen TissueLyser II is a quick and reproducible alternative to qRT-PCR for quantitative analysis of mRNA expression in vivo directly from tissue biopsies. The bDNA assay is a high-throughput, plate-based, luminescence technique, capable of directly measuring mRNA levels from tissue lysates derived from various biological samples. We have performed a systematic evaluation of this technique for in vivo detection of RNAi-based silencing. We show that similar quality data is obtained from purified RNA and tissue lysates. In general, we observe low intra- and inter-animal variability (around 10% for control samples), and high intermediate precision. This allows minimization of sample size for evaluation of oligonucleotide efficacy in vivo. PMID:26595721

  14. A broad range assay for rapid detection and etiologic characterization of bacterial meningitis: performance testing in samples from sub-Sahara.

    PubMed

    Won, Helen; Yang, Samuel; Gaydos, Charlotte; Hardick, Justin; Ramachandran, Padmini; Hsieh, Yu-Hsiang; Kecojevic, Alexander; Njanpop-Lafourcade, Berthe-Marie; Mueller, Judith E; Tameklo, Tsidi Agbeko; Badziklou, Kossi; Gessner, Bradford D; Rothman, Richard E

    2012-09-01

    This study aimed to conduct a pilot evaluation of broad-based multiprobe polymerase chain reaction (PCR) in clinical cerebrospinal fluid (CSF) samples compared to local conventional PCR/culture methods used for bacterial meningitis surveillance. A previously described PCR consisting of initial broad-based detection of Eubacteriales by a universal probe, followed by Gram typing, and pathogen-specific probes was designed targeting variable regions of the 16S rRNA gene. The diagnostic performance of the 16S rRNA assay in "127 CSF samples was evaluated in samples from patients from Togo, Africa, by comparison to conventional PCR/culture methods. Our probes detected Neisseria meningitidis, Streptococcus pneumoniae, and Haemophilus influenzae. Uniprobe sensitivity and specificity versus conventional PCR were 100% and 54.6%, respectively. Sensitivity and specificity of uniprobe versus culture methods were 96.5% and 52.5%, respectively. Gram-typing probes correctly typed 98.8% (82/83) and pathogen-specific probes identified 96.4% (80/83) of the positives. This broad-based PCR algorithm successfully detected and provided species level information for multiple bacterial meningitis agents in clinical samples. Copyright © 2012 Elsevier Inc. All rights reserved.

  15. A broad range assay for rapid detection and etiologic characterization of bacterial meningitis: performance testing in samples from sub-Sahara☆, ☆☆,★

    PubMed Central

    Won, Helen; Yang, Samuel; Gaydos, Charlotte; Hardick, Justin; Ramachandran, Padmini; Hsieh, Yu-Hsiang; Kecojevic, Alexander; Njanpop-Lafourcade, Berthe-Marie; Mueller, Judith E.; Tameklo, Tsidi Agbeko; Badziklou, Kossi; Gessner, Bradford D.; Rothman, Richard E.

    2012-01-01

    This study aimed to conduct a pilot evaluation of broad-based multiprobe polymerase chain reaction (PCR) in clinical cerebrospinal fluid (CSF) samples compared to local conventional PCR/culture methods used for bacterial meningitis surveillance. A previously described PCR consisting of initial broad-based detection of Eubacteriales by a universal probe, followed by Gram typing, and pathogen-specific probes was designed targeting variable regions of the 16S rRNA gene. The diagnostic performance of the 16S rRNA assay in “”127 CSF samples was evaluated in samples from patients from Togo, Africa, by comparison to conventional PCR/culture methods. Our probes detected Neisseria meningitidis, Streptococcus pneumoniae, and Haemophilus influenzae. Uniprobe sensitivity and specificity versus conventional PCR were 100% and 54.6%, respectively. Sensitivity and specificity of uniprobe versus culture methods were 96.5% and 52.5%, respectively. Gram-typing probes correctly typed 98.8% (82/83) and pathogen-specific probes identified 96.4% (80/83) of the positives. This broad-based PCR algorithm successfully detected and provided species level information for multiple bacterial meningitis agents in clinical samples. PMID:22809694

  16. Serial isoelectric focusing as an effective and economic way to obtain maximal resolution and high-throughput in 2D-based comparative proteomics of scarce samples: proof-of-principle.

    PubMed

    Farhoud, Murtada H; Wessels, Hans J C T; Wevers, Ron A; van Engelen, Baziel G; van den Heuvel, Lambert P; Smeitink, Jan A

    2005-01-01

    In 2D-based comparative proteomics of scarce samples, such as limited patient material, established methods for prefractionation and subsequent use of different narrow range IPG strips to increase overall resolution are difficult to apply. Also, a high number of samples, a prerequisite for drawing meaningful conclusions when pathological and control samples are considered, will increase the associated amount of work almost exponentially. Here, we introduce a novel, effective, and economic method designed to obtain maximum 2D resolution while maintaining the high throughput necessary to perform large-scale comparative proteomics studies. The method is based on connecting different IPG strips serially head-to-tail so that a complete line of different IPG strips with sequential pH regions can be focused in the same experiment. We show that when 3 IPG strips (covering together the pH range of 3-11) are connected head-to-tail an optimal resolution is achieved along the whole pH range. Sample consumption, time required, and associated costs are reduced by almost 70%, and the workload is reduced significantly.

  17. SEM technique for imaging and measuring electronic transport in nanocomposites based on electric field induced contrast

    DOEpatents

    Jesse, Stephen [Knoxville, TN; Geohegan, David B [Knoxville, TN; Guillorn, Michael [Brooktondale, NY

    2009-02-17

    Methods and apparatus are described for SEM imaging and measuring electronic transport in nanocomposites based on electric field induced contrast. A method includes mounting a sample onto a sample holder, the sample including a sample material; wire bonding leads from the sample holder onto the sample; placing the sample holder in a vacuum chamber of a scanning electron microscope; connecting leads from the sample holder to a power source located outside the vacuum chamber; controlling secondary electron emission from the sample by applying a predetermined voltage to the sample through the leads; and generating an image of the secondary electron emission from the sample. An apparatus includes a sample holder for a scanning electron microscope having an electrical interconnect and leads on top of the sample holder electrically connected to the electrical interconnect; a power source and a controller connected to the electrical interconnect for applying voltage to the sample holder to control the secondary electron emission from a sample mounted on the sample holder; and a computer coupled to a secondary electron detector to generate images of the secondary electron emission from the sample.

  18. Evaluation of the Biological Sampling Kit (BiSKit) for Large-Area Surface Sampling

    PubMed Central

    Buttner, Mark P.; Cruz, Patricia; Stetzenbach, Linda D.; Klima-Comba, Amy K.; Stevens, Vanessa L.; Emanuel, Peter A.

    2004-01-01

    Current surface sampling methods for microbial contaminants are designed to sample small areas and utilize culture analysis. The total number of microbes recovered is low because a small area is sampled, making detection of a potential pathogen more difficult. Furthermore, sampling of small areas requires a greater number of samples to be collected, which delays the reporting of results, taxes laboratory resources and staffing, and increases analysis costs. A new biological surface sampling method, the Biological Sampling Kit (BiSKit), designed to sample large areas and to be compatible with testing with a variety of technologies, including PCR and immunoassay, was evaluated and compared to other surface sampling strategies. In experimental room trials, wood laminate and metal surfaces were contaminated by aerosolization of Bacillus atrophaeus spores, a simulant for Bacillus anthracis, into the room, followed by settling of the spores onto the test surfaces. The surfaces were sampled with the BiSKit, a cotton-based swab, and a foam-based swab. Samples were analyzed by culturing, quantitative PCR, and immunological assays. The results showed that the large surface area (1 m2) sampled with the BiSKit resulted in concentrations of B. atrophaeus in samples that were up to 10-fold higher than the concentrations obtained with the other methods tested. A comparison of wet and dry sampling with the BiSKit indicated that dry sampling was more efficient (efficiency, 18.4%) than wet sampling (efficiency, 11.3%). The sensitivities of detection of B. atrophaeus on metal surfaces were 42 ± 5.8 CFU/m2 for wet sampling and 100.5 ± 10.2 CFU/m2 for dry sampling. These results demonstrate that the use of a sampling device capable of sampling larger areas results in higher sensitivity than that obtained with currently available methods and has the advantage of sampling larger areas, thus requiring collection of fewer samples per site. PMID:15574898

  19. Studying Cannabis Use Behaviors With Facebook and Web Surveys: Methods and Insights.

    PubMed

    Borodovsky, Jacob T; Marsch, Lisa A; Budney, Alan J

    2018-05-02

    The rapid and wide-reaching expansion of internet access and digital technologies offers epidemiologists numerous opportunities to study health behaviors. One particularly promising new data collection strategy is the use of Facebook's advertising platform in conjunction with Web-based surveys. Our research team at the Center for Technology and Behavioral Health has used this quick and cost-efficient method to recruit large samples and address unique scientific questions related to cannabis use. In conducting this research, we have gleaned several insights for using this sampling method effectively and have begun to document the characteristics of the resulting data. We believe this information could be useful to other researchers attempting to study cannabis use or, potentially, other health behaviors. The first aim of this paper is to describe case examples of procedures for using Facebook as a survey sampling method for studying cannabis use. We then present several distinctive features of the data produced using this method. Finally, we discuss the utility of this sampling method for addressing specific types of epidemiological research questions. Overall, we believe that sampling with Facebook advertisements and Web surveys is best conceptualized as a targeted, nonprobability-based method for oversampling cannabis users across the United States. ©Jacob T Borodovsky, Lisa A Marsch, Alan J Budney. Originally published in JMIR Public Health and Surveillance (http://publichealth.jmir.org), 02.05.2018.

  20. Classification of weld defect based on information fusion technology for radiographic testing system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jiang, Hongquan; Liang, Zeming, E-mail: heavenlzm@126.com; Gao, Jianmin

    Improving the efficiency and accuracy of weld defect classification is an important technical problem in developing the radiographic testing system. This paper proposes a novel weld defect classification method based on information fusion technology, Dempster–Shafer evidence theory. First, to characterize weld defects and improve the accuracy of their classification, 11 weld defect features were defined based on the sub-pixel level edges of radiographic images, four of which are presented for the first time in this paper. Second, we applied information fusion technology to combine different features for weld defect classification, including a mass function defined based on the weld defectmore » feature information and the quartile-method-based calculation of standard weld defect class which is to solve a sample problem involving a limited number of training samples. A steam turbine weld defect classification case study is also presented herein to illustrate our technique. The results show that the proposed method can increase the correct classification rate with limited training samples and address the uncertainties associated with weld defect classification.« less

  1. Classification of weld defect based on information fusion technology for radiographic testing system.

    PubMed

    Jiang, Hongquan; Liang, Zeming; Gao, Jianmin; Dang, Changying

    2016-03-01

    Improving the efficiency and accuracy of weld defect classification is an important technical problem in developing the radiographic testing system. This paper proposes a novel weld defect classification method based on information fusion technology, Dempster-Shafer evidence theory. First, to characterize weld defects and improve the accuracy of their classification, 11 weld defect features were defined based on the sub-pixel level edges of radiographic images, four of which are presented for the first time in this paper. Second, we applied information fusion technology to combine different features for weld defect classification, including a mass function defined based on the weld defect feature information and the quartile-method-based calculation of standard weld defect class which is to solve a sample problem involving a limited number of training samples. A steam turbine weld defect classification case study is also presented herein to illustrate our technique. The results show that the proposed method can increase the correct classification rate with limited training samples and address the uncertainties associated with weld defect classification.

  2. Anthrax Sampling and Decontamination: Technology Trade-Offs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Price, Phillip N.; Hamachi, Kristina; McWilliams, Jennifer

    2008-09-12

    The goal of this project was to answer the following questions concerning response to a future anthrax release (or suspected release) in a building: 1. Based on past experience, what rules of thumb can be determined concerning: (a) the amount of sampling that may be needed to determine the extent of contamination within a given building; (b) what portions of a building should be sampled; (c) the cost per square foot to decontaminate a given type of building using a given method; (d) the time required to prepare for, and perform, decontamination; (e) the effectiveness of a given decontamination methodmore » in a given type of building? 2. Based on past experience, what resources will be spent on evaluating the extent of contamination, performing decontamination, and assessing the effectiveness of the decontamination in abuilding of a given type and size? 3. What are the trade-offs between cost, time, and effectiveness for the various sampling plans, sampling methods, and decontamination methods that have been used in the past?« less

  3. Sampling and pyrosequencing methods for characterizing bacterial communities in the human gut using 16S sequence tags.

    PubMed

    Wu, Gary D; Lewis, James D; Hoffmann, Christian; Chen, Ying-Yu; Knight, Rob; Bittinger, Kyle; Hwang, Jennifer; Chen, Jun; Berkowsky, Ronald; Nessel, Lisa; Li, Hongzhe; Bushman, Frederic D

    2010-07-30

    Intense interest centers on the role of the human gut microbiome in health and disease, but optimal methods for analysis are still under development. Here we present a study of methods for surveying bacterial communities in human feces using 454/Roche pyrosequencing of 16S rRNA gene tags. We analyzed fecal samples from 10 individuals and compared methods for storage, DNA purification and sequence acquisition. To assess reproducibility, we compared samples one cm apart on a single stool specimen for each individual. To analyze storage methods, we compared 1) immediate freezing at -80 degrees C, 2) storage on ice for 24 or 3) 48 hours. For DNA purification methods, we tested three commercial kits and bead beating in hot phenol. Variations due to the different methodologies were compared to variation among individuals using two approaches--one based on presence-absence information for bacterial taxa (unweighted UniFrac) and the other taking into account their relative abundance (weighted UniFrac). In the unweighted analysis relatively little variation was associated with the different analytical procedures, and variation between individuals predominated. In the weighted analysis considerable variation was associated with the purification methods. Particularly notable was improved recovery of Firmicutes sequences using the hot phenol method. We also carried out surveys of the effects of different 454 sequencing methods (FLX versus Titanium) and amplification of different 16S rRNA variable gene segments. Based on our findings we present recommendations for protocols to collect, process and sequence bacterial 16S rDNA from fecal samples--some major points are 1) if feasible, bead-beating in hot phenol or use of the PSP kit improves recovery; 2) storage methods can be adjusted based on experimental convenience; 3) unweighted (presence-absence) comparisons are less affected by lysis method.

  4. A novel strategy to obtain quantitative data for modelling: combined enrichment and real-time PCR for enumeration of salmonellae from pig carcasses.

    PubMed

    Krämer, Nadine; Löfström, Charlotta; Vigre, Håkan; Hoorfar, Jeffrey; Bunge, Cornelia; Malorny, Burkhard

    2011-03-01

    Salmonella is a major zoonotic pathogen which causes outbreaks and sporadic cases of gastroenteritis in humans worldwide. The primary sources for Salmonella are food-producing animals such as pigs and poultry. For risk assessment and hazard analysis and critical control point (HACCP) concepts, it is essential to produce large amounts of quantitative data, which is currently not achievable with the standard cultural based methods for enumeration of Salmonella. This study presents the development of a novel strategy to enumerate low numbers of Salmonella in cork borer samples taken from pig carcasses as a first concept and proof of principle for a new sensitive and rapid quantification method based on combined enrichment and real-time PCR. The novelty of the approach is in the short pre-enrichment step, where for most bacteria, growth is in the log phase. The method consists of an 8h pre-enrichment of the cork borer sample diluted 1:10 in non-selective buffered peptone water, followed by DNA extraction, and Salmonella detection and quantification by real-time PCR. The limit of quantification was 1.4 colony forming units (CFU)/20 cm(2) (approximately 10 g) of artificially contaminated sample with 95% confidence interval of ± 0.7 log CFU/sample. The precision was similar to the standard reference most probable number (MPN) method. A screening of 200 potentially naturally contaminated cork borer samples obtained over seven weeks in a slaughterhouse resulted in 25 Salmonella-positive samples. The analysis of salmonellae within these samples showed that the PCR method had a higher sensitivity for samples with a low contamination level (<6.7 CFU/sample), where 15 of the samples negative with the MPN method was detected with the PCR method and 5 were found to be negative by both methods. For the samples with a higher contamination level (6.7-310 CFU/sample) a good agreement between the results obtained with the PCR and MPN methods was obtained. The quantitative real-time PCR method can easily be applied to other food and environmental matrices by adaptation of the pre-enrichment time and media. Copyright © 2010 Elsevier B.V. All rights reserved.

  5. Practicable group testing method to evaluate weight/weight GMO content in maize grains.

    PubMed

    Mano, Junichi; Yanaka, Yuka; Ikezu, Yoko; Onishi, Mari; Futo, Satoshi; Minegishi, Yasutaka; Ninomiya, Kenji; Yotsuyanagi, Yuichi; Spiegelhalter, Frank; Akiyama, Hiroshi; Teshima, Reiko; Hino, Akihiro; Naito, Shigehiro; Koiwa, Tomohiro; Takabatake, Reona; Furui, Satoshi; Kitta, Kazumi

    2011-07-13

    Because of the increasing use of maize hybrids with genetically modified (GM) stacked events, the established and commonly used bulk sample methods for PCR quantification of GM maize in non-GM maize are prone to overestimate the GM organism (GMO) content, compared to the actual weight/weight percentage of GM maize in the grain sample. As an alternative method, we designed and assessed a group testing strategy in which the GMO content is statistically evaluated based on qualitative analyses of multiple small pools, consisting of 20 maize kernels each. This approach enables the GMO content evaluation on a weight/weight basis, irrespective of the presence of stacked-event kernels. To enhance the method's user-friendliness in routine application, we devised an easy-to-use PCR-based qualitative analytical method comprising a sample preparation step in which 20 maize kernels are ground in a lysis buffer and a subsequent PCR assay in which the lysate is directly used as a DNA template. This method was validated in a multilaboratory collaborative trial.

  6. A Critical Assessment of Bias in Survey Studies Using Location-Based Sampling to Recruit Patrons in Bars

    PubMed Central

    Morrison, Christopher; Lee, Juliet P.; Gruenewald, Paul J.; Marzell, Miesha

    2015-01-01

    Location-based sampling is a method to obtain samples of people within ecological contexts relevant to specific public health outcomes. Random selection increases generalizability, however in some circumstances (such as surveying bar patrons) recruitment conditions increase risks of sample bias. We attempted to recruit representative samples of bars and patrons in six California cities, but low response rates precluded meaningful analysis. A systematic review of 24 similar studies revealed that none addressed the key shortcomings of our study. We recommend steps to improve studies that use location-based sampling: (i) purposively sample places of interest, (ii) utilize recruitment strategies appropriate to the environment, and (iii) provide full information on response rates at all levels of sampling. PMID:26574657

  7. Generalized Bootstrap Method for Assessment of Uncertainty in Semivariogram Inference

    USGS Publications Warehouse

    Olea, R.A.; Pardo-Iguzquiza, E.

    2011-01-01

    The semivariogram and its related function, the covariance, play a central role in classical geostatistics for modeling the average continuity of spatially correlated attributes. Whereas all methods are formulated in terms of the true semivariogram, in practice what can be used are estimated semivariograms and models based on samples. A generalized form of the bootstrap method to properly model spatially correlated data is used to advance knowledge about the reliability of empirical semivariograms and semivariogram models based on a single sample. Among several methods available to generate spatially correlated resamples, we selected a method based on the LU decomposition and used several examples to illustrate the approach. The first one is a synthetic, isotropic, exhaustive sample following a normal distribution, the second example is also a synthetic but following a non-Gaussian random field, and a third empirical sample consists of actual raingauge measurements. Results show wider confidence intervals than those found previously by others with inadequate application of the bootstrap. Also, even for the Gaussian example, distributions for estimated semivariogram values and model parameters are positively skewed. In this sense, bootstrap percentile confidence intervals, which are not centered around the empirical semivariogram and do not require distributional assumptions for its construction, provide an achieved coverage similar to the nominal coverage. The latter cannot be achieved by symmetrical confidence intervals based on the standard error, regardless if the standard error is estimated from a parametric equation or from bootstrap. ?? 2010 International Association for Mathematical Geosciences.

  8. Problems with sampling desert tortoises: A simulation analysis based on field data

    USGS Publications Warehouse

    Freilich, J.E.; Camp, R.J.; Duda, J.J.; Karl, A.E.

    2005-01-01

    The desert tortoise (Gopherus agassizii) was listed as a U.S. threatened species in 1990 based largely on population declines inferred from mark-recapture surveys of 2.59-km2 (1-mi2) plots. Since then, several census methods have been proposed and tested, but all methods still pose logistical or statistical difficulties. We conducted computer simulations using actual tortoise location data from 2 1-mi2 plot surveys in southern California, USA, to identify strengths and weaknesses of current sampling strategies. We considered tortoise population estimates based on these plots as "truth" and then tested various sampling methods based on sampling smaller plots or transect lines passing through the mile squares. Data were analyzed using Schnabel's mark-recapture estimate and program CAPTURE. Experimental subsampling with replacement of the 1-mi2 data using 1-km2 and 0.25-km2 plot boundaries produced data sets of smaller plot sizes, which we compared to estimates from the 1-mi 2 plots. We also tested distance sampling by saturating a 1-mi 2 site with computer simulated transect lines, once again evaluating bias in density estimates. Subsampling estimates from 1-km2 plots did not differ significantly from the estimates derived at 1-mi2. The 0.25-km2 subsamples significantly overestimated population sizes, chiefly because too few recaptures were made. Distance sampling simulations were biased 80% of the time and had high coefficient of variation to density ratios. Furthermore, a prospective power analysis suggested limited ability to detect population declines as high as 50%. We concluded that poor performance and bias of both sampling procedures was driven by insufficient sample size, suggesting that all efforts must be directed to increasing numbers found in order to produce reliable results. Our results suggest that present methods may not be capable of accurately estimating desert tortoise populations.

  9. Nanoparticle-assisted laser desorption/ionization mass spectrometry: Novel sample preparation methods and nanoparticle screening for plant metabolite imaging

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yagnik, Gargey B.

    The main goal of the presented research is development of nanoparticle based matrix-assisted laser desorption ionization-mass spectrometry (MALDI-MS). This dissertation includes the application of previously developed data acquisition methods, development of novel sample preparation methods, application and comparison of novel nanoparticle matrices, and comparison of two nanoparticle matrix application methods for MALDI-MS and MALDI-MS imaging.

  10. A new approach based on off-line coupling of high-performance liquid chromatography with gas chromatography-mass spectrometry to determine acrylamide in coffee brew.

    PubMed

    Blanch, Gracia Patricia; Morales, Francisco José; Moreno, Fernando de la Peña; del Castillo, María Luisa Ruiz

    2013-01-01

    A new method based on off-line coupling of LC with GC in replacement of conventional sample preparation techniques is proposed to analyze acrylamide in coffee brews. The method involves the preseparation of the sample by LC, the collection of the selected fraction, its concentration under nitrogen, and subsequent analysis by GC coupled with MS. The composition of the LC mobile phase and the flow rate were studied to select those conditions that allowed separation of acrylamide without coeluting compounds. Under the conditions selected recoveries close to 100% were achieved while LODs and LOQs equal to 5 and 10 μg/L for acrylamide in brewed coffee were obtained. The method developed enabled the reliable detection of acrylamide in spiked coffee beverage samples without further clean-up steps or sample manipulation. © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  11. A Noise-Filtered Under-Sampling Scheme for Imbalanced Classification.

    PubMed

    Kang, Qi; Chen, XiaoShuang; Li, SiSi; Zhou, MengChu

    2017-12-01

    Under-sampling is a popular data preprocessing method in dealing with class imbalance problems, with the purposes of balancing datasets to achieve a high classification rate and avoiding the bias toward majority class examples. It always uses full minority data in a training dataset. However, some noisy minority examples may reduce the performance of classifiers. In this paper, a new under-sampling scheme is proposed by incorporating a noise filter before executing resampling. In order to verify the efficiency, this scheme is implemented based on four popular under-sampling methods, i.e., Undersampling + Adaboost, RUSBoost, UnderBagging, and EasyEnsemble through benchmarks and significance analysis. Furthermore, this paper also summarizes the relationship between algorithm performance and imbalanced ratio. Experimental results indicate that the proposed scheme can improve the original undersampling-based methods with significance in terms of three popular metrics for imbalanced classification, i.e., the area under the curve, -measure, and -mean.

  12. Study on a pattern classification method of soil quality based on simplified learning sample dataset

    USGS Publications Warehouse

    Zhang, Jiahua; Liu, S.; Hu, Y.; Tian, Y.

    2011-01-01

    Based on the massive soil information in current soil quality grade evaluation, this paper constructed an intelligent classification approach of soil quality grade depending on classical sampling techniques and disordered multiclassification Logistic regression model. As a case study to determine the learning sample capacity under certain confidence level and estimation accuracy, and use c-means algorithm to automatically extract the simplified learning sample dataset from the cultivated soil quality grade evaluation database for the study area, Long chuan county in Guangdong province, a disordered Logistic classifier model was then built and the calculation analysis steps of soil quality grade intelligent classification were given. The result indicated that the soil quality grade can be effectively learned and predicted by the extracted simplified dataset through this method, which changed the traditional method for soil quality grade evaluation. ?? 2011 IEEE.

  13. Comparing sexual minority cancer survivors recruited through a cancer registry to convenience methods of recruitment.

    PubMed

    Boehmer, Ulrike; Clark, Melissa A; Timm, Alison; Glickman, Mark; Sullivan, Mairead

    2011-01-01

    Sexual minority women, defined as having a lesbian or bisexual identity or reporting a preference for a female partner, are not considered by cancer surveillance. This study assesses the representativeness of sexual minority breast cancer survivors, defined as having a lesbian or bisexual identity or reporting a preference for a female partner, who were recruited into a convenience sample compared with a population-based registry sample of sexual minority breast cancer survivors. Long-term survivors of non-metastatic breast cancer who self-reported as sexual minority were recruited from a cancer registry and subsequently from the community using convenience recruitment methods. Sexual minority breast cancer survivors who screened eligible participated in a telephone survey about their quality of life and factors associated therewith. Participants in the convenience sample were similar to the registry-based sample with respect to adjustment to cancer, physical health, trust in physician, coping, social support, and sexual minority experiences. Compared with the convenience sample, breast cancer survivors in the registry sample were more likely married, more educated, diagnosed more recently, at an earlier stage of cancer, and more likely treated with breast-conserving surgery; they differed on adjuvant therapies. Because sexual minority breast cancer survivors who volunteered for the community-based sample shared most characteristics of the sample recruited from the cancer registry, we concluded that the community sample had comparable representational quality. In the absence of cancer surveillance of sexual minorities, thoughtful convenience recruitment methods provide good representational quality convenience samples. Copyright © 2011 Jacobs Institute of Women's Health. Published by Elsevier Inc. All rights reserved.

  14. Smartphone-based colorimetric analysis for detection of saliva alcohol concentration.

    PubMed

    Jung, Youngkee; Kim, Jinhee; Awofeso, Olumide; Kim, Huisung; Regnier, Fred; Bae, Euiwon

    2015-11-01

    A simple device and associated analytical methods are reported. We provide objective and accurate determination of saliva alcohol concentrations using smartphone-based colorimetric imaging. The device utilizes any smartphone with a miniature attachment that positions the sample and provides constant illumination for sample imaging. Analyses of histograms based on channel imaging of red-green-blue (RGB) and hue-saturation-value (HSV) color space provide unambiguous determination of blood alcohol concentration from color changes on sample pads. A smartphone-based sample analysis by colorimetry was developed and tested with blind samples that matched with the training sets. This technology can be adapted to any smartphone and used to conduct color change assays.

  15. Pilot Test of a Novel Method for Assessing Community Response to Low-Amplitude Sonic Booms

    NASA Technical Reports Server (NTRS)

    Fidell, Sanford; Horonjeff, Richard D.; Harris, Michael

    2012-01-01

    A pilot test of a novel method for assessing residents annoyance to sonic booms was performed. During a two-week period, residents of the base housing area at Edwards Air Force Base provided data on their reactions to sonic booms using Smartphone-based interviews. Noise measurements were conducted at the same time. The report presents information about data collection methods and about test participants reactions to low-amplitude sonic booms. The latter information should not be viewed as definitive for several reasons. It may not be reliably generalized to the wider U.S. residential population (because it was not derived from a representative random sample) and the sample itself was not large.

  16. RIVERINE ASSESSMENT USING MACROINVERTEBRATES: ALL METHODS ARE NOT CREATED EQUAL

    EPA Science Inventory

    In 1999, we compared six benthic macroinvertebrate field sampling methods for nonwadeable streams based on those developed for three major programs (EMAP-SW, NAWQA, and Ohio EPA), at each of sixty sites across four tributaries to the Ohio River. Water chemistry samples and physi...

  17. RnaSeqSampleSize: real data based sample size estimation for RNA sequencing.

    PubMed

    Zhao, Shilin; Li, Chung-I; Guo, Yan; Sheng, Quanhu; Shyr, Yu

    2018-05-30

    One of the most important and often neglected components of a successful RNA sequencing (RNA-Seq) experiment is sample size estimation. A few negative binomial model-based methods have been developed to estimate sample size based on the parameters of a single gene. However, thousands of genes are quantified and tested for differential expression simultaneously in RNA-Seq experiments. Thus, additional issues should be carefully addressed, including the false discovery rate for multiple statistic tests, widely distributed read counts and dispersions for different genes. To solve these issues, we developed a sample size and power estimation method named RnaSeqSampleSize, based on the distributions of gene average read counts and dispersions estimated from real RNA-seq data. Datasets from previous, similar experiments such as the Cancer Genome Atlas (TCGA) can be used as a point of reference. Read counts and their dispersions were estimated from the reference's distribution; using that information, we estimated and summarized the power and sample size. RnaSeqSampleSize is implemented in R language and can be installed from Bioconductor website. A user friendly web graphic interface is provided at http://cqs.mc.vanderbilt.edu/shiny/RnaSeqSampleSize/ . RnaSeqSampleSize provides a convenient and powerful way for power and sample size estimation for an RNAseq experiment. It is also equipped with several unique features, including estimation for interested genes or pathway, power curve visualization, and parameter optimization.

  18. An evaluation of authentication methods for smartphone based on users’ preferences

    NASA Astrophysics Data System (ADS)

    Sari, P. K.; Ratnasari, G. S.; Prasetio, A.

    2016-04-01

    This study discusses about smartphone screen lock preferences using some types of authentication methods. The purpose is to determine the user behaviours based on the perceived security and convenience, as well as the preferences for different types of authentication methods. Variables used are the considerations for locking the screens and the types of authentication methods. The population consists of the smartphone users with the total samples of 400 respondents within a nonprobability sampling method. Data analysis method used is the descriptive analysis. The results showed that the convenience factor is still the major consideration for locking the smartphone screens. Majority of the users chose the pattern unlock as the most convenient method to use. Meanwhile, fingerprint unlock becomes the most secure method in the users’ perceptions and as the method chosen to be used in the future.

  19. Simulated Tempering Distributed Replica Sampling, Virtual Replica Exchange, and Other Generalized-Ensemble Methods for Conformational Sampling.

    PubMed

    Rauscher, Sarah; Neale, Chris; Pomès, Régis

    2009-10-13

    Generalized-ensemble algorithms in temperature space have become popular tools to enhance conformational sampling in biomolecular simulations. A random walk in temperature leads to a corresponding random walk in potential energy, which can be used to cross over energetic barriers and overcome the problem of quasi-nonergodicity. In this paper, we introduce two novel methods: simulated tempering distributed replica sampling (STDR) and virtual replica exchange (VREX). These methods are designed to address the practical issues inherent in the replica exchange (RE), simulated tempering (ST), and serial replica exchange (SREM) algorithms. RE requires a large, dedicated, and homogeneous cluster of CPUs to function efficiently when applied to complex systems. ST and SREM both have the drawback of requiring extensive initial simulations, possibly adaptive, for the calculation of weight factors or potential energy distribution functions. STDR and VREX alleviate the need for lengthy initial simulations, and for synchronization and extensive communication between replicas. Both methods are therefore suitable for distributed or heterogeneous computing platforms. We perform an objective comparison of all five algorithms in terms of both implementation issues and sampling efficiency. We use disordered peptides in explicit water as test systems, for a total simulation time of over 42 μs. Efficiency is defined in terms of both structural convergence and temperature diffusion, and we show that these definitions of efficiency are in fact correlated. Importantly, we find that ST-based methods exhibit faster temperature diffusion and correspondingly faster convergence of structural properties compared to RE-based methods. Within the RE-based methods, VREX is superior to both SREM and RE. On the basis of our observations, we conclude that ST is ideal for simple systems, while STDR is well-suited for complex systems.

  20. Systems and methods for self-synchronized digital sampling

    NASA Technical Reports Server (NTRS)

    Samson, Jr., John R. (Inventor)

    2008-01-01

    Systems and methods for self-synchronized data sampling are provided. In one embodiment, a system for capturing synchronous data samples is provided. The system includes an analog to digital converter adapted to capture signals from one or more sensors and convert the signals into a stream of digital data samples at a sampling frequency determined by a sampling control signal; and a synchronizer coupled to the analog to digital converter and adapted to receive a rotational frequency signal from a rotating machine, wherein the synchronizer is further adapted to generate the sampling control signal, and wherein the sampling control signal is based on the rotational frequency signal.

  1. Subrandom methods for multidimensional nonuniform sampling.

    PubMed

    Worley, Bradley

    2016-08-01

    Methods of nonuniform sampling that utilize pseudorandom number sequences to select points from a weighted Nyquist grid are commonplace in biomolecular NMR studies, due to the beneficial incoherence introduced by pseudorandom sampling. However, these methods require the specification of a non-arbitrary seed number in order to initialize a pseudorandom number generator. Because the performance of pseudorandom sampling schedules can substantially vary based on seed number, this can complicate the task of routine data collection. Approaches such as jittered sampling and stochastic gap sampling are effective at reducing random seed dependence of nonuniform sampling schedules, but still require the specification of a seed number. This work formalizes the use of subrandom number sequences in nonuniform sampling as a means of seed-independent sampling, and compares the performance of three subrandom methods to their pseudorandom counterparts using commonly applied schedule performance metrics. Reconstruction results using experimental datasets are also provided to validate claims made using these performance metrics. Copyright © 2016 Elsevier Inc. All rights reserved.

  2. Regional geochemical maps of the Tonopah 1 degree by 2 degrees Quadrangle, Nevada, based on samples of stream sediment and nonmagnetic heavy-mineral concentrate

    USGS Publications Warehouse

    Nash, J.T.; Siems, D.F.

    1988-01-01

    The geochemical maps in this report are based on analytical results reported by Fairfield and others (1985), Hill and others (1986), and Siems and others (1986). These reports also describe the sample preparation and analytical methods and provide information on the location of the sample sites.

  3. Comparison of the DSM-IV Combined and Inattentive Types of ADHD in a School-Based Sample of Latino/Hispanic Children

    ERIC Educational Resources Information Center

    Bauermeister, Jose J.; Matos, Maribel; Reina, Graciela; Salas, Carmen C.; Martinez, Jose V.; Cumba, Eduardo; Barkley, Russell A.

    2005-01-01

    Background: The aim of this investigation was to examine the construct validity and distinctiveness of the inattentive type (IT) and combined type (CT) of Attention-Deficit/Hyperactivity Disorder (ADHD) in a Latino/Hispanic sample. Method: A comprehensive assessment was conducted with a clinically diagnosed school-based sample of 98 children aged…

  4. First experiences with methods to measure ammonia emissions from naturally ventilated cattle buildings in the U.K.

    NASA Astrophysics Data System (ADS)

    Demmers, T. G. M.; Burgess, L. R.; Short, J. L.; Phillips, V. R.; Clark, J. A.; Wathes, C. M.

    A method has been developed to measure the emission rate of ammonia from naturally ventilated U.K. livestock buildings. The method is based on measurements of ammonia concentration and estimates of the ventilation rate of the building by continuous release of carbon monoxide tracer within the building. The tracer concentration is measured at nine positions in openings around the perimeter of the building, as well as around a ring sampling line. Two criteria were evaluated to decide whether, at any given time, a given opening in the building acted as an air inlet or as an air outlet. Carbon monoxide concentration difference across an opening was found to be a better criterion than the temperature difference across the opening. Ammonia concentrations were measured continuously at the sampling points using a chemiluminescence analyser. The method was applied to a straw-bedded beef unit and to a slurry-based dairy unit. Both buildings were of space-boarded construction. Ventilation rates estimated by the ring line sample were consistently higher than by the perimeter samples. During calm weather, the ventilation estimates by both samples were similar (10-20 air changes h -1). However, during windy conditions (>5 m s -1) the ventilation rate was overestimated by the ring line sample (average 100 air changes h -1) compared to the perimeter samples (average 50 air changes h -1). The difference was caused by incomplete mixing of the tracer within the building. The ventilation rate estimated from the perimeter samples was used for the calculation of the emission rate. Preliminary estimates of the ammonia emission factor were 6.0 kg NH 3 (500 kg live-weight) -1 (190 d) -1 for the slurry-based dairy unit and 3.7 for the straw-bedded beef unit.

  5. An Alu-based, MGB Eclipse real-time PCR method for quantitation of human DNA in forensic samples.

    PubMed

    Nicklas, Janice A; Buel, Eric

    2005-09-01

    The forensic community needs quick, reliable methods to quantitate human DNA in crime scene samples to replace the laborious and imprecise slot blot method. A real-time PCR based method has the possibility of allowing development of a faster and more quantitative assay. Alu sequences are primate-specific and are found in many copies in the human genome, making these sequences an excellent target or marker for human DNA. This paper describes the development of a real-time Alu sequence-based assay using MGB Eclipse primers and probes. The advantages of this assay are simplicity, speed, less hands-on-time and automated quantitation, as well as a large dynamic range (128 ng/microL to 0.5 pg/microL).

  6. Efficacy of the FilmArray blood culture identification panel for direct molecular diagnosis of infectious diseases from samples other than blood.

    PubMed

    Micó, Miquel; Navarro, Ferran; de Miniac, Daniela; González, Yésica; Brell, Albert; López, Cristina; Sánchez-Reus, Ferran; Mirelis, Beatriz; Coll, Pere

    2015-12-01

    Molecular-based techniques reduce the delay in diagnosing infectious diseases and therefore contribute to better patient outcomes. We assessed the FilmArray blood culture identification (BCID) panel (Biofire Diagnostics/bioMérieux) directly on clinical specimens other than blood: cerebrospinal, joint, pleural and ascitic fluids, bronchoscopy samples and abscesses. We compared the results from 88 samples obtained by culture-based techniques. The percentage of agreement between the two methods was 75 % with a Cohen κ value of 0.51. Global sensitivity and specificity using the FilmArray BCID panel were 71 and 97 %, respectively. Sensitivity was poorer in samples with a low bacterial load, such as ascitic and pleural fluids (25 %), whereas the sensitivity for abscess samples was high (89 %). These findings suggest that the FilmArray BCID panel could be useful to perform microbiological diagnosis directly from samples other than positive blood cultures, as it offers acceptable sensitivity and moderate agreement with conventional microbiological methods. Nevertheless, cost-benefit studies should be performed before introducing this method into algorithms for microbiological diagnostics.

  7. Asteroid orbital inversion using uniform phase-space sampling

    NASA Astrophysics Data System (ADS)

    Muinonen, K.; Pentikäinen, H.; Granvik, M.; Oszkiewicz, D.; Virtanen, J.

    2014-07-01

    We review statistical inverse methods for asteroid orbit computation from a small number of astrometric observations and short time intervals of observations. With the help of Markov-chain Monte Carlo methods (MCMC), we present a novel inverse method that utilizes uniform sampling of the phase space for the orbital elements. The statistical orbital ranging method (Virtanen et al. 2001, Muinonen et al. 2001) was set out to resolve the long-lasting challenges in the initial computation of orbits for asteroids. The ranging method starts from the selection of a pair of astrometric observations. Thereafter, the topocentric ranges and angular deviations in R.A. and Decl. are randomly sampled. The two Cartesian positions allow for the computation of orbital elements and, subsequently, the computation of ephemerides for the observation dates. Candidate orbital elements are included in the sample of accepted elements if the χ^2-value between the observed and computed observations is within a pre-defined threshold. The sample orbital elements obtain weights based on a certain debiasing procedure. When the weights are available, the full sample of orbital elements allows the probabilistic assessments for, e.g., object classification and ephemeris computation as well as the computation of collision probabilities. The MCMC ranging method (Oszkiewicz et al. 2009; see also Granvik et al. 2009) replaces the original sampling algorithm described above with a proposal probability density function (p.d.f.), and a chain of sample orbital elements results in the phase space. MCMC ranging is based on a bivariate Gaussian p.d.f. for the topocentric ranges, and allows for the sampling to focus on the phase-space domain with most of the probability mass. In the virtual-observation MCMC method (Muinonen et al. 2012), the proposal p.d.f. for the orbital elements is chosen to mimic the a posteriori p.d.f. for the elements: first, random errors are simulated for each observation, resulting in a set of virtual observations; second, corresponding virtual least-squares orbital elements are derived using the Nelder-Mead downhill simplex method; third, repeating the procedure two times allows for a computation of a difference for two sets of virtual orbital elements; and, fourth, this orbital-element difference constitutes a symmetric proposal in a random-walk Metropolis-Hastings algorithm, avoiding the explicit computation of the proposal p.d.f. In a discrete approximation, the allowed proposals coincide with the differences that are based on a large number of pre-computed sets of virtual least-squares orbital elements. The virtual-observation MCMC method is thus based on the characterization of the relevant volume in the orbital-element phase space. Here we utilize MCMC to map the phase-space domain of acceptable solutions. We can make use of the proposal p.d.f.s from the MCMC ranging and virtual-observation methods. The present phase-space mapping produces, upon convergence, a uniform sampling of the solution space within a pre-defined χ^2-value. The weights of the sampled orbital elements are then computed on the basis of the corresponding χ^2-values. The present method resembles the original ranging method. On one hand, MCMC mapping is insensitive to local extrema in the phase space and efficiently maps the solution space. This is somewhat contrary to the MCMC methods described above. On the other hand, MCMC mapping can suffer from producing a small number of sample elements with small χ^2-values, in resemblance to the original ranging method. We apply the methods to example near-Earth, main-belt, and transneptunian objects, and highlight the utilization of the methods in the data processing and analysis pipeline of the ESA Gaia space mission.

  8. Evaluation of respondent-driven sampling.

    PubMed

    McCreesh, Nicky; Frost, Simon D W; Seeley, Janet; Katongole, Joseph; Tarsh, Matilda N; Ndunguse, Richard; Jichi, Fatima; Lunel, Natasha L; Maher, Dermot; Johnston, Lisa G; Sonnenberg, Pam; Copas, Andrew J; Hayes, Richard J; White, Richard G

    2012-01-01

    Respondent-driven sampling is a novel variant of link-tracing sampling for estimating the characteristics of hard-to-reach groups, such as HIV prevalence in sex workers. Despite its use by leading health organizations, the performance of this method in realistic situations is still largely unknown. We evaluated respondent-driven sampling by comparing estimates from a respondent-driven sampling survey with total population data. Total population data on age, tribe, religion, socioeconomic status, sexual activity, and HIV status were available on a population of 2402 male household heads from an open cohort in rural Uganda. A respondent-driven sampling (RDS) survey was carried out in this population, using current methods of sampling (RDS sample) and statistical inference (RDS estimates). Analyses were carried out for the full RDS sample and then repeated for the first 250 recruits (small sample). We recruited 927 household heads. Full and small RDS samples were largely representative of the total population, but both samples underrepresented men who were younger, of higher socioeconomic status, and with unknown sexual activity and HIV status. Respondent-driven sampling statistical inference methods failed to reduce these biases. Only 31%-37% (depending on method and sample size) of RDS estimates were closer to the true population proportions than the RDS sample proportions. Only 50%-74% of respondent-driven sampling bootstrap 95% confidence intervals included the population proportion. Respondent-driven sampling produced a generally representative sample of this well-connected nonhidden population. However, current respondent-driven sampling inference methods failed to reduce bias when it occurred. Whether the data required to remove bias and measure precision can be collected in a respondent-driven sampling survey is unresolved. Respondent-driven sampling should be regarded as a (potentially superior) form of convenience sampling method, and caution is required when interpreting findings based on the sampling method.

  9. Cellulose I crystallinity determination using FT-Raman spectroscopy : univariate and multivariate methods

    Treesearch

    Umesh P. Agarwal; Richard S. Reiner; Sally A. Ralph

    2010-01-01

    Two new methods based on FT–Raman spectroscopy, one simple, based on band intensity ratio, and the other using a partial least squares (PLS) regression model, are proposed to determine cellulose I crystallinity. In the simple method, crystallinity in cellulose I samples was determined based on univariate regression that was first developed using the Raman band...

  10. Time-dependent importance sampling in semiclassical initial value representation calculations for time correlation functions.

    PubMed

    Tao, Guohua; Miller, William H

    2011-07-14

    An efficient time-dependent importance sampling method is developed for the Monte Carlo calculation of time correlation functions via the initial value representation (IVR) of semiclassical (SC) theory. A prefactor-free time-dependent sampling function weights the importance of a trajectory based on the magnitude of its contribution to the time correlation function, and global trial moves are used to facilitate the efficient sampling the phase space of initial conditions. The method can be generally applied to sampling rare events efficiently while avoiding being trapped in a local region of the phase space. Results presented in the paper for two system-bath models demonstrate the efficiency of this new importance sampling method for full SC-IVR calculations.

  11. K-Nearest Neighbor Algorithm Optimization in Text Categorization

    NASA Astrophysics Data System (ADS)

    Chen, Shufeng

    2018-01-01

    K-Nearest Neighbor (KNN) classification algorithm is one of the simplest methods of data mining. It has been widely used in classification, regression and pattern recognition. The traditional KNN method has some shortcomings such as large amount of sample computation and strong dependence on the sample library capacity. In this paper, a method of representative sample optimization based on CURE algorithm is proposed. On the basis of this, presenting a quick algorithm QKNN (Quick k-nearest neighbor) to find the nearest k neighbor samples, which greatly reduces the similarity calculation. The experimental results show that this algorithm can effectively reduce the number of samples and speed up the search for the k nearest neighbor samples to improve the performance of the algorithm.

  12. Protocol for Detection of Yersinia pestis in Environmental ...

    EPA Pesticide Factsheets

    Methods Report This is the first ever open-access and detailed protocol available to all government departments and agencies, and their contractors to detect Yersinia pestis, the pathogen that causes plague, from multiple environmental sample types including water. Each analytical method includes sample processing procedure for each sample type in a step-by-step manner. It includes real-time PCR, traditional microbiological culture, and the Rapid Viability PCR (RV-PCR) analytical methods. For large volume water samples it also includes an ultra-filtration-based sample concentration procedure. Because of such a non-restrictive availability of this protocol to all government departments and agencies, and their contractors, the nation will now have increased laboratory capacity to analyze large number of samples during a wide-area plague incident.

  13. Direct sampling of cystic fibrosis lungs indicates that DNA-based analyses of upper-airway specimens can misrepresent lung microbiota.

    PubMed

    Goddard, Amanda F; Staudinger, Benjamin J; Dowd, Scot E; Joshi-Datar, Amruta; Wolcott, Randall D; Aitken, Moira L; Fligner, Corinne L; Singh, Pradeep K

    2012-08-21

    Recent work using culture-independent methods suggests that the lungs of cystic fibrosis (CF) patients harbor a vast array of bacteria not conventionally implicated in CF lung disease. However, sampling lung secretions in living subjects requires that expectorated specimens or collection devices pass through the oropharynx. Thus, contamination could confound results. Here, we compared culture-independent analyses of throat and sputum specimens to samples directly obtained from the lungs at the time of transplantation. We found that CF lungs with advanced disease contained relatively homogenous populations of typical CF pathogens. In contrast, upper-airway specimens from the same subjects contained higher levels of microbial diversity and organisms not typically considered CF pathogens. Furthermore, sputum exhibited day-to-day variation in the abundance of nontypical organisms, even in the absence of clinical changes. These findings suggest that oropharyngeal contamination could limit the accuracy of DNA-based measurements on upper-airway specimens. This work highlights the importance of sampling procedures for microbiome studies and suggests that methods that account for contamination are needed when DNA-based methods are used on clinical specimens.

  14. Ensemble based adaptive over-sampling method for imbalanced data learning in computer aided detection of microaneurysm.

    PubMed

    Ren, Fulong; Cao, Peng; Li, Wei; Zhao, Dazhe; Zaiane, Osmar

    2017-01-01

    Diabetic retinopathy (DR) is a progressive disease, and its detection at an early stage is crucial for saving a patient's vision. An automated screening system for DR can help in reduce the chances of complete blindness due to DR along with lowering the work load on ophthalmologists. Among the earliest signs of DR are microaneurysms (MAs). However, current schemes for MA detection appear to report many false positives because detection algorithms have high sensitivity. Inevitably some non-MAs structures are labeled as MAs in the initial MAs identification step. This is a typical "class imbalance problem". Class imbalanced data has detrimental effects on the performance of conventional classifiers. In this work, we propose an ensemble based adaptive over-sampling algorithm for overcoming the class imbalance problem in the false positive reduction, and we use Boosting, Bagging, Random subspace as the ensemble framework to improve microaneurysm detection. The ensemble based over-sampling methods we proposed combine the strength of adaptive over-sampling and ensemble. The objective of the amalgamation of ensemble and adaptive over-sampling is to reduce the induction biases introduced from imbalanced data and to enhance the generalization classification performance of extreme learning machines (ELM). Experimental results show that our ASOBoost method has higher area under the ROC curve (AUC) and G-mean values than many existing class imbalance learning methods. Copyright © 2016 Elsevier Ltd. All rights reserved.

  15. An Improved Sparse Representation over Learned Dictionary Method for Seizure Detection.

    PubMed

    Li, Junhui; Zhou, Weidong; Yuan, Shasha; Zhang, Yanli; Li, Chengcheng; Wu, Qi

    2016-02-01

    Automatic seizure detection has played an important role in the monitoring, diagnosis and treatment of epilepsy. In this paper, a patient specific method is proposed for seizure detection in the long-term intracranial electroencephalogram (EEG) recordings. This seizure detection method is based on sparse representation with online dictionary learning and elastic net constraint. The online learned dictionary could sparsely represent the testing samples more accurately, and the elastic net constraint which combines the 11-norm and 12-norm not only makes the coefficients sparse but also avoids over-fitting problem. First, the EEG signals are preprocessed using wavelet filtering and differential filtering, and the kernel function is applied to make the samples closer to linearly separable. Then the dictionaries of seizure and nonseizure are respectively learned from original ictal and interictal training samples with online dictionary optimization algorithm to compose the training dictionary. After that, the test samples are sparsely coded over the learned dictionary and the residuals associated with ictal and interictal sub-dictionary are calculated, respectively. Eventually, the test samples are classified as two distinct categories, seizure or nonseizure, by comparing the reconstructed residuals. The average segment-based sensitivity of 95.45%, specificity of 99.08%, and event-based sensitivity of 94.44% with false detection rate of 0.23/h and average latency of -5.14 s have been achieved with our proposed method.

  16. Comparability among four invertebrate sampling methods, Fountain Creek Basin, Colorado, 2010-2012

    USGS Publications Warehouse

    Zuellig, Robert E.; Bruce, James F.; Stogner, Sr., Robert W.; Brown, Krystal D.

    2014-01-01

    The U.S. Geological Survey, in cooperation with Colorado Springs City Engineering and Colorado Springs Utilities, designed a study to determine if sampling method and sample timing resulted in comparable samples and assessments of biological condition. To accomplish this task, annual invertebrate samples were collected concurrently using four sampling methods at 15 U.S. Geological Survey streamflow gages in the Fountain Creek basin from 2010 to 2012. Collectively, the four methods are used by local (U.S. Geological Survey cooperative monitoring program) and State monitoring programs (Colorado Department of Public Health and Environment) in the Fountain Creek basin to produce two distinct sample types for each program that target single-and multiple-habitats. This study found distinguishable differences between single-and multi-habitat sample types using both community similarities and multi-metric index values, while methods from each program within sample type were comparable. This indicates that the Colorado Department of Public Health and Environment methods were compatible with the cooperative monitoring program methods within multi-and single-habitat sample types. Comparisons between September and October samples found distinguishable differences based on community similarities for both sample types, whereas only differences were found for single-habitat samples when multi-metric index values were considered. At one site, differences between September and October index values from single-habitat samples resulted in opposing assessments of biological condition. Direct application of the results to inform the revision of the existing Fountain Creek basin U.S. Geological Survey cooperative monitoring program are discussed.

  17. Detection of an amphiphilic biosample in a paper microchannel based on length.

    PubMed

    Chen, Yu-Tzu; Yang, Jing-Tang

    2015-01-01

    We developed a simple method to achieve semiquantitative detection of an amphiphilic biosample through measuring the length of flow on a microfluidic analytical device (μPAD) based on paper. When an amphiphilic sample was dripped into a straight microchannel defined with a printed wax barrier (hydrophobic) on filter paper (hydrophilic), the length of flow was affected by the reciprocal effect between the sample, the filter-paper channel and the wax barrier. The flow length decreased with increasing concentration of an amphiphilic sample because of adsorption of the sample on the hydrophobic barrier. Measurement of the flow length enabled a determination of the concentration of the amphiphilic sample. The several tested samples included surfactants (Tween 20 and Triton X-100), oligonucleotides (DNA), bovine serum albumin (BSA), human albumin, nitrite, glucose and low-density lipoprotein (LDL). The results show that the measurement of the flow length determined directly the concentration of an amphiphilic sample, whereas a non-amphiphilic sample was not amenable to this method. The proposed method features the advantages of small cost, simplicity, convenience, directness, rapidity (<5 min) and requirement of only a small volume (5 μL) of sample, with prospective applications in developing areas and sites near patients for testing at a point of care (POCT).

  18. Comparison of base composition analysis and Sanger sequencing of mitochondrial DNA for four U.S. population groups.

    PubMed

    Kiesler, Kevin M; Coble, Michael D; Hall, Thomas A; Vallone, Peter M

    2014-01-01

    A set of 711 samples from four U.S. population groups was analyzed using a novel mass spectrometry based method for mitochondrial DNA (mtDNA) base composition profiling. Comparison of the mass spectrometry results with Sanger sequencing derived data yielded a concordance rate of 99.97%. Length heteroplasmy was identified in 46% of samples and point heteroplasmy was observed in 6.6% of samples in the combined mass spectral and Sanger data set. Using discrimination capacity as a metric, Sanger sequencing of the full control region had the highest discriminatory power, followed by the mass spectrometry base composition method, which was more discriminating than Sanger sequencing of just the hypervariable regions. This trend is in agreement with the number of nucleotides covered by each of the three assays. Published by Elsevier Ireland Ltd.

  19. Long-Term Frozen Storage of Urine Samples: A Trouble to Get PCR Results in Schistosoma spp. DNA Detection?

    PubMed Central

    Fernández-Soto, Pedro; Velasco Tirado, Virginia; Carranza Rodríguez, Cristina; Pérez-Arellano, José Luis; Muro, Antonio

    2013-01-01

    Background Human schistosomiasis remains a serious worldwide public health problem. At present, a sensitive and specific assay for routine diagnosis of schistosome infection is not yet available. The potential for detecting schistosome-derived DNA by PCR-based methods in human clinical samples is currently being investigated as a diagnostic tool with potential application in routine schistosomiasis diagnosis. Collection of diagnostic samples such as stool or blood is usually difficult in some populations. However, urine is a biological sample that can be collected in a non-invasive method, easy to get from people of all ages and easy in management, but as a sample for PCR diagnosis is still not widely used. This could be due to the high variability in the reported efficiency of detection as a result of the high variation in urine samples’ storage or conditions for handling and DNA preservation and extraction methods. Methodology/Principal Findings We evaluate different commercial DNA extraction methods from a series of long-term frozen storage human urine samples from patients with parasitological confirmed schistosomiasis in order to assess the PCR effectiveness for Schistosoma spp. detection. Patientś urine samples were frozen for 18 months up to 7 years until use. Results were compared with those obtained in PCR assays using fresh healthy human urine artificially contaminated with Schistosoma mansoni DNA and urine samples from mice experimentally infected with S. mansoni cercariae stored frozen for at least 12 months before use. PCR results in fresh human artificial urine samples using different DNA based extraction methods were much more effective than those obtained when long-term frozen human urine samples were used as the source of DNA template. Conclusions/Significance Long-term frozen human urine samples are probably not a good source for DNA extraction for use as a template in PCR detection of Schistosoma spp., regardless of the DNA method of extraction used. PMID:23613907

  20. Comparative effectiveness and acceptability of home-based and clinic-based sampling methods for sexually transmissible infections screening in females aged 14-50 years: a systematic review and meta-analysis.

    PubMed

    Odesanmi, Tolulope Y; Wasti, Sharada P; Odesanmi, Omolola S; Adegbola, Omololu; Oguntuase, Olubukola O; Mahmood, Sajid

    2013-12-01

    Home-based sampling is a strategy to enhance uptake of sexually transmissible infection (STI) screening. This review aimed to compare the screening uptake levels of home-based self-sampling and clinic-based specimen collection for STIs (chlamydia (Chlamydia trachomatis), gonorrhoea (Neisseria gonorrhoeae) and trichomoniasis) in females aged 14-50 years. Acceptability and effect on specimen quality were determined. Sixteen electronic databases were searched from inception to September 2012. Randomised controlled trials (RCTs) comparing the uptake levels of home-based self-sampling and clinic-based sampling for chlamydia, gonorrhoea and trichomoniasis in females aged 14-50 years were eligible for inclusion. The risk of bias in the trials was assessed. Risk ratios (RRs) for dichotomous outcomes were meta-analysed. Of 3065 papers, six studies with seven RCTs contributed to the final review. Compared with clinic-based methods, home-based screening increased uptake significantly (P=0.001-0.05) in five trials and was substantiated in a meta-analysis (RR: 1.55; 95% confidence interval: 1.30-1.85; P=0.00001) of two trials. In three trials, a significant preference for home-based testing (P=0.001-0.05) was expressed. No significant difference was observed in specimen quality. Sampling was rated as easy by a significantly higher number of women (P=0.01) in the clinic group in one trial. The review provides evidence that home-based testing results in greater uptake of STI screening in females (14-50 years) than clinic-based testing without compromising quality in the developed world. Home collection strategies should be added to clinic-based screening programs to enhance uptake.

  1. Simultaneous LC/MS/MS determination of thiols and disulfides in urine samples based on differential labeling with ferrocene-based maleimides.

    PubMed

    Seiwert, Bettina; Karst, Uwe

    2007-09-15

    A method for the simultaneous determination of a series of thiols and disulfides in urine samples has been developed based on the sequential labeling of free and bound thiol functionalities with two ferrocene-based maleimide reagents. The sample is first exposed to N-(2-ferroceneethyl)maleimide, thus leading to the derivatization of free thiol groups in the sample. After quantitative reaction and subsequent reduction of the disulfide-bound thiols by tris(2-carboxyethyl)phosphine, the newly formed thiol functionalities are reacted with ferrocenecarboxylic acid-(2-maleimidoyl)ethylamide. The reaction products are determined by LC/MS/MS in the multiple reaction mode, and precursor ion scan as well as neutral loss scan is applied to detect unknown further thiols. The method was successfully applied to the analysis of free and disulfide-bound thiols in urine samples. Limits of detection are 30 to 110 nM, and the linear range comprises two decades of concentration, thus covering the relevant concentration range of thiols in urine samples. The thiol and disulfide concentrations were referred to the creatinine content to compensate for different sample volumes. As some calibration standards for the disulfides are not commercially available, they were synthesized in an electrochemical flow-through cell. This allowed the synthesis of hetero- and homodimeric disulfides.

  2. 45 CFR 286.260 - May Tribes use sampling and electronic filing?

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... method” means a probability sampling method in which every sampling unit has a known, non-zero chance to... quarterly reports electronically, based on format specifications that we will provide. Tribes who do not...

  3. Enhanced sampling simulations to construct free-energy landscape of protein-partner substrate interaction.

    PubMed

    Ikebe, Jinzen; Umezawa, Koji; Higo, Junichi

    2016-03-01

    Molecular dynamics (MD) simulations using all-atom and explicit solvent models provide valuable information on the detailed behavior of protein-partner substrate binding at the atomic level. As the power of computational resources increase, MD simulations are being used more widely and easily. However, it is still difficult to investigate the thermodynamic properties of protein-partner substrate binding and protein folding with conventional MD simulations. Enhanced sampling methods have been developed to sample conformations that reflect equilibrium conditions in a more efficient manner than conventional MD simulations, thereby allowing the construction of accurate free-energy landscapes. In this review, we discuss these enhanced sampling methods using a series of case-by-case examples. In particular, we review enhanced sampling methods conforming to trivial trajectory parallelization, virtual-system coupled multicanonical MD, and adaptive lambda square dynamics. These methods have been recently developed based on the existing method of multicanonical MD simulation. Their applications are reviewed with an emphasis on describing their practical implementation. In our concluding remarks we explore extensions of the enhanced sampling methods that may allow for even more efficient sampling.

  4. An opportunity cost approach to sample size calculation in cost-effectiveness analysis.

    PubMed

    Gafni, A; Walter, S D; Birch, S; Sendi, P

    2008-01-01

    The inclusion of economic evaluations as part of clinical trials has led to concerns about the adequacy of trial sample size to support such analysis. The analytical tool of cost-effectiveness analysis is the incremental cost-effectiveness ratio (ICER), which is compared with a threshold value (lambda) as a method to determine the efficiency of a health-care intervention. Accordingly, many of the methods suggested to calculating the sample size requirements for the economic component of clinical trials are based on the properties of the ICER. However, use of the ICER and a threshold value as a basis for determining efficiency has been shown to be inconsistent with the economic concept of opportunity cost. As a result, the validity of the ICER-based approaches to sample size calculations can be challenged. Alternative methods for determining improvements in efficiency have been presented in the literature that does not depend upon ICER values. In this paper, we develop an opportunity cost approach to calculating sample size for economic evaluations alongside clinical trials, and illustrate the approach using a numerical example. We compare the sample size requirement of the opportunity cost method with the ICER threshold method. In general, either method may yield the larger required sample size. However, the opportunity cost approach, although simple to use, has additional data requirements. We believe that the additional data requirements represent a small price to pay for being able to perform an analysis consistent with both concept of opportunity cost and the problem faced by decision makers. Copyright (c) 2007 John Wiley & Sons, Ltd.

  5. European validation of Real-Time PCR method for detection of Salmonella spp. in pork meat.

    PubMed

    Delibato, Elisabetta; Rodriguez-Lazaro, David; Gianfranceschi, Monica; De Cesare, Alessandra; Comin, Damiano; Gattuso, Antonietta; Hernandez, Marta; Sonnessa, Michele; Pasquali, Frédérique; Sreter-Lancz, Zuzsanna; Saiz-Abajo, María-José; Pérez-De-Juan, Javier; Butrón, Javier; Prukner-Radovcic, Estella; Horvatek Tomic, Danijela; Johannessen, Gro S; Jakočiūnė, Džiuginta; Olsen, John E; Chemaly, Marianne; Le Gall, Francoise; González-García, Patricia; Lettini, Antonia Anna; Lukac, Maja; Quesne, Segolénè; Zampieron, Claudia; De Santis, Paola; Lovari, Sarah; Bertasi, Barbara; Pavoni, Enrico; Proroga, Yolande T R; Capuano, Federico; Manfreda, Gerardo; De Medici, Dario

    2014-08-01

    The classical microbiological method for detection of Salmonella spp. requires more than five days for final confirmation, and consequently there is a need for an alternative methodology for detection of this pathogen particularly in those food categories with a short shelf-life. This study presents an international (at European level) ISO 16140-based validation study of a non-proprietary Real-Time PCR-based method that can generate final results the day following sample analysis. It is based on an ISO compatible enrichment coupled to an easy and inexpensive DNA extraction and a consolidated Real-Time PCR assay. Thirteen laboratories from seven European Countries participated to this trial, and pork meat was selected as food model. The limit of detection observed was down to 10 CFU per 25 g of sample, showing excellent concordance and accordance values between samples and laboratories (100%). In addition, excellent values were obtained for relative accuracy, specificity and sensitivity (100%) when the results obtained for the Real-Time PCR-based methods were compared to those of the ISO 6579:2002 standard method. The results of this international trial demonstrate that the evaluated Real-Time PCR-based method represents an excellent alternative to the ISO standard. In fact, it shows an equal and solid performance as well as it reduces dramatically the extent of the analytical process, and can be easily implemented routinely by the Competent Authorities and Food Industry laboratories. Copyright © 2014 Elsevier B.V. All rights reserved.

  6. LETTER TO THE EDITOR: Free-response operator characteristic models for visual search

    NASA Astrophysics Data System (ADS)

    Hutchinson, T. P.

    2007-05-01

    Computed tomography of diffraction enhanced imaging (DEI-CT) is a novel x-ray phase-contrast computed tomography which is applied to inspect weakly absorbing low-Z samples. Refraction-angle images which are extracted from a series of raw DEI images measured in different positions of the rocking curve of the analyser can be regarded as projections of DEI-CT. Based on them, the distribution of refractive index decrement in the sample can be reconstructed according to the principles of CT. How to combine extraction methods and reconstruction algorithms to obtain the most accurate reconstructed results is investigated in detail in this paper. Two kinds of comparison, the comparison of different extraction methods and the comparison between 'two-step' algorithms and the Hilbert filtered backprojection (HFBP) algorithm, draw the conclusion that the HFBP algorithm based on the maximum refraction-angle (MRA) method may be the best combination at present. Though all current extraction methods including the MRA method are approximate methods and cannot calculate very large refraction-angle values, the HFBP algorithm based on the MRA method is able to provide quite acceptable estimations of the distribution of refractive index decrement of the sample. The conclusion is proved by the experimental results at the Beijing Synchrotron Radiation Facility.

  7. Note: A simple image processing based fiducial auto-alignment method for sample registration.

    PubMed

    Robertson, Wesley D; Porto, Lucas R; Ip, Candice J X; Nantel, Megan K T; Tellkamp, Friedjof; Lu, Yinfei; Miller, R J Dwayne

    2015-08-01

    A simple method for the location and auto-alignment of sample fiducials for sample registration using widely available MATLAB/LabVIEW software is demonstrated. The method is robust, easily implemented, and applicable to a wide variety of experiment types for improved reproducibility and increased setup speed. The software uses image processing to locate and measure the diameter and center point of circular fiducials for distance self-calibration and iterative alignment and can be used with most imaging systems. The method is demonstrated to be fast and reliable in locating and aligning sample fiducials, provided here by a nanofabricated array, with accuracy within the optical resolution of the imaging system. The software was further demonstrated to register, load, and sample the dynamically wetted array.

  8. Sample Training Based Wildfire Segmentation by 2D Histogram θ-Division with Minimum Error

    PubMed Central

    Dong, Erqian; Sun, Mingui; Jia, Wenyan; Zhang, Dengyi; Yuan, Zhiyong

    2013-01-01

    A novel wildfire segmentation algorithm is proposed with the help of sample training based 2D histogram θ-division and minimum error. Based on minimum error principle and 2D color histogram, the θ-division methods were presented recently, but application of prior knowledge on them has not been explored. For the specific problem of wildfire segmentation, we collect sample images with manually labeled fire pixels. Then we define the probability function of error division to evaluate θ-division segmentations, and the optimal angle θ is determined by sample training. Performances in different color channels are compared, and the suitable channel is selected. To further improve the accuracy, the combination approach is presented with both θ-division and other segmentation methods such as GMM. Our approach is tested on real images, and the experiments prove its efficiency for wildfire segmentation. PMID:23878526

  9. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Fangyan; Zhang, Song; Chung Wong, Pak

    Effectively visualizing large graphs and capturing the statistical properties are two challenging tasks. To aid in these two tasks, many sampling approaches for graph simplification have been proposed, falling into three categories: node sampling, edge sampling, and traversal-based sampling. It is still unknown which approach is the best. We evaluate commonly used graph sampling methods through a combined visual and statistical comparison of graphs sampled at various rates. We conduct our evaluation on three graph models: random graphs, small-world graphs, and scale-free graphs. Initial results indicate that the effectiveness of a sampling method is dependent on the graph model, themore » size of the graph, and the desired statistical property. This benchmark study can be used as a guideline in choosing the appropriate method for a particular graph sampling task, and the results presented can be incorporated into graph visualization and analysis tools.« less

  10. Evaluating Ethanol-based Sample Preservation to Facilitate Use of DNA Barcoding in Routine Freshwater Biomonitoring Programs Using Benthic Macroinvertebrates

    EPA Science Inventory

    Molecular methods, such as DNA barcoding, have the potential in enhance biomonitoring programs worldwide. Altering routinely used sample preservation methods to protect DNA from degradation may pose a potential impediment to application of DNA barcoding and metagenomics for biom...

  11. Migration monitoring with automated technology

    Treesearch

    Rhonda L. Millikin

    2005-01-01

    Automated technology can supplement ground-based methods of migration monitoring by providing: (1) unbiased and automated sampling; (2) independent validation of current methods; (3) a larger sample area for landscape-level analysis of habitat selection for stopover, and (4) an opportunity to study flight behavior. In particular, radar-acoustic sensor fusion can...

  12. Differential expression analysis for RNAseq using Poisson mixed models.

    PubMed

    Sun, Shiquan; Hood, Michelle; Scott, Laura; Peng, Qinke; Mukherjee, Sayan; Tung, Jenny; Zhou, Xiang

    2017-06-20

    Identifying differentially expressed (DE) genes from RNA sequencing (RNAseq) studies is among the most common analyses in genomics. However, RNAseq DE analysis presents several statistical and computational challenges, including over-dispersed read counts and, in some settings, sample non-independence. Previous count-based methods rely on simple hierarchical Poisson models (e.g. negative binomial) to model independent over-dispersion, but do not account for sample non-independence due to relatedness, population structure and/or hidden confounders. Here, we present a Poisson mixed model with two random effects terms that account for both independent over-dispersion and sample non-independence. We also develop a scalable sampling-based inference algorithm using a latent variable representation of the Poisson distribution. With simulations, we show that our method properly controls for type I error and is generally more powerful than other widely used approaches, except in small samples (n <15) with other unfavorable properties (e.g. small effect sizes). We also apply our method to three real datasets that contain related individuals, population stratification or hidden confounders. Our results show that our method increases power in all three data compared to other approaches, though the power gain is smallest in the smallest sample (n = 6). Our method is implemented in MACAU, freely available at www.xzlab.org/software.html. © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.

  13. Method for charging a hydrogen getter

    DOEpatents

    Tracy, C.E.; Keyser, M.A.; Benson, D.K.

    1998-09-15

    A method for charging a sample of either a permanent or reversible getter material with a high concentration of hydrogen while maintaining a base pressure below 10{sup {minus}4} torr at room temperature involves placing the sample of hydrogen getter material in a chamber, activating the sample of hydrogen getter material, overcharging the sample of getter material through conventional charging techniques to a high concentration of hydrogen, and then subjecting the sample of getter material to a low temperature vacuum bake-out process. Application of the method results in a reversible hydrogen getter which is highly charged to maximum capacities of hydrogen and which concurrently exhibits minimum hydrogen vapor pressures at room temperatures. 9 figs.

  14. Material characterization in partially filled waveguides using inverse scattering and multiple sample orientations

    NASA Astrophysics Data System (ADS)

    Sjöberg, Daniel; Larsson, Christer

    2015-06-01

    We present a method aimed at reducing uncertainties and instabilities when characterizing materials in waveguide setups. The method is based on measuring the S parameters for three different orientations of a rectangular sample block in a rectangular waveguide. The corresponding geometries are modeled in a commercial full-wave simulation program, taking any material parameters as input. The material parameters of the sample are found by minimizing the squared distance between measured and calculated S parameters. The information added by the different sample orientations is quantified using the Cramér-Rao lower bound. The flexibility of the method allows the determination of material parameters of an arbitrarily shaped sample that fits in the waveguide.

  15. Determination of maximal amount of minor gases adsorbed in a shale sample by headspace gas chromatography.

    PubMed

    Zhang, Chun-Yun; Hu, Hui-Chao; Chai, Xin-Sheng; Pan, Lei; Xiao, Xian-Ming

    2014-02-07

    In this paper, we present a novel method for determining the maximal amount of ethane, a minor gas species, adsorbed in a shale sample. The method is based on the time-dependent release of ethane from shale samples measured by headspace gas chromatography (HS-GC). The study includes a mathematical model for fitting the experimental data, calculating the maximal amount gas adsorbed, and predicting results at other temperatures. The method is a more efficient alternative to the isothermal adsorption method that is in widespread use today. Copyright © 2013 Elsevier B.V. All rights reserved.

  16. A Simple Sampling Method for Estimating the Accuracy of Large Scale Record Linkage Projects.

    PubMed

    Boyd, James H; Guiver, Tenniel; Randall, Sean M; Ferrante, Anna M; Semmens, James B; Anderson, Phil; Dickinson, Teresa

    2016-05-17

    Record linkage techniques allow different data collections to be brought together to provide a wider picture of the health status of individuals. Ensuring high linkage quality is important to guarantee the quality and integrity of research. Current methods for measuring linkage quality typically focus on precision (the proportion of incorrect links), given the difficulty of measuring the proportion of false negatives. The aim of this work is to introduce and evaluate a sampling based method to estimate both precision and recall following record linkage. In the sampling based method, record-pairs from each threshold (including those below the identified cut-off for acceptance) are sampled and clerically reviewed. These results are then applied to the entire set of record-pairs, providing estimates of false positives and false negatives. This method was evaluated on a synthetically generated dataset, where the true match status (which records belonged to the same person) was known. The sampled estimates of linkage quality were relatively close to actual linkage quality metrics calculated for the whole synthetic dataset. The precision and recall measures for seven reviewers were very consistent with little variation in the clerical assessment results (overall agreement using the Fleiss Kappa statistics was 0.601). This method presents as a possible means of accurately estimating matching quality and refining linkages in population level linkage studies. The sampling approach is especially important for large project linkages where the number of record pairs produced may be very large often running into millions.

  17. Sequential injection titration method using second-order signals: determination of acidity in plant oils and biodiesel samples.

    PubMed

    del Río, Vanessa; Larrechi, M Soledad; Callao, M Pilar

    2010-06-15

    A new concept of flow titration is proposed and demonstrated for the determination of total acidity in plant oils and biodiesel. We use sequential injection analysis (SIA) with a diode array spectrophotometric detector linked to chemometric tools such as multivariate curve resolution-alternating least squares (MCR-ALS). This system is based on the evolution of the basic specie of an acid-base indicator, alizarine, when it comes into contact with a sample that contains free fatty acids. The gradual pH change in the reactor coil due to diffusion and reaction phenomenona allows the sequential appearance of both species of the indicator in the detector coil, recording a data matrix for each sample. The SIA-MCR-ALS method helps to reduce the amounts of sample, the reagents and the time consumed. Each determination consumes 0.413ml of sample, 0.250ml of indicator and 3ml of carrier (ethanol) and generates 3.333ml of waste. The frequency of the analysis is high (12 samples h(-1) including all steps, i.e., cleaning, preparing and analysing). The utilized reagents are of common use in the laboratory and it is not necessary to use the reagents of perfect known concentration. The method was applied to determine acidity in plant oil and biodiesel samples. Results obtained by the proposed method compare well with those obtained by the official European Community method that is time consuming and uses large amounts of organic solvents.

  18. Protocol Improvements for Low Concentration DNA-Based Bioaerosol Sampling and Analysis

    PubMed Central

    Ng, Chun Kiat; Miller, Dana; Cao, Bin

    2015-01-01

    Introduction As bioaerosol research attracts increasing attention, there is a need for additional efforts that focus on method development to deal with different environmental samples. Bioaerosol environmental samples typically have very low biomass concentrations in the air, which often leaves researchers with limited options in choosing the downstream analysis steps, especially when culture-independent methods are intended. Objectives This study investigates the impacts of three important factors that can influence the performance of culture-independent DNA-based analysis in dealing with bioaerosol environmental samples engaged in this study. The factors are: 1) enhanced high temperature sonication during DNA extraction; 2) effect of sampling duration on DNA recoverability; and 3) an alternative method for concentrating composite samples. In this study, DNA extracted from samples was analysed using the Qubit fluorometer (for direct total DNA measurement) and quantitative polymerase chain reaction (qPCR). Results and Findings The findings suggest that additional lysis from high temperature sonication is crucial: DNA yields from both high and low biomass samples increased up to 600% when the protocol included 30-min sonication at 65°C. Long air sampling duration on a filter media was shown to have a negative impact on DNA recoverability with up to 98% of DNA lost over a 20-h sampling period. Pooling DNA from separate samples during extraction was proven to be feasible with margins of error below 30%. PMID:26619279

  19. Generating virtual training samples for sparse representation of face images and face recognition

    NASA Astrophysics Data System (ADS)

    Du, Yong; Wang, Yu

    2016-03-01

    There are many challenges in face recognition. In real-world scenes, images of the same face vary with changing illuminations, different expressions and poses, multiform ornaments, or even altered mental status. Limited available training samples cannot convey these possible changes in the training phase sufficiently, and this has become one of the restrictions to improve the face recognition accuracy. In this article, we view the multiplication of two images of the face as a virtual face image to expand the training set and devise a representation-based method to perform face recognition. The generated virtual samples really reflect some possible appearance and pose variations of the face. By multiplying a training sample with another sample from the same subject, we can strengthen the facial contour feature and greatly suppress the noise. Thus, more human essential information is retained. Also, uncertainty of the training data is simultaneously reduced with the increase of the training samples, which is beneficial for the training phase. The devised representation-based classifier uses both the original and new generated samples to perform the classification. In the classification phase, we first determine K nearest training samples for the current test sample by calculating the Euclidean distances between the test sample and training samples. Then, a linear combination of these selected training samples is used to represent the test sample, and the representation result is used to classify the test sample. The experimental results show that the proposed method outperforms some state-of-the-art face recognition methods.

  20. Challenges in projecting clustering results across gene expression-profiling datasets.

    PubMed

    Lusa, Lara; McShane, Lisa M; Reid, James F; De Cecco, Loris; Ambrogi, Federico; Biganzoli, Elia; Gariboldi, Manuela; Pierotti, Marco A

    2007-11-21

    Gene expression microarray studies for several types of cancer have been reported to identify previously unknown subtypes of tumors. For breast cancer, a molecular classification consisting of five subtypes based on gene expression microarray data has been proposed. These subtypes have been reported to exist across several breast cancer microarray studies, and they have demonstrated some association with clinical outcome. A classification rule based on the method of centroids has been proposed for identifying the subtypes in new collections of breast cancer samples; the method is based on the similarity of the new profiles to the mean expression profile of the previously identified subtypes. Previously identified centroids of five breast cancer subtypes were used to assign 99 breast cancer samples, including a subset of 65 estrogen receptor-positive (ER+) samples, to five breast cancer subtypes based on microarray data for the samples. The effect of mean centering the genes (i.e., transforming the expression of each gene so that its mean expression is equal to 0) on subtype assignment by method of centroids was assessed. Further studies of the effect of mean centering and of class prevalence in the test set on the accuracy of method of centroids classifications of ER status were carried out using training and test sets for which ER status had been independently determined by ligand-binding assay and for which the proportion of ER+ and ER- samples were systematically varied. When all 99 samples were considered, mean centering before application of the method of centroids appeared to be helpful for correctly assigning samples to subtypes, as evidenced by the expression of genes that had previously been used as markers to identify the subtypes. However, when only the 65 ER+ samples were considered for classification, many samples appeared to be misclassified, as evidenced by an unexpected distribution of ER+ samples among the resultant subtypes. When genes were mean centered before classification of samples for ER status, the accuracy of the ER subgroup assignments was highly dependent on the proportion of ER+ samples in the test set; this effect of subtype prevalence was not seen when gene expression data were not mean centered. Simple corrections such as mean centering of genes aimed at microarray platform or batch effect correction can have undesirable consequences because patient population effects can easily be confused with these assay-related effects. Careful thought should be given to the comparability of the patient populations before attempting to force data comparability for purposes of assigning subtypes to independent subjects.

  1. Optimal model-based sensorless adaptive optics for epifluorescence microscopy.

    PubMed

    Pozzi, Paolo; Soloviev, Oleg; Wilding, Dean; Vdovin, Gleb; Verhaegen, Michel

    2018-01-01

    We report on a universal sample-independent sensorless adaptive optics method, based on modal optimization of the second moment of the fluorescence emission from a point-like excitation. Our method employs a sample-independent precalibration, performed only once for the particular system, to establish the direct relation between the image quality and the aberration. The method is potentially applicable to any form of microscopy with epifluorescence detection, including the practically important case of incoherent fluorescence emission from a three dimensional object, through minor hardware modifications. We have applied the technique successfully to a widefield epifluorescence microscope and to a multiaperture confocal microscope.

  2. Validation of a standardized extraction method for formalin-fixed paraffin-embedded tissue samples.

    PubMed

    Lagheden, Camilla; Eklund, Carina; Kleppe, Sara Nordqvist; Unger, Elizabeth R; Dillner, Joakim; Sundström, Karin

    2016-07-01

    Formalin-fixed paraffin-embedded (FFPE) samples can be DNA-extracted and used for human papillomavirus (HPV) genotyping. The xylene-based gold standard for extracting FFPE samples is laborious, suboptimal and involves health hazards for the personnel involved. To compare extraction with the standard xylene method to a xylene-free method used in an HPV LabNet Global Reference Laboratory at the Centers for Disease Control (CDC); based on a commercial method with an extra heating step. Fifty FFPE samples were randomly selected from a national audit of all cervical cancer cases diagnosed in Sweden during 10 years. For each case-block, a blank-block was sectioned, as a control for contamination. For xylene extraction, the standard WHO Laboratory Manual protocol was used. For the CDC method, the manufacturers' protocol was followed except for an extra heating step, 120°C for 20min. Samples were extracted and tested in parallel with β-globin real-time PCR, HPV16 real-time PCR and HPV typing using modified general primers (MGP)-PCR and Luminex assays. For a valid result the blank-block had to be betaglobin-negative in all tests and the case-block positive for beta-globin. Overall, detection was improved with the heating method and the amount of HPV-positive samples increased from 70% to 86% (p=0.039). For all samples where HPV type concordance could be evaluated, there was 100% type concordance. A xylene-free and robust extraction method for HPV-DNA typing in FFPE material is currently in great demand. Our proposed standardized protocol appears to be generally useful. Copyright © 2016. Published by Elsevier B.V.

  3. Information fusion methods based on physical laws.

    PubMed

    Rao, Nageswara S V; Reister, David B; Barhen, Jacob

    2005-01-01

    We consider systems whose parameters satisfy certain easily computable physical laws. Each parameter is directly measured by a number of sensors, or estimated using measurements, or both. The measurement process may introduce both systematic and random errors which may then propagate into the estimates. Furthermore, the actual parameter values are not known since every parameter is measured or estimated, which makes the existing sample-based fusion methods inapplicable. We propose a fusion method for combining the measurements and estimators based on the least violation of physical laws that relate the parameters. Under fairly general smoothness and nonsmoothness conditions on the physical laws, we show the asymptotic convergence of our method and also derive distribution-free performance bounds based on finite samples. For suitable choices of the fuser classes, we show that for each parameter the fused estimate is probabilistically at least as good as its best measurement as well as best estimate. We illustrate the effectiveness of this method for a practical problem of fusing well-log data in methane hydrate exploration.

  4. A practical method for extending the biuret assay to protein determination of corn-based products.

    PubMed

    Liu, Zelong; Pan, Junhui

    2017-06-01

    A modified biuret method suitable for protein determination of corn-based products was developed by introducing a combination of an alkaline reagent with sodium dodecyl sulfate (reagent A) and heat treatments. The method was tested on seven corn-based samples. The results showed mostly good agreement (P>0.05) as compared to the Kjeldahl values. The proposed method was found to enhance the accuracy of prediction on zein content using bovine serum albumin as standard. Reagent A and sample treatment were proved to effectively improve protein solubilization for the thermally-dried corn-based products, e.g. corn gluten meal. The absorbance was stable for at least 1-h. Moreover, the whole measurement of protein content only needs 15-20min more than the traditional biuret assay, and can be performed in batches. The findings suggest that the proposed method could be a timesaving alternative for routine protein analyses in corn processing factories. Copyright © 2016 Elsevier Ltd. All rights reserved.

  5. Obtaining Self-Samples to Diagnose Curable Sexually Transmitted Infections: A Systematic Review of Patients’ Experiences

    PubMed Central

    Paudyal, Priyamvada; Llewellyn, Carrie; Lau, Jason; Mahmud, Mohammad; Smith, Helen

    2015-01-01

    Background Routine screening is key to sexually transmitted infection (STI) prevention and control. Previous studies suggest that clinic-based screening programmes capture only a small proportion of people with STIs. Self-sampling using non- or minimally invasive techniques may be beneficial for those reluctant to actively engage with conventional sampling methods. We systematically reviewed studies of patients’ experiences of obtaining self-samples to diagnose curable STIs. Methods We conducted an electronic search of MEDLINE, EMBASE, CINAHL, PsychINFO, BNI, and Cochrane Database of Systematic Reviews to identify relevant articles published in English between January 1980 and March 2014. Studies were included if participants self-sampled for the diagnosis of a curable STI and had specifically sought participants’ opinions of their experience, acceptability, preferences, or willingness to self-sample. Results The initial search yielded 558 references. Of these, 45 studies met the inclusion criteria. Thirty-six studies assessed patients’ acceptability and experiences of self-sampling. Pooled results from these studies shows that self-sampling is a highly acceptable method with 85% of patients reporting the method to be well received and acceptable. Twenty-eight studies reported on ease of self-sampling; the majority of patients (88%) in these studies found self-sampling an “easy” procedure. Self-sampling was favoured compared to clinician sampling, and home sampling was preferred to clinic-based sampling. Females and older participants were more accepting of self-sampling. Only a small minority of participants (13%) reported pain during self-sampling. Participants were willing to undergo self-sampling and recommend others. Privacy and safety were the most common concerns. Conclusion Self-sampling for diagnostic testing is well accepted with the majority having a positive experience and willingness to use again. Standardization of self-sampling procedures and rigorous validation of outcome measurement will lead to better comparability across studies. Future studies need to conduct rigorous economic evaluations of self-sampling to inform policy development for the management of STI. PMID:25909508

  6. Micro-sampling method based on high-resolution continuum source graphite furnace atomic absorption spectrometry for calcium determination in blood and mitochondrial suspensions.

    PubMed

    Gómez-Nieto, Beatriz; Gismera, Mª Jesús; Sevilla, Mª Teresa; Satrústegui, Jorgina; Procopio, Jesús R

    2017-08-01

    A micro-sampling and straightforward method based on high resolution continuum source atomic absorption spectrometry (HR-CS AAS) was developed to determine extracellular and intracellular Ca in samples of interest in clinical and biomedical analysis. Solid sampling platforms were used to introduce the micro-samples into the graphite furnace atomizer. The secondary absorption line for Ca, located at 239.856nm, was selected to carry out the measurements. Experimental parameters such as pyrolysis and atomization temperatures and the amount of sample introduced for the measurements were optimized. Calibration was performed using aqueous standards and the approach to measure at the wings of the absorption lines was employed for the expansion of the linear response range. The limit of detection was of 0.02mgL -1 Ca (0.39ng Ca) and the upper limit of linear range was increased up to 8.0mgL -1 Ca (160ng Ca). The proposed method was used to determine Ca in mitochondrial suspensions and whole blood samples with successful results. Adequate recoveries (within 91-107%) were obtained in the tests performed for validation purposes. Copyright © 2017 Elsevier B.V. All rights reserved.

  7. Delivery of femtolitre droplets using surface acoustic wave based atomisation for cryo-EM grid preparation.

    PubMed

    Ashtiani, Dariush; Venugopal, Hari; Belousoff, Matthew; Spicer, Bradley; Mak, Johnson; Neild, Adrian; de Marco, Alex

    2018-04-06

    Cryo-Electron Microscopy (cryo-EM) has become an invaluable tool for structural biology. Over the past decade, the advent of direct electron detectors and automated data acquisition has established cryo-EM as a central method in structural biology. However, challenges remain in the reliable and efficient preparation of samples in a manner which is compatible with high time resolution. The delivery of sample onto the grid is recognized as a critical step in the workflow as it is a source of variability and loss of material due to the blotting which is usually required. Here, we present a method for sample delivery and plunge freezing based on the use of Surface Acoustic Waves to deploy 6-8 µm droplets to the EM grid. This method minimises the sample dead volume and ensures vitrification within 52.6 ms from the moment the sample leaves the microfluidics chip. We demonstrate a working protocol to minimize the atomised volume and apply it to plunge freeze three different samples and provide proof that no damage occurs due to the interaction between the sample and the acoustic waves. Copyright © 2018 Elsevier Inc. All rights reserved.

  8. Brief communication: the relation between standard error of the estimate and sample size of histomorphometric aging methods.

    PubMed

    Hennig, Cheryl; Cooper, David

    2011-08-01

    Histomorphometric aging methods report varying degrees of precision, measured through Standard Error of the Estimate (SEE). These techniques have been developed from variable samples sizes (n) and the impact of n on reported aging precision has not been rigorously examined in the anthropological literature. This brief communication explores the relation between n and SEE through a review of the literature (abstracts, articles, book chapters, theses, and dissertations), predictions based upon sampling theory and a simulation. Published SEE values for age prediction, derived from 40 studies, range from 1.51 to 16.48 years (mean 8.63; sd: 3.81 years). In general, these values are widely distributed for smaller samples and the distribution narrows as n increases--a pattern expected from sampling theory. For the two studies that have samples in excess of 200 individuals, the SEE values are very similar (10.08 and 11.10 years) with a mean of 10.59 years. Assuming this mean value is a 'true' characterization of the error at the population level, the 95% confidence intervals for SEE values from samples of 10, 50, and 150 individuals are on the order of ± 4.2, 1.7, and 1.0 years, respectively. While numerous sources of variation potentially affect the precision of different methods, the impact of sample size cannot be overlooked. The uncertainty associated with SEE values derived from smaller samples complicates the comparison of approaches based upon different methodology and/or skeletal elements. Meaningful comparisons require larger samples than have frequently been used and should ideally be based upon standardized samples. Copyright © 2011 Wiley-Liss, Inc.

  9. Performance Evaluation and Online Realization of Data-driven Normalization Methods Used in LC/MS based Untargeted Metabolomics Analysis.

    PubMed

    Li, Bo; Tang, Jing; Yang, Qingxia; Cui, Xuejiao; Li, Shuang; Chen, Sijie; Cao, Quanxing; Xue, Weiwei; Chen, Na; Zhu, Feng

    2016-12-13

    In untargeted metabolomics analysis, several factors (e.g., unwanted experimental &biological variations and technical errors) may hamper the identification of differential metabolic features, which requires the data-driven normalization approaches before feature selection. So far, ≥16 normalization methods have been widely applied for processing the LC/MS based metabolomics data. However, the performance and the sample size dependence of those methods have not yet been exhaustively compared and no online tool for comparatively and comprehensively evaluating the performance of all 16 normalization methods has been provided. In this study, a comprehensive comparison on these methods was conducted. As a result, 16 methods were categorized into three groups based on their normalization performances across various sample sizes. The VSN, the Log Transformation and the PQN were identified as methods of the best normalization performance, while the Contrast consistently underperformed across all sub-datasets of different benchmark data. Moreover, an interactive web tool comprehensively evaluating the performance of 16 methods specifically for normalizing LC/MS based metabolomics data was constructed and hosted at http://server.idrb.cqu.edu.cn/MetaPre/. In summary, this study could serve as a useful guidance to the selection of suitable normalization methods in analyzing the LC/MS based metabolomics data.

  10. Performance Evaluation and Online Realization of Data-driven Normalization Methods Used in LC/MS based Untargeted Metabolomics Analysis

    PubMed Central

    Li, Bo; Tang, Jing; Yang, Qingxia; Cui, Xuejiao; Li, Shuang; Chen, Sijie; Cao, Quanxing; Xue, Weiwei; Chen, Na; Zhu, Feng

    2016-01-01

    In untargeted metabolomics analysis, several factors (e.g., unwanted experimental & biological variations and technical errors) may hamper the identification of differential metabolic features, which requires the data-driven normalization approaches before feature selection. So far, ≥16 normalization methods have been widely applied for processing the LC/MS based metabolomics data. However, the performance and the sample size dependence of those methods have not yet been exhaustively compared and no online tool for comparatively and comprehensively evaluating the performance of all 16 normalization methods has been provided. In this study, a comprehensive comparison on these methods was conducted. As a result, 16 methods were categorized into three groups based on their normalization performances across various sample sizes. The VSN, the Log Transformation and the PQN were identified as methods of the best normalization performance, while the Contrast consistently underperformed across all sub-datasets of different benchmark data. Moreover, an interactive web tool comprehensively evaluating the performance of 16 methods specifically for normalizing LC/MS based metabolomics data was constructed and hosted at http://server.idrb.cqu.edu.cn/MetaPre/. In summary, this study could serve as a useful guidance to the selection of suitable normalization methods in analyzing the LC/MS based metabolomics data. PMID:27958387

  11. The clustering-based case-based reasoning for imbalanced business failure prediction: a hybrid approach through integrating unsupervised process with supervised process

    NASA Astrophysics Data System (ADS)

    Li, Hui; Yu, Jun-Ling; Yu, Le-An; Sun, Jie

    2014-05-01

    Case-based reasoning (CBR) is one of the main forecasting methods in business forecasting, which performs well in prediction and holds the ability of giving explanations for the results. In business failure prediction (BFP), the number of failed enterprises is relatively small, compared with the number of non-failed ones. However, the loss is huge when an enterprise fails. Therefore, it is necessary to develop methods (trained on imbalanced samples) which forecast well for this small proportion of failed enterprises and performs accurately on total accuracy meanwhile. Commonly used methods constructed on the assumption of balanced samples do not perform well in predicting minority samples on imbalanced samples consisting of the minority/failed enterprises and the majority/non-failed ones. This article develops a new method called clustering-based CBR (CBCBR), which integrates clustering analysis, an unsupervised process, with CBR, a supervised process, to enhance the efficiency of retrieving information from both minority and majority in CBR. In CBCBR, various case classes are firstly generated through hierarchical clustering inside stored experienced cases, and class centres are calculated out by integrating cases information in the same clustered class. When predicting the label of a target case, its nearest clustered case class is firstly retrieved by ranking similarities between the target case and each clustered case class centre. Then, nearest neighbours of the target case in the determined clustered case class are retrieved. Finally, labels of the nearest experienced cases are used in prediction. In the empirical experiment with two imbalanced samples from China, the performance of CBCBR was compared with the classical CBR, a support vector machine, a logistic regression and a multi-variant discriminate analysis. The results show that compared with the other four methods, CBCBR performed significantly better in terms of sensitivity for identifying the minority samples and generated high total accuracy meanwhile. The proposed approach makes CBR useful in imbalanced forecasting.

  12. A Simple Joint Estimation Method of Residual Frequency Offset and Sampling Frequency Offset for DVB Systems

    NASA Astrophysics Data System (ADS)

    Kwon, Ki-Won; Cho, Yongsoo

    This letter presents a simple joint estimation method for residual frequency offset (RFO) and sampling frequency offset (STO) in OFDM-based digital video broadcasting (DVB) systems. The proposed method selects a continual pilot (CP) subset from an unsymmetrically and non-uniformly distributed CP set to obtain an unbiased estimator. Simulation results show that the proposed method using a properly selected CP subset is unbiased and performs robustly.

  13. Synchronization sampling method based on delta-sigma analog-digital converter for underwater towed array system.

    PubMed

    Jiang, Jia-Jia; Duan, Fa-Jie; Li, Yan-Chao; Hua, Xiang-Ning

    2014-03-01

    Synchronization sampling is very important in underwater towed array system where every acquisition node (AN) samples analog signals by its own analog-digital converter (ADC). In this paper, a simple and effective synchronization sampling method is proposed to ensure synchronized operation among different ANs of the underwater towed array system. We first present a master-slave synchronization sampling model, and then design a high accuracy phase-locked loop to synchronize all delta-sigma ADCs to a reference clock. However, when the master-slave synchronization sampling model is used, both the time-delay (TD) of messages traveling along the wired transmission medium and the jitter of the clocks will bring out synchronization sampling error (SSE). Therefore, a simple method is proposed to estimate and compensate the TD of the messages transmission, and then another effective method is presented to overcome the SSE caused by the jitter of the clocks. An experimental system with three ANs is set up, and the related experimental results verify the validity of the synchronization sampling method proposed in this paper.

  14. Synchronization sampling method based on delta-sigma analog-digital converter for underwater towed array system

    NASA Astrophysics Data System (ADS)

    Jiang, Jia-Jia; Duan, Fa-Jie; Li, Yan-Chao; Hua, Xiang-Ning

    2014-03-01

    Synchronization sampling is very important in underwater towed array system where every acquisition node (AN) samples analog signals by its own analog-digital converter (ADC). In this paper, a simple and effective synchronization sampling method is proposed to ensure synchronized operation among different ANs of the underwater towed array system. We first present a master-slave synchronization sampling model, and then design a high accuracy phase-locked loop to synchronize all delta-sigma ADCs to a reference clock. However, when the master-slave synchronization sampling model is used, both the time-delay (TD) of messages traveling along the wired transmission medium and the jitter of the clocks will bring out synchronization sampling error (SSE). Therefore, a simple method is proposed to estimate and compensate the TD of the messages transmission, and then another effective method is presented to overcome the SSE caused by the jitter of the clocks. An experimental system with three ANs is set up, and the related experimental results verify the validity of the synchronization sampling method proposed in this paper.

  15. Estimating population size with correlated sampling unit estimates

    Treesearch

    David C. Bowden; Gary C. White; Alan B. Franklin; Joseph L. Ganey

    2003-01-01

    Finite population sampling theory is useful in estimating total population size (abundance) from abundance estimates of each sampled unit (quadrat). We develop estimators that allow correlated quadrat abundance estimates, even for quadrats in different sampling strata. Correlated quadrat abundance estimates based on mark–recapture or distance sampling methods occur...

  16. Improving the spectral measurement accuracy based on temperature distribution and spectra-temperature relationship

    NASA Astrophysics Data System (ADS)

    Li, Zhe; Feng, Jinchao; Liu, Pengyu; Sun, Zhonghua; Li, Gang; Jia, Kebin

    2018-05-01

    Temperature is usually considered as a fluctuation in near-infrared spectral measurement. Chemometric methods were extensively studied to correct the effect of temperature variations. However, temperature can be considered as a constructive parameter that provides detailed chemical information when systematically changed during the measurement. Our group has researched the relationship between temperature-induced spectral variation (TSVC) and normalized squared temperature. In this study, we focused on the influence of temperature distribution in calibration set. Multi-temperature calibration set selection (MTCS) method was proposed to improve the prediction accuracy by considering the temperature distribution of calibration samples. Furthermore, double-temperature calibration set selection (DTCS) method was proposed based on MTCS method and the relationship between TSVC and normalized squared temperature. We compare the prediction performance of PLS models based on random sampling method and proposed methods. The results from experimental studies showed that the prediction performance was improved by using proposed methods. Therefore, MTCS method and DTCS method will be the alternative methods to improve prediction accuracy in near-infrared spectral measurement.

  17. The Effect of Storage and Extraction Methods on Amplification of Plasmodium falciparum DNA from Dried Blood Spots.

    PubMed

    Schwartz, Alanna; Baidjoe, Amrish; Rosenthal, Philip J; Dorsey, Grant; Bousema, Teun; Greenhouse, Bryan

    2015-05-01

    Extraction and amplification of DNA from dried blood spots (DBS) collected in field studies is commonly used for detection of Plasmodium falciparum. However, there have been few systematic efforts to determine the effects of storage and extraction methods on the sensitivity of DNA amplification. We investigated the effects of storage conditions, length of storage, and DNA extraction methods on amplification via three PCR-based assays using field samples and laboratory controls. Samples stored as DBS for 2 or more years at ambient temperature showed a significant loss of sensitivity that increased with time; after 10 years only 10% samples with parasite densities > 1,000 parasites/μL were detectable by nested polymerase chain reaction (PCR). Conversely, DBS and extracted DNA stored at -20°C showed no loss of sensitivity with time. Samples with low parasite densities amplified more successfully with saponin/Chelex compared with spin-column-based extraction, though the latter method performed better on samples with higher parasite densities stored for 2 years at ambient temperature. DNA extracted via both methods was stable after 20 freeze-thaw cycles. Our results suggest that DBS should be stored at -20°C or extracted immediately, especially if anticipating 2 or more years of storage. © The American Society of Tropical Medicine and Hygiene.

  18. The Effect of Storage and Extraction Methods on Amplification of Plasmodium falciparum DNA from Dried Blood Spots

    PubMed Central

    Schwartz, Alanna; Baidjoe, Amrish; Rosenthal, Philip J.; Dorsey, Grant; Bousema, Teun; Greenhouse, Bryan

    2015-01-01

    Extraction and amplification of DNA from dried blood spots (DBS) collected in field studies is commonly used for detection of Plasmodium falciparum. However, there have been few systematic efforts to determine the effects of storage and extraction methods on the sensitivity of DNA amplification. We investigated the effects of storage conditions, length of storage, and DNA extraction methods on amplification via three PCR-based assays using field samples and laboratory controls. Samples stored as DBS for 2 or more years at ambient temperature showed a significant loss of sensitivity that increased with time; after 10 years only 10% samples with parasite densities > 1,000 parasites/μL were detectable by nested polymerase chain reaction (PCR). Conversely, DBS and extracted DNA stored at −20°C showed no loss of sensitivity with time. Samples with low parasite densities amplified more successfully with saponin/Chelex compared with spin-column-based extraction, though the latter method performed better on samples with higher parasite densities stored for 2 years at ambient temperature. DNA extracted via both methods was stable after 20 freeze-thaw cycles. Our results suggest that DBS should be stored at −20°C or extracted immediately, especially if anticipating 2 or more years of storage. PMID:25758652

  19. Differentiation of organic and non-organic winter wheat cultivars from a controlled field trial by crystallization patterns.

    PubMed

    Kahl, Johannes; Busscher, Nicolaas; Mergardt, Gaby; Mäder, Paul; Torp, Torfinn; Ploeger, Angelika

    2015-01-01

    There is a need for authentication tools in order to verify the existing certification system. Recently, markers for analytical authentication of organic products were evaluated. Herein, crystallization with additives was described as an interesting fingerprint approach which needs further evidence, based on a standardized method and well-documented sample origin. The fingerprint of wheat cultivars from a controlled field trial is generated from structure analysis variables of crystal patterns. Method performance was tested on factors such as crystallization chamber, day of experiment and region of interest of the patterns. Two different organic treatments and two different treatments of the non-organic regime can be grouped together in each of three consecutive seasons. When the k-nearest-neighbor classification method was applied, approximately 84% of Runal samples and 95% of Titlis samples were classified correctly into organic and non-organic origin using cross-validation. Crystallization with additive offers an interesting complementary fingerprint method for organic wheat samples. When the method is applied to winter wheat from the DOK trial, organic and non-organic treated samples can be differentiated significantly based on pattern recognition. Therefore crystallization with additives seems to be a promising tool in organic wheat authentication. © 2014 Society of Chemical Industry.

  20. Evaluation of magnetic nanoparticle samples made from biocompatible ferucarbotran by time-correlation magnetic particle imaging reconstruction method

    PubMed Central

    2013-01-01

    Background Molecular imaging using magnetic nanoparticles (MNPs)—magnetic particle imaging (MPI)—has attracted interest for the early diagnosis of cancer and cardiovascular disease. However, because a steep local magnetic field distribution is required to obtain a defined image, sophisticated hardware is required. Therefore, it is desirable to realize excellent image quality even with low-performance hardware. In this study, the spatial resolution of MPI was evaluated using an image reconstruction method based on the correlation information of the magnetization signal in a time domain and by applying MNP samples made from biocompatible ferucarbotran that have adjusted particle diameters. Methods The magnetization characteristics and particle diameters of four types of MNP samples made from ferucarbotran were evaluated. A numerical analysis based on our proposed method that calculates the image intensity from correlation information between the magnetization signal generated from MNPs and the system function was attempted, and the obtained image quality was compared with that using the prototype in terms of image resolution and image artifacts. Results MNP samples obtained by adjusting ferucarbotran showed superior properties to conventional ferucarbotran samples, and numerical analysis showed that the same image quality could be obtained using a gradient magnetic field generator with 0.6 times the performance. However, because image blurring was included theoretically by the proposed method, an algorithm will be required to improve performance. Conclusions MNP samples obtained by adjusting ferucarbotran showed magnetizing properties superior to conventional ferucarbotran samples, and by using such samples, comparable image quality (spatial resolution) could be obtained with a lower gradient magnetic field intensity. PMID:23734917

  1. Development of a general method for quantifying IgG-based therapeutic monoclonal antibodies in human plasma using protein G purification coupled with a two internal standard calibration strategy using LC-MS/MS.

    PubMed

    Chiu, Huai-Hsuan; Liao, Hsiao-Wei; Shao, Yu-Yun; Lu, Yen-Shen; Lin, Ching-Hung; Tsai, I-Lin; Kuo, Ching-Hua

    2018-08-17

    Monoclonal antibody (mAb) drugs have generated much interest in recent years for treating various diseases. Immunoglobulin G (IgG) represents a high percentage of mAb drugs that have been approved by the Food and Drug Administration (FDA). To facilitate therapeutic drug monitoring and pharmacokinetic/pharmacodynamic studies, we developed a general liquid chromatography-tandem mass spectrometry (LC-MS/MS) method to quantify the concentration of IgG-based mAbs in human plasma. Three IgG-based drugs (bevacizumab, nivolumab and pembrolizumab) were selected to demonstrate our method. Protein G beads were used for sample pretreatment due to their universal ability to trap IgG-based drugs. Surrogate peptides that were obtained after trypsin digestion were quantified by using LC-MS/MS. To calibrate sample preparation errors and matrix effects that occur during LC-MS/MS analysis, we used two internal standards (IS) method that include the IgG-based drug-IS tocilizumab and post-column infused IS. Using two internal standards was found to effectively improve quantification accuracy, which was within 15% for all mAb drugs that were tested at three different concentrations. This general method was validated in term of its precision, accuracy, linearity and sensitivity for 3 demonstration mAb drugs. The successful application of the method to clinical samples demonstrated its' applicability in clinical analysis. It is anticipated that this general method could be applied to other mAb-based drugs for use in precision medicine and clinical studies. Copyright © 2018 Elsevier B.V. All rights reserved.

  2. Molecular dynamics based enhanced sampling of collective variables with very large time steps.

    PubMed

    Chen, Pei-Yang; Tuckerman, Mark E

    2018-01-14

    Enhanced sampling techniques that target a set of collective variables and that use molecular dynamics as the driving engine have seen widespread application in the computational molecular sciences as a means to explore the free-energy landscapes of complex systems. The use of molecular dynamics as the fundamental driver of the sampling requires the introduction of a time step whose magnitude is limited by the fastest motions in a system. While standard multiple time-stepping methods allow larger time steps to be employed for the slower and computationally more expensive forces, the maximum achievable increase in time step is limited by resonance phenomena, which inextricably couple fast and slow motions. Recently, we introduced deterministic and stochastic resonance-free multiple time step algorithms for molecular dynamics that solve this resonance problem and allow ten- to twenty-fold gains in the large time step compared to standard multiple time step algorithms [P. Minary et al., Phys. Rev. Lett. 93, 150201 (2004); B. Leimkuhler et al., Mol. Phys. 111, 3579-3594 (2013)]. These methods are based on the imposition of isokinetic constraints that couple the physical system to Nosé-Hoover chains or Nosé-Hoover Langevin schemes. In this paper, we show how to adapt these methods for collective variable-based enhanced sampling techniques, specifically adiabatic free-energy dynamics/temperature-accelerated molecular dynamics, unified free-energy dynamics, and by extension, metadynamics, thus allowing simulations employing these methods to employ similarly very large time steps. The combination of resonance-free multiple time step integrators with free-energy-based enhanced sampling significantly improves the efficiency of conformational exploration.

  3. Molecular dynamics based enhanced sampling of collective variables with very large time steps

    NASA Astrophysics Data System (ADS)

    Chen, Pei-Yang; Tuckerman, Mark E.

    2018-01-01

    Enhanced sampling techniques that target a set of collective variables and that use molecular dynamics as the driving engine have seen widespread application in the computational molecular sciences as a means to explore the free-energy landscapes of complex systems. The use of molecular dynamics as the fundamental driver of the sampling requires the introduction of a time step whose magnitude is limited by the fastest motions in a system. While standard multiple time-stepping methods allow larger time steps to be employed for the slower and computationally more expensive forces, the maximum achievable increase in time step is limited by resonance phenomena, which inextricably couple fast and slow motions. Recently, we introduced deterministic and stochastic resonance-free multiple time step algorithms for molecular dynamics that solve this resonance problem and allow ten- to twenty-fold gains in the large time step compared to standard multiple time step algorithms [P. Minary et al., Phys. Rev. Lett. 93, 150201 (2004); B. Leimkuhler et al., Mol. Phys. 111, 3579-3594 (2013)]. These methods are based on the imposition of isokinetic constraints that couple the physical system to Nosé-Hoover chains or Nosé-Hoover Langevin schemes. In this paper, we show how to adapt these methods for collective variable-based enhanced sampling techniques, specifically adiabatic free-energy dynamics/temperature-accelerated molecular dynamics, unified free-energy dynamics, and by extension, metadynamics, thus allowing simulations employing these methods to employ similarly very large time steps. The combination of resonance-free multiple time step integrators with free-energy-based enhanced sampling significantly improves the efficiency of conformational exploration.

  4. Model-based Bayesian inference for ROC data analysis

    NASA Astrophysics Data System (ADS)

    Lei, Tianhu; Bae, K. Ty

    2013-03-01

    This paper presents a study of model-based Bayesian inference to Receiver Operating Characteristics (ROC) data. The model is a simple version of general non-linear regression model. Different from Dorfman model, it uses a probit link function with a covariate variable having zero-one two values to express binormal distributions in a single formula. Model also includes a scale parameter. Bayesian inference is implemented by Markov Chain Monte Carlo (MCMC) method carried out by Bayesian analysis Using Gibbs Sampling (BUGS). Contrast to the classical statistical theory, Bayesian approach considers model parameters as random variables characterized by prior distributions. With substantial amount of simulated samples generated by sampling algorithm, posterior distributions of parameters as well as parameters themselves can be accurately estimated. MCMC-based BUGS adopts Adaptive Rejection Sampling (ARS) protocol which requires the probability density function (pdf) which samples are drawing from be log concave with respect to the targeted parameters. Our study corrects a common misconception and proves that pdf of this regression model is log concave with respect to its scale parameter. Therefore, ARS's requirement is satisfied and a Gaussian prior which is conjugate and possesses many analytic and computational advantages is assigned to the scale parameter. A cohort of 20 simulated data sets and 20 simulations from each data set are used in our study. Output analysis and convergence diagnostics for MCMC method are assessed by CODA package. Models and methods by using continuous Gaussian prior and discrete categorical prior are compared. Intensive simulations and performance measures are given to illustrate our practice in the framework of model-based Bayesian inference using MCMC method.

  5. Least squares polynomial chaos expansion: A review of sampling strategies

    NASA Astrophysics Data System (ADS)

    Hadigol, Mohammad; Doostan, Alireza

    2018-04-01

    As non-institutive polynomial chaos expansion (PCE) techniques have gained growing popularity among researchers, we here provide a comprehensive review of major sampling strategies for the least squares based PCE. Traditional sampling methods, such as Monte Carlo, Latin hypercube, quasi-Monte Carlo, optimal design of experiments (ODE), Gaussian quadratures, as well as more recent techniques, such as coherence-optimal and randomized quadratures are discussed. We also propose a hybrid sampling method, dubbed alphabetic-coherence-optimal, that employs the so-called alphabetic optimality criteria used in the context of ODE in conjunction with coherence-optimal samples. A comparison between the empirical performance of the selected sampling methods applied to three numerical examples, including high-order PCE's, high-dimensional problems, and low oversampling ratios, is presented to provide a road map for practitioners seeking the most suitable sampling technique for a problem at hand. We observed that the alphabetic-coherence-optimal technique outperforms other sampling methods, specially when high-order ODE are employed and/or the oversampling ratio is low.

  6. Inter-class sparsity based discriminative least square regression.

    PubMed

    Wen, Jie; Xu, Yong; Li, Zuoyong; Ma, Zhongli; Xu, Yuanrong

    2018-06-01

    Least square regression is a very popular supervised classification method. However, two main issues greatly limit its performance. The first one is that it only focuses on fitting the input features to the corresponding output labels while ignoring the correlations among samples. The second one is that the used label matrix, i.e., zero-one label matrix is inappropriate for classification. To solve these problems and improve the performance, this paper presents a novel method, i.e., inter-class sparsity based discriminative least square regression (ICS_DLSR), for multi-class classification. Different from other methods, the proposed method pursues that the transformed samples have a common sparsity structure in each class. For this goal, an inter-class sparsity constraint is introduced to the least square regression model such that the margins of samples from the same class can be greatly reduced while those of samples from different classes can be enlarged. In addition, an error term with row-sparsity constraint is introduced to relax the strict zero-one label matrix, which allows the method to be more flexible in learning the discriminative transformation matrix. These factors encourage the method to learn a more compact and discriminative transformation for regression and thus has the potential to perform better than other methods. Extensive experimental results show that the proposed method achieves the best performance in comparison with other methods for multi-class classification. Copyright © 2018 Elsevier Ltd. All rights reserved.

  7. A new method for determining the acid number of biodiesel based on coulometric titration.

    PubMed

    Barbieri Gonzaga, Fabiano; Pereira Sobral, Sidney

    2012-08-15

    A new method is proposed for determining the acid number (AN) of biodiesel using coulometric titration with potentiometric detection, basically employing a potentiostat/galvanostat and an electrochemical cell containing a platinum electrode, a silver electrode, and a combination pH electrode. The method involves a sequential application of a constant current between the platinum (cathode) and silver (anode) electrodes, followed by measuring the potential of the combination pH electrode, using an isopropanol/water mixture as solvent and LiCl as the supporting electrolyte. A preliminary evaluation of the new method, using acetic acid for doping a biodiesel sample, showed an average recovery of 100.1%. Compared to a volumetric titration-based method for determining the AN of several biodiesel samples (ranging from about 0.18 to 0.95 mg g(-1)), the new method produced statistically similar results with better repeatability. Compared to other works reported in the literature, the new method presented an average repeatability up to 3.2 times better and employed a sample size up to 20 times smaller. Copyright © 2012 Elsevier B.V. All rights reserved.

  8. Rapid classification of hairtail fish and pork freshness using an electronic nose based on the PCA method.

    PubMed

    Tian, Xiu-Ying; Cai, Qiang; Zhang, Yong-Ming

    2012-01-01

    We report a method for building a simple and reproducible electronic nose based on commercially available metal oxide sensors (MOS) to monitor the freshness of hairtail fish and pork stored at 15, 10, and 5 °C. After assembly in the laboratory, the proposed product was tested by a manufacturer. Sample delivery was based on the dynamic headspace method, and two features were extracted from the transient response of each sensor using an unsupervised principal component analysis (PCA) method. The compensation method and pattern recognition based on PCA are discussed in the current paper. PCA compensation can be used for all storage temperatures, however, pattern recognition differs according to storage conditions. Total volatile basic nitrogen (TVBN) and aerobic bacterial counts of the samples were measured simultaneously with the standard indicators of hairtail fish and pork freshness. The PCA models based on TVBN and aerobic bacterial counts were used to classify hairtail fish samples as "fresh" (TVBN ≤ 25 g and microbial counts ≤ 10(6) cfu/g) or "spoiled" (TVBN ≥ 25 g and microbial counts ≥ 10(6) cfu/g) and pork samples also as "fresh" (TVBN ≤ 15 g and microbial counts ≤ 10(6) cfu/g) or "spoiled" (TVBN ≥ 15 g and microbial counts ≥ 10(6) cfu/g). Good correlation coefficients between the responses of the electronic nose and the TVBN and aerobic bacterial counts of the samples were obtained. For hairtail fish, correlation coefficients were 0.97 and 0.91, and for pork, correlation coefficients were 0.81 and 0.88, respectively. Through laboratory simulation and field application, we were able to determine that the electronic nose could help ensure the shelf life of hairtail fish and pork, especially when an instrument is needed to take measurements rapidly. The results also showed that the electronic nose could analyze the process and level of spoilage for hairtail fish and pork.

  9. Methods of analysis by the U.S. Geological Survey National Water Quality Laboratory; determination of pesticides in water by Carbopak-B solid-phase extraction and high-preformance liquid chromatography

    USGS Publications Warehouse

    Werner, Stephen L.; Burkhardt, Mark R.; DeRusseau, Sabrina N.

    1996-01-01

    In accordance with the needs of the National Water-Quality Assessment Program (NAWQA), the U.S. Geological Survey has developed and implemented a graphitized carbon-based solid-phase extraction and high-performance liquid chromatographic analytical method. The method is used to determine 41 pesticides and pesticide metabolites that are not readily amenable to gas chromatography or other high-temperature analytical techniques. Pesticides are extracted from filtered environmental water samples using a 0.5-gram graphitized carbon-based solid-phase cartridge, eluted from the cartridge into two analytical fractions, and analyzed using high-performance liquid chromatography with photodiode-array detection. The upper concentration limit is 1.6 micrograms per liter (=B5g/L) for most compounds. Single-operator method detection limits in organic-free water samples ranged from 0.006 to 0.032 =B5g/L= Recoveries in organic-free water samples ranged from 37 to 88 percent. Recoveries in ground- and surface-water samples ranged from 29 to 94 percent. An optional on-site extraction procedure allows for samples to be collected and processed at remote sites where it is difficult to ship samples to the laboratory within the recommended pre-extraction holding time of 7 days.

  10. Suitability and setup of next-generation sequencing-based method for taxonomic characterization of aquatic microbial biofilm.

    PubMed

    Bakal, Tomas; Janata, Jiri; Sabova, Lenka; Grabic, Roman; Zlabek, Vladimir; Najmanova, Lucie

    2018-06-16

    A robust and widely applicable method for sampling of aquatic microbial biofilm and further sample processing is presented. The method is based on next-generation sequencing of V4-V5 variable regions of 16S rRNA gene and further statistical analysis of sequencing data, which could be useful not only to investigate taxonomic composition of biofilm bacterial consortia but also to assess aquatic ecosystem health. Five artificial materials commonly used for biofilm growth (glass, stainless steel, aluminum, polypropylene, polyethylene) were tested to determine the one giving most robust and reproducible results. The effect of used sampler material on total microbial composition was not statistically significant; however, the non-plastic materials (glass, metal) gave more stable outputs without irregularities among sample parallels. The bias of the method is assessed with respect to the employment of a non-quantitative step (PCR amplification) to obtain quantitative results (relative abundance of identified taxa). This aspect is often overlooked in ecological and medical studies. We document that sequencing of a mixture of three merged primary PCR reactions for each sample and further evaluation of median values from three technical replicates for each sample enables to overcome this bias and gives robust and repeatable results well distinguishing among sampling localities and seasons.

  11. Efficient exploration of chemical space by fragment-based screening.

    PubMed

    Hall, Richard J; Mortenson, Paul N; Murray, Christopher W

    2014-01-01

    Screening methods seek to sample a vast chemical space in order to identify starting points for further chemical optimisation. Fragment based drug discovery exploits the superior sampling of chemical space that can be achieved when the molecular weight is restricted. Here we show that commercially available fragment space is still relatively poorly sampled and argue for highly sensitive screening methods to allow the detection of smaller fragments. We analyse the properties of our fragment library versus the properties of X-ray hits derived from the library. We particularly consider properties related to the degree of planarity of the fragments. Copyright © 2014 Elsevier Ltd. All rights reserved.

  12. Extraction efficiency and implications for absolute quantitation of propranolol in mouse brain, liver and kidney thin tissue sections using droplet-based liquid microjunction surface sampling-HPLC ESI-MS/MS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kertesz, Vilmos; Weiskittel, Taylor M.; Vavek, Marissa

    Currently, absolute quantitation aspects of droplet-based surface sampling for thin tissue analysis using a fully automated autosampler/HPLC-ESI-MS/MS system are not fully evaluated. Knowledge of extraction efficiency and its reproducibility is required to judge the potential of the method for absolute quantitation of analytes from thin tissue sections. Methods: Adjacent thin tissue sections of propranolol dosed mouse brain (10- μm-thick), kidney (10- μm-thick) and liver (8-, 10-, 16- and 24- μm-thick) were obtained. Absolute concentration of propranolol was determined in tissue punches from serial sections using standard bulk tissue extraction protocols and subsequent HPLC separations and tandem mass spectrometric analysis. Thesemore » values were used to determine propranolol extraction efficiency from the tissues with the droplet-based surface sampling approach. Results: Extraction efficiency of propranolol using 10- μm-thick brain, kidney and liver thin tissues using droplet-based surface sampling varied between ~45-63%. Extraction efficiency decreased from ~65% to ~36% with liver thickness increasing from 8 μm to 24 μm. Randomly selecting half of the samples as standards, precision and accuracy of propranolol concentrations obtained for the other half of samples as quality control metrics were determined. Resulting precision ( ±15%) and accuracy ( ±3%) values, respectively, were within acceptable limits. In conclusion, comparative quantitation of adjacent mouse thin tissue sections of different organs and of various thicknesses by droplet-based surface sampling and by bulk extraction of tissue punches showed that extraction efficiency was incomplete using the former method, and that it depended on the organ and tissue thickness. However, once extraction efficiency was determined and applied, the droplet-based approach provided the required quantitation accuracy and precision for assay validations. Furthermore, this means that once the extraction efficiency was calibrated for a given tissue type and drug, the droplet-based approach provides a non-labor intensive and high-throughput means to acquire spatially resolved quantitative analysis of multiple samples of the same type.« less

  13. Extraction efficiency and implications for absolute quantitation of propranolol in mouse brain, liver and kidney thin tissue sections using droplet-based liquid microjunction surface sampling-HPLC ESI-MS/MS

    DOE PAGES

    Kertesz, Vilmos; Weiskittel, Taylor M.; Vavek, Marissa; ...

    2016-06-22

    Currently, absolute quantitation aspects of droplet-based surface sampling for thin tissue analysis using a fully automated autosampler/HPLC-ESI-MS/MS system are not fully evaluated. Knowledge of extraction efficiency and its reproducibility is required to judge the potential of the method for absolute quantitation of analytes from thin tissue sections. Methods: Adjacent thin tissue sections of propranolol dosed mouse brain (10- μm-thick), kidney (10- μm-thick) and liver (8-, 10-, 16- and 24- μm-thick) were obtained. Absolute concentration of propranolol was determined in tissue punches from serial sections using standard bulk tissue extraction protocols and subsequent HPLC separations and tandem mass spectrometric analysis. Thesemore » values were used to determine propranolol extraction efficiency from the tissues with the droplet-based surface sampling approach. Results: Extraction efficiency of propranolol using 10- μm-thick brain, kidney and liver thin tissues using droplet-based surface sampling varied between ~45-63%. Extraction efficiency decreased from ~65% to ~36% with liver thickness increasing from 8 μm to 24 μm. Randomly selecting half of the samples as standards, precision and accuracy of propranolol concentrations obtained for the other half of samples as quality control metrics were determined. Resulting precision ( ±15%) and accuracy ( ±3%) values, respectively, were within acceptable limits. In conclusion, comparative quantitation of adjacent mouse thin tissue sections of different organs and of various thicknesses by droplet-based surface sampling and by bulk extraction of tissue punches showed that extraction efficiency was incomplete using the former method, and that it depended on the organ and tissue thickness. However, once extraction efficiency was determined and applied, the droplet-based approach provided the required quantitation accuracy and precision for assay validations. Furthermore, this means that once the extraction efficiency was calibrated for a given tissue type and drug, the droplet-based approach provides a non-labor intensive and high-throughput means to acquire spatially resolved quantitative analysis of multiple samples of the same type.« less

  14. Using lot quality-assurance sampling and area sampling to identify priority areas for trachoma control: Viet Nam.

    PubMed

    Myatt, Mark; Mai, Nguyen Phuong; Quynh, Nguyen Quang; Nga, Nguyen Huy; Tai, Ha Huy; Long, Nguyen Hung; Minh, Tran Hung; Limburg, Hans

    2005-10-01

    To report on the use of lot quality-assurance sampling (LQAS) surveys undertaken within an area-sampling framework to identify priority areas for intervention with trachoma control activities in Viet Nam. The LQAS survey method for the rapid assessment of the prevalence of active trachoma was adapted for use in Viet Nam with the aim of classifying individual communes by the prevalence of active trachoma among children in primary school. School-based sampling was used; school sites to be sampled were selected using an area-sampling approach. A total of 719 communes in 41 districts in 18 provinces were surveyed. Survey staff found the LQAS survey method both simple and rapid to use after initial problems with area-sampling methods were identified and remedied. The method yielded a finer spatial resolution of prevalence than had been previously achieved in Viet Nam using semiquantitative rapid assessment surveys and multistage cluster-sampled surveys. When used with area-sampling techniques, the LQAS survey method has the potential to form the basis of survey instruments that can be used to efficiently target resources for interventions against active trachoma. With additional work, such methods could provide a generally applicable tool for effective programme planning and for the certification of the elimination of trachoma as a blinding disease.

  15. Comparison of PCR-based methods for the simultaneous detection of Neisseria meningitidis, Haemophilus influenzae, and Streptococcus pneumoniae in clinical samples.

    PubMed

    de Filippis, Ivano; de Andrade, Claudia Ferreira; Caldeira, Nathalia; de Azevedo, Aline Carvalho; de Almeida, Antonio Eugenio

    2016-01-01

    Several in-house PCR-based assays have been described for the detection of bacterial meningitis caused by Neisseria meningitidis, Streptococcus pneumoniae, and Haemophilus influenzae from clinical samples. PCR-based methods targeting different bacterial genes are frequently used by different laboratories worldwide, but no standard method has ever been established. The aim of our study was to compare different in-house and a commercial PCR-based tests for the detection of bacterial pathogens causing meningitis and invasive disease in humans. A total of 110 isolates and 134 clinical samples (99 cerebrospinal fluid and 35 blood samples) collected from suspected cases of invasive disease were analyzed. Specific sets of primers frequently used for PCR-diagnosis of the three pathogens were used and compared with the results achieved using the multiplex approach described here. Several different gene targets were used for each microorganism, namely ctrA, crgA and nspA for N. meningitidis, ply for S. pneumoniae, P6 and bexA for H. influenzae. All used methods were fast, specific and sensitive, while some of the targets used for the in-house PCR assay detected lower concentrations of genomic DNA than the commercial method. An additional PCR reaction is described for the differentiation of capsulated and non-capsulated H. influenzae strains, the while commercial method only detects capsulated strains. The in-house PCR methods here compared showed to be rapid, sensitive, highly specific, and cheaper than commercial methods. The in-house PCR methods could be easily adopted by public laboratories of developing countries for diagnostic purposes. The best results were achieved using primers targeting the genes nspA, ply, and P6 which were able to detect the lowest DNA concentrations for each specific target. Copyright © 2016 Elsevier Editora Ltda. All rights reserved.

  16. Evaluation of an anion exchange resin-based method for concentration of F-RNA coliphages (enteric virus indicators) from water samples.

    PubMed

    Pérez-Méndez, A; Chandler, J C; Bisha, B; Goodridge, L D

    2014-08-01

    Enteric viral contaminants in water represent a public health concern, thus methods for detecting these viruses or their indicator microorganisms are needed. Because enteric viruses and their viral indicators are often found at low concentrations in water, their detection requires upfront concentration methods. In this study, a strong basic anion exchange resin was evaluated as an adsorbent material for the concentration of F-RNA coliphages (MS2, Qβ, GA, and HB-P22). These coliphages are recognized as enteric virus surrogates and fecal indicator organisms. Following adsorption of the coliphages from 50ml water samples, direct RNA isolation and real time RT-PCR detection were performed. In water samples containing 10(5)pfu/ml of the F-RNA coliphages, the anion exchange resin (IRA-900) adsorbed over 96.7% of the coliphages present, improving real time RT-PCR detection by 5-7 cycles compared to direct testing. F-RNA coliphage RNA recovery using the integrated method ranged from 12.6% to 77.1%. Resin-based concentration of samples with low levels of the F-RNA coliphages allowed for 10(0)pfu/ml (MS2 and Qβ) and 10(-1)pfu/ml (GA and HB-P22) to be detected. The resin-based method offers considerable advantages in cost, speed, simplicity and field adaptability. Copyright © 2014 Elsevier B.V. All rights reserved.

  17. Aircraft digital control design methods

    NASA Technical Reports Server (NTRS)

    Powell, J. D.; Parsons, E.; Tashker, M. G.

    1976-01-01

    Variations in design methods for aircraft digital flight control are evaluated and compared. The methods fall into two categories; those where the design is done in the continuous domain (or s plane) and those where the design is done in the discrete domain (or z plane). Design method fidelity is evaluated by examining closed loop root movement and the frequency response of the discretely controlled continuous aircraft. It was found that all methods provided acceptable performance for sample rates greater than 10 cps except the uncompensated s plane design method which was acceptable above 20 cps. A design procedure based on optimal control methods was proposed that provided the best fidelity at very slow sample rates and required no design iterations for changing sample rates.

  18. Can the Roche hemolysis index be used for automated determination of cell-free hemoglobin? A comparison to photometric assays.

    PubMed

    Petrova, Darinka Todorova; Cocisiu, Gabriela Ariadna; Eberle, Christoph; Rhode, Karl-Heinz; Brandhorst, Gunnar; Walson, Philip D; Oellerich, Michael

    2013-09-01

    The aim of this study was to develop a novel method for automated quantification of cell-free hemoglobin (fHb) based on the HI (Roche Diagnostics). The novel fHb method based on the HI was correlated with fHb measured using the triple wavelength methods of both Harboe [fHb, g/L = (0.915 * HI + 2.634)/100] and Fairbanks et al. [fHb, g/L = (0.917 * HI + 2.131)/100]. fHb concentrations were estimated from the HI using the Roche Modular automated platform in self-made and commercially available quality controls, as well as samples from a proficiency testing scheme (INSTAND). The fHb using Roche automated HI results were then compared to results obtained using the traditional spectrophotometric assays for one hundred plasma samples with varying degrees of hemolysis, lipemia and/or bilirubinemia. The novel method using automated HI quantification on the Roche Modular clinical chemistry platform correlated well with results using the classical methods in the 100 patient samples (Harboe: r = 0.9284; Fairbanks et al.: r = 0.9689) and recovery was good for self-made controls. However, commercially available quality controls showed poor recovery due to an unidentified matrix problem. The novel method produced reliable determination of fHb in samples without interferences. However, poor recovery using commercially available fHb quality control samples currently greatly limits its usefulness. © 2013.

  19. A multiresidue method for determination of trace levels of pesticides in air and water.

    PubMed

    Millet, M; Wortham, H; Sanusi, A; Mirabel, P

    1996-11-01

    A multiresidue analytical method is described for the analysis of 13 pesticides in fogwater, rainwater, gas, and particles. This method is based upon solid-liquid extraction using Sep-Pak tC18 light cartridges for aqueous samples, soxhlet for gas (adsorbed on XAD-2) and particles (on glass fiber filters), HPLC-based fractionation of the extracted residues using a silica column, and a linear gradient of n-hexane/tert butyl methyl ether followed by GC-ECD and HPLC-UV analyses of each fraction. Prior to analysis with GC-ECD, a methylation procedure using BF3/methanol was developed for the analysis of the fraction which contains chlorophenoxy acid herbicides. The recoveries of the extraction procedure of liquid samples and of the methylation were greater than 92 and 97% with a standard deviation lower than 8 and 5%, respectively. The detection limits varied between 0.1 and 0.01 microgram.ml-1 for the 13 pesticides studied with a standard deviation less than 9%. This method was used for the determination of pesticides in 18 fogwater samples (soluble + insoluble), 31 rainwater samples, and 17 air (gas + particles) samples collected between 1991 and 1993 in Colmar (east of France).

  20. Evaluation of an alternate method for sampling benthic macroinvertebrates in low-gradient streams sampled as part of the National Rivers and Streams Assessment.

    PubMed

    Flotemersch, Joseph E; North, Sheila; Blocksom, Karen A

    2014-02-01

    Benthic macroinvertebrates are sampled in streams and rivers as one of the assessment elements of the US Environmental Protection Agency's National Rivers and Streams Assessment. In a 2006 report, the recommendation was made that different yet comparable methods be evaluated for different types of streams (e.g., low gradient vs. high gradient). Consequently, a research element was added to the 2008-2009 National Rivers and Streams Assessment to conduct a side-by-side comparison of the standard macroinvertebrate sampling method with an alternate method specifically designed for low-gradient wadeable streams and rivers that focused more on stream edge habitat. Samples were collected using each method at 525 sites in five of nine aggregate ecoregions located in the conterminous USA. Methods were compared using the benthic macroinvertebrate multimetric index developed for the 2006 Wadeable Streams Assessment. Statistical analysis did not reveal any trends that would suggest the overall assessment of low-gradient streams on a regional or national scale would change if the alternate method was used rather than the standard sampling method, regardless of the gradient cutoff used to define low-gradient streams. Based on these results, the National Rivers and Streams Survey should continue to use the standard field method for sampling all streams.

  1. Methods of sampling airborne fungi in working environments of waste treatment facilities.

    PubMed

    Černá, Kristýna; Wittlingerová, Zdeňka; Zimová, Magdaléna; Janovský, Zdeněk

    2016-01-01

    The objective of the present study was to evaluate and compare the efficiency of a filter based sampling method and a high volume sampling method for sampling airborne culturable fungi present in waste sorting facilities. Membrane filters method was compared with surface air system method. The selected sampling methods were modified and tested in 2 plastic waste sorting facilities. The total number of colony-forming units (CFU)/m3 of airborne fungi was dependent on the type of sampling device, on the time of sampling, which was carried out every hour from the beginning of the work shift, and on the type of cultivation medium (p < 0.001). Detected concentrations of airborne fungi ranged 2×102-1.7×106 CFU/m3 when using the membrane filters (MF) method, and 3×102-6.4×104 CFU/m3 when using the surface air system (SAS) method. Both methods showed comparable sensitivity to the fluctuations of the concentrations of airborne fungi during the work shifts. The SAS method is adequate for a fast indicative determination of concentration of airborne fungi. The MF method is suitable for thorough assessment of working environment contamination by airborne fungi. Therefore we recommend the MF method for the implementation of a uniform standard methodology of airborne fungi sampling in working environments of waste treatment facilities. This work is available in Open Access model and licensed under a CC BY-NC 3.0 PL license.

  2. Identification of discriminant proteins through antibody profiling, methods and apparatus for identifying an individual

    DOEpatents

    Apel, William A.; Thompson, Vicki S; Lacey, Jeffrey A.; Gentillon, Cynthia A.

    2016-08-09

    A method for determining a plurality of proteins for discriminating and positively identifying an individual based from a biological sample. The method may include profiling a biological sample from a plurality of individuals against a protein array including a plurality of proteins. The protein array may include proteins attached to a support in a preselected pattern such that locations of the proteins are known. The biological sample may be contacted with the protein array such that a portion of antibodies in the biological sample reacts with and binds to the proteins forming immune complexes. A statistical analysis method, such as discriminant analysis, may be performed to determine discriminating proteins for distinguishing individuals. Proteins of interest may be used to form a protein array. Such a protein array may be used, for example, to compare a forensic sample from an unknown source with a sample from a known source.

  3. Identification of discriminant proteins through antibody profiling, methods and apparatus for identifying an individual

    DOEpatents

    Thompson, Vicki S; Lacey, Jeffrey A; Gentillon, Cynthia A; Apel, William A

    2015-03-03

    A method for determining a plurality of proteins for discriminating and positively identifying an individual based from a biological sample. The method may include profiling a biological sample from a plurality of individuals against a protein array including a plurality of proteins. The protein array may include proteins attached to a support in a preselected pattern such that locations of the proteins are known. The biological sample may be contacted with the protein array such that a portion of antibodies in the biological sample reacts with and binds to the proteins forming immune complexes. A statistical analysis method, such as discriminant analysis, may be performed to determine discriminating proteins for distinguishing individuals. Proteins of interest may be used to form a protein array. Such a protein array may be used, for example, to compare a forensic sample from an unknown source with a sample from a known source.

  4. Rapid detection of Naegleria fowleri in water distribution pipeline biofilms and drinking water samples.

    PubMed

    Puzon, Geoffrey J; Lancaster, James A; Wylie, Jason T; Plumb, Iason J

    2009-09-01

    Rapid detection of pathogenic Naegleria fowler in water distribution networks is critical for water utilities. Current detection methods rely on sampling drinking water followed by culturing and molecular identification of purified strains. This culture-based method takes an extended amount of time (days), detects both nonpathogenic and pathogenic species, and does not account for N. fowleri cells associated with pipe wall biofilms. In this study, a total DNA extraction technique coupled with a real-time PCR method using primers specific for N. fowleri was developed and validated. The method readily detected N. fowleri without preculturing with the lowest detection limit for N. fowleri cells spiked in biofilm being one cell (66% detection rate) and five cells (100% detection rate). For drinking water, the detection limit was five cells (66% detection rate) and 10 cells (100% detection rate). By comparison, culture-based methods were less sensitive for detection of cells spiked into both biofilm (66% detection for <10 cells) and drinking water (0% detection for <10 cells). In mixed cultures of N. fowleri and nonpathogenic Naegleria, the method identified N. fowleri in 100% of all replicates, whereastests with the current consensus primers detected N. fowleri in only 5% of all replicates. Application of the new method to drinking water and pipe wall biofilm samples obtained from a distribution network enabled the detection of N. fowleri in under 6 h, versus 3+ daysforthe culture based method. Further, comparison of the real-time PCR data from the field samples and the standard curves enabled an approximation of N. fowleri cells in the biofilm and drinking water. The use of such a method will further aid water utilities in detecting and managing the persistence of N. fowleri in water distribution networks.

  5. Automated acid and base number determination of mineral-based lubricants by fourier transform infrared spectroscopy: commercial laboratory evaluation.

    PubMed

    Winterfield, Craig; van de Voort, F R

    2014-12-01

    The Fluid Life Corporation assessed and implemented Fourier transform infrared spectroscopy (FTIR)-based methods using American Society for Testing and Materials (ASTM)-like stoichiometric reactions for determination of acid and base number for in-service mineral-based oils. The basic protocols, quality control procedures, calibration, validation, and performance of these new quantitative methods are assessed. ASTM correspondence is attained using a mixed-mode calibration, using primary reference standards to anchor the calibration, supplemented by representative sample lubricants analyzed by ASTM procedures. A partial least squares calibration is devised by combining primary acid/base reference standards and representative samples, focusing on the main spectral stoichiometric response with chemometrics assisting in accounting for matrix variability. FTIR(AN/BN) methodology is precise, accurate, and free of most interference that affects ASTM D664 and D4739 results. Extensive side-by-side operational runs produced normally distributed differences with mean differences close to zero and standard deviations of 0.18 and 0.26 mg KOH/g, respectively. Statistically, the FTIR methods are a direct match to the ASTM methods, with superior performance in terms of analytical throughput, preparation time, and solvent use. FTIR(AN/BN) analysis is a viable, significant advance for in-service lubricant analysis, providing an economic means of trending samples instead of tedious and expensive conventional ASTM(AN/BN) procedures. © 2014 Society for Laboratory Automation and Screening.

  6. A Nonlinear Framework of Delayed Particle Smoothing Method for Vehicle Localization under Non-Gaussian Environment.

    PubMed

    Xiao, Zhu; Havyarimana, Vincent; Li, Tong; Wang, Dong

    2016-05-13

    In this paper, a novel nonlinear framework of smoothing method, non-Gaussian delayed particle smoother (nGDPS), is proposed, which enables vehicle state estimation (VSE) with high accuracy taking into account the non-Gaussianity of the measurement and process noises. Within the proposed method, the multivariate Student's t-distribution is adopted in order to compute the probability distribution function (PDF) related to the process and measurement noises, which are assumed to be non-Gaussian distributed. A computation approach based on Ensemble Kalman Filter (EnKF) is designed to cope with the mean and the covariance matrix of the proposal non-Gaussian distribution. A delayed Gibbs sampling algorithm, which incorporates smoothing of the sampled trajectories over a fixed-delay, is proposed to deal with the sample degeneracy of particles. The performance is investigated based on the real-world data, which is collected by low-cost on-board vehicle sensors. The comparison study based on the real-world experiments and the statistical analysis demonstrates that the proposed nGDPS has significant improvement on the vehicle state accuracy and outperforms the existing filtering and smoothing methods.

  7. Wave transmission approach based on modal analysis for embedded mechanical systems

    NASA Astrophysics Data System (ADS)

    Cretu, Nicolae; Nita, Gelu; Ioan Pop, Mihail

    2013-09-01

    An experimental method for determining the phase velocity in small solid samples is proposed. The method is based on measuring the resonant frequencies of a binary or ternary solid elastic system comprising the small sample of interest and a gauge material of manageable size. The wave transmission matrix of the combined system is derived and the theoretical values of its eigenvalues are used to determine the expected eigenfrequencies that, equated with the measured values, allow for the numerical estimation of the phase velocities in both materials. The known phase velocity of the gauge material is then used to asses the accuracy of the method. Using computer simulation and the experimental values for phase velocities, the theoretical values for the eigenfrequencies of the eigenmodes of the embedded elastic system are obtained, to validate the method. We conclude that the proposed experimental method may be reliably used to determine the elastic properties of small solid samples whose geometries do not allow a direct measurement of their resonant frequencies.

  8. A nanosilver-based spectrophotometric method for determination of malachite green in surface water samples.

    PubMed

    Sahraei, R; Farmany, A; Mortazavi, S S; Noorizadeh, H

    2013-07-01

    A new spectrophotometric method is reported for the determination of nanomolar level of malachite green in surface water samples. The method is based on the catalytic effect of silver nanoparticles on the oxidation of malachite green by hexacyanoferrate (III) in acetate-acetic acid medium. The absorbance is measured at 610 nm with the fixed-time method. Under the optimum conditions, the linear range was 8.0 × 10(-9)-2.0 × 10(-7) mol L(-1) malachite green with a correlation coefficient of 0.996. The limit of detection (S/N = 3) was 2.0 × 10(-9) mol L(-1). Relative standard deviation for ten replicate determinations of 1.0 × 10(-8) mol L(-1) malachite green was 1.86%. The method is featured with good accuracy and reproducibility for malachite green determination in surface water samples without any pre-concentration and separation step.

  9. Evaluation of common methods for sampling invertebrate pollinator assemblages: net sampling out-perform pan traps.

    PubMed

    Popic, Tony J; Davila, Yvonne C; Wardle, Glenda M

    2013-01-01

    Methods for sampling ecological assemblages strive to be efficient, repeatable, and representative. Unknowingly, common methods may be limited in terms of revealing species function and so of less value for comparative studies. The global decline in pollination services has stimulated surveys of flower-visiting invertebrates, using pan traps and net sampling. We explore the relative merits of these two methods in terms of species discovery, quantifying abundance, function, and composition, and responses of species to changing floral resources. Using a spatially-nested design we sampled across a 5000 km(2) area of arid grasslands, including 432 hours of net sampling and 1296 pan trap-days, between June 2010 and July 2011. Net sampling yielded 22% more species and 30% higher abundance than pan traps, and better reflected the spatio-temporal variation of floral resources. Species composition differed significantly between methods; from 436 total species, 25% were sampled by both methods, 50% only by nets, and the remaining 25% only by pans. Apart from being less comprehensive, if pan traps do not sample flower-visitors, the link to pollination is questionable. By contrast, net sampling functionally linked species to pollination through behavioural observations of flower-visitation interaction frequency. Netted specimens are also necessary for evidence of pollen transport. Benefits of net-based sampling outweighed minor differences in overall sampling effort. As pan traps and net sampling methods are not equivalent for sampling invertebrate-flower interactions, we recommend net sampling of invertebrate pollinator assemblages, especially if datasets are intended to document declines in pollination and guide measures to retain this important ecosystem service.

  10. Evaluation of Common Methods for Sampling Invertebrate Pollinator Assemblages: Net Sampling Out-Perform Pan Traps

    PubMed Central

    Popic, Tony J.; Davila, Yvonne C.; Wardle, Glenda M.

    2013-01-01

    Methods for sampling ecological assemblages strive to be efficient, repeatable, and representative. Unknowingly, common methods may be limited in terms of revealing species function and so of less value for comparative studies. The global decline in pollination services has stimulated surveys of flower-visiting invertebrates, using pan traps and net sampling. We explore the relative merits of these two methods in terms of species discovery, quantifying abundance, function, and composition, and responses of species to changing floral resources. Using a spatially-nested design we sampled across a 5000 km2 area of arid grasslands, including 432 hours of net sampling and 1296 pan trap-days, between June 2010 and July 2011. Net sampling yielded 22% more species and 30% higher abundance than pan traps, and better reflected the spatio-temporal variation of floral resources. Species composition differed significantly between methods; from 436 total species, 25% were sampled by both methods, 50% only by nets, and the remaining 25% only by pans. Apart from being less comprehensive, if pan traps do not sample flower-visitors, the link to pollination is questionable. By contrast, net sampling functionally linked species to pollination through behavioural observations of flower-visitation interaction frequency. Netted specimens are also necessary for evidence of pollen transport. Benefits of net-based sampling outweighed minor differences in overall sampling effort. As pan traps and net sampling methods are not equivalent for sampling invertebrate-flower interactions, we recommend net sampling of invertebrate pollinator assemblages, especially if datasets are intended to document declines in pollination and guide measures to retain this important ecosystem service. PMID:23799127

  11. Examining the Effectiveness of Problem-Based Learning in the Teaching of Information Technology: A Comparison with Lectured-Based Learning

    ERIC Educational Resources Information Center

    Liu, YuFing

    2013-01-01

    This paper applies a quasi-experimental research method to compare the difference in students' approaches to learning and their learning achievements between the group that follows the problem based learning (PBL) teaching method with computer support and the group that follows the non-PBL teaching methods. The study sample consisted of 68 junior…

  12. Development, validation and application of an ICP-MS/MS method to quantify minerals and (ultra-)trace elements in human serum.

    PubMed

    Meyer, Sören; Markova, Mariya; Pohl, Gabriele; Marschall, Talke A; Pivovarova, Olga; Pfeiffer, Andreas F H; Schwerdtle, Tanja

    2018-09-01

    Multi-element determination in human samples is very challenging. Especially in human intervention studies sample volumes are often limited to a few microliters and due to the high number of samples a high-throughput is indispensable. Here, we present a state-of-the-art ICP-MS/MS-based method for the analysis of essential (trace) elements, namely Mg, Ca, Fe, Cu, Zn, Mo, Se and I, as well as food-relevant toxic elements such as As and Cd. The developed method was validated regarding linearity of the calibration curves, method LODs and LOQs, selectivity and trueness as well as precision. The established reliable method was applied to quantify the element serum concentrations of participants of a human intervention study (LeguAN). The participants received isocaloric diets, either rich in plant protein or in animal protein. While the serum concentrations of Mg and Mo increased in participants receiving the plant protein-based diet (above all legumes), the Se concentration in serum decreased. In contrast, the animal protein-based diet, rich in meat and dairy products, resulted in an increased Se concentration in serum. Copyright © 2018 Elsevier GmbH. All rights reserved.

  13. Validated HPLC Determination of 4-Dimethylaminoantipyrine in Different Suppository Bases

    PubMed Central

    Kalmár, É; Kormányos, B.; Szakonyi, G.; Dombi, G.

    2014-01-01

    Suppositories are important tools for individual therapy, especially in paediatrics, and an instrumental assay method has become necessary for the quality control of dosage units. The aim of this work was to develop a rapid, effective high-performance liquid chromatography method to assay aminophenazone in extemporaneous suppositories prepared with two different suppository bases, adeps solidus and massa macrogoli. With a novel sample preparation method developed by the authors, 4-dimethylaminoantipyrine was determined in these suppository bases with 95-105% recovery. The measurements were carried out on a Shimadzu Prominence ultra high-performance liquid chromatography system equipped with a 20 μl sample loop. The separation was achieved on a Hypersil ODS column, with methanol, sodium acetate buffer (pH 5.5±0.05, 0.05 M, 60:40, v/v) as the mobile phase at a flow rate of 1.5 ml/min. The chromatograms were acquired at 253 nm. The chromatographic method was fully validated in accordance with current guidelines. The presented data demonstrate the successful development of a rapid, efficient and robust sample preparation and high-performance liquid chromatography method for the routine quality control of the dosage units of suppositories containing 4-dimethylaminoantipyrine. PMID:24799736

  14. Evaluation of Respondent-Driven Sampling

    PubMed Central

    McCreesh, Nicky; Frost, Simon; Seeley, Janet; Katongole, Joseph; Tarsh, Matilda Ndagire; Ndunguse, Richard; Jichi, Fatima; Lunel, Natasha L; Maher, Dermot; Johnston, Lisa G; Sonnenberg, Pam; Copas, Andrew J; Hayes, Richard J; White, Richard G

    2012-01-01

    Background Respondent-driven sampling is a novel variant of link-tracing sampling for estimating the characteristics of hard-to-reach groups, such as HIV prevalence in sex-workers. Despite its use by leading health organizations, the performance of this method in realistic situations is still largely unknown. We evaluated respondent-driven sampling by comparing estimates from a respondent-driven sampling survey with total-population data. Methods Total-population data on age, tribe, religion, socioeconomic status, sexual activity and HIV status were available on a population of 2402 male household-heads from an open cohort in rural Uganda. A respondent-driven sampling (RDS) survey was carried out in this population, employing current methods of sampling (RDS sample) and statistical inference (RDS estimates). Analyses were carried out for the full RDS sample and then repeated for the first 250 recruits (small sample). Results We recruited 927 household-heads. Full and small RDS samples were largely representative of the total population, but both samples under-represented men who were younger, of higher socioeconomic status, and with unknown sexual activity and HIV status. Respondent-driven-sampling statistical-inference methods failed to reduce these biases. Only 31%-37% (depending on method and sample size) of RDS estimates were closer to the true population proportions than the RDS sample proportions. Only 50%-74% of respondent-driven-sampling bootstrap 95% confidence intervals included the population proportion. Conclusions Respondent-driven sampling produced a generally representative sample of this well-connected non-hidden population. However, current respondent-driven-sampling inference methods failed to reduce bias when it occurred. Whether the data required to remove bias and measure precision can be collected in a respondent-driven sampling survey is unresolved. Respondent-driven sampling should be regarded as a (potentially superior) form of convenience-sampling method, and caution is required when interpreting findings based on the sampling method. PMID:22157309

  15. Synchronization of Hierarchical Time-Varying Neural Networks Based on Asynchronous and Intermittent Sampled-Data Control.

    PubMed

    Xiong, Wenjun; Patel, Ragini; Cao, Jinde; Zheng, Wei Xing

    In this brief, our purpose is to apply asynchronous and intermittent sampled-data control methods to achieve the synchronization of hierarchical time-varying neural networks. The asynchronous and intermittent sampled-data controllers are proposed for two reasons: 1) the controllers may not transmit the control information simultaneously and 2) the controllers cannot always exist at any time . The synchronization is then discussed for a kind of hierarchical time-varying neural networks based on the asynchronous and intermittent sampled-data controllers. Finally, the simulation results are given to illustrate the usefulness of the developed criteria.In this brief, our purpose is to apply asynchronous and intermittent sampled-data control methods to achieve the synchronization of hierarchical time-varying neural networks. The asynchronous and intermittent sampled-data controllers are proposed for two reasons: 1) the controllers may not transmit the control information simultaneously and 2) the controllers cannot always exist at any time . The synchronization is then discussed for a kind of hierarchical time-varying neural networks based on the asynchronous and intermittent sampled-data controllers. Finally, the simulation results are given to illustrate the usefulness of the developed criteria.

  16. Using lot quality-assurance sampling and area sampling to identify priority areas for trachoma control: Viet Nam.

    PubMed Central

    Myatt, Mark; Mai, Nguyen Phuong; Quynh, Nguyen Quang; Nga, Nguyen Huy; Tai, Ha Huy; Long, Nguyen Hung; Minh, Tran Hung; Limburg, Hans

    2005-01-01

    OBJECTIVE: To report on the use of lot quality-assurance sampling (LQAS) surveys undertaken within an area-sampling framework to identify priority areas for intervention with trachoma control activities in Viet Nam. METHODS: The LQAS survey method for the rapid assessment of the prevalence of active trachoma was adapted for use in Viet Nam with the aim of classifying individual communes by the prevalence of active trachoma among children in primary school. School-based sampling was used; school sites to be sampled were selected using an area-sampling approach. A total of 719 communes in 41 districts in 18 provinces were surveyed. FINDINGS: Survey staff found the LQAS survey method both simple and rapid to use after initial problems with area-sampling methods were identified and remedied. The method yielded a finer spatial resolution of prevalence than had been previously achieved in Viet Nam using semiquantitative rapid assessment surveys and multistage cluster-sampled surveys. CONCLUSION: When used with area-sampling techniques, the LQAS survey method has the potential to form the basis of survey instruments that can be used to efficiently target resources for interventions against active trachoma. With additional work, such methods could provide a generally applicable tool for effective programme planning and for the certification of the elimination of trachoma as a blinding disease. PMID:16283052

  17. A quantitative evaluation of two methods for preserving hair samples

    USGS Publications Warehouse

    Roon, David A.; Waits, L.P.; Kendall, K.C.

    2003-01-01

    Hair samples are an increasingly important DNA source for wildlife studies, yet optimal storage methods and DNA degradation rates have not been rigorously evaluated. We tested amplification success rates over a one-year storage period for DNA extracted from brown bear (Ursus arctos) hair samples preserved using silica desiccation and -20C freezing. For three nuclear DNA microsatellites, success rates decreased significantly after a six-month time point, regardless of storage method. For a 1000 bp mitochondrial fragment, a similar decrease occurred after a two-week time point. Minimizing delays between collection and DNA extraction will maximize success rates for hair-based noninvasive genetic sampling projects.

  18. Separation techniques for the clean-up of radioactive mixed waste for ICP-AES/ICP-MS analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Swafford, A.M.; Keller, J.M.

    1993-03-17

    Two separation techniques were investigated for the clean-up of typical radioactive mixed waste samples requiring elemental analysis by Inductively Coupled Plasma-Atomic Emission Spectroscopy (ICP-AES) or Inductively Coupled Plasma-Mass Spectrometry (ICP-MS). These measurements frequently involve regulatory or compliance criteria which include the determination of elements on the EPA Target Analyte List (TAL). These samples usually consist of both an aqueous phase and a solid phase which is mostly an inorganic sludge. Frequently, samples taken from the waste tanks contain high levels of uranium and thorium which can cause spectral interferences in ICP-AES or ICP-MS analysis. The removal of these interferences ismore » necessary to determine the presence of the EPA TAL elements in the sample. Two clean-up methods were studied on simulated aqueous waste samples containing the EPA TAL elements. The first method studied was a classical procedure based upon liquid-liquid extraction using tri-n- octylphosphine oxide (TOPO) dissolved in cyclohexane. The second method investigated was based on more recently developed techniques using extraction chromatography; specifically the use of a commercially available Eichrom TRU[center dot]Spec[trademark] column. Literature on these two methods indicates the efficient removal of uranium and thorium from properly prepared samples and provides considerable qualitative information on the extraction behavior of many other elements. However, there is a lack of quantitative data on the extraction behavior of elements on the EPA Target Analyte List. Experimental studies on these two methods consisted of determining whether any of the analytes were extracted by these methods and the recoveries obtained. Both methods produced similar results; the EPA target analytes were only slightly or not extracted. Advantages and disadvantages of each method were evaluated and found to be comparable.« less

  19. Ear recognition from one sample per person.

    PubMed

    Chen, Long; Mu, Zhichun; Zhang, Baoqing; Zhang, Yi

    2015-01-01

    Biometrics has the advantages of efficiency and convenience in identity authentication. As one of the most promising biometric-based methods, ear recognition has received broad attention and research. Previous studies have achieved remarkable performance with multiple samples per person (MSPP) in the gallery. However, most conventional methods are insufficient when there is only one sample per person (OSPP) available in the gallery. To solve the OSPP problem by maximizing the use of a single sample, this paper proposes a hybrid multi-keypoint descriptor sparse representation-based classification (MKD-SRC) ear recognition approach based on 2D and 3D information. Because most 3D sensors capture 3D data accessorizing the corresponding 2D data, it is sensible to use both types of information. First, the ear region is extracted from the profile. Second, keypoints are detected and described for both the 2D texture image and 3D range image. Then, the hybrid MKD-SRC algorithm is used to complete the recognition with only OSPP in the gallery. Experimental results on a benchmark dataset have demonstrated the feasibility and effectiveness of the proposed method in resolving the OSPP problem. A Rank-one recognition rate of 96.4% is achieved for a gallery of 415 subjects, and the time involved in the computation is satisfactory compared to conventional methods.

  20. Molecular DNA-based detection of ionising radiation in meat.

    PubMed

    Şakalar, Ergün

    2017-05-01

    Ionising radiation induces molecular alterations, such as formation of ions, free radicals, and new stable molecules, and cleavage of the chemical bonds of the molecules present in food. Irradiation-treated meat should be labelled to control the process and to ensure free consumer choice. Therefore, sensitive analytical methods are required to detect the irradiation dose. Meat samples were exposed to radiation doses of 0, 0.272, 0.497, 1.063, 3.64, 8.82 and 17.42 kGy in an industrial 60 Co gamma cell. Primers were designed to amplify 998, 498 and 250-base pair (bp) regions of the 18S rRNA gene of nuclear DNA from the irradiated samples. A new DNA-based method was developed to quantify the radiation exposed to the unstored meat and the meat stored at -20 °C for 3 and 6 months. The method was able to detect meat samples stored and unstored with dose limits of 1.063 and 3.64 kGy, respectively. The level of irradiation can be detected using primer pairs that target particularly different-sized sequences for DNA amplification by PCR. This method can be widely used for the analysis of not only meat samples, but also all biological materials containing DNA. © 2016 Society of Chemical Industry. © 2016 Society of Chemical Industry.

  1. Measuring Blood Glucose Concentrations in Photometric Glucometers Requiring Very Small Sample Volumes.

    PubMed

    Demitri, Nevine; Zoubir, Abdelhak M

    2017-01-01

    Glucometers present an important self-monitoring tool for diabetes patients and, therefore, must exhibit high accuracy as well as good usability features. Based on an invasive photometric measurement principle that drastically reduces the volume of the blood sample needed from the patient, we present a framework that is capable of dealing with small blood samples, while maintaining the required accuracy. The framework consists of two major parts: 1) image segmentation; and 2) convergence detection. Step 1 is based on iterative mode-seeking methods to estimate the intensity value of the region of interest. We present several variations of these methods and give theoretical proofs of their convergence. Our approach is able to deal with changes in the number and position of clusters without any prior knowledge. Furthermore, we propose a method based on sparse approximation to decrease the computational load, while maintaining accuracy. Step 2 is achieved by employing temporal tracking and prediction, herewith decreasing the measurement time, and, thus, improving usability. Our framework is tested on several real datasets with different characteristics. We show that we are able to estimate the underlying glucose concentration from much smaller blood samples than is currently state of the art with sufficient accuracy according to the most recent ISO standards and reduce measurement time significantly compared to state-of-the-art methods.

  2. Detection of sunn pest-damaged wheat samples using visible/near-infrared spectroscopy based on pattern recognition.

    PubMed

    Basati, Zahra; Jamshidi, Bahareh; Rasekh, Mansour; Abbaspour-Gilandeh, Yousef

    2018-05-30

    The presence of sunn pest-damaged grains in wheat mass reduces the quality of flour and bread produced from it. Therefore, it is essential to assess the quality of the samples in collecting and storage centers of wheat and flour mills. In this research, the capability of visible/near-infrared (Vis/NIR) spectroscopy combined with pattern recognition methods was investigated for discrimination of wheat samples with different percentages of sunn pest-damaged. To this end, various samples belonging to five classes (healthy and 5%, 10%, 15% and 20% unhealthy) were analyzed using Vis/NIR spectroscopy (wavelength range of 350-1000 nm) based on both supervised and unsupervised pattern recognition methods. Principal component analysis (PCA) and hierarchical cluster analysis (HCA) as the unsupervised techniques and soft independent modeling of class analogies (SIMCA) and partial least squares-discriminant analysis (PLS-DA) as supervised methods were used. The results showed that Vis/NIR spectra of healthy samples were correctly clustered using both PCA and HCA. Due to the high overlapping between the four unhealthy classes (5%, 10%, 15% and 20%), it was not possible to discriminate all the unhealthy samples in individual classes. However, when considering only the two main categories of healthy and unhealthy, an acceptable degree of separation between the classes can be obtained after classification with supervised pattern recognition methods of SIMCA and PLS-DA. SIMCA based on PCA modeling correctly classified samples in two classes of healthy and unhealthy with classification accuracy of 100%. Moreover, the power of the wavelengths of 839 nm, 918 nm and 995 nm were more than other wavelengths to discriminate two classes of healthy and unhealthy. It was also concluded that PLS-DA provides excellent classification results of healthy and unhealthy samples (R 2  = 0.973 and RMSECV = 0.057). Therefore, Vis/NIR spectroscopy based on pattern recognition techniques can be useful for rapid distinguishing the healthy wheat samples from those damaged by sunn pest in the maintenance and processing centers. Copyright © 2018 Elsevier B.V. All rights reserved.

  3. A Classification of Remote Sensing Image Based on Improved Compound Kernels of Svm

    NASA Astrophysics Data System (ADS)

    Zhao, Jianing; Gao, Wanlin; Liu, Zili; Mou, Guifen; Lu, Lin; Yu, Lina

    The accuracy of RS classification based on SVM which is developed from statistical learning theory is high under small number of train samples, which results in satisfaction of classification on RS using SVM methods. The traditional RS classification method combines visual interpretation with computer classification. The accuracy of the RS classification, however, is improved a lot based on SVM method, because it saves much labor and time which is used to interpret images and collect training samples. Kernel functions play an important part in the SVM algorithm. It uses improved compound kernel function and therefore has a higher accuracy of classification on RS images. Moreover, compound kernel improves the generalization and learning ability of the kernel.

  4. Restricted Boltzmann machines based oversampling and semi-supervised learning for false positive reduction in breast CAD.

    PubMed

    Cao, Peng; Liu, Xiaoli; Bao, Hang; Yang, Jinzhu; Zhao, Dazhe

    2015-01-01

    The false-positive reduction (FPR) is a crucial step in the computer aided detection system for the breast. The issues of imbalanced data distribution and the limitation of labeled samples complicate the classification procedure. To overcome these challenges, we propose oversampling and semi-supervised learning methods based on the restricted Boltzmann machines (RBMs) to solve the classification of imbalanced data with a few labeled samples. To evaluate the proposed method, we conducted a comprehensive performance study and compared its results with the commonly used techniques. Experiments on benchmark dataset of DDSM demonstrate the effectiveness of the RBMs based oversampling and semi-supervised learning method in terms of geometric mean (G-mean) for false positive reduction in Breast CAD.

  5. Ultrasonic-energy enhance the ionic liquid-based dual microextraction to preconcentrate the lead in ground and stored rain water samples as compared to conventional shaking method.

    PubMed

    Nizamani, Sooraj; Kazi, Tasneem G; Afridi, Hassan I

    2018-01-01

    An efficient preconcentration technique based on ultrasonic-assisted ionic liquid-based dual microextraction (UA-ILDµE) method has been developed to preconcentrate the lead (Pb +2 ) in ground and stored rain water. In the current proposed method, Pb +2 was complexed with a chelating agent (dithizone), whereas an ionic liquid (1-butyl-3-methylimidazolium hexafluorophosphate) was used for extraction purpose. The ultrasonic irradiation and electrical shaking system were applied to enhance the dispersion and extraction of Pb +2 complex in aqueous samples. For second phase, dual microextraction (DµE phase), the enriched Pb +2 complex in ionic liquid, extracted back into the acidic aqueous solution and finally determined by flame atomic absorption spectrometry. Some major analytical parameters that influenced the extraction efficiency of developed method, such as pH, concentration of ligand, volume of ionic liquid and samples, time of shaking in thermostatic electrical shaker and ultrasonic bath, effect of back extracting HNO 3 volume, matrix effect, centrifugation time and rate were optimized. At the sample volume of 25mL, the calculated preconcentration factor was 62.2. The limit of detection of proposed procedure for Pb +2 ions was found to be 0.54μgL -1 . The validation of developed method was performed by the analysis of certified sample of water SRM 1643e and standard addition method in a real water sample. The extraction recovery of Pb +2 was enhanced≥2% with shaking time of 80s in ultrasonic bath as compared to used thermostatic electrical shaker, where for optimum recovery up to 10min was required. The developed procedure was successfully used for the enrichment of Pb +2 in ground and stored rain water (surface water) samples of an endemic region of Pakistan. The resulted data indicated that the ground water samples were highly contaminated with Pb +2 , while some of the surface water samples were also have higher values of Pb +2 than permissible limit of WHO. The concentration of Pb +2 in surface and ground water samples was found in the range of 17.5-24.5 and 25.6-99.1μgL - 1 respectively. Copyright © 2017 Elsevier B.V. All rights reserved.

  6. Problem-Based Learning Method: Secondary Education 10th Grade Chemistry Course Mixtures Topic

    ERIC Educational Resources Information Center

    Üce, Musa; Ates, Ismail

    2016-01-01

    In this research; aim was determining student achievement by comparing problem-based learning method with teacher-centered traditional method of teaching 10th grade chemistry lesson mixtures topic. Pretest-posttest control group research design is implemented. Research sample includes; two classes of (total of 48 students) an Anatolian High School…

  7. Monolithic methacrylate packed 96-tips for high throughput bioanalysis.

    PubMed

    Altun, Zeki; Skoglund, Christina; Abdel-Rehim, Mohamed

    2010-04-16

    In the pharmaceutical industry the growing number of samples to be analyzed requires high throughput and fully automated analytical techniques. Commonly used sample-preparation methods are solid-phase extraction (SPE), liquid-liquid extraction (LLE) and protein precipitation. In this paper we will discus a new sample-preparation technique based on SPE for high throughput drug extraction developed and used by our group. This new sample-preparation method is based on monolithic methacrylate polymer as packing sorbent for 96-tip robotic device. Using this device a 96-well plate could be handled in 2-4min. The key aspect of the monolithic phase is that monolithic material can offer both good binding capacity and low back-pressure properties compared to e.g. silica phases. The present paper presents the successful application of monolithic 96-tips and LC-MS/MS by the sample preparation of busulphan, rescovitine, metoprolol, pindolol and local anaesthetics from human plasma samples and cyklophosphamid from mice blood samples. Copyright 2009 Elsevier B.V. All rights reserved.

  8. IMPLICIT DUAL CONTROL BASED ON PARTICLE FILTERING AND FORWARD DYNAMIC PROGRAMMING.

    PubMed

    Bayard, David S; Schumitzky, Alan

    2010-03-01

    This paper develops a sampling-based approach to implicit dual control. Implicit dual control methods synthesize stochastic control policies by systematically approximating the stochastic dynamic programming equations of Bellman, in contrast to explicit dual control methods that artificially induce probing into the control law by modifying the cost function to include a term that rewards learning. The proposed implicit dual control approach is novel in that it combines a particle filter with a policy-iteration method for forward dynamic programming. The integration of the two methods provides a complete sampling-based approach to the problem. Implementation of the approach is simplified by making use of a specific architecture denoted as an H-block. Practical suggestions are given for reducing computational loads within the H-block for real-time applications. As an example, the method is applied to the control of a stochastic pendulum model having unknown mass, length, initial position and velocity, and unknown sign of its dc gain. Simulation results indicate that active controllers based on the described method can systematically improve closed-loop performance with respect to other more common stochastic control approaches.

  9. Luma-chroma space filter design for subpixel-based monochrome image downsampling.

    PubMed

    Fang, Lu; Au, Oscar C; Cheung, Ngai-Man; Katsaggelos, Aggelos K; Li, Houqiang; Zou, Feng

    2013-10-01

    In general, subpixel-based downsampling can achieve higher apparent resolution of the down-sampled images on LCD or OLED displays than pixel-based downsampling. With the frequency domain analysis of subpixel-based downsampling, we discover special characteristics of the luma-chroma color transform choice for monochrome images. With these, we model the anti-aliasing filter design for subpixel-based monochrome image downsampling as a human visual system-based optimization problem with a two-term cost function and obtain a closed-form solution. One cost term measures the luminance distortion and the other term measures the chrominance aliasing in our chosen luma-chroma space. Simulation results suggest that the proposed method can achieve sharper down-sampled gray/font images compared with conventional pixel and subpixel-based methods, without noticeable color fringing artifacts.

  10. Testing non-inferiority of a new treatment in three-arm clinical trials with binary endpoints.

    PubMed

    Tang, Nian-Sheng; Yu, Bin; Tang, Man-Lai

    2014-12-18

    A two-arm non-inferiority trial without a placebo is usually adopted to demonstrate that an experimental treatment is not worse than a reference treatment by a small pre-specified non-inferiority margin due to ethical concerns. Selection of the non-inferiority margin and establishment of assay sensitivity are two major issues in the design, analysis and interpretation for two-arm non-inferiority trials. Alternatively, a three-arm non-inferiority clinical trial including a placebo is usually conducted to assess the assay sensitivity and internal validity of a trial. Recently, some large-sample approaches have been developed to assess the non-inferiority of a new treatment based on the three-arm trial design. However, these methods behave badly with small sample sizes in the three arms. This manuscript aims to develop some reliable small-sample methods to test three-arm non-inferiority. Saddlepoint approximation, exact and approximate unconditional, and bootstrap-resampling methods are developed to calculate p-values of the Wald-type, score and likelihood ratio tests. Simulation studies are conducted to evaluate their performance in terms of type I error rate and power. Our empirical results show that the saddlepoint approximation method generally behaves better than the asymptotic method based on the Wald-type test statistic. For small sample sizes, approximate unconditional and bootstrap-resampling methods based on the score test statistic perform better in the sense that their corresponding type I error rates are generally closer to the prespecified nominal level than those of other test procedures. Both approximate unconditional and bootstrap-resampling test procedures based on the score test statistic are generally recommended for three-arm non-inferiority trials with binary outcomes.

  11. Detection of Multi-walled Carbon Nanotubes and Carbon Nanodiscs on Workplace Surfaces at a Small-Scale Producer.

    PubMed

    Hedmer, Maria; Ludvigsson, Linus; Isaxon, Christina; Nilsson, Patrik T; Skaug, Vidar; Bohgard, Mats; Pagels, Joakim H; Messing, Maria E; Tinnerberg, Håkan

    2015-08-01

    The industrial use of novel-manufactured nanomaterials such as carbon nanotubes and carbon nanodiscs is increasing globally. Occupational exposure can occur during production, downstream use, and disposal. The health effects of many nanomaterials are not yet fully characterized and to handle nano-objects, their aggregates and agglomerates >100nm (NOAA), a high degree of control measures and personal protective equipment are required. The emission of airborne NOAA during production and handling can contaminate workplace surfaces with dust, which can be resuspended resulting in secondary inhalation exposures and dermal exposures. This study surveys the presence of carbon-based nanomaterials, such as multi-walled carbon nanotubes (MWCNTs) and carbon nanodiscs, as surface contamination at a small-scale producer using a novel tape sampling method. Eighteen different surfaces at a small-scale producer were sampled with an adhesive tape sampling method. The surfaces selected were associated with the production and handling of MWCNT powder in the near-field zone. Surfaces in the far-field zone were also sampled. In addition, tape stripping of the skin was performed on one worker. The tape samples were analysed with scanning electron microscopy to detect the carbon-based NOAA. Air sampling with a personal impactor was also performed on a worker who was producing MWCNTs the same day as the tape samples were collected. MWCNTs were detected in 50% of the collected tape samples and carbon nanodiscs in 17%. MWCNTs and carbon nanodiscs were identified in all parts of the workplace, thus, increasing the risk for secondary inhalation and dermal exposure of the workers. Both airborne MWCNTs and carbon nanodiscs were detected in the personal impactor samples. The tape-strip samples from the worker showed no presence of carbon-containing nanoparticles. Tape sampling is a functional method for detecting surface contamination of carbon-based NOAA and for exposure control during production at potentially any workplace that produces or handles such manufactured nanomaterials. With the tape method, it is possible to monitor if a potential for secondary inhalation exposure or dermal exposure exists through resuspension of dust deposited on workplace surfaces. By means of air sampling, we could confirm that carbon nanodiscs were resuspended into the air at the workplace even though they were not handled during that particular work shift. MWCNTs were detected in the air samples, but can have been derived from either resuspension or from the work tasks with MWCNTs that were performed during the air sampling. Tape sampling is a complementary method to air sampling and together these two methods provide a better view of the hygienic situation in workplaces where NOAA can be emitted into work environments. © The Author 2015. Published by Oxford University Press on behalf of the British Occupational Hygiene Society.

  12. The effect on cadaver blood DNA identification by the use of targeted and whole body post-mortem computed tomography angiography.

    PubMed

    Rutty, Guy N; Barber, Jade; Amoroso, Jasmin; Morgan, Bruno; Graham, Eleanor A M

    2013-12-01

    Post-mortem computed tomography angiography (PMCTA) involves the injection of contrast agents. This could have both a dilution effect on biological fluid samples and could affect subsequent post-contrast analytical laboratory processes. We undertook a small sample study of 10 targeted and 10 whole body PMCTA cases to consider whether or not these two methods of PMCTA could affect post-PMCTA cadaver blood based DNA identification. We used standard methodology to examine DNA from blood samples obtained before and after the PMCTA procedure. We illustrate that neither of these PMCTA methods had an effect on the alleles called following short tandem repeat based DNA profiling, and therefore the ability to undertake post-PMCTA blood based DNA identification.

  13. Modeling the uncertainty of estimating forest carbon stocks in China

    NASA Astrophysics Data System (ADS)

    Yue, T. X.; Wang, Y. F.; Du, Z. P.; Zhao, M. W.; Zhang, L. L.; Zhao, N.; Lu, M.; Larocque, G. R.; Wilson, J. P.

    2015-12-01

    Earth surface systems are controlled by a combination of global and local factors, which cannot be understood without accounting for both the local and global components. The system dynamics cannot be recovered from the global or local controls alone. Ground forest inventory is able to accurately estimate forest carbon stocks at sample plots, but these sample plots are too sparse to support the spatial simulation of carbon stocks with required accuracy. Satellite observation is an important source of global information for the simulation of carbon stocks. Satellite remote-sensing can supply spatially continuous information about the surface of forest carbon stocks, which is impossible from ground-based investigations, but their description has considerable uncertainty. In this paper, we validated the Lund-Potsdam-Jena dynamic global vegetation model (LPJ), the Kriging method for spatial interpolation of ground sample plots and a satellite-observation-based approach as well as an approach for fusing the ground sample plots with satellite observations and an assimilation method for incorporating the ground sample plots into LPJ. The validation results indicated that both the data fusion and data assimilation approaches reduced the uncertainty of estimating carbon stocks. The data fusion had the lowest uncertainty by using an existing method for high accuracy surface modeling to fuse the ground sample plots with the satellite observations (HASM-SOA). The estimates produced with HASM-SOA were 26.1 and 28.4 % more accurate than the satellite-based approach and spatial interpolation of the sample plots, respectively. Forest carbon stocks of 7.08 Pg were estimated for China during the period from 2004 to 2008, an increase of 2.24 Pg from 1984 to 2008, using the preferred HASM-SOA method.

  14. Reliability of confidence intervals calculated by bootstrap and classical methods using the FIA 1-ha plot design

    Treesearch

    H. T. Schreuder; M. S. Williams

    2000-01-01

    In simulation sampling from forest populations using sample sizes of 20, 40, and 60 plots respectively, confidence intervals based on the bootstrap (accelerated, percentile, and t-distribution based) were calculated and compared with those based on the classical t confidence intervals for mapped populations and subdomains within those populations. A 68.1 ha mapped...

  15. Introduction of a New Diagnostic Method for Breast Cancer Based on Fine Needle Aspiration (FNA) Test Data and Combining Intelligent Systems

    PubMed Central

    Fiuzy, Mohammad; Haddadnia, Javad; Mollania, Nasrin; Hashemian, Maryam; Hassanpour, Kazem

    2012-01-01

    Background Accurate Diagnosis of Breast Cancer is of prime importance. Fine Needle Aspiration test or "FNA”, which has been used for several years in Europe, is a simple, inexpensive, noninvasive and accurate technique for detecting breast cancer. Expending the suitable features of the Fine Needle Aspiration results is the most important diagnostic problem in early stages of breast cancer. In this study, we introduced a new algorithm that can detect breast cancer based on combining artificial intelligent system and Fine Needle Aspiration (FNA). Methods We studied the Features of Wisconsin Data Base Cancer which contained about 569 FNA test samples (212 patient samples (malignant) and 357 healthy samples (benign)). In this research, we combined Artificial Intelligence Approaches, such as Evolutionary Algorithm (EA) with Genetic Algorithm (GA), and also used Exact Classifier Systems (here by Fuzzy C-Means (FCM)) to separate malignant from benign samples. Furthermore, we examined artificial Neural Networks (NN) to identify the model and structure. This research proposed a new algorithm for an accurate diagnosis of breast cancer. Results According to Wisconsin Data Base Cancer (WDBC) data base, 62.75% of samples were benign, and 37.25% were malignant. After applying the proposed algorithm, we achieved high detection accuracy of about "96.579%” on 205 patients who were diagnosed as having breast cancer. It was found that the method had 93% sensitivity, 73% specialty, 65% positive predictive value, and 95% negative predictive value, respectively. If done by experts, Fine Needle Aspiration (FNA) can be a reliable replacement for open biopsy in palpable breast masses. Evaluation of FNA samples during aspiration can decrease insufficient samples. FNA can be the first line of diagnosis in women with breast masses, at least in deprived regions, and may increase health standards and clinical supervision of patients. Conclusion Such a smart, economical, non-invasive, rapid and accurate system can be introduced as a useful diagnostic system for comprehensive treatment of breast cancer. Another advantage of this method is the possibility of diagnosing breast abnormalities. If done by experts, FNA can be a reliable replacement for open biopsy in palpable breast masses. Evaluation of FNA samples during aspiration can decrease insufficient samples. PMID:25352966

  16. [Study on Application of NIR Spectral Information Screening in Identification of Maca Origin].

    PubMed

    Wang, Yuan-zhong; Zhao, Yan-li; Zhang, Ji; Jin, Hang

    2016-02-01

    Medicinal and edible plant Maca is rich in various nutrients and owns great medicinal value. Based on near infrared diffuse reflectance spectra, 139 Maca samples collected from Peru and Yunnan were used to identify their geographical origins. Multiplication signal correction (MSC) coupled with second derivative (SD) and Norris derivative filter (ND) was employed in spectral pretreatment. Spectrum range (7,500-4,061 cm⁻¹) was chosen by spectrum standard deviation. Combined with principal component analysis-mahalanobis distance (PCA-MD), the appropriate number of principal components was selected as 5. Based on the spectrum range and the number of principal components selected, two abnormal samples were eliminated by modular group iterative singular sample diagnosis method. Then, four methods were used to filter spectral variable information, competitive adaptive reweighted sampling (CARS), monte carlo-uninformative variable elimination (MC-UVE), genetic algorithm (GA) and subwindow permutation analysis (SPA). The spectral variable information filtered was evaluated by model population analysis (MPA). The results showed that RMSECV(SPA) > RMSECV(CARS) > RMSECV(MC-UVE) > RMSECV(GA), were 2. 14, 2. 05, 2. 02, and 1. 98, and the spectral variables were 250, 240, 250 and 70, respectively. According to the spectral variable filtered, partial least squares discriminant analysis (PLS-DA) was used to build the model, with random selection of 97 samples as training set, and the other 40 samples as validation set. The results showed that, R²: GA > MC-UVE > CARS > SPA, RMSEC and RMSEP: GA < MC-UVE < CARS

  17. Analysis of spatial distribution of land cover maps accuracy

    NASA Astrophysics Data System (ADS)

    Khatami, R.; Mountrakis, G.; Stehman, S. V.

    2017-12-01

    Land cover maps have become one of the most important products of remote sensing science. However, classification errors will exist in any classified map and affect the reliability of subsequent map usage. Moreover, classification accuracy often varies over different regions of a classified map. These variations of accuracy will affect the reliability of subsequent analyses of different regions based on the classified maps. The traditional approach of map accuracy assessment based on an error matrix does not capture the spatial variation in classification accuracy. Here, per-pixel accuracy prediction methods are proposed based on interpolating accuracy values from a test sample to produce wall-to-wall accuracy maps. Different accuracy prediction methods were developed based on four factors: predictive domain (spatial versus spectral), interpolation function (constant, linear, Gaussian, and logistic), incorporation of class information (interpolating each class separately versus grouping them together), and sample size. Incorporation of spectral domain as explanatory feature spaces of classification accuracy interpolation was done for the first time in this research. Performance of the prediction methods was evaluated using 26 test blocks, with 10 km × 10 km dimensions, dispersed throughout the United States. The performance of the predictions was evaluated using the area under the curve (AUC) of the receiver operating characteristic. Relative to existing accuracy prediction methods, our proposed methods resulted in improvements of AUC of 0.15 or greater. Evaluation of the four factors comprising the accuracy prediction methods demonstrated that: i) interpolations should be done separately for each class instead of grouping all classes together; ii) if an all-classes approach is used, the spectral domain will result in substantially greater AUC than the spatial domain; iii) for the smaller sample size and per-class predictions, the spectral and spatial domain yielded similar AUC; iv) for the larger sample size (i.e., very dense spatial sample) and per-class predictions, the spatial domain yielded larger AUC; v) increasing the sample size improved accuracy predictions with a greater benefit accruing to the spatial domain; and vi) the function used for interpolation had the smallest effect on AUC.

  18. A quick method based on SIMPLISMA-KPLS for simultaneously selecting outlier samples and informative samples for model standardization in near infrared spectroscopy

    NASA Astrophysics Data System (ADS)

    Li, Li-Na; Ma, Chang-Ming; Chang, Ming; Zhang, Ren-Cheng

    2017-12-01

    A novel method based on SIMPLe-to-use Interactive Self-modeling Mixture Analysis (SIMPLISMA) and Kernel Partial Least Square (KPLS), named as SIMPLISMA-KPLS, is proposed in this paper for selection of outlier samples and informative samples simultaneously. It is a quick algorithm used to model standardization (or named as model transfer) in near infrared (NIR) spectroscopy. The NIR experiment data of the corn for analysis of the protein content is introduced to evaluate the proposed method. Piecewise direct standardization (PDS) is employed in model transfer. And the comparison of SIMPLISMA-PDS-KPLS and KS-PDS-KPLS is given in this research by discussion of the prediction accuracy of protein content and calculation speed of each algorithm. The conclusions include that SIMPLISMA-KPLS can be utilized as an alternative sample selection method for model transfer. Although it has similar accuracy to Kennard-Stone (KS), it is different from KS as it employs concentration information in selection program. This means that it ensures analyte information is involved in analysis, and the spectra (X) of the selected samples is interrelated with concentration (y). And it can be used for outlier sample elimination simultaneously by validation of calibration. According to the statistical data results of running time, it is clear that the sample selection process is more rapid when using KPLS. The quick algorithm of SIMPLISMA-KPLS is beneficial to improve the speed of online measurement using NIR spectroscopy.

  19. The Preference for Internet-Based Psychological Interventions by Individuals Without Past or Current Use of Mental Health Treatment Delivered Online: A Survey Study With Mixed-Methods Analysis

    PubMed Central

    Mattsson, Susanne; Olsson, Erik Martin Gustaf

    2016-01-01

    Background The use of the Internet has the potential to increase access to evidence-based mental health services for a far-reaching population at a low cost. However, low take-up rates in routine care indicate that barriers for implementing Internet-based interventions have not yet been fully identified. Objective The aim of this study was to evaluate the preference for Internet-based psychological interventions as compared to treatment delivered face to face among individuals without past or current use of mental health treatment delivered online. A further aim was to investigate predictors of treatment preference and to complement the quantitative analyses with qualitative data about the perceived advantages and disadvantages of Internet-based interventions. Methods Two convenience samples were used. Sample 1 was recruited in an occupational setting (n=231) and Sample 2 consisted of individuals previously treated for cancer (n=208). Data were collected using a paper-and-pencil survey and analyzed using mixed methods. Results The preference for Internet-based psychological interventions was low in both Sample 1 (6.5%) and Sample 2 (2.6%). Most participants preferred psychological interventions delivered face to face. Use of the Internet to search for and read health-related information was a significant predictor of treatment preference in both Sample 1 (odds ratio [OR] 2.82, 95% CI 1.18-6.75) and Sample 2 (OR 3.52, 95% CI 1.33-9.29). Being born outside of Sweden was a significant predictor of preference for Internet-based interventions, but only in Sample 2 (OR 6.24, 95% CI 1.29-30.16). Similar advantages and disadvantages were mentioned in both samples. Perceived advantages of Internet-based interventions included flexibility regarding time and location, low effort, accessibility, anonymity, credibility, user empowerment, and improved communication between therapist and client. Perceived disadvantages included anonymity, low credibility, impoverished communication between therapist and client, fear of negative side effects, requirements of computer literacy, and concerns about confidentiality. Conclusions Internet-based interventions were reported as the preferred choice by a minority of participants. The results suggest that Internet-based interventions have specific advantages that may facilitate help-seeking among some individuals and some disadvantages that may restrict its use. Initiatives to increase treatment acceptability may benefit from addressing the advantages and disadvantages reported in this study. PMID:27302200

  20. Comparison of soil sampling and analytical methods for asbestos at the Sumas Mountain Asbestos Site—Working towards a toolbox for better assessment

    EPA Science Inventory

    Established soil sampling methods for asbestos are inadequate to support risk assessment and risk-based decision making at Superfund sites due to difficulties in detecting asbestos at low concentrations and difficulty in extrapolating soil concentrations to air concentrations. En...

Top