NASA Astrophysics Data System (ADS)
Aspinall, M. D.; Joyce, M. J.; Mackin, R. O.; Jarrah, Z.; Boston, A. J.; Nolan, P. J.; Peyton, A. J.; Hawkes, N. P.
2009-01-01
A unique, digital time pick-off method, known as sample-interpolation timing (SIT) is described. This method demonstrates the possibility of improved timing resolution for the digital measurement of time of flight compared with digital replica-analogue time pick-off methods for signals sampled at relatively low rates. Three analogue timing methods have been replicated in the digital domain (leading-edge, crossover and constant-fraction timing) for pulse data sampled at 8 GSa s-1. Events arising from the 7Li(p, n)7Be reaction have been detected with an EJ-301 organic liquid scintillator and recorded with a fast digital sampling oscilloscope. Sample-interpolation timing was developed solely for the digital domain and thus performs more efficiently on digital signals compared with analogue time pick-off methods replicated digitally, especially for fast signals that are sampled at rates that current affordable and portable devices can achieve. Sample interpolation can be applied to any analogue timing method replicated digitally and thus also has the potential to exploit the generic capabilities of analogue techniques with the benefits of operating in the digital domain. A threshold in sampling rate with respect to the signal pulse width is observed beyond which further improvements in timing resolution are not attained. This advance is relevant to many applications in which time-of-flight measurement is essential.
Foo, Lee Kien; McGree, James; Duffull, Stephen
2012-01-01
Optimal design methods have been proposed to determine the best sampling times when sparse blood sampling is required in clinical pharmacokinetic studies. However, the optimal blood sampling time points may not be feasible in clinical practice. Sampling windows, a time interval for blood sample collection, have been proposed to provide flexibility in blood sampling times while preserving efficient parameter estimation. Because of the complexity of the population pharmacokinetic models, which are generally nonlinear mixed effects models, there is no analytical solution available to determine sampling windows. We propose a method for determination of sampling windows based on MCMC sampling techniques. The proposed method attains a stationary distribution rapidly and provides time-sensitive windows around the optimal design points. The proposed method is applicable to determine sampling windows for any nonlinear mixed effects model although our work focuses on an application to population pharmacokinetic models. Copyright © 2012 John Wiley & Sons, Ltd.
Lungu, Bwalya; Waltman, W Douglas; Berghaus, Roy D; Hofacre, Charles L
2012-04-01
Conventional culture methods have traditionally been considered the "gold standard" for the isolation and identification of foodborne bacterial pathogens. However, culture methods are labor-intensive and time-consuming. A Salmonella enterica serotype Enteritidis-specific real-time PCR assay that recently received interim approval by the National Poultry Improvement Plan for the detection of Salmonella Enteritidis was evaluated against a culture method that had also received interim National Poultry Improvement Plan approval for the analysis of environmental samples from integrated poultry houses. The method was validated with 422 field samples collected by either the boot sock or drag swab method. The samples were cultured by selective enrichment in tetrathionate broth followed by transfer onto a modified semisolid Rappaport-Vassiliadis medium and then plating onto brilliant green with novobiocin and xylose lysine brilliant Tergitol 4 plates. One-milliliter aliquots of the selective enrichment broths from each sample were collected for DNA extraction by the commercial PrepSEQ nucleic acid extraction assay and analysis by the Salmonella Enteritidis-specific real-time PCR assay. The real-time PCR assay detected no significant differences between the boot sock and drag swab samples. In contrast, the culture method detected a significantly higher number of positive samples from boot socks. The diagnostic sensitivity of the real-time PCR assay for the field samples was significantly higher than that of the culture method. The kappa value obtained was 0.46, indicating moderate agreement between the real-time PCR assay and the culture method. In addition, the real-time PCR method had a turnaround time of 2 days compared with 4 to 8 days for the culture method. The higher sensitivity as well as the reduction in time and labor makes this real-time PCR assay an excellent alternative to conventional culture methods for diagnostic purposes, surveillance, and research studies to improve food safety.
Application of work sampling technique to analyze logging operations.
Edwin S. Miyata; Helmuth M. Steinhilb; Sharon A. Winsauer
1981-01-01
Discusses the advantages and disadvantages of various time study methods for determining efficiency and productivity in logging. The work sampling method is compared with the continuous time-study method. Gives the feasibility, capability, and limitation of the work sampling method.
Tao, Guohua; Miller, William H
2011-07-14
An efficient time-dependent importance sampling method is developed for the Monte Carlo calculation of time correlation functions via the initial value representation (IVR) of semiclassical (SC) theory. A prefactor-free time-dependent sampling function weights the importance of a trajectory based on the magnitude of its contribution to the time correlation function, and global trial moves are used to facilitate the efficient sampling the phase space of initial conditions. The method can be generally applied to sampling rare events efficiently while avoiding being trapped in a local region of the phase space. Results presented in the paper for two system-bath models demonstrate the efficiency of this new importance sampling method for full SC-IVR calculations.
Representativeness of direct observations selected using a work-sampling equation.
Sharp, Rebecca A; Mudford, Oliver C; Elliffe, Douglas
2015-01-01
Deciding on appropriate sampling to obtain representative samples of behavior is important but not straightforward, because the relative duration of the target behavior may affect its observation in a given sampling interval. Work-sampling methods, which offer a way to adjust the frequency of sampling according to a priori or ongoing estimates of the behavior to achieve a preselected level of representativeness, may provide a solution. Full-week observations of 7 behaviors were conducted for 3 students with autism spectrum disorder and intellectual disabilities. Work-sampling methods were used to select momentary time samples from the full time-of-interest, which produced representative samples. However, work sampling required impractically high numbers of time samples to obtain representative samples. More practical momentary time samples produced less representative samples, particularly for low-duration behaviors. The utility and limits of work-sampling methods for applied behavior analysis are discussed. © Society for the Experimental Analysis of Behavior.
Comparison of preprocessing methods and storage times for touch DNA samples
Dong, Hui; Wang, Jing; Zhang, Tao; Ge, Jian-ye; Dong, Ying-qiang; Sun, Qi-fan; Liu, Chao; Li, Cai-xia
2017-01-01
Aim To select appropriate preprocessing methods for different substrates by comparing the effects of four different preprocessing methods on touch DNA samples and to determine the effect of various storage times on the results of touch DNA sample analysis. Method Hand touch DNA samples were used to investigate the detection and inspection results of DNA on different substrates. Four preprocessing methods, including the direct cutting method, stubbing procedure, double swab technique, and vacuum cleaner method, were used in this study. DNA was extracted from mock samples with four different preprocessing methods. The best preprocess protocol determined from the study was further used to compare performance after various storage times. DNA extracted from all samples was quantified and amplified using standard procedures. Results The amounts of DNA and the number of alleles detected on the porous substrates were greater than those on the non-porous substrates. The performances of the four preprocessing methods varied with different substrates. The direct cutting method displayed advantages for porous substrates, and the vacuum cleaner method was advantageous for non-porous substrates. No significant degradation trend was observed as the storage times increased. Conclusion Different substrates require the use of different preprocessing method in order to obtain the highest DNA amount and allele number from touch DNA samples. This study provides a theoretical basis for explorations of touch DNA samples and may be used as a reference when dealing with touch DNA samples in case work. PMID:28252870
Viana, Duarte S; Santamaría, Luis; Figuerola, Jordi
2016-02-01
Propagule retention time is a key factor in determining propagule dispersal distance and the shape of "seed shadows". Propagules dispersed by animal vectors are either ingested and retained in the gut until defecation or attached externally to the body until detachment. Retention time is a continuous variable, but it is commonly measured at discrete time points, according to pre-established sampling time-intervals. Although parametric continuous distributions have been widely fitted to these interval-censored data, the performance of different fitting methods has not been evaluated. To investigate the performance of five different fitting methods, we fitted parametric probability distributions to typical discretized retention-time data with known distribution using as data-points either the lower, mid or upper bounds of sampling intervals, as well as the cumulative distribution of observed values (using either maximum likelihood or non-linear least squares for parameter estimation); then compared the estimated and original distributions to assess the accuracy of each method. We also assessed the robustness of these methods to variations in the sampling procedure (sample size and length of sampling time-intervals). Fittings to the cumulative distribution performed better for all types of parametric distributions (lognormal, gamma and Weibull distributions) and were more robust to variations in sample size and sampling time-intervals. These estimated distributions had negligible deviations of up to 0.045 in cumulative probability of retention times (according to the Kolmogorov-Smirnov statistic) in relation to original distributions from which propagule retention time was simulated, supporting the overall accuracy of this fitting method. In contrast, fitting the sampling-interval bounds resulted in greater deviations that ranged from 0.058 to 0.273 in cumulative probability of retention times, which may introduce considerable biases in parameter estimates. We recommend the use of cumulative probability to fit parametric probability distributions to propagule retention time, specifically using maximum likelihood for parameter estimation. Furthermore, the experimental design for an optimal characterization of unimodal propagule retention time should contemplate at least 500 recovered propagules and sampling time-intervals not larger than the time peak of propagule retrieval, except in the tail of the distribution where broader sampling time-intervals may also produce accurate fits.
Létant, Sonia E; Murphy, Gloria A; Alfaro, Teneile M; Avila, Julie R; Kane, Staci R; Raber, Ellen; Bunt, Thomas M; Shah, Sanjiv R
2011-09-01
In the event of a biothreat agent release, hundreds of samples would need to be rapidly processed to characterize the extent of contamination and determine the efficacy of remediation activities. Current biological agent identification and viability determination methods are both labor- and time-intensive such that turnaround time for confirmed results is typically several days. In order to alleviate this issue, automated, high-throughput sample processing methods were developed in which real-time PCR analysis is conducted on samples before and after incubation. The method, referred to as rapid-viability (RV)-PCR, uses the change in cycle threshold after incubation to detect the presence of live organisms. In this article, we report a novel RV-PCR method for detection of live, virulent Bacillus anthracis, in which the incubation time was reduced from 14 h to 9 h, bringing the total turnaround time for results below 15 h. The method incorporates a magnetic bead-based DNA extraction and purification step prior to PCR analysis, as well as specific real-time PCR assays for the B. anthracis chromosome and pXO1 and pXO2 plasmids. A single laboratory verification of the optimized method applied to the detection of virulent B. anthracis in environmental samples was conducted and showed a detection level of 10 to 99 CFU/sample with both manual and automated RV-PCR methods in the presence of various challenges. Experiments exploring the relationship between the incubation time and the limit of detection suggest that the method could be further shortened by an additional 2 to 3 h for relatively clean samples.
Comparison of Techniques for Sampling Adult Necrophilous Insects From Pig Carcasses.
Cruise, Angela; Hatano, Eduardo; Watson, David W; Schal, Coby
2018-02-06
Studies of the pre-colonization interval and mechanisms driving necrophilous insect ecological succession depend on effective sampling of adult insects and knowledge of their diel and successional activity patterns. The number of insects trapped, their diversity, and diel periodicity were compared with four sampling methods on neonate pigs. Sampling method, time of day and decomposition age of the pigs significantly affected the number of insects sampled from pigs. We also found significant interactions of sampling method and decomposition day, time of sampling and decomposition day. No single method was superior to the other methods during all three decomposition days. Sampling times after noon yielded the largest samples during the first 2 d of decomposition. On day 3 of decomposition however, all sampling times were equally effective. Therefore, to maximize insect collections from neonate pigs, the method used to sample must vary by decomposition day. The suction trap collected the most species-rich samples, but sticky trap samples were the most diverse, when both species richness and evenness were factored into a Shannon diversity index. Repeated sampling during the noon to 18:00 hours period was most effective to obtain the maximum diversity of trapped insects. The integration of multiple sampling techniques would most effectively sample the necrophilous insect community. However, because all four tested methods were deficient at sampling beetle species, future work should focus on optimizing the most promising methods, alone or in combinations, and incorporate hand-collections of beetles. © The Author(s) 2018. Published by Oxford University Press on behalf of Entomological Society of America. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
Sommer, D; Enderlein, D; Antakli, A; Schönenbrücher, H; Slaghuis, J; Redmann, T; Lierz, M
2012-01-01
The efficiency of two commercial PCR methods based on real-time technology, the foodproof® Salmonella detection system and the BAX® PCR Assay Salmonella system was compared to standardized culture methods (EN ISO 6579:2002 - Annex D) for the detection of Salmonella spp. in poultry samples. Four sample matrices (feed, dust, boot swabs, feces) obtained directly from poultry flocks, as well as artificially spiked samples of the same matrices, were used. All samples were tested for Salmonella spp. using culture methods first as the gold standard. In addition samples spiked with Salmonella Enteridis were tested to evaluate the sensitivity of both PCR methods. Furthermore all methods were evaluated in an annual ring-trial of the National Salmonella Reference Laboratory of Germany. Salmonella detection in the matrices feed, dust and boot swabs were comparable in both PCR systems whereas the results from feces differed markedly. The quality, especially the freshness, of the fecal samples had an influence on the sensitivity of the real-time PCR and the results of the culture methods. In fresh fecal samples an initial spiking level of 100cfu/25g Salmonella Enteritidis was detected. Two-days-dried fecal samples allowed the detection of 14cfu/25g. Both real- time PCR protocols appear to be suitable for the detection of Salmonella spp. in all four matrices. The foodproof® system detected eight samples more to be positive compared to the BAX® system, but had a potential false positive result in one case. In 7-days-dried samples none of the methods was able to detect Salmonella likely through letal cell damage. In general the advantage of PCR analyses over the culture method is the reduction of working time from 4-5 days to only 2 days. However, especially for the analysis of fecal samples official validation should be conducted according to the requirement of EN ISO6579:2002 - Annex D.
An evaluation of flow-stratified sampling for estimating suspended sediment loads
Robert B. Thomas; Jack Lewis
1995-01-01
Abstract - Flow-stratified sampling is a new method for sampling water quality constituents such as suspended sediment to estimate loads. As with selection-at-list-time (SALT) and time-stratified sampling, flow-stratified sampling is a statistical method requiring random sampling, and yielding unbiased estimates of load and variance. It can be used to estimate event...
Létant, Sonia E.; Murphy, Gloria A.; Alfaro, Teneile M.; Avila, Julie R.; Kane, Staci R.; Raber, Ellen; Bunt, Thomas M.; Shah, Sanjiv R.
2011-01-01
In the event of a biothreat agent release, hundreds of samples would need to be rapidly processed to characterize the extent of contamination and determine the efficacy of remediation activities. Current biological agent identification and viability determination methods are both labor- and time-intensive such that turnaround time for confirmed results is typically several days. In order to alleviate this issue, automated, high-throughput sample processing methods were developed in which real-time PCR analysis is conducted on samples before and after incubation. The method, referred to as rapid-viability (RV)-PCR, uses the change in cycle threshold after incubation to detect the presence of live organisms. In this article, we report a novel RV-PCR method for detection of live, virulent Bacillus anthracis, in which the incubation time was reduced from 14 h to 9 h, bringing the total turnaround time for results below 15 h. The method incorporates a magnetic bead-based DNA extraction and purification step prior to PCR analysis, as well as specific real-time PCR assays for the B. anthracis chromosome and pXO1 and pXO2 plasmids. A single laboratory verification of the optimized method applied to the detection of virulent B. anthracis in environmental samples was conducted and showed a detection level of 10 to 99 CFU/sample with both manual and automated RV-PCR methods in the presence of various challenges. Experiments exploring the relationship between the incubation time and the limit of detection suggest that the method could be further shortened by an additional 2 to 3 h for relatively clean samples. PMID:21764960
Transformation-cost time-series method for analyzing irregularly sampled data
NASA Astrophysics Data System (ADS)
Ozken, Ibrahim; Eroglu, Deniz; Stemler, Thomas; Marwan, Norbert; Bagci, G. Baris; Kurths, Jürgen
2015-06-01
Irregular sampling of data sets is one of the challenges often encountered in time-series analysis, since traditional methods cannot be applied and the frequently used interpolation approach can corrupt the data and bias the subsequence analysis. Here we present the TrAnsformation-Cost Time-Series (TACTS) method, which allows us to analyze irregularly sampled data sets without degenerating the quality of the data set. Instead of using interpolation we consider time-series segments and determine how close they are to each other by determining the cost needed to transform one segment into the following one. Using a limited set of operations—with associated costs—to transform the time series segments, we determine a new time series, that is our transformation-cost time series. This cost time series is regularly sampled and can be analyzed using standard methods. While our main interest is the analysis of paleoclimate data, we develop our method using numerical examples like the logistic map and the Rössler oscillator. The numerical data allows us to test the stability of our method against noise and for different irregular samplings. In addition we provide guidance on how to choose the associated costs based on the time series at hand. The usefulness of the TACTS method is demonstrated using speleothem data from the Secret Cave in Borneo that is a good proxy for paleoclimatic variability in the monsoon activity around the maritime continent.
Transformation-cost time-series method for analyzing irregularly sampled data.
Ozken, Ibrahim; Eroglu, Deniz; Stemler, Thomas; Marwan, Norbert; Bagci, G Baris; Kurths, Jürgen
2015-06-01
Irregular sampling of data sets is one of the challenges often encountered in time-series analysis, since traditional methods cannot be applied and the frequently used interpolation approach can corrupt the data and bias the subsequence analysis. Here we present the TrAnsformation-Cost Time-Series (TACTS) method, which allows us to analyze irregularly sampled data sets without degenerating the quality of the data set. Instead of using interpolation we consider time-series segments and determine how close they are to each other by determining the cost needed to transform one segment into the following one. Using a limited set of operations-with associated costs-to transform the time series segments, we determine a new time series, that is our transformation-cost time series. This cost time series is regularly sampled and can be analyzed using standard methods. While our main interest is the analysis of paleoclimate data, we develop our method using numerical examples like the logistic map and the Rössler oscillator. The numerical data allows us to test the stability of our method against noise and for different irregular samplings. In addition we provide guidance on how to choose the associated costs based on the time series at hand. The usefulness of the TACTS method is demonstrated using speleothem data from the Secret Cave in Borneo that is a good proxy for paleoclimatic variability in the monsoon activity around the maritime continent.
Sharifdini, Meysam; Mirhendi, Hossein; Ashrafi, Keyhan; Hosseini, Mostafa; Mohebali, Mehdi; Khodadadi, Hossein; Kia, Eshrat Beigom
2015-01-01
This study was performed to evaluate nested polymerase chain reaction (PCR) and real-time PCR methods for detection of Strongyloides stercoralis in fecal samples compared with parasitological methods. A total of 466 stool samples were examined by conventional parasitological methods (formalin ether concentration [FEC] and agar plate culture [APC]). DNA was extracted using an in-house method, and mitochondrial cytochrome c oxidase subunit 1 and 18S ribosomal genes were amplified by nested PCR and real-time PCR, respectively. Among 466 samples, 12.7% and 18.2% were found infected with S. stercoralis by FEC and APC, respectively. DNA of S. stercoralis was detected in 18.9% and 25.1% of samples by real-time PCR and nested PCR, respectively. Considering parasitological methods as the diagnostic gold standard, the sensitivity and specificity of nested PCR were 100% and 91.6%, respectively, and that of real-time PCR were 84.7% and 95.8%, respectively. However, considering sequence analyzes of the selected nested PCR products, the specificity of nested PCR is increased. In general, molecular methods were superior to parasitological methods. They were more sensitive and more reliable in detection of S. stercoralis in comparison with parasitological methods. Between the two molecular methods, the sensitivity of nested PCR was higher than real-time PCR. PMID:26350449
Robert B. Thomas; Jack Lewis
1993-01-01
Time-stratified sampling of sediment for estimating suspended load is introduced and compared to selection at list time (SALT) sampling. Both methods provide unbiased estimates of load and variance. The magnitude of the variance of the two methods is compared using five storm populations of suspended sediment flux derived from turbidity data. Under like conditions,...
Velasco, Valeria; Sherwood, Julie S.; Rojas-García, Pedro P.; Logue, Catherine M.
2014-01-01
The aim of this study was to compare a real-time PCR assay, with a conventional culture/PCR method, to detect S. aureus, mecA and Panton-Valentine Leukocidin (PVL) genes in animals and retail meat, using a two-step selective enrichment protocol. A total of 234 samples were examined (77 animal nasal swabs, 112 retail raw meat, and 45 deli meat). The multiplex real-time PCR targeted the genes: nuc (identification of S. aureus), mecA (associated with methicillin resistance) and PVL (virulence factor), and the primary and secondary enrichment samples were assessed. The conventional culture/PCR method included the two-step selective enrichment, selective plating, biochemical testing, and multiplex PCR for confirmation. The conventional culture/PCR method recovered 95/234 positive S. aureus samples. Application of real-time PCR on samples following primary and secondary enrichment detected S. aureus in 111/234 and 120/234 samples respectively. For detection of S. aureus, the kappa statistic was 0.68–0.88 (from substantial to almost perfect agreement) and 0.29–0.77 (from fair to substantial agreement) for primary and secondary enrichments, using real-time PCR. For detection of mecA gene, the kappa statistic was 0–0.49 (from no agreement beyond that expected by chance to moderate agreement) for primary and secondary enrichment samples. Two pork samples were mecA gene positive by all methods. The real-time PCR assay detected the mecA gene in samples that were negative for S. aureus, but positive for Staphylococcus spp. The PVL gene was not detected in any sample by the conventional culture/PCR method or the real-time PCR assay. Among S. aureus isolated by conventional culture/PCR method, the sequence type ST398, and multi-drug resistant strains were found in animals and raw meat samples. The real-time PCR assay may be recommended as a rapid method for detection of S. aureus and the mecA gene, with further confirmation of methicillin-resistant S. aureus (MRSA) using the standard culture method. PMID:24849624
Velasco, Valeria; Sherwood, Julie S; Rojas-García, Pedro P; Logue, Catherine M
2014-01-01
The aim of this study was to compare a real-time PCR assay, with a conventional culture/PCR method, to detect S. aureus, mecA and Panton-Valentine Leukocidin (PVL) genes in animals and retail meat, using a two-step selective enrichment protocol. A total of 234 samples were examined (77 animal nasal swabs, 112 retail raw meat, and 45 deli meat). The multiplex real-time PCR targeted the genes: nuc (identification of S. aureus), mecA (associated with methicillin resistance) and PVL (virulence factor), and the primary and secondary enrichment samples were assessed. The conventional culture/PCR method included the two-step selective enrichment, selective plating, biochemical testing, and multiplex PCR for confirmation. The conventional culture/PCR method recovered 95/234 positive S. aureus samples. Application of real-time PCR on samples following primary and secondary enrichment detected S. aureus in 111/234 and 120/234 samples respectively. For detection of S. aureus, the kappa statistic was 0.68-0.88 (from substantial to almost perfect agreement) and 0.29-0.77 (from fair to substantial agreement) for primary and secondary enrichments, using real-time PCR. For detection of mecA gene, the kappa statistic was 0-0.49 (from no agreement beyond that expected by chance to moderate agreement) for primary and secondary enrichment samples. Two pork samples were mecA gene positive by all methods. The real-time PCR assay detected the mecA gene in samples that were negative for S. aureus, but positive for Staphylococcus spp. The PVL gene was not detected in any sample by the conventional culture/PCR method or the real-time PCR assay. Among S. aureus isolated by conventional culture/PCR method, the sequence type ST398, and multi-drug resistant strains were found in animals and raw meat samples. The real-time PCR assay may be recommended as a rapid method for detection of S. aureus and the mecA gene, with further confirmation of methicillin-resistant S. aureus (MRSA) using the standard culture method.
Method and apparatus for measuring nuclear magnetic properties
Weitekamp, D.P.; Bielecki, A.; Zax, D.B.; Zilm, K.W.; Pines, A.
1987-12-01
A method for studying the chemical and structural characteristics of materials is disclosed. The method includes placement of a sample material in a high strength polarizing magnetic field to order the sample nuclei. The condition used to order the sample is then removed abruptly and the ordering of the sample allowed to evolve for a time interval. At the end of the time interval, the ordering of the sample is measured by conventional nuclear magnetic resonance techniques. 5 figs.
Method and apparatus for measuring nuclear magnetic properties
Weitekamp, Daniel P.; Bielecki, Anthony; Zax, David B.; Zilm, Kurt W.; Pines, Alexander
1987-01-01
A method for studying the chemical and structural characteristics of materials is disclosed. The method includes placement of a sample material in a high strength polarizing magnetic field to order the sample nucleii. The condition used to order the sample is then removed abruptly and the ordering of the sample allowed to evolve for a time interval. At the end of the time interval, the ordering of the sample is measured by conventional nuclear magnetic resonance techniques.
Screen Space Ambient Occlusion Based Multiple Importance Sampling for Real-Time Rendering
NASA Astrophysics Data System (ADS)
Zerari, Abd El Mouméne; Babahenini, Mohamed Chaouki
2018-03-01
We propose a new approximation technique for accelerating the Global Illumination algorithm for real-time rendering. The proposed approach is based on the Screen-Space Ambient Occlusion (SSAO) method, which approximates the global illumination for large, fully dynamic scenes at interactive frame rates. Current algorithms that are based on the SSAO method suffer from difficulties due to the large number of samples that are required. In this paper, we propose an improvement to the SSAO technique by integrating it with a Multiple Importance Sampling technique that combines a stratified sampling method with an importance sampling method, with the objective of reducing the number of samples. Experimental evaluation demonstrates that our technique can produce high-quality images in real time and is significantly faster than traditional techniques.
Motamedi, Marjan; Mirhendi, Hossein; Zomorodian, Kamiar; Khodadadi, Hossein; Kharazi, Mahboobeh; Ghasemi, Zeinab; Shidfar, Mohammad Reza; Makimura, Koichi
2017-10-01
Following our previous report on evaluation of the beta tubulin real-time PCR for detection of dermatophytosis, this study aimed to compare the real-time PCR assay with conventional methods for the clinical assessment of its diagnostic performance. Samples from a total of 853 patients with suspected dermatophyte lesions were subjected to direct examination (all samples), culture (499 samples) and real-time PCR (all samples). Fungal DNA was extracted directly from clinical samples using a conical steel bullet, followed by purification with a commercial kit and subjected to the Taq-Man probe-based real-time PCR. The study showed that among the 499 specimens for which all three methods were used, 156 (31.2%), 128 (25.6%) and 205 (41.0%) were found to be positive by direct microscopy, culture and real-time PCR respectively. Real-time PCR significantly increased the detection rate of dermatophytes compared with microscopy (288 vs 229) with 87% concordance between the two methods. The sensitivity, specificity, positive predictive value, and negative predictive value of the real-time PCR was 87.5%, 85%, 66.5% and 95.2% respectively. Although real-time PCR performed better on skin than on nail samples, it should not yet fully replace conventional diagnosis. © 2017 Blackwell Verlag GmbH.
Marinelli, L; Cottarelli, A; Solimini, A G; Del Cimmuto, A; De Giusti, M
2017-01-01
In this study we estimated the presence of Legionella species, viable but non-culturable (VBNC), in hospital water networks. We also evaluated the time and load of Legionella appearance in samples found negative using the standard culture method. A total of 42 samples was obtained from the tap water of five hospital buildings. The samples were tested for Legionella by the standard culture method and were monitored for up to 12 months for the appearance of VBNC Legionella. All the 42 samples were negative at the time of collection. Seven of the 42 samples (17.0%) became positive for Legionella at different times of monitoring. The time to the appearance of VBNC Legionella was extremely variable, from 15 days to 9 months from sampling. The most frequent Legionella species observed were Legionella spp and L. anisa and only in one sample L. pneumophila srg.1. Our study confirms the presence of VBNC Legionella in samples resulting negative using the standard culture method and highlights the different time to its appearance that can occur several months after sampling. The results are important for risk assessment and risk management of engineered water systems.
Crepeau, Kathryn L.; Fram, Miranda S.; Bush, Noel
2004-01-01
An analytical method for the determination of the trihalomethane formation potential of water samples has been developed. The trihalomethane formation potential is measured by dosing samples with chlorine under specified conditions of pH, temperature, incubation time, darkness, and residual-free chlorine, and then analyzing the resulting trihalomethanes by purge and trap/gas chromatography equipped with an electron capture detector. Detailed explanations of the method and quality-control practices are provided. Method validation experiments showed that the trihalomethane formation potential varies as a function of time between sample collection and analysis, residual-free chlorine concentration, method of sample dilution, and the concentration of bromide in the sample.
Sampling maternal care behaviour in domestic dogs: What's the best approach?
Czerwinski, Veronika H; Smith, Bradley P; Hynd, Philip I; Hazel, Susan J
2017-07-01
Our understanding of the frequency and duration of maternal care behaviours in the domestic dog during the first two postnatal weeks is limited, largely due to the inconsistencies in the sampling methodologies that have been employed. In order to develop a more concise picture of maternal care behaviour during this period, and to help establish the sampling method that represents these behaviours best, we compared a variety of time sampling methods Six litters were continuously observed for a total of 96h over postnatal days 3, 6, 9 and 12 (24h per day). Frequent (dam presence, nursing duration, contact duration) and infrequent maternal behaviours (anogenital licking duration and frequency) were coded using five different time sampling methods that included: 12-h night (1800-0600h), 12-h day (0600-1800h), one hour period during the night (1800-0600h), one hour period during the day (0600-1800h) and a one hour period anytime. Each of the one hour time sampling method consisted of four randomly chosen 15-min periods. Two random sets of four 15-min period were also analysed to ensure reliability. We then determined which of the time sampling methods averaged over the three 24-h periods best represented the frequency and duration of behaviours. As might be expected, frequently occurring behaviours were adequately represented by short (oneh) sampling periods, however this was not the case with the infrequent behaviour. Thus, we argue that the time sampling methodology employed must match the behaviour of interest. This caution applies to maternal behaviour in altricial species, such as canids, as well as all systematic behavioural observations utilising time sampling methodology. Copyright © 2017. Published by Elsevier B.V.
Shin, Saeam; Kim, Juwon; Kim, Yoonjung; Cho, Sun-Mi; Lee, Kyung-A
2017-10-26
EGFR mutation is an emerging biomarker for treatment selection in non-small-cell lung cancer (NSCLC) patients. However, optimal mutation detection is hindered by complications associated with the biopsy procedure, tumor heterogeneity and limited sensitivity of test methodology. In this study, we evaluated the diagnostic utility of real-time PCR using malignant pleural effusion samples. A total of 77 pleural fluid samples from 77 NSCLC patients were tested using the cobas EGFR mutation test (Roche Molecular Systems). Pleural fluid was centrifuged, and separated cell pellets and supernatants were tested in parallel. Results were compared with Sanger sequencing and/or peptide nucleic acid (PNA)-mediated PCR clamping of matched tumor tissue or pleural fluid samples. All samples showed valid real-time PCR results in one or more DNA samples extracted from cell pellets and supernatants. Compared with other molecular methods, the sensitivity of real-time PCR method was 100%. Concordance rate of real-time PCR and Sanger sequencing plus PNA-mediated PCR clamping was 98.7%. We have confirmed that real-time PCR using pleural fluid had a high concordance rate compared to conventional methods, with no failed samples. Our data demonstrated that the parallel real-time PCR testing using supernatant and cell pellet could offer reliable and robust surrogate strategy when tissue is not available.
Evaluating Composite Sampling Methods of Bacillus Spores at Low Concentrations
Hess, Becky M.; Amidan, Brett G.; Anderson, Kevin K.; Hutchison, Janine R.
2016-01-01
Restoring all facility operations after the 2001 Amerithrax attacks took years to complete, highlighting the need to reduce remediation time. Some of the most time intensive tasks were environmental sampling and sample analyses. Composite sampling allows disparate samples to be combined, with only a single analysis needed, making it a promising method to reduce response times. We developed a statistical experimental design to test three different composite sampling methods: 1) single medium single pass composite (SM-SPC): a single cellulose sponge samples multiple coupons with a single pass across each coupon; 2) single medium multi-pass composite: a single cellulose sponge samples multiple coupons with multiple passes across each coupon (SM-MPC); and 3) multi-medium post-sample composite (MM-MPC): a single cellulose sponge samples a single surface, and then multiple sponges are combined during sample extraction. Five spore concentrations of Bacillus atrophaeus Nakamura spores were tested; concentrations ranged from 5 to 100 CFU/coupon (0.00775 to 0.155 CFU/cm2). Study variables included four clean surface materials (stainless steel, vinyl tile, ceramic tile, and painted dry wallboard) and three grime coated/dirty materials (stainless steel, vinyl tile, and ceramic tile). Analysis of variance for the clean study showed two significant factors: composite method (p< 0.0001) and coupon material (p = 0.0006). Recovery efficiency (RE) was higher overall using the MM-MPC method compared to the SM-SPC and SM-MPC methods. RE with the MM-MPC method for concentrations tested (10 to 100 CFU/coupon) was similar for ceramic tile, dry wall, and stainless steel for clean materials. RE was lowest for vinyl tile with both composite methods. Statistical tests for the dirty study showed RE was significantly higher for vinyl and stainless steel materials, but lower for ceramic tile. These results suggest post-sample compositing can be used to reduce sample analysis time when responding to a Bacillus anthracis contamination event of clean or dirty surfaces. PMID:27736999
Evaluating Composite Sampling Methods of Bacillus Spores at Low Concentrations.
Hess, Becky M; Amidan, Brett G; Anderson, Kevin K; Hutchison, Janine R
2016-01-01
Restoring all facility operations after the 2001 Amerithrax attacks took years to complete, highlighting the need to reduce remediation time. Some of the most time intensive tasks were environmental sampling and sample analyses. Composite sampling allows disparate samples to be combined, with only a single analysis needed, making it a promising method to reduce response times. We developed a statistical experimental design to test three different composite sampling methods: 1) single medium single pass composite (SM-SPC): a single cellulose sponge samples multiple coupons with a single pass across each coupon; 2) single medium multi-pass composite: a single cellulose sponge samples multiple coupons with multiple passes across each coupon (SM-MPC); and 3) multi-medium post-sample composite (MM-MPC): a single cellulose sponge samples a single surface, and then multiple sponges are combined during sample extraction. Five spore concentrations of Bacillus atrophaeus Nakamura spores were tested; concentrations ranged from 5 to 100 CFU/coupon (0.00775 to 0.155 CFU/cm2). Study variables included four clean surface materials (stainless steel, vinyl tile, ceramic tile, and painted dry wallboard) and three grime coated/dirty materials (stainless steel, vinyl tile, and ceramic tile). Analysis of variance for the clean study showed two significant factors: composite method (p< 0.0001) and coupon material (p = 0.0006). Recovery efficiency (RE) was higher overall using the MM-MPC method compared to the SM-SPC and SM-MPC methods. RE with the MM-MPC method for concentrations tested (10 to 100 CFU/coupon) was similar for ceramic tile, dry wall, and stainless steel for clean materials. RE was lowest for vinyl tile with both composite methods. Statistical tests for the dirty study showed RE was significantly higher for vinyl and stainless steel materials, but lower for ceramic tile. These results suggest post-sample compositing can be used to reduce sample analysis time when responding to a Bacillus anthracis contamination event of clean or dirty surfaces.
Detecting the sampling rate through observations
NASA Astrophysics Data System (ADS)
Shoji, Isao
2018-09-01
This paper proposes a method to detect the sampling rate of discrete time series of diffusion processes. Using the maximum likelihood estimates of the parameters of a diffusion process, we establish a criterion based on the Kullback-Leibler divergence and thereby estimate the sampling rate. Simulation studies are conducted to check whether the method can detect the sampling rates from data and their results show a good performance in the detection. In addition, the method is applied to a financial time series sampled on daily basis and shows the detected sampling rate is different from the conventional rates.
Tao, Guohua; Miller, William H
2012-09-28
An efficient time-dependent (TD) Monte Carlo (MC) importance sampling method has recently been developed [G. Tao and W. H. Miller, J. Chem. Phys. 135, 024104 (2011)] for the evaluation of time correlation functions using the semiclassical (SC) initial value representation (IVR) methodology. In this TD-SC-IVR method, the MC sampling uses information from both time-evolved phase points as well as their initial values, and only the "important" trajectories are sampled frequently. Even though the TD-SC-IVR was shown in some benchmark examples to be much more efficient than the traditional time-independent sampling method (which uses only initial conditions), the calculation of the SC prefactor-which is computationally expensive, especially for large systems-is still required for accepted trajectories. In the present work, we present an approximate implementation of the TD-SC-IVR method that is completely prefactor-free; it gives the time correlation function as a classical-like magnitude function multiplied by a phase function. Application of this approach to flux-flux correlation functions (which yield reaction rate constants) for the benchmark H + H(2) system shows very good agreement with exact quantum results. Limitations of the approximate approach are also discussed.
RAPID SEPARATION METHOD FOR EMERGENCY WATER AND URINE SAMPLES
DOE Office of Scientific and Technical Information (OSTI.GOV)
Maxwell, S.; Culligan, B.
2008-08-27
The Savannah River Site Environmental Bioassay Lab participated in the 2008 NRIP Emergency Response program administered by the National Institute for Standards and Technology (NIST) in May, 2008. A new rapid column separation method was used for analysis of actinides and {sup 90}Sr the NRIP 2008 emergency water and urine samples. Significant method improvements were applied to reduce analytical times. As a result, much faster analysis times were achieved, less than 3 hours for determination of {sup 90}Sr and 3-4 hours for actinides. This represents a 25%-33% improvement in analysis times from NRIP 2007 and a {approx}100% improvement compared tomore » NRIP 2006 report times. Column flow rates were increased by a factor of two, with no significant adverse impact on the method performance. Larger sample aliquots, shorter count times, faster cerium fluoride microprecipitation and streamlined calcium phosphate precipitation were also employed. Based on initial feedback from NIST, the SRS Environmental Bioassay Lab had the most rapid analysis times for actinides and {sup 90}Sr analyses for NRIP 2008 emergency urine samples. High levels of potential matrix interferences may be present in emergency samples and rugged methods are essential. Extremely high levels of {sup 210}Po were found to have an adverse effect on the uranium results for the NRIP-08 urine samples, while uranium results for NRIP-08 water samples were not affected. This problem, which was not observed for NRIP-06 or NRIP-07 urine samples, was resolved by using an enhanced {sup 210}Po removal step, which will be described.« less
Sun, Chenglu; Li, Wei; Chen, Wei
2017-01-01
For extracting the pressure distribution image and respiratory waveform unobtrusively and comfortably, we proposed a smart mat which utilized a flexible pressure sensor array, printed electrodes and novel soft seven-layer structure to monitor those physiological information. However, in order to obtain high-resolution pressure distribution and more accurate respiratory waveform, it needs more time to acquire the pressure signal of all the pressure sensors embedded in the smart mat. In order to reduce the sampling time while keeping the same resolution and accuracy, a novel method based on compressed sensing (CS) theory was proposed. By utilizing the CS based method, 40% of the sampling time can be decreased by means of acquiring nearly one-third of original sampling points. Then several experiments were carried out to validate the performance of the CS based method. While less than one-third of original sampling points were measured, the correlation degree coefficient between reconstructed respiratory waveform and original waveform can achieve 0.9078, and the accuracy of the respiratory rate (RR) extracted from the reconstructed respiratory waveform can reach 95.54%. The experimental results demonstrated that the novel method can fit the high resolution smart mat system and be a viable option for reducing the sampling time of the pressure sensor array. PMID:28796188
de Vries, W; Wieggers, H J J; Brus, D J
2010-08-05
Element fluxes through forest ecosystems are generally based on measurements of concentrations in soil solution at regular time intervals at plot locations sampled in a regular grid. Here we present spatially averaged annual element leaching fluxes in three Dutch forest monitoring plots using a new sampling strategy in which both sampling locations and sampling times are selected by probability sampling. Locations were selected by stratified random sampling with compact geographical blocks of equal surface area as strata. In each sampling round, six composite soil solution samples were collected, consisting of five aliquots, one per stratum. The plot-mean concentration was estimated by linear regression, so that the bias due to one or more strata being not represented in the composite samples is eliminated. The sampling times were selected in such a way that the cumulative precipitation surplus of the time interval between two consecutive sampling times was constant, using an estimated precipitation surplus averaged over the past 30 years. The spatially averaged annual leaching flux was estimated by using the modeled daily water flux as an ancillary variable. An important advantage of the new method is that the uncertainty in the estimated annual leaching fluxes due to spatial and temporal variation and resulting sampling errors can be quantified. Results of this new method were compared with the reference approach in which daily leaching fluxes were calculated by multiplying daily interpolated element concentrations with daily water fluxes and then aggregated to a year. Results show that the annual fluxes calculated with the reference method for the period 2003-2005, including all plots, elements and depths, lies only in 53% of the cases within the range of the average +/-2 times the standard error of the new method. Despite the differences in results, both methods indicate comparable N retention and strong Al mobilization in all plots, with Al leaching being nearly equal to the leaching of SO(4) and NO(3) with fluxes expressed in mol(c) ha(-1) yr(-1). This illustrates that Al release, which is the clearest signal of soil acidification, is mainly due to the external input of SO(4) and NO(3).
Sampling Methods for Detection and Monitoring of the Asian Citrus Psyllid (Hemiptera: Psyllidae).
Monzo, C; Arevalo, H A; Jones, M M; Vanaclocha, P; Croxton, S D; Qureshi, J A; Stansly, P A
2015-06-01
The Asian citrus psyllid (ACP), Diaphorina citri Kuwayama is a key pest of citrus due to its role as vector of citrus greening disease or "huanglongbing." ACP monitoring is considered an indispensable tool for management of vector and disease. In the present study, datasets collected between 2009 and 2013 from 245 citrus blocks were used to evaluate precision, sensitivity for detection, and efficiency of five sampling methods. The number of samples needed to reach a 0.25 standard error-mean ratio was estimated using Taylor's power law and used to compare precision among sampling methods. Comparison of detection sensitivity and time expenditure (cost) between stem-tap and other sampling methodologies conducted consecutively at the same location were also assessed. Stem-tap sampling was the most efficient sampling method when ACP densities were moderate to high and served as the basis for comparison with all other methods. Protocols that grouped trees near randomly selected locations across the block were more efficient than sampling trees at random across the block. Sweep net sampling was similar to stem-taps in number of captures per sampled unit, but less precise at any ACP density. Yellow sticky traps were 14 times more sensitive than stem-taps but much more time consuming and thus less efficient except at very low population densities. Visual sampling was efficient for detecting and monitoring ACP at low densities. Suction sampling was time consuming and taxing but the most sensitive of all methods for detection of sparse populations. This information can be used to optimize ACP monitoring efforts. © The Authors 2015. Published by Oxford University Press on behalf of Entomological Society of America. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Krämer, Nadine; Löfström, Charlotta; Vigre, Håkan; Hoorfar, Jeffrey; Bunge, Cornelia; Malorny, Burkhard
2011-03-01
Salmonella is a major zoonotic pathogen which causes outbreaks and sporadic cases of gastroenteritis in humans worldwide. The primary sources for Salmonella are food-producing animals such as pigs and poultry. For risk assessment and hazard analysis and critical control point (HACCP) concepts, it is essential to produce large amounts of quantitative data, which is currently not achievable with the standard cultural based methods for enumeration of Salmonella. This study presents the development of a novel strategy to enumerate low numbers of Salmonella in cork borer samples taken from pig carcasses as a first concept and proof of principle for a new sensitive and rapid quantification method based on combined enrichment and real-time PCR. The novelty of the approach is in the short pre-enrichment step, where for most bacteria, growth is in the log phase. The method consists of an 8h pre-enrichment of the cork borer sample diluted 1:10 in non-selective buffered peptone water, followed by DNA extraction, and Salmonella detection and quantification by real-time PCR. The limit of quantification was 1.4 colony forming units (CFU)/20 cm(2) (approximately 10 g) of artificially contaminated sample with 95% confidence interval of ± 0.7 log CFU/sample. The precision was similar to the standard reference most probable number (MPN) method. A screening of 200 potentially naturally contaminated cork borer samples obtained over seven weeks in a slaughterhouse resulted in 25 Salmonella-positive samples. The analysis of salmonellae within these samples showed that the PCR method had a higher sensitivity for samples with a low contamination level (<6.7 CFU/sample), where 15 of the samples negative with the MPN method was detected with the PCR method and 5 were found to be negative by both methods. For the samples with a higher contamination level (6.7-310 CFU/sample) a good agreement between the results obtained with the PCR and MPN methods was obtained. The quantitative real-time PCR method can easily be applied to other food and environmental matrices by adaptation of the pre-enrichment time and media. Copyright © 2010 Elsevier B.V. All rights reserved.
Measuring larval nematode contamination on cattle pastures: Comparing two herbage sampling methods.
Verschave, S H; Levecke, B; Duchateau, L; Vercruysse, J; Charlier, J
2015-06-15
Assessing levels of pasture larval contamination is frequently used to study the population dynamics of the free-living stages of parasitic nematodes of livestock. Direct quantification of infective larvae (L3) on herbage is the most applied method to measure pasture larval contamination. However, herbage collection remains labour intensive and there is a lack of studies addressing the variation induced by the sampling method and the required sample size. The aim of this study was (1) to compare two different sampling methods in terms of pasture larval count results and time required to sample, (2) to assess the amount of variation in larval counts at the level of sample plot, pasture and season, respectively and (3) to calculate the required sample size to assess pasture larval contamination with a predefined precision using random plots across pasture. Eight young stock pastures of different commercial dairy herds were sampled in three consecutive seasons during the grazing season (spring, summer and autumn). On each pasture, herbage samples were collected through both a double-crossed W-transect with samples taken every 10 steps (method 1) and four random located plots of 0.16 m(2) with collection of all herbage within the plot (method 2). The average (± standard deviation (SD)) pasture larval contamination using sampling methods 1 and 2 was 325 (± 479) and 305 (± 444)L3/kg dry herbage (DH), respectively. Large discrepancies in pasture larval counts of the same pasture and season were often seen between methods, but no significant difference (P = 0.38) in larval counts between methods was found. Less time was required to collect samples with method 2. This difference in collection time between methods was most pronounced for pastures with a surface area larger than 1 ha. The variation in pasture larval counts from samples generated by random plot sampling was mainly due to the repeated measurements on the same pasture in the same season (residual variance component = 6.2), rather than due to pasture (variance component = 0.55) or season (variance component = 0.15). Using the observed distribution of L3, the required sample size (i.e. number of plots per pasture) for sampling a pasture through random plots with a particular precision was simulated. A higher relative precision was acquired when estimating PLC on pastures with a high larval contamination and a low level of aggregation compared to pastures with a low larval contamination when the same sample size was applied. In the future, herbage sampling through random plots across pasture (method 2) seems a promising method to develop further as no significant difference in counts between the methods was found and this method was less time consuming. Copyright © 2015 Elsevier B.V. All rights reserved.
Ultrasonic sensor and method of use
Condreva, Kenneth J.
2001-01-01
An ultrasonic sensor system and method of use for measuring transit time though a liquid sample, using one ultrasonic transducer coupled to a precision time interval counter. The timing circuit captures changes in transit time, representing small changes in the velocity of sound transmitted, over necessarily small time intervals (nanoseconds) and uses the transit time changes to identify the presence of non-conforming constituents in the sample.
Frequency-time coherence for all-optical sampling without optical pulse source
Preußler, Stefan; Raoof Mehrpoor, Gilda; Schneider, Thomas
2016-01-01
Sampling is the first step to convert an analogue optical signal into a digital electrical signal. The latter can be further processed and analysed by well-known electrical signal processing methods. Optical pulse sources like mode-locked lasers are commonly incorporated for all-optical sampling, but have several drawbacks. A novel approach for a simple all-optical sampling is to utilise the frequency-time coherence of each signal. The method is based on only using two coupled modulators driven with an electrical sine wave. Since no optical source is required, a simple integration in appropriate platforms, such as Silicon Photonics might be possible. The presented method grants all-optical sampling with electrically tunable bandwidth, repetition rate and time shift. PMID:27687495
Evaluating Composite Sampling Methods of Bacillus spores at Low Concentrations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hess, Becky M.; Amidan, Brett G.; Anderson, Kevin K.
Restoring facility operations after the 2001 Amerithrax attacks took over three months to complete, highlighting the need to reduce remediation time. The most time intensive tasks were environmental sampling and sample analyses. Composite sampling allows disparate samples to be combined, with only a single analysis needed, making it a promising method to reduce response times. We developed a statistical experimental design to test three different composite sampling methods: 1) single medium single pass composite: a single cellulose sponge samples multiple coupons; 2) single medium multi-pass composite: a single cellulose sponge is used to sample multiple coupons; and 3) multi-medium post-samplemore » composite: a single cellulose sponge samples a single surface, and then multiple sponges are combined during sample extraction. Five spore concentrations of Bacillus atrophaeus Nakamura spores were tested; concentrations ranged from 5 to 100 CFU/coupon (0.00775 to 0.155CFU/cm2, respectively). Study variables included four clean surface materials (stainless steel, vinyl tile, ceramic tile, and painted wallboard) and three grime coated/dirty materials (stainless steel, vinyl tile, and ceramic tile). Analysis of variance for the clean study showed two significant factors: composite method (p-value < 0.0001) and coupon material (p-value = 0.0008). Recovery efficiency (RE) was higher overall using the post-sample composite (PSC) method compared to single medium composite from both clean and grime coated materials. RE with the PSC method for concentrations tested (10 to 100 CFU/coupon) was similar for ceramic tile, painted wall board, and stainless steel for clean materials. RE was lowest for vinyl tile with both composite methods. Statistical tests for the dirty study showed RE was significantly higher for vinyl and stainless steel materials, but significantly lower for ceramic tile. These results suggest post-sample compositing can be used to reduce sample analysis time when responding to a Bacillus anthracis contamination event of clean or dirty surfaces.« less
Son, Na Ry; Seo, Dong Joo; Lee, Min Hwa; Seo, Sheungwoo; Wang, Xiaoyu; Lee, Bog-Hieu; Lee, Jeong-Su; Joo, In-Sun; Hwang, In-Gyun; Choi, Changsun
2014-09-01
The aim of this study was to develop an optimal technique for detecting hepatitis E virus (HEV) in swine livers. Here, three elution buffers and two concentration methods were compared with respect to enhancing recovery of HEV from swine liver samples. Real-time reverse transcription-polymerase chain reaction (RT-PCR) and nested RT-PCR were performed to detect HEV RNA. When phosphate-buffered saline (PBS, pH 7.4) was used to concentrate HEV in swine liver samples using ultrafiltration, real-time RT-PCR detected HEV in 6 of the 26 samples. When threonine buffer was used to concentrate HEV using polyethylene glycol (PEG) precipitation and ultrafiltration, real-time RT-PCR detected HEV in 1 and 3 of the 26 samples, respectively. When glycine buffer was used to concentrate HEV using ultrafiltration and PEG precipitation, real-time RT-PCR detected HEV in 1 and 3 samples of the 26 samples, respectively. When nested RT-PCR was used to detect HEV, all samples tested negative regardless of the type of elution buffer or concentration method used. Therefore, the combination of real-time RT-PCR and ultrafiltration with PBS buffer was the most sensitive and reliable method for detecting HEV in swine livers. Copyright © 2014 Elsevier B.V. All rights reserved.
Time Series Analysis Based on Running Mann Whitney Z Statistics
USDA-ARS?s Scientific Manuscript database
A sensitive and objective time series analysis method based on the calculation of Mann Whitney U statistics is described. This method samples data rankings over moving time windows, converts those samples to Mann-Whitney U statistics, and then normalizes the U statistics to Z statistics using Monte-...
Zhang, Lida; Sun, Da-Wen; Zhang, Zhihang
2017-03-24
Moisture sorption isotherm is commonly determined by saturated salt slurry method, which has defects of long time cost, cumbersome labor, and microbial deterioration of samples. Thus, a novel method, a w measurement (AWM) method, has been developed to overcome these drawbacks. Fundamentals and applications of this fast method have been introduced with respects to its typical operational steps, a variety of equipment set-ups and applied samples. The resultant rapidness and reliability have been evaluated by comparing with conventional methods. This review also discussed factors impairing measurement precision and accuracy, including inappropriate choice of predryingwetting techniques and unachieved moisture uniformity in samples due to inadequate time. This analysis and corresponding suggestions can facilitate improved AWM method with more satisfying accuracy and time cost.
Kaspar, A; Pfister, K; Nielsen, M K; Silaghi, C; Fink, H; Scheuerle, M C
2017-01-11
Strongylus vulgaris has become a rare parasite in Germany during the past 50 years due to the practice of frequent prophylactic anthelmintic therapy. To date, the emerging development of resistance in Cyathostominae and Parascaris spp. to numerous equine anthelmintics has changed deworming management and the frequency of anthelmintic usage. In this regard, reliable detection of parasitic infections, especially of the highly pathogenic S. vulgaris is essential. In the current study, two diagnostic methods for the detection of infections with S. vulgaris were compared and information on the occurrence of this parasite in German horses was gained. For this purpose, faecal samples of 501 horses were screened for S. vulgaris with real-time PCR and an additional larval culture was performed in samples of 278 horses. A subset of 26 horses underwent multiple follow-up examinations with both methods in order to evaluate both the persistence of S. vulgaris infections and the reproducibility of each diagnostic method. The real-time PCR revealed S. vulgaris-DNA in ten of 501 investigated equine samples (1.9%). The larval culture demonstrated larvae of S. vulgaris in three of the 278 samples (1.1%). A direct comparison of the two methods was possible in 321 samples including 43 follow-up examinations with the result of 11 S. vulgaris-positive samples by real-time PCR and 4 S. vulgaris-positive samples by larval culture. The McNemar's test (p-value = 0.016) revealed a significant difference and the kappa values (0.525) showed a moderate agreement between real-time PCR and larval culture. The real-time PCR detected a significantly higher proportion of positives of S. vulgaris compared to larval culture and should thus be considered as a routine diagnostic method for the detection of S. vulgaris in equine samples.
Effects of yeast, fermentation time, and preservation methods on tarhana.
Gurbuz, Ozan; Gocmen, Duygu; Ozmen, Nese; Dagdelen, Fatih
2010-01-01
The physicochemical properties of tarhana soup produced with different dough treatments, fermentation times, and preservation methods were examined. Tarhana doughs were prepared with yogurt (control) or baker's yeast (Saccharomyces cerevisiae) and fermented for 3 days. Samples were taken at 24, 48, and 72 hr. Samples were then preserved via one of four methods: sun dried, dried in the shade, vacumn dried, and frozen. Frozen samples produced lower organic acid levels after 72 hr of fermentation in both control (0.68 g/100 g) and yeast (0.61 g/100 g) applications than samples that were dried (0.94 g/100 g control samples; 0.81 g/100 g samples with yeast). Increasing fermentation time resulted in a significant effect on the formation of organic acid in the tarhana (p < .01). At 72 hr of fermentation, total acidity increased 11%, 17%, and 23% for tarhana samples vacumn-dried, sun-dried, and dried in the shade, respectively. Preservation methods also affected the moisture, ash, crude protein, total acidity, pH, salt, fat, reducing sugar levels, and the sensory assestment of tarhana soup (p < .01). Sensory characteristics were not significantly affected by baker's yeast in any of the preservation methods used (p > .01). However, sensory scores for tarhana prepared from the samples dried in a sheltered area showed a reduction in color desireablilty as the fermentation time increased. The soup prepared from frozen tarhana (72 hr fermentation, with yeast) had the highest scores with respect to color, mouth feel, flavor, and overall acceptability. Vacuum-dried samples' scores in these areas were also high in comparison to the two other drying methods.
Multirate sampled-data yaw-damper and modal suppression system design
NASA Technical Reports Server (NTRS)
Berg, Martin C.; Mason, Gregory S.
1990-01-01
A multirate control law synthesized algorithm based on an infinite-time quadratic cost function, was developed along with a method for analyzing the robustness of multirate systems. A generalized multirate sampled-data control law structure (GMCLS) was introduced. A new infinite-time-based parameter optimization multirate sampled-data control law synthesis method and solution algorithm were developed. A singular-value-based method for determining gain and phase margins for multirate systems was also developed. The finite-time-based parameter optimization multirate sampled-data control law synthesis algorithm originally intended to be applied to the aircraft problem was instead demonstrated by application to a simpler problem involving the control of the tip position of a two-link robot arm. The GMCLS, the infinite-time-based parameter optimization multirate control law synthesis method and solution algorithm, and the singular-value based method for determining gain and phase margins were all demonstrated by application to the aircraft control problem originally proposed for this project.
Edagawa, Akiko; Kimura, Akio; Kawabuchi-Kurata, Takako; Adachi, Shinichi; Furuhata, Katsunori; Miyamoto, Hiroshi
2015-10-19
We investigated Legionella contamination in bath water samples, collected from 68 bathing facilities in Japan, by culture, culture with amoebic co-culture, real-time quantitative PCR (qPCR), and real-time qPCR with amoebic co-culture. Using the conventional culture method, Legionella pneumophila was detected in 11 samples (11/68, 16.2%). Contrary to our expectation, the culture method with the amoebic co-culture technique did not increase the detection rate of Legionella (4/68, 5.9%). In contrast, a combination of the amoebic co-culture technique followed by qPCR successfully increased the detection rate (57/68, 83.8%) compared with real-time qPCR alone (46/68, 67.6%). Using real-time qPCR after culture with amoebic co-culture, more than 10-fold higher bacterial numbers were observed in 30 samples (30/68, 44.1%) compared with the same samples without co-culture. On the other hand, higher bacterial numbers were not observed after propagation by amoebae in 32 samples (32/68, 47.1%). Legionella was not detected in the remaining six samples (6/68, 8.8%), irrespective of the method. These results suggest that application of the amoebic co-culture technique prior to real-time qPCR may be useful for the sensitive detection of Legionella from bath water samples. Furthermore, a combination of amoebic co-culture and real-time qPCR might be useful to detect viable and virulent Legionella because their ability to invade and multiply within free-living amoebae is considered to correlate with their pathogenicity for humans. This is the first report evaluating the efficacy of the amoebic co-culture technique for detecting Legionella in bath water samples.
Pope, Misty L.; Bussen, Michelle; Feige, Mary Ann; Shadix, Lois; Gonder, Sharon; Rodgers, Crystal; Chambers, Yildiz; Pulz, Jessica; Miller, Ken; Connell, Kevin; Standridge, Jon
2003-01-01
Escherichia coli is a routinely used microbiological indicator of water quality. To determine whether holding time and storage conditions had an effect on E. coli densities in surface water, studies were conducted in three phases, encompassing 24 sites across the United States and four commonly used monitoring methods. During all three phases of the study, E. coli samples were analyzed at time 0 and at 8, 24, 30, and 48 h after sample collection. During phase 1, when 4°C samples were evaluated by Colilert or by placing a membrane onto mFC medium followed by transfer to nutrient agar containing 4-methylumbelliferyl-β-d-glucuronide (mFC/NA-MUG), three of four sites showed no significant differences throughout the 48-h study. During phase 2, five of seven sites showed no significant difference between time 0 and 24 h by membrane filtration (mFC/NA-MUG). When evaluated by the Colilert method, five of seven sites showed no significant difference in E. coli density between time 0 and 48 h. During phase 3, 8 of 13 sites showed no significant differences in E. coli densities between time 0 and the 48-h holding time, regardless of method. Based on the results of these studies, it appears that if samples are held below 10°C and are not allowed to freeze, most surface water E. coli samples analyzed by commonly used methods beyond 8 h after sample collection can generate E. coli data comparable to those generated within 8 h of sample collection. Notwithstanding this conclusion, E. coli samples collected from surface waters should always be analyzed as soon as possible. PMID:14532081
Edagawa, Akiko; Kimura, Akio; Kawabuchi-Kurata, Takako; Adachi, Shinichi; Furuhata, Katsunori; Miyamoto, Hiroshi
2015-01-01
We investigated Legionella contamination in bath water samples, collected from 68 bathing facilities in Japan, by culture, culture with amoebic co-culture, real-time quantitative PCR (qPCR), and real-time qPCR with amoebic co-culture. Using the conventional culture method, Legionella pneumophila was detected in 11 samples (11/68, 16.2%). Contrary to our expectation, the culture method with the amoebic co-culture technique did not increase the detection rate of Legionella (4/68, 5.9%). In contrast, a combination of the amoebic co-culture technique followed by qPCR successfully increased the detection rate (57/68, 83.8%) compared with real-time qPCR alone (46/68, 67.6%). Using real-time qPCR after culture with amoebic co-culture, more than 10-fold higher bacterial numbers were observed in 30 samples (30/68, 44.1%) compared with the same samples without co-culture. On the other hand, higher bacterial numbers were not observed after propagation by amoebae in 32 samples (32/68, 47.1%). Legionella was not detected in the remaining six samples (6/68, 8.8%), irrespective of the method. These results suggest that application of the amoebic co-culture technique prior to real-time qPCR may be useful for the sensitive detection of Legionella from bath water samples. Furthermore, a combination of amoebic co-culture and real-time qPCR might be useful to detect viable and virulent Legionella because their ability to invade and multiply within free-living amoebae is considered to correlate with their pathogenicity for humans. This is the first report evaluating the efficacy of the amoebic co-culture technique for detecting Legionella in bath water samples. PMID:26492259
An evaluation of long-term preservation methods for brown bear (Ursus arctos) faecal DNA samples
Murphy, M.A.; Waits, L.P.; Kendall, K.C.; Wasser, S.K.; Higbee, J.A.; Bogden, R.
2002-01-01
Relatively few large-scale faecal DNA studies have been initiated due to difficulties in amplifying low quality and quantity DNA template. To improve brown bear faecal DNA PCR amplification success rates and to determine post collection sample longevity, five preservation methods were evaluated: 90% ethanol, DETs buffer, silica-dried, oven-dried stored at room temperature, and oven-dried stored at -20??C. Preservation effectiveness was evaluated for 50 faecal samples by PCR amplification of a mitochondrial DNA (mtDNA) locus (???146 bp) and a nuclear DNA (nDNA) locus (???200 bp) at time points of one week, one month, three months and six months. Preservation method and storage time significantly impacted mtDNA and nDNA amplification success rates. For mtDNA, all preservation methods had ??? 75% success at one week, but storage time had a significant impact on the effectiveness of the silica preservation method. Ethanol preserved samples had the highest success rates for both mtDNA (86.5%) and nDNA (84%). Nuclear DNA amplification success rates ranged from 26-88%, and storage time had a significant impact on all methods but ethanol. Preservation method and storage time should be important considerations for researchers planning projects utilizing faecal DNA. We recommend preservation of faecal samples in 90% ethanol when feasible, although when collecting in remote field conditions or for both DNA and hormone assays a dry collection method may be advantageous.
Interval sampling methods and measurement error: a computer simulation.
Wirth, Oliver; Slaven, James; Taylor, Matthew A
2014-01-01
A simulation study was conducted to provide a more thorough account of measurement error associated with interval sampling methods. A computer program simulated the application of momentary time sampling, partial-interval recording, and whole-interval recording methods on target events randomly distributed across an observation period. The simulation yielded measures of error for multiple combinations of observation period, interval duration, event duration, and cumulative event duration. The simulations were conducted up to 100 times to yield measures of error variability. Although the present simulation confirmed some previously reported characteristics of interval sampling methods, it also revealed many new findings that pertain to each method's inherent strengths and weaknesses. The analysis and resulting error tables can help guide the selection of the most appropriate sampling method for observation-based behavioral assessments. © Society for the Experimental Analysis of Behavior.
Espino, L; Way, M O; Wilson, L T
2008-02-01
Commercial rice, Oryza sativa L., fields in southeastern Texas were sampled during 2003 and 2004, and visual samples were compared with sweep net samples. Fields were sampled at different stages of panicle development, times of day, and by different operators. Significant differences were found between perimeter and within field sweep net samples, indicating that samples taken 9 m from the field margin overestimate within field Oebalus pugnax (F.) (Hemiptera: Pentatomidae) populations. Time of day did not significantly affect the number of O. pugnax caught with the sweep net; however, there was a trend to capture more insects during morning than afternoon. For all sampling methods evaluated during this study, O. pugnax was found to have an aggregated spatial pattern at most densities. When comparing sweep net with visual sampling methods, one sweep of the "long stick" and two sweeps of the "sweep stick" correlated well with the sweep net (r2 = 0.639 and r2 = 0.815, respectively). This relationship was not affected by time of day of sampling, stage of panicle development, type of planting or operator. Relative cost-reliability, which incorporates probability of adoption, indicates the visual methods are more cost-reliable than the sweep net for sampling O.
40 CFR 60.496 - Test methods and procedures.
Code of Federal Regulations, 2013 CFR
2013-07-01
... Surface Coating Industry § 60.496 Test methods and procedures. (a) The reference methods in appendix A to...) Method 4 for stack gas moisture. (b) For Method 24, the coating sample must be a 1-litre sample collected... volume must be 0.003 dscm except that shorter sampling times or smaller volumes, when necessitated by...
40 CFR 60.496 - Test methods and procedures.
Code of Federal Regulations, 2014 CFR
2014-07-01
... Surface Coating Industry § 60.496 Test methods and procedures. (a) The reference methods in appendix A to...) Method 4 for stack gas moisture. (b) For Method 24, the coating sample must be a 1-litre sample collected... volume must be 0.003 dscm except that shorter sampling times or smaller volumes, when necessitated by...
40 CFR 60.496 - Test methods and procedures.
Code of Federal Regulations, 2012 CFR
2012-07-01
... Surface Coating Industry § 60.496 Test methods and procedures. (a) The reference methods in appendix A to...) Method 4 for stack gas moisture. (b) For Method 24, the coating sample must be a 1-litre sample collected... volume must be 0.003 dscm except that shorter sampling times or smaller volumes, when necessitated by...
40 CFR 60.496 - Test methods and procedures.
Code of Federal Regulations, 2011 CFR
2011-07-01
... Surface Coating Industry § 60.496 Test methods and procedures. (a) The reference methods in appendix A to...) Method 4 for stack gas moisture. (b) For Method 24, the coating sample must be a 1-litre sample collected... volume must be 0.003 dscm except that shorter sampling times or smaller volumes, when necessitated by...
40 CFR 60.496 - Test methods and procedures.
Code of Federal Regulations, 2010 CFR
2010-07-01
... Surface Coating Industry § 60.496 Test methods and procedures. (a) The reference methods in appendix A to...) Method 4 for stack gas moisture. (b) For Method 24, the coating sample must be a 1-litre sample collected... volume must be 0.003 dscm except that shorter sampling times or smaller volumes, when necessitated by...
Spectrometer Sensitivity Investigations on the Spectrometric Oil Analysis Program.
1983-04-22
31 H. ACID DISSOLUTION METHOD (ADM) ........... 90 31 I. ANALYSIS OF SAMPLES............................ 31 jJ. PARTICLE TRANSPORT EFFICIENCY OF...THE ROTATING *DISK.................................... 32 I .K. A/E35U-3 ACID DISSOLUTION METHOD.................. 32 L. BURN TIME... ACID DISSOLUTION METHOD ......... ,...,....... 95 3. EFFECT OF BURN TIME ............ 95 4. DIRECT SAMPLE INTRODUCTION .......................... 95
Goeman, Valerie R; Tinkler, Stacy H; Hammac, G Kenitra; Ruple, Audrey
2018-04-01
Environmental surveillance for Salmonella enterica can be used for early detection of contamination; thus routine sampling is an integral component of infection control programs in hospital environments. At the Purdue University Veterinary Teaching Hospital (PUVTH), the technique regularly employed in the large animal hospital for sample collection uses sterile gauze sponges for environmental sampling, which has proven labor-intensive and time-consuming. Alternative sampling methods use Swiffer brand electrostatic wipes for environmental sample collection, which are reportedly effective and efficient. It was hypothesized that use of Swiffer wipes for sample collection would be more efficient and less costly than the use of gauze sponges. A head-to-head comparison between the 2 sampling methods was conducted in the PUVTH large animal hospital and relative agreement, cost-effectiveness, and sampling efficiency were compared. There was fair agreement in culture results between the 2 sampling methods, but Swiffer wipes required less time and less physical effort to collect samples and were more cost-effective.
Evaluation on determination of iodine in coal by energy dispersive X-ray fluorescence
Wang, B.; Jackson, J.C.; Palmer, C.; Zheng, B.; Finkelman, R.B.
2005-01-01
A quick and inexpensive method of relative high iodine determination from coal samples was evaluated. Energy dispersive X-ray fluorescence (EDXRF) provided a detection limit of about 14 ppm (3 times of standard deviations of the blank sample), without any complex sample preparation. An analytical relative standard deviation of 16% was readily attainable for coal samples. Under optimum conditions, coal samples with iodine concentrations higher than 5 ppm can be determined using this EDXRF method. For the time being, due to the general iodine concentrations of coal samples lower than 5 ppm, except for some high iodine content coal, this method can not effectively been used for iodine determination. More work needed to meet the requirement of determination of iodine from coal samples for this method. Copyright ?? 2005 by The Geochemical Society of Japan.
NASA Astrophysics Data System (ADS)
Zhang, Qian; Harman, Ciaran J.; Kirchner, James W.
2018-02-01
River water-quality time series often exhibit fractal scaling, which here refers to autocorrelation that decays as a power law over some range of scales. Fractal scaling presents challenges to the identification of deterministic trends because (1) fractal scaling has the potential to lead to false inference about the statistical significance of trends and (2) the abundance of irregularly spaced data in water-quality monitoring networks complicates efforts to quantify fractal scaling. Traditional methods for estimating fractal scaling - in the form of spectral slope (β) or other equivalent scaling parameters (e.g., Hurst exponent) - are generally inapplicable to irregularly sampled data. Here we consider two types of estimation approaches for irregularly sampled data and evaluate their performance using synthetic time series. These time series were generated such that (1) they exhibit a wide range of prescribed fractal scaling behaviors, ranging from white noise (β = 0) to Brown noise (β = 2) and (2) their sampling gap intervals mimic the sampling irregularity (as quantified by both the skewness and mean of gap-interval lengths) in real water-quality data. The results suggest that none of the existing methods fully account for the effects of sampling irregularity on β estimation. First, the results illustrate the danger of using interpolation for gap filling when examining autocorrelation, as the interpolation methods consistently underestimate or overestimate β under a wide range of prescribed β values and gap distributions. Second, the widely used Lomb-Scargle spectral method also consistently underestimates β. A previously published modified form, using only the lowest 5 % of the frequencies for spectral slope estimation, has very poor precision, although the overall bias is small. Third, a recent wavelet-based method, coupled with an aliasing filter, generally has the smallest bias and root-mean-squared error among all methods for a wide range of prescribed β values and gap distributions. The aliasing method, however, does not itself account for sampling irregularity, and this introduces some bias in the result. Nonetheless, the wavelet method is recommended for estimating β in irregular time series until improved methods are developed. Finally, all methods' performances depend strongly on the sampling irregularity, highlighting that the accuracy and precision of each method are data specific. Accurately quantifying the strength of fractal scaling in irregular water-quality time series remains an unresolved challenge for the hydrologic community and for other disciplines that must grapple with irregular sampling.
A new time calibration method for switched-capacitor-array-based waveform samplers
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kim, H.; Chen, C. -T.; Eclov, N.
2014-08-24
Here we have developed a new time calibration method for the DRS4 waveform sampler that enables us to precisely measure the non-uniform sampling interval inherent in the switched-capacitor cells of the DRS4. The method uses the proportionality between the differential amplitude and sampling interval of adjacent switched-capacitor cells responding to a sawtooth-shape pulse. In the experiment, a sawtooth-shape pulse with a 40 ns period generated by a Tektronix AWG7102 is fed to a DRS4 evaluation board for calibrating the sampling intervals of all 1024 cells individually. The electronic time resolution of the DRS4 evaluation board with the new time calibrationmore » is measured to be ~2.4 ps RMS by using two simultaneous Gaussian pulses with 2.35 ns full-width at half-maximum and applying a Gaussian fit. The time resolution dependencies on the time difference with the new time calibration are measured and compared to results obtained by another method. Ultimately, the new method could be applicable for other switched-capacitor-array technology-based waveform samplers for precise time calibration.« less
A new time calibration method for switched-capacitor-array-based waveform samplers
NASA Astrophysics Data System (ADS)
Kim, H.; Chen, C.-T.; Eclov, N.; Ronzhin, A.; Murat, P.; Ramberg, E.; Los, S.; Moses, W.; Choong, W.-S.; Kao, C.-M.
2014-12-01
We have developed a new time calibration method for the DRS4 waveform sampler that enables us to precisely measure the non-uniform sampling interval inherent in the switched-capacitor cells of the DRS4. The method uses the proportionality between the differential amplitude and sampling interval of adjacent switched-capacitor cells responding to a sawtooth-shape pulse. In the experiment, a sawtooth-shape pulse with a 40 ns period generated by a Tektronix AWG7102 is fed to a DRS4 evaluation board for calibrating the sampling intervals of all 1024 cells individually. The electronic time resolution of the DRS4 evaluation board with the new time calibration is measured to be 2.4 ps RMS by using two simultaneous Gaussian pulses with 2.35 ns full-width at half-maximum and applying a Gaussian fit. The time resolution dependencies on the time difference with the new time calibration are measured and compared to results obtained by another method. The new method could be applicable for other switched-capacitor-array technology-based waveform samplers for precise time calibration.
A New Time Calibration Method for Switched-capacitor-array-based Waveform Samplers.
Kim, H; Chen, C-T; Eclov, N; Ronzhin, A; Murat, P; Ramberg, E; Los, S; Moses, W; Choong, W-S; Kao, C-M
2014-12-11
We have developed a new time calibration method for the DRS4 waveform sampler that enables us to precisely measure the non-uniform sampling interval inherent in the switched-capacitor cells of the DRS4. The method uses the proportionality between the differential amplitude and sampling interval of adjacent switched-capacitor cells responding to a sawtooth-shape pulse. In the experiment, a sawtooth-shape pulse with a 40 ns period generated by a Tektronix AWG7102 is fed to a DRS4 evaluation board for calibrating the sampling intervals of all 1024 cells individually. The electronic time resolution of the DRS4 evaluation board with the new time calibration is measured to be ~2.4 ps RMS by using two simultaneous Gaussian pulses with 2.35 ns full-width at half-maximum and applying a Gaussian fit. The time resolution dependencies on the time difference with the new time calibration are measured and compared to results obtained by another method. The new method could be applicable for other switched-capacitor-array technology-based waveform samplers for precise time calibration.
A New Time Calibration Method for Switched-capacitor-array-based Waveform Samplers
Kim, H.; Chen, C.-T.; Eclov, N.; Ronzhin, A.; Murat, P.; Ramberg, E.; Los, S.; Moses, W.; Choong, W.-S.; Kao, C.-M.
2014-01-01
We have developed a new time calibration method for the DRS4 waveform sampler that enables us to precisely measure the non-uniform sampling interval inherent in the switched-capacitor cells of the DRS4. The method uses the proportionality between the differential amplitude and sampling interval of adjacent switched-capacitor cells responding to a sawtooth-shape pulse. In the experiment, a sawtooth-shape pulse with a 40 ns period generated by a Tektronix AWG7102 is fed to a DRS4 evaluation board for calibrating the sampling intervals of all 1024 cells individually. The electronic time resolution of the DRS4 evaluation board with the new time calibration is measured to be ~2.4 ps RMS by using two simultaneous Gaussian pulses with 2.35 ns full-width at half-maximum and applying a Gaussian fit. The time resolution dependencies on the time difference with the new time calibration are measured and compared to results obtained by another method. The new method could be applicable for other switched-capacitor-array technology-based waveform samplers for precise time calibration. PMID:25506113
Cho, Sanghee; Grazioso, Ron; Zhang, Nan; Aykac, Mehmet; Schmand, Matthias
2011-12-07
The main focus of our study is to investigate how the performance of digital timing methods is affected by sampling rate, anti-aliasing and signal interpolation filters. We used the Nyquist sampling theorem to address some basic questions such as what will be the minimum sampling frequencies? How accurate will the signal interpolation be? How do we validate the timing measurements? The preferred sampling rate would be as low as possible, considering the high cost and power consumption of high-speed analog-to-digital converters. However, when the sampling rate is too low, due to the aliasing effect, some artifacts are produced in the timing resolution estimations; the shape of the timing profile is distorted and the FWHM values of the profile fluctuate as the source location changes. Anti-aliasing filters are required in this case to avoid the artifacts, but the timing is degraded as a result. When the sampling rate is marginally over the Nyquist rate, a proper signal interpolation is important. A sharp roll-off (higher order) filter is required to separate the baseband signal from its replicates to avoid the aliasing, but in return the computation will be higher. We demonstrated the analysis through a digital timing study using fast LSO scintillation crystals as used in time-of-flight PET scanners. From the study, we observed that there is no significant timing resolution degradation down to 1.3 Ghz sampling frequency, and the computation requirement for the signal interpolation is reasonably low. A so-called sliding test is proposed as a validation tool checking constant timing resolution behavior of a given timing pick-off method regardless of the source location change. Lastly, the performance comparison for several digital timing methods is also shown.
Zhang, Zeng-yan; Ji, Te; Zhu, Zhi-yong; Zhao, Hong-wei; Chen, Min; Xiao, Ti-qiao; Guo, Zhi
2015-01-01
Terahertz radiation is an electromagnetic radiation in the range between millimeter waves and far infrared. Due to its low energy and non-ionizing characters, THz pulse imaging emerges as a novel tool in many fields, such as material, chemical, biological medicine, and food safety. Limited spatial resolution is a significant restricting factor of terahertz imaging technology. Near field imaging method was proposed to improve the spatial resolution of terahertz system. Submillimeter scale's spauial resolution can be achieved if the income source size is smaller than the wawelength of the incoming source and the source is very close to the sample. But many changes were needed to the traditional terahertz time domain spectroscopy system, and it's very complex to analyze sample's physical parameters through the terahertz signal. A method of inserting a pinhole upstream to the sample was first proposed in this article to improve the spatial resolution of traditional terahertz time domain spectroscopy system. The measured spatial resolution of terahertz time domain spectroscopy system by knife edge method can achieve spatial resolution curves. The moving stage distance between 10 % and 90 Yo of the maximum signals respectively was defined as the, spatial resolution of the system. Imaging spatial resolution of traditional terahertz time domain spectroscopy system was improved dramatically after inserted a pinhole with diameter 0. 5 mm, 2 mm upstream to the sample. Experimental results show that the spatial resolution has been improved from 1. 276 mm to 0. 774 mm, with the increment about 39 %. Though this simple method, the spatial resolution of traditional terahertz time domain spectroscopy system was increased from millimeter scale to submillimeter scale. A pinhole with diameter 1 mm on a polyethylene plate was taken as sample, to terahertz imaging study. The traditional terahertz time domain spectroscopy system and pinhole inserted terahertz time domain spectroscopy system were applied in the imaging experiment respectively. The relative THz-power loss imaging of samples were use in this article. This method generally delivers the best signal to noise ratio in loss images, dispersion effects are cancelled. Terahertz imaging results show that the sample's boundary was more distinct after inserting the pinhole in front of, sample. The results also conform that inserting pinhole in front of sample can improve the imaging spatial resolution effectively. The theoretical analyses of the method which improve the spatial resolution by inserting a pinhole in front of sample were given in this article. The analyses also indicate that the smaller the pinhole size, the longer spatial coherence length of the system, the better spatial resolution of the system. At the same time the terahertz signal will be reduced accordingly. All the experimental results and theoretical analyses indicate that the method of inserting a pinhole in front of sample can improve the spatial resolution of traditional terahertz time domain spectroscopy system effectively, and it will further expand the application of terahertz imaging technology.
[DNA quantification of blood samples pre-treated with pyramidon].
Zhu, Chuan-Hong; Zheng, Dao-Li; Ni, Rao-Zhi; Wang, Hai-Sheng; Ning, Ping; Fang, Hui; Liu, Yan
2014-06-01
To study DNA quantification and STR typing of samples pre-treated with pyramidon. The blood samples of ten unrelated individuals were anticoagulated in EDTA. The blood stains were made on the filter paper. The experimental groups were divided into six groups in accordance with the storage time, 30 min, 1 h, 3 h, 6 h, 12 h and 24h after pre-treated with pyramidon. DNA was extracted by three methods: magnetic bead-based extraction, QIAcube DNA purification method and Chelex-100 method. The quantification of DNA was made by fluorescent quantitative PCR. STR typing was detected by PCR-STR fluorescent technology. In the same DNA extraction method, the sample DNA decreased gradually with times after pre-treatment with pyramidon. In the same storage time, the DNA quantification in different extraction methods had significant differences. Sixteen loci DNA typing were detected in 90.56% of samples. Pyramidon pre-treatment could cause DNA degradation, but effective STR typing can be achieved within 24 h. The magnetic bead-based extraction is the best method for STR profiling and DNA extraction.
Quantifying and Mitigating the Effect of Preferential Sampling on Phylodynamic Inference
Karcher, Michael D.; Palacios, Julia A.; Bedford, Trevor; Suchard, Marc A.; Minin, Vladimir N.
2016-01-01
Phylodynamics seeks to estimate effective population size fluctuations from molecular sequences of individuals sampled from a population of interest. One way to accomplish this task formulates an observed sequence data likelihood exploiting a coalescent model for the sampled individuals’ genealogy and then integrating over all possible genealogies via Monte Carlo or, less efficiently, by conditioning on one genealogy estimated from the sequence data. However, when analyzing sequences sampled serially through time, current methods implicitly assume either that sampling times are fixed deterministically by the data collection protocol or that their distribution does not depend on the size of the population. Through simulation, we first show that, when sampling times do probabilistically depend on effective population size, estimation methods may be systematically biased. To correct for this deficiency, we propose a new model that explicitly accounts for preferential sampling by modeling the sampling times as an inhomogeneous Poisson process dependent on effective population size. We demonstrate that in the presence of preferential sampling our new model not only reduces bias, but also improves estimation precision. Finally, we compare the performance of the currently used phylodynamic methods with our proposed model through clinically-relevant, seasonal human influenza examples. PMID:26938243
Baek, Hyun Jae; Shin, JaeWook; Jin, Gunwoo; Cho, Jaegeol
2017-10-24
Photoplethysmographic signals are useful for heart rate variability analysis in practical ambulatory applications. While reducing the sampling rate of signals is an important consideration for modern wearable devices that enable 24/7 continuous monitoring, there have not been many studies that have investigated how to compensate the low timing resolution of low-sampling-rate signals for accurate heart rate variability analysis. In this study, we utilized the parabola approximation method and measured it against the conventional cubic spline interpolation method for the time, frequency, and nonlinear domain variables of heart rate variability. For each parameter, the intra-class correlation, standard error of measurement, Bland-Altman 95% limits of agreement and root mean squared relative error were presented. Also, elapsed time taken to compute each interpolation algorithm was investigated. The results indicated that parabola approximation is a simple, fast, and accurate algorithm-based method for compensating the low timing resolution of pulse beat intervals. In addition, the method showed comparable performance with the conventional cubic spline interpolation method. Even though the absolute value of the heart rate variability variables calculated using a signal sampled at 20 Hz were not exactly matched with those calculated using a reference signal sampled at 250 Hz, the parabola approximation method remains a good interpolation method for assessing trends in HRV measurements for low-power wearable applications.
The methodology study of time accelerated irradiation of elastomers
NASA Astrophysics Data System (ADS)
Ito, Masayuki
2005-07-01
The article studied the methods how to shorten the irradiation time by increasing dose rate without changing the relationship between dose versus properties of degraded samples. The samples used were nine kinds of EPDM which have different compounding formula. The different dose of Co-γ ray was exposed to the samples. The maximum dose was 2 MGy. The reference condition to be compared with two short time test conditions is irradiation of 0.33 kGy/h at room temperature. Two methods shown below were studied as the time-accelerate irradiation conditions.
Tulipan, Rachel J; Phillips, Heidi; Garrett, Laura D; Dirikolu, Levent; Mitchell, Mark A
2017-05-01
OBJECTIVE To characterize long-term elution of platinum from carboplatin-impregnated calcium sulfate hemihydrate (CI-CSH) beads in vitro by comparing 2 distinct sample collection methods designed to mimic 2 in vivo environments. SAMPLES 162 CI-CSH beads containing 4.6 mg of carboplatin (2.4 mg of platinum/bead). PROCEDURES For method 1, which mimicked an in vivo environment with rapid and complete fluid exchange, each of 3 plastic 10-mL conical tubes contained 3 CI-CSH beads and 5 mL of PBS solution. Eluent samples were obtained by evacuation of all fluid at 1, 2, 3, 6, 9, and 12 hours and 1, 2, 3, 6, 9, 12, 15, 18, 22, 26, and 30 days. Five milliliters of fresh PBS solution was then added to each tube. For method 2, which mimicked an in vivo environment with no fluid exchange, each of 51 tubes (ie, 3 tubes/17 sample collection times) contained 3 CI-CSH beads and 5 mL of PBS solution. Eluent samples were obtained from the assigned tubes for each time point. All samples were analyzed for platinum content by inductively coupled plasma-mass spectrometry. RESULTS Platinum was released from CI-CSH beads for 22 to 30 days. Significant differences were found in platinum concentration and percentage of platinum eluted from CI-CSH beads over time for each method. Platinum concentrations and elution percentages in method 2 samples were significantly higher than those of method 1 samples, except for the first hour measurements. CONCLUSIONS AND CLINICAL RELEVANCE Sample collection methods 1 and 2 may provide estimates of the minimum and maximum platinum release, respectively, from CI-CSH beads in vivo.
Lindahl, S; Båverud, V; Egenvall, A; Aspán, A; Pringle, J
2013-01-01
Strangles is a contagious equine-specific disease caused by Streptococcus equi subsp. equi. Unfortunately, detection of S. equi can fail in up to 40% of horses with strangles. Whereas recent molecular biologic methods and sampling techniques have improved recovery of S. equi optimal sampling methods and laboratory analyses remain ill-defined. To determine the yield of S. equi from horses with acute strangles in confirmed outbreaks by field-sampling methods subjected to culture and biochemical identification, and real-time PCR directly and after culture. Fifty-seven horses of varying breeds and ages from 8 strangles outbreaks. Prospective study. Culture with biochemical identification and real-time PCR directly, and from culture, were performed on nasal swabs, nasopharyngeal swabs, and nasopharyngeal lavages. Real-time PCR directly from samples identified the highest number of infected horses, with 45/57 nasal swabs, 41/57 nasopharyngeal swabs, and 48/57 nasopharyngeal lavages S. equi positive. Biochemical identification (highest positives 22/57) was inferior to real-time PCR for S. equi recovery regardless of sampling method. Real-time PCR of nasopharyngeal lavage directly and after culture yielded 52/57 positives whereas direct real-time PCR of nasopharyngeal lavage combined with either nasopharyngeal swabs or nasal swabs yielded 53/57 positives. Three horses were negative on all samples. Nasopharyngeal lavage analyzed by a combination of real-time PCR directly and after culture or, alternatively, real-time PCR directly on a nasopharyngeal lavage and a nasal/nasopharyngeal swab can identify S. equi in over 90% of acute strangles cases. Copyright © 2013 by the American College of Veterinary Internal Medicine.
High frequency resolution terahertz time-domain spectroscopy
NASA Astrophysics Data System (ADS)
Sangala, Bagvanth Reddy
2013-12-01
A new method for the high frequency resolution terahertz time-domain spectroscopy is developed based on the characteristic matrix method. This method is useful for studying planar samples or stack of planar samples. The terahertz radiation was generated by optical rectification in a ZnTe crystal and detected by another ZnTe crystal via electro-optic sampling method. In this new characteristic matrix based method, the spectra of the sample and reference waveforms will be modeled by using characteristic matrices. We applied this new method to measure the optical constants of air. The terahertz transmission through the layered systems air-Teflon-air-Quartz-air and Nitrogen gas-Teflon-Nitrogen gas-Quartz-Nitrogen gas was modeled by the characteristic matrix method. A transmission coefficient is derived from these models which was optimized to fit the experimental transmission coefficient to extract the optical constants of air. The optimization of an error function involving the experimental complex transmission coefficient and the theoretical transmission coefficient was performed using patternsearch algorithm of MATLAB. Since this method takes account of the echo waveforms due to reflections in the layered samples, this method allows analysis of longer time-domain waveforms giving rise to very high frequency resolution in the frequency-domain. We have presented the high frequency resolution terahertz time-domain spectroscopy of air and compared the results with the literature values. We have also fitted the complex susceptibility of air to the Lorentzian and Gaussian functions to extract the linewidths.
A multi-threshold sampling method for TOF-PET signal processing
NASA Astrophysics Data System (ADS)
Kim, H.; Kao, C. M.; Xie, Q.; Chen, C. T.; Zhou, L.; Tang, F.; Frisch, H.; Moses, W. W.; Choong, W. S.
2009-04-01
As an approach to realizing all-digital data acquisition for positron emission tomography (PET), we have previously proposed and studied a multi-threshold sampling method to generate samples of a PET event waveform with respect to a few user-defined amplitudes. In this sampling scheme, one can extract both the energy and timing information for an event. In this paper, we report our prototype implementation of this sampling method and the performance results obtained with this prototype. The prototype consists of two multi-threshold discriminator boards and a time-to-digital converter (TDC) board. Each of the multi-threshold discriminator boards takes one input and provides up to eight threshold levels, which can be defined by users, for sampling the input signal. The TDC board employs the CERN HPTDC chip that determines the digitized times of the leading and falling edges of the discriminator output pulses. We connect our prototype electronics to the outputs of two Hamamatsu R9800 photomultiplier tubes (PMTs) that are individually coupled to a 6.25×6.25×25 mm3 LSO crystal. By analyzing waveform samples generated by using four thresholds, we obtain a coincidence timing resolution of about 340 ps and an ˜18% energy resolution at 511 keV. We are also able to estimate the decay-time constant from the resulting samples and obtain a mean value of 44 ns with an ˜9 ns FWHM. In comparison, using digitized waveforms obtained at a 20 GSps sampling rate for the same LSO/PMT modules we obtain ˜300 ps coincidence timing resolution, ˜14% energy resolution at 511 keV, and ˜5 ns FWHM for the estimated decay-time constant. Details of the results on the timing and energy resolutions by using the multi-threshold method indicate that it is a promising approach for implementing digital PET data acquisition.
Piecewise multivariate modelling of sequential metabolic profiling data.
Rantalainen, Mattias; Cloarec, Olivier; Ebbels, Timothy M D; Lundstedt, Torbjörn; Nicholson, Jeremy K; Holmes, Elaine; Trygg, Johan
2008-02-19
Modelling the time-related behaviour of biological systems is essential for understanding their dynamic responses to perturbations. In metabolic profiling studies, the sampling rate and number of sampling points are often restricted due to experimental and biological constraints. A supervised multivariate modelling approach with the objective to model the time-related variation in the data for short and sparsely sampled time-series is described. A set of piecewise Orthogonal Projections to Latent Structures (OPLS) models are estimated, describing changes between successive time points. The individual OPLS models are linear, but the piecewise combination of several models accommodates modelling and prediction of changes which are non-linear with respect to the time course. We demonstrate the method on both simulated and metabolic profiling data, illustrating how time related changes are successfully modelled and predicted. The proposed method is effective for modelling and prediction of short and multivariate time series data. A key advantage of the method is model transparency, allowing easy interpretation of time-related variation in the data. The method provides a competitive complement to commonly applied multivariate methods such as OPLS and Principal Component Analysis (PCA) for modelling and analysis of short time-series data.
An Improved Manual Method for NOx Emission Measurement.
ERIC Educational Resources Information Center
Dee, L. A.; And Others
The current manual NO (x) sampling and analysis method was evaluated. Improved time-integrated sampling and rapid analysis methods were developed. In the new method, the sample gas is drawn through a heated bed of uniquely active, crystalline, Pb02 where NO (x) is quantitatively absorbed. Nitrate ion is later extracted with water and the…
Rapid fusion method for the determination of Pu, Np, and Am in large soil samples
Maxwell, Sherrod L.; Culligan, Brian; Hutchison, Jay B.; ...
2015-02-14
A new rapid sodium hydroxide fusion method for the preparation of 10-20 g soil samples has been developed by the Savannah River National Laboratory (SRNL). The method enables lower detection limits for plutonium, neptunium, and americium in environmental soil samples. The method also significantly reduces sample processing time and acid fume generation compared to traditional soil digestion techniques using hydrofluoric acid. Ten gram soil aliquots can be ashed and fused using the new method in 1-2 hours, completely dissolving samples, including refractory particles. Pu, Np and Am are separated using stacked 2mL cartridges of TEVA and DGA Resin and measuredmore » using alpha spectrometry. The method can be adapted for measurement by inductively-coupled plasma mass spectrometry (ICP-MS). Two 10 g soil aliquots of fused soil may be combined prior to chromatographic separations to further improve detection limits. Total sample preparation time, including chromatographic separations and alpha spectrometry source preparation, is less than 8 hours.« less
Bannerman, J A; Costamagna, A C; McCornack, B P; Ragsdale, D W
2015-06-01
Generalist natural enemies play an important role in controlling soybean aphid, Aphis glycines (Hemiptera: Aphididae), in North America. Several sampling methods are used to monitor natural enemy populations in soybean, but there has been little work investigating their relative bias, precision, and efficiency. We compare five sampling methods: quadrats, whole-plant counts, sweep-netting, walking transects, and yellow sticky cards to determine the most practical methods for sampling the three most prominent species, which included Harmonia axyridis (Pallas), Coccinella septempunctata L. (Coleoptera: Coccinellidae), and Orius insidiosus (Say) (Hemiptera: Anthocoridae). We show an important time by sampling method interaction indicated by diverging community similarities within and between sampling methods as the growing season progressed. Similarly, correlations between sampling methods for the three most abundant species over multiple time periods indicated differences in relative bias between sampling methods and suggests that bias is not consistent throughout the growing season, particularly for sticky cards and whole-plant samples. Furthermore, we show that sticky cards produce strongly biased capture rates relative to the other four sampling methods. Precision and efficiency differed between sampling methods and sticky cards produced the most precise (but highly biased) results for adult natural enemies, while walking transects and whole-plant counts were the most efficient methods for detecting coccinellids and O. insidiosus, respectively. Based on bias, precision, and efficiency considerations, the most practical sampling methods for monitoring in soybean include walking transects for coccinellid detection and whole-plant counts for detection of small predators like O. insidiosus. Sweep-netting and quadrat samples are also useful for some applications, when efficiency is not paramount. © The Authors 2015. Published by Oxford University Press on behalf of Entomological Society of America. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
ERIC Educational Resources Information Center
Collins, Kathleen M. T.; Onwuegbuzie, Anthony J.; Jiao, Qun G.
2007-01-01
A sequential design utilizing identical samples was used to classify mixed methods studies via a two-dimensional model, wherein sampling designs were grouped according to the time orientation of each study's components and the relationship of the qualitative and quantitative samples. A quantitative analysis of 121 studies representing nine fields…
Observational studies of patients in the emergency department: a comparison of 4 sampling methods.
Valley, Morgan A; Heard, Kennon J; Ginde, Adit A; Lezotte, Dennis C; Lowenstein, Steven R
2012-08-01
We evaluate the ability of 4 sampling methods to generate representative samples of the emergency department (ED) population. We analyzed the electronic records of 21,662 consecutive patient visits at an urban, academic ED. From this population, we simulated different models of study recruitment in the ED by using 2 sample sizes (n=200 and n=400) and 4 sampling methods: true random, random 4-hour time blocks by exact sample size, random 4-hour time blocks by a predetermined number of blocks, and convenience or "business hours." For each method and sample size, we obtained 1,000 samples from the population. Using χ(2) tests, we measured the number of statistically significant differences between the sample and the population for 8 variables (age, sex, race/ethnicity, language, triage acuity, arrival mode, disposition, and payer source). Then, for each variable, method, and sample size, we compared the proportion of the 1,000 samples that differed from the overall ED population to the expected proportion (5%). Only the true random samples represented the population with respect to sex, race/ethnicity, triage acuity, mode of arrival, language, and payer source in at least 95% of the samples. Patient samples obtained using random 4-hour time blocks and business hours sampling systematically differed from the overall ED patient population for several important demographic and clinical variables. However, the magnitude of these differences was not large. Common sampling strategies selected for ED-based studies may affect parameter estimates for several representative population variables. However, the potential for bias for these variables appears small. Copyright © 2012. Published by Mosby, Inc.
Archfield, Stacey A.; LeBlanc, Denis R.
2005-01-01
To evaluate diffusion sampling as an alternative method to monitor volatile organic compound (VOC) concentrations in ground water, concentrations in samples collected by traditional pumped-sampling methods were compared to concentrations in samples collected by diffusion-sampling methods for 89 monitoring wells at or near the Massachusetts Military Reservation, Cape Cod. Samples were analyzed for 36 VOCs. There was no substantial difference between the utility of diffusion and pumped samples to detect the presence or absence of a VOC. In wells where VOCs were detected, diffusion-sample concentrations of tetrachloroethene (PCE) and trichloroethene (TCE) were significantly lower than pumped-sample concentrations. Because PCE and TCE concentrations detected in the wells dominated the calculation of many of the total VOC concentrations, when VOC concentrations were summed and compared by sampling method, visual inspection also showed a downward concentration bias in the diffusion-sample concentration. The degree to which pumped- and diffusion-sample concentrations agreed was not a result of variability inherent within the sampling methods or the diffusion process itself. A comparison of the degree of agreement in the results from the two methods to 13 quantifiable characteristics external to the sampling methods offered only well-screen length as being related to the degree of agreement between the methods; however, there is also evidence to indicate that the flushing rate of water through the well screen affected the agreement between the sampling methods. Despite poor agreement between the concentrations obtained by the two methods at some wells, the degree to which the concentrations agree at a given well is repeatable. A one-time, well-bywell comparison between diffusion- and pumped-sampling methods could determine which wells are good candidates for the use of diffusion samplers. For wells with good method agreement, the diffusion-sampling method is a time-saving and cost-effective alternative to pumped-sampling methods in a long-term monitoring program, such as at the Massachusetts Military Reservation.
Mass load estimation errors utilizing grab sampling strategies in a karst watershed
Fogle, A.W.; Taraba, J.L.; Dinger, J.S.
2003-01-01
Developing a mass load estimation method appropriate for a given stream and constituent is difficult due to inconsistencies in hydrologic and constituent characteristics. The difficulty may be increased in flashy flow conditions such as karst. Many projects undertaken are constrained by budget and manpower and do not have the luxury of sophisticated sampling strategies. The objectives of this study were to: (1) examine two grab sampling strategies with varying sampling intervals and determine the error in mass load estimates, and (2) determine the error that can be expected when a grab sample is collected at a time of day when the diurnal variation is most divergent from the daily mean. Results show grab sampling with continuous flow to be a viable data collection method for estimating mass load in the study watershed. Comparing weekly, biweekly, and monthly grab sampling, monthly sampling produces the best results with this method. However, the time of day the sample is collected is important. Failure to account for diurnal variability when collecting a grab sample may produce unacceptable error in mass load estimates. The best time to collect a sample is when the diurnal cycle is nearest the daily mean.
Fatemeh, Dehghan; Reza, Zolfaghari Mohammad; Mohammad, Arjomandzadegan; Salomeh, Kalantari; Reza, Ahmari Gholam; Hossein, Sarmadian; Maryam, Sadrnia; Azam, Ahmadi; Mana, Shojapoor; Negin, Najarian; Reza, Kasravi Alii; Saeed, Falahat
2014-01-01
Objective To analyse molecular detection of coliforms and shorten the time of PCR. Methods Rapid detection of coliforms by amplification of lacZ and uidA genes in a multiplex PCR reaction was designed and performed in comparison with most probably number (MPN) method for 16 artificial and 101 field samples. The molecular method was also conducted on isolated coliforms from positive MPN samples; standard sample for verification of microbial method certificated reference material; isolated strains from certificated reference material and standard bacteria. The PCR and electrophoresis parameters were changed for reducing the operation time. Results Results of PCR for lacZ and uidA genes were similar in all of standard, operational and artificial samples and showed the 876 bp and 147 bp bands of lacZ and uidA genes by multiplex PCR. PCR results were confirmed by MPN culture method by sensitivity 86% (95% CI: 0.71-0.93). Also the total execution time, with a successful change of factors, was reduced to less than two and a half hour. Conclusions Multiplex PCR method with shortened operation time was used for the simultaneous detection of total coliforms and Escherichia coli in distribution system of Arak city. It's recommended to be used at least as an initial screening test, and then the positive samples could be randomly tested by MPN. PMID:25182727
Sandstrom, Mark W.; Stroppel, Max E.; Foreman, William T.; Schroeder, Michael P.
2001-01-01
A method for the isolation and analysis of 21 parent pesticides and 20 pesticide degradates in natural-water samples is described. Water samples are filtered to remove suspended particulate matter and then are pumped through disposable solid-phase-extraction columns that contain octadecyl-bonded porous silica to extract the analytes. The columns are dried by using nitrogen gas, and adsorbed analytes are eluted with ethyl acetate. Extracted analytes are determined by capillary-column gas chromatography/mass spectrometry with selected-ion monitoring of three characteristic ions. The upper concentration limit is 2 micrograms per liter (?g/L) for most analytes. Single-operator method detection limits in reagent-water samples range from 0.00 1 to 0.057 ?g/L. Validation data also are presented for 14 parent pesticides and 20 degradates that were determined to have greater bias or variability, or shorter holding times than the other compounds. The estimated maximum holding time for analytes in pesticide-grade water before extraction was 4 days. The estimated maximum holding time for analytes after extraction on the dry solid-phase-extraction columns was 7 days. An optional on-site extraction procedure allows for samples to be collected and processed at remote sites where it is difficult to ship samples to the laboratory within the recommended pre-extraction holding time. The method complements existing U.S. Geological Survey Method O-1126-95 (NWQL Schedules 2001 and 2010) by using identical sample preparation and comparable instrument analytical conditions so that sample extracts can be analyzed by either method to expand the range of analytes determined from one water sample.
Geldsetzer, Pascal; Fink, Günther; Vaikath, Maria; Bärnighausen, Till
2018-02-01
(1) To evaluate the operational efficiency of various sampling methods for patient exit interviews; (2) to discuss under what circumstances each method yields an unbiased sample; and (3) to propose a new, operationally efficient, and unbiased sampling method. Literature review, mathematical derivation, and Monte Carlo simulations. Our simulations show that in patient exit interviews it is most operationally efficient if the interviewer, after completing an interview, selects the next patient exiting the clinical consultation. We demonstrate mathematically that this method yields a biased sample: patients who spend a longer time with the clinician are overrepresented. This bias can be removed by selecting the next patient who enters, rather than exits, the consultation room. We show that this sampling method is operationally more efficient than alternative methods (systematic and simple random sampling) in most primary health care settings. Under the assumption that the order in which patients enter the consultation room is unrelated to the length of time spent with the clinician and the interviewer, selecting the next patient entering the consultation room tends to be the operationally most efficient unbiased sampling method for patient exit interviews. © 2016 The Authors. Health Services Research published by Wiley Periodicals, Inc. on behalf of Health Research and Educational Trust.
Detecting chaos in irregularly sampled time series.
Kulp, C W
2013-09-01
Recently, Wiebe and Virgin [Chaos 22, 013136 (2012)] developed an algorithm which detects chaos by analyzing a time series' power spectrum which is computed using the Discrete Fourier Transform (DFT). Their algorithm, like other time series characterization algorithms, requires that the time series be regularly sampled. Real-world data, however, are often irregularly sampled, thus, making the detection of chaotic behavior difficult or impossible with those methods. In this paper, a characterization algorithm is presented, which effectively detects chaos in irregularly sampled time series. The work presented here is a modification of Wiebe and Virgin's algorithm and uses the Lomb-Scargle Periodogram (LSP) to compute a series' power spectrum instead of the DFT. The DFT is not appropriate for irregularly sampled time series. However, the LSP is capable of computing the frequency content of irregularly sampled data. Furthermore, a new method of analyzing the power spectrum is developed, which can be useful for differentiating between chaotic and non-chaotic behavior. The new characterization algorithm is successfully applied to irregularly sampled data generated by a model as well as data consisting of observations of variable stars.
Bonetta, Sa; Bonetta, Si; Ferretti, E; Balocco, F; Carraro, E
2010-05-01
This study was designed to define the extent of water contamination by Legionella pneumophila of certain Italian hotels and to compare quantitative real-time PCR with the conventional culture method. Nineteen Italian hotels of different sizes were investigated. In each hotel three hot water samples (boiler, room showers, recycling) and one cold water sample (inlet) were collected. Physico-chemical parameters were also analysed. Legionella pneumophila was detected in 42% and 74% of the hotels investigated by the culture method and by real-time PCR, respectively. In 21% of samples analysed by the culture method, a concentration of >10(4) CFU l(-1) was found, and Leg. pneumophila serogroup 1 was isolated from 10.5% of the hotels. The presence of Leg. pneumophila was significantly influenced by water sample temperature, while no association with water hardness or residual-free chlorine was found. This study showed a high percentage of buildings colonized by Leg. pneumophila. Moreover, real-time PCR proved to be sensitive enough to detect lower levels of contamination than the culture method. This study indicates that the Italian hotels represent a possible source of risk for Legionnaires' disease and confirms the sensitivity of the molecular method. To our knowledge, this is the first report to demonstrate Legionella contamination in Italian hotels using real-time PCR and culture methods.
Detection and monitoring of invasive exotic plants: a comparison of four sampling methods
Cynthia D. Huebner
2007-01-01
The ability to detect and monitor exotic invasive plants is likely to vary depending on the sampling method employed. Methods with strong qualitative thoroughness for species detection often lack the intensity necessary to monitor vegetation change. Four sampling methods (systematic plot, stratified-random plot, modified Whittaker, and timed meander) in hemlock and red...
Molecular dynamics based enhanced sampling of collective variables with very large time steps.
Chen, Pei-Yang; Tuckerman, Mark E
2018-01-14
Enhanced sampling techniques that target a set of collective variables and that use molecular dynamics as the driving engine have seen widespread application in the computational molecular sciences as a means to explore the free-energy landscapes of complex systems. The use of molecular dynamics as the fundamental driver of the sampling requires the introduction of a time step whose magnitude is limited by the fastest motions in a system. While standard multiple time-stepping methods allow larger time steps to be employed for the slower and computationally more expensive forces, the maximum achievable increase in time step is limited by resonance phenomena, which inextricably couple fast and slow motions. Recently, we introduced deterministic and stochastic resonance-free multiple time step algorithms for molecular dynamics that solve this resonance problem and allow ten- to twenty-fold gains in the large time step compared to standard multiple time step algorithms [P. Minary et al., Phys. Rev. Lett. 93, 150201 (2004); B. Leimkuhler et al., Mol. Phys. 111, 3579-3594 (2013)]. These methods are based on the imposition of isokinetic constraints that couple the physical system to Nosé-Hoover chains or Nosé-Hoover Langevin schemes. In this paper, we show how to adapt these methods for collective variable-based enhanced sampling techniques, specifically adiabatic free-energy dynamics/temperature-accelerated molecular dynamics, unified free-energy dynamics, and by extension, metadynamics, thus allowing simulations employing these methods to employ similarly very large time steps. The combination of resonance-free multiple time step integrators with free-energy-based enhanced sampling significantly improves the efficiency of conformational exploration.
Molecular dynamics based enhanced sampling of collective variables with very large time steps
NASA Astrophysics Data System (ADS)
Chen, Pei-Yang; Tuckerman, Mark E.
2018-01-01
Enhanced sampling techniques that target a set of collective variables and that use molecular dynamics as the driving engine have seen widespread application in the computational molecular sciences as a means to explore the free-energy landscapes of complex systems. The use of molecular dynamics as the fundamental driver of the sampling requires the introduction of a time step whose magnitude is limited by the fastest motions in a system. While standard multiple time-stepping methods allow larger time steps to be employed for the slower and computationally more expensive forces, the maximum achievable increase in time step is limited by resonance phenomena, which inextricably couple fast and slow motions. Recently, we introduced deterministic and stochastic resonance-free multiple time step algorithms for molecular dynamics that solve this resonance problem and allow ten- to twenty-fold gains in the large time step compared to standard multiple time step algorithms [P. Minary et al., Phys. Rev. Lett. 93, 150201 (2004); B. Leimkuhler et al., Mol. Phys. 111, 3579-3594 (2013)]. These methods are based on the imposition of isokinetic constraints that couple the physical system to Nosé-Hoover chains or Nosé-Hoover Langevin schemes. In this paper, we show how to adapt these methods for collective variable-based enhanced sampling techniques, specifically adiabatic free-energy dynamics/temperature-accelerated molecular dynamics, unified free-energy dynamics, and by extension, metadynamics, thus allowing simulations employing these methods to employ similarly very large time steps. The combination of resonance-free multiple time step integrators with free-energy-based enhanced sampling significantly improves the efficiency of conformational exploration.
Generalized sample entropy analysis for traffic signals based on similarity measure
NASA Astrophysics Data System (ADS)
Shang, Du; Xu, Mengjia; Shang, Pengjian
2017-05-01
Sample entropy is a prevailing method used to quantify the complexity of a time series. In this paper a modified method of generalized sample entropy and surrogate data analysis is proposed as a new measure to assess the complexity of a complex dynamical system such as traffic signals. The method based on similarity distance presents a different way of signals patterns match showing distinct behaviors of complexity. Simulations are conducted over synthetic data and traffic signals for providing the comparative study, which is provided to show the power of the new method. Compared with previous sample entropy and surrogate data analysis, the new method has two main advantages. The first one is that it overcomes the limitation about the relationship between the dimension parameter and the length of series. The second one is that the modified sample entropy functions can be used to quantitatively distinguish time series from different complex systems by the similar measure.
O'Reilly, Joseph E; Donoghue, Philip C J
2018-03-01
Consensus trees are required to summarize trees obtained through MCMC sampling of a posterior distribution, providing an overview of the distribution of estimated parameters such as topology, branch lengths, and divergence times. Numerous consensus tree construction methods are available, each presenting a different interpretation of the tree sample. The rise of morphological clock and sampled-ancestor methods of divergence time estimation, in which times and topology are coestimated, has increased the popularity of the maximum clade credibility (MCC) consensus tree method. The MCC method assumes that the sampled, fully resolved topology with the highest clade credibility is an adequate summary of the most probable clades, with parameter estimates from compatible sampled trees used to obtain the marginal distributions of parameters such as clade ages and branch lengths. Using both simulated and empirical data, we demonstrate that MCC trees, and trees constructed using the similar maximum a posteriori (MAP) method, often include poorly supported and incorrect clades when summarizing diffuse posterior samples of trees. We demonstrate that the paucity of information in morphological data sets contributes to the inability of MCC and MAP trees to accurately summarise of the posterior distribution. Conversely, majority-rule consensus (MRC) trees represent a lower proportion of incorrect nodes when summarizing the same posterior samples of trees. Thus, we advocate the use of MRC trees, in place of MCC or MAP trees, in attempts to summarize the results of Bayesian phylogenetic analyses of morphological data.
O’Reilly, Joseph E; Donoghue, Philip C J
2018-01-01
Abstract Consensus trees are required to summarize trees obtained through MCMC sampling of a posterior distribution, providing an overview of the distribution of estimated parameters such as topology, branch lengths, and divergence times. Numerous consensus tree construction methods are available, each presenting a different interpretation of the tree sample. The rise of morphological clock and sampled-ancestor methods of divergence time estimation, in which times and topology are coestimated, has increased the popularity of the maximum clade credibility (MCC) consensus tree method. The MCC method assumes that the sampled, fully resolved topology with the highest clade credibility is an adequate summary of the most probable clades, with parameter estimates from compatible sampled trees used to obtain the marginal distributions of parameters such as clade ages and branch lengths. Using both simulated and empirical data, we demonstrate that MCC trees, and trees constructed using the similar maximum a posteriori (MAP) method, often include poorly supported and incorrect clades when summarizing diffuse posterior samples of trees. We demonstrate that the paucity of information in morphological data sets contributes to the inability of MCC and MAP trees to accurately summarise of the posterior distribution. Conversely, majority-rule consensus (MRC) trees represent a lower proportion of incorrect nodes when summarizing the same posterior samples of trees. Thus, we advocate the use of MRC trees, in place of MCC or MAP trees, in attempts to summarize the results of Bayesian phylogenetic analyses of morphological data. PMID:29106675
NASA Astrophysics Data System (ADS)
Ahn, Surl-Hee; Grate, Jay W.; Darve, Eric F.
2017-08-01
Molecular dynamics simulations are useful in obtaining thermodynamic and kinetic properties of bio-molecules, but they are limited by the time scale barrier. That is, we may not obtain properties' efficiently because we need to run microseconds or longer simulations using femtosecond time steps. To overcome this time scale barrier, we can use the weighted ensemble (WE) method, a powerful enhanced sampling method that efficiently samples thermodynamic and kinetic properties. However, the WE method requires an appropriate partitioning of phase space into discrete macrostates, which can be problematic when we have a high-dimensional collective space or when little is known a priori about the molecular system. Hence, we developed a new WE-based method, called the "Concurrent Adaptive Sampling (CAS) algorithm," to tackle these issues. The CAS algorithm is not constrained to use only one or two collective variables, unlike most reaction coordinate-dependent methods. Instead, it can use a large number of collective variables and adaptive macrostates to enhance the sampling in the high-dimensional space. This is especially useful for systems in which we do not know what the right reaction coordinates are, in which case we can use many collective variables to sample conformations and pathways. In addition, a clustering technique based on the committor function is used to accelerate sampling the slowest process in the molecular system. In this paper, we introduce the new method and show results from two-dimensional models and bio-molecules, specifically penta-alanine and a triazine trimer.
Real-time PCR detection of Plasmodium directly from whole blood and filter paper samples
2011-01-01
Background Real-time PCR is a sensitive and specific method for the analysis of Plasmodium DNA. However, prior purification of genomic DNA from blood is necessary since PCR inhibitors and quenching of fluorophores from blood prevent efficient amplification and detection of PCR products. Methods Reagents designed to specifically overcome PCR inhibition and quenching of fluorescence were evaluated for real-time PCR amplification of Plasmodium DNA directly from blood. Whole blood from clinical samples and dried blood spots collected in the field in Colombia were tested. Results Amplification and fluorescence detection by real-time PCR were optimal with 40× SYBR® Green dye and 5% blood volume in the PCR reaction. Plasmodium DNA was detected directly from both whole blood and dried blood spots from clinical samples. The sensitivity and specificity ranged from 93-100% compared with PCR performed on purified Plasmodium DNA. Conclusions The methodology described facilitates high-throughput testing of blood samples collected in the field by fluorescence-based real-time PCR. This method can be applied to a broad range of clinical studies with the advantages of immediate sample testing, lower experimental costs and time-savings. PMID:21851640
Nogly, Przemyslaw; Panneels, Valerie; Nelson, Garrett; ...
2016-08-22
Serial femtosecond crystallography (SFX) using X-ray free-electron laser sources is an emerging method with considerable potential for time-resolved pump-probe experiments. Here we present a lipidic cubic phase SFX structure of the light-driven proton pump bacteriorhodopsin (bR) to 2.3 Å resolution and a method to investigate protein dynamics with modest sample requirement. Time-resolved SFX (TR-SFX) with a pump-probe delay of 1 ms yields difference Fourier maps compatible with the dark to M state transition of bR. Importantly, the method is very sample efficient and reduces sample consumption to about 1 mg per collected time point. Accumulation of M intermediate within themore » crystal lattice is confirmed by time-resolved visible absorption spectroscopy. Furthermore, this study provides an important step towards characterizing the complete photocycle dynamics of retinal proteins and demonstrates the feasibility of a sample efficient viscous medium jet for TR-SFX.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nogly, Przemyslaw; Panneels, Valerie; Nelson, Garrett
Serial femtosecond crystallography (SFX) using X-ray free-electron laser sources is an emerging method with considerable potential for time-resolved pump-probe experiments. Here we present a lipidic cubic phase SFX structure of the light-driven proton pump bacteriorhodopsin (bR) to 2.3 Å resolution and a method to investigate protein dynamics with modest sample requirement. Time-resolved SFX (TR-SFX) with a pump-probe delay of 1 ms yields difference Fourier maps compatible with the dark to M state transition of bR. Importantly, the method is very sample efficient and reduces sample consumption to about 1 mg per collected time point. Accumulation of M intermediate within themore » crystal lattice is confirmed by time-resolved visible absorption spectroscopy. Furthermore, this study provides an important step towards characterizing the complete photocycle dynamics of retinal proteins and demonstrates the feasibility of a sample efficient viscous medium jet for TR-SFX.« less
Nogly, Przemyslaw; Panneels, Valerie; Nelson, Garrett; Gati, Cornelius; Kimura, Tetsunari; Milne, Christopher; Milathianaki, Despina; Kubo, Minoru; Wu, Wenting; Conrad, Chelsie; Coe, Jesse; Bean, Richard; Zhao, Yun; Båth, Petra; Dods, Robert; Harimoorthy, Rajiv; Beyerlein, Kenneth R.; Rheinberger, Jan; James, Daniel; DePonte, Daniel; Li, Chufeng; Sala, Leonardo; Williams, Garth J.; Hunter, Mark S.; Koglin, Jason E.; Berntsen, Peter; Nango, Eriko; Iwata, So; Chapman, Henry N.; Fromme, Petra; Frank, Matthias; Abela, Rafael; Boutet, Sébastien; Barty, Anton; White, Thomas A.; Weierstall, Uwe; Spence, John; Neutze, Richard; Schertler, Gebhard; Standfuss, Jörg
2016-01-01
Serial femtosecond crystallography (SFX) using X-ray free-electron laser sources is an emerging method with considerable potential for time-resolved pump-probe experiments. Here we present a lipidic cubic phase SFX structure of the light-driven proton pump bacteriorhodopsin (bR) to 2.3 Å resolution and a method to investigate protein dynamics with modest sample requirement. Time-resolved SFX (TR-SFX) with a pump-probe delay of 1 ms yields difference Fourier maps compatible with the dark to M state transition of bR. Importantly, the method is very sample efficient and reduces sample consumption to about 1 mg per collected time point. Accumulation of M intermediate within the crystal lattice is confirmed by time-resolved visible absorption spectroscopy. This study provides an important step towards characterizing the complete photocycle dynamics of retinal proteins and demonstrates the feasibility of a sample efficient viscous medium jet for TR-SFX. PMID:27545823
Hobolth, Asger; Stone, Eric A
2009-09-01
Analyses of serially-sampled data often begin with the assumption that the observations represent discrete samples from a latent continuous-time stochastic process. The continuous-time Markov chain (CTMC) is one such generative model whose popularity extends to a variety of disciplines ranging from computational finance to human genetics and genomics. A common theme among these diverse applications is the need to simulate sample paths of a CTMC conditional on realized data that is discretely observed. Here we present a general solution to this sampling problem when the CTMC is defined on a discrete and finite state space. Specifically, we consider the generation of sample paths, including intermediate states and times of transition, from a CTMC whose beginning and ending states are known across a time interval of length T. We first unify the literature through a discussion of the three predominant approaches: (1) modified rejection sampling, (2) direct sampling, and (3) uniformization. We then give analytical results for the complexity and efficiency of each method in terms of the instantaneous transition rate matrix Q of the CTMC, its beginning and ending states, and the length of sampling time T. In doing so, we show that no method dominates the others across all model specifications, and we give explicit proof of which method prevails for any given Q, T, and endpoints. Finally, we introduce and compare three applications of CTMCs to demonstrate the pitfalls of choosing an inefficient sampler.
Self-contained cryogenic gas sampling apparatus and method
McManus, G.J.; Motes, B.G.; Bird, S.K.; Kotter, D.K.
1996-03-26
Apparatus for obtaining a whole gas sample, is composed of: a sample vessel having an inlet for receiving a gas sample; a controllable valve mounted for controllably opening and closing the inlet; a valve control coupled to the valve for opening and closing the valve at selected times; a portable power source connected for supplying operating power to the valve control; and a cryogenic coolant in thermal communication with the vessel for cooling the interior of the vessel to cryogenic temperatures. A method is described for obtaining an air sample using the apparatus described above, by: placing the apparatus at a location at which the sample is to be obtained; operating the valve control to open the valve at a selected time and close the valve at a selected subsequent time; and between the selected times maintaining the vessel at a cryogenic temperature by heat exchange with the coolant. 3 figs.
Self-contained cryogenic gas sampling apparatus and method
McManus, Gary J.; Motes, Billy G.; Bird, Susan K.; Kotter, Dale K.
1996-01-01
Apparatus for obtaining a whole gas sample, composed of: a sample vessel having an inlet for receiving a gas sample; a controllable valve mounted for controllably opening and closing the inlet; a valve control coupled to the valve for opening and closing the valve at selected times; a portable power source connected for supplying operating power to the valve control; and a cryogenic coolant in thermal communication with the vessel for cooling the interior of the vessel to cryogenic temperatures. A method of obtaining an air sample using the apparatus described above, by: placing the apparatus at a location at which the sample is to be obtained; operating the valve control to open the valve at a selected time and close the valve at a selected subsequent time; and between the selected times maintaining the vessel at a cryogenic temperature by heat exchange with the coolant.
A new method for calculation of the chlorine demand of natural and treated waters.
Warton, Ben; Heitz, Anna; Joll, Cynthia; Kagi, Robert
2006-08-01
Conventional methods of calculating chlorine demand are dose dependent, making intercomparison of samples difficult, especially in cases where the samples contain substantially different concentrations of dissolved organic carbon (DOC), or other chlorine-consuming species. Using the method presented here, the values obtained for chlorine demand are normalised, allowing valid comparison of chlorine demand between samples, independent of the chlorine dose. Since the method is not dose dependent, samples with substantially differing water quality characteristics can be reliably compared. In our method, we dosed separate aliquots of a water sample with different chlorine concentrations, and periodically measured the residual chlorine concentrations in these subsamples. The chlorine decay data obtained in this way were then fitted to first-order exponential decay functions, corresponding to short-term demand (0-4h) and long-term demand (4-168 h). From the derived decay functions, the residual concentrations at a given time within the experimental time window were calculated and plotted against the corresponding initial chlorine concentrations, giving a linear relationship. From this linear function, it was then possible to determine the residual chlorine concentration for any initial concentration (i.e. dose). Thus, using this method, the initial chlorine dose required to give any residual chlorine concentration can be calculated for any time within the experimental time window, from a single set of experimental data.
Drummond, A; Rodrigo, A G
2000-12-01
Reconstruction of evolutionary relationships from noncontemporaneous molecular samples provides a new challenge for phylogenetic reconstruction methods. With recent biotechnological advances there has been an increase in molecular sequencing throughput, and the potential to obtain serial samples of sequences from populations, including rapidly evolving pathogens, is fast being realized. A new method called the serial-sample unweighted pair grouping method with arithmetic means (sUPGMA) is presented that reconstructs a genealogy or phylogeny of sequences sampled serially in time using a matrix of pairwise distances. The resulting tree depicts the terminal lineages of each sample ending at a different level consistent with the sample's temporal order. Since sUPGMA is a variant of UPGMA, it will perform best when sequences have evolved at a constant rate (i.e., according to a molecular clock). On simulated data, this new method performs better than standard cluster analysis under a variety of longitudinal sampling strategies. Serial-sample UPGMA is particularly useful for analysis of longitudinal samples of viruses and bacteria, as well as ancient DNA samples, with the minimal requirement that samples of sequences be ordered in time.
Coes, Alissa L.; Paretti, Nicholas V.; Foreman, William T.; Iverson, Jana L.; Alvarez, David A.
2014-01-01
A continuous active sampling method was compared to continuous passive and discrete sampling methods for the sampling of trace organic compounds (TOCs) in water. Results from each method are compared and contrasted in order to provide information for future investigators to use while selecting appropriate sampling methods for their research. The continuous low-level aquatic monitoring (CLAM) sampler (C.I.Agent® Storm-Water Solutions) is a submersible, low flow-rate sampler, that continuously draws water through solid-phase extraction media. CLAM samplers were deployed at two wastewater-dominated stream field sites in conjunction with the deployment of polar organic chemical integrative samplers (POCIS) and the collection of discrete (grab) water samples. All samples were analyzed for a suite of 69 TOCs. The CLAM and POCIS samples represent time-integrated samples that accumulate the TOCs present in the water over the deployment period (19–23 h for CLAM and 29 days for POCIS); the discrete samples represent only the TOCs present in the water at the time and place of sampling. Non-metric multi-dimensional scaling and cluster analysis were used to examine patterns in both TOC detections and relative concentrations between the three sampling methods. A greater number of TOCs were detected in the CLAM samples than in corresponding discrete and POCIS samples, but TOC concentrations in the CLAM samples were significantly lower than in the discrete and (or) POCIS samples. Thirteen TOCs of varying polarity were detected by all of the three methods. TOC detections and concentrations obtained by the three sampling methods, however, are dependent on multiple factors. This study found that stream discharge, constituent loading, and compound type all affected TOC concentrations detected by each method. In addition, TOC detections and concentrations were affected by the reporting limits, bias, recovery, and performance of each method.
Coes, Alissa L; Paretti, Nicholas V; Foreman, William T; Iverson, Jana L; Alvarez, David A
2014-03-01
A continuous active sampling method was compared to continuous passive and discrete sampling methods for the sampling of trace organic compounds (TOCs) in water. Results from each method are compared and contrasted in order to provide information for future investigators to use while selecting appropriate sampling methods for their research. The continuous low-level aquatic monitoring (CLAM) sampler (C.I.Agent® Storm-Water Solutions) is a submersible, low flow-rate sampler, that continuously draws water through solid-phase extraction media. CLAM samplers were deployed at two wastewater-dominated stream field sites in conjunction with the deployment of polar organic chemical integrative samplers (POCIS) and the collection of discrete (grab) water samples. All samples were analyzed for a suite of 69 TOCs. The CLAM and POCIS samples represent time-integrated samples that accumulate the TOCs present in the water over the deployment period (19-23 h for CLAM and 29 days for POCIS); the discrete samples represent only the TOCs present in the water at the time and place of sampling. Non-metric multi-dimensional scaling and cluster analysis were used to examine patterns in both TOC detections and relative concentrations between the three sampling methods. A greater number of TOCs were detected in the CLAM samples than in corresponding discrete and POCIS samples, but TOC concentrations in the CLAM samples were significantly lower than in the discrete and (or) POCIS samples. Thirteen TOCs of varying polarity were detected by all of the three methods. TOC detections and concentrations obtained by the three sampling methods, however, are dependent on multiple factors. This study found that stream discharge, constituent loading, and compound type all affected TOC concentrations detected by each method. In addition, TOC detections and concentrations were affected by the reporting limits, bias, recovery, and performance of each method. Published by Elsevier B.V.
Comparisons of discrete and integrative sampling accuracy in estimating pulsed aquatic exposures.
Morrison, Shane A; Luttbeg, Barney; Belden, Jason B
2016-11-01
Most current-use pesticides have short half-lives in the water column and thus the most relevant exposure scenarios for many aquatic organisms are pulsed exposures. Quantifying exposure using discrete water samples may not be accurate as few studies are able to sample frequently enough to accurately determine time-weighted average (TWA) concentrations of short aquatic exposures. Integrative sampling methods that continuously sample freely dissolved contaminants over time intervals (such as integrative passive samplers) have been demonstrated to be a promising measurement technique. We conducted several modeling scenarios to test the assumption that integrative methods may require many less samples for accurate estimation of peak 96-h TWA concentrations. We compared the accuracies of discrete point samples and integrative samples while varying sampling frequencies and a range of contaminant water half-lives (t 50 = 0.5, 2, and 8 d). Differences the predictive accuracy of discrete point samples and integrative samples were greatest at low sampling frequencies. For example, when the half-life was 0.5 d, discrete point samples required 7 sampling events to ensure median values > 50% and no sampling events reporting highly inaccurate results (defined as < 10% of the true 96-h TWA). Across all water half-lives investigated, integrative sampling only required two samples to prevent highly inaccurate results and measurements resulting in median values > 50% of the true concentration. Regardless, the need for integrative sampling diminished as water half-life increased. For an 8-d water half-life, two discrete samples produced accurate estimates and median values greater than those obtained for two integrative samples. Overall, integrative methods are the more accurate method for monitoring contaminants with short water half-lives due to reduced frequency of extreme values, especially with uncertainties around the timing of pulsed events. However, the acceptability of discrete sampling methods for providing accurate concentration measurements increases with increasing aquatic half-lives. Copyright © 2016 Elsevier Ltd. All rights reserved.
A quantitative evaluation of two methods for preserving hair samples
Roon, David A.; Waits, L.P.; Kendall, K.C.
2003-01-01
Hair samples are an increasingly important DNA source for wildlife studies, yet optimal storage methods and DNA degradation rates have not been rigorously evaluated. We tested amplification success rates over a one-year storage period for DNA extracted from brown bear (Ursus arctos) hair samples preserved using silica desiccation and -20C freezing. For three nuclear DNA microsatellites, success rates decreased significantly after a six-month time point, regardless of storage method. For a 1000 bp mitochondrial fragment, a similar decrease occurred after a two-week time point. Minimizing delays between collection and DNA extraction will maximize success rates for hair-based noninvasive genetic sampling projects.
2013-01-01
Background Amebiasis caused by Entamoeba histolytica is the third leading cause of death worldwide. This pathogenic amoeba is morphologically indistinguishable from E. dispar and E. moshkovskii, the non-pathogenic species. Polymerase chain reaction is the current method of choice approved by World Health Organization. Real-time PCR is another attractive molecular method for diagnosis of infectious diseases as post-PCR analyses are eliminated and turnaround times are shorter. The present work aimed to compare the results of Entamoeba species identification using the real-time assay against the established nested PCR method. Methods In this study, a total of 334 human faecal samples were collected from different Orang Asli settlements. Faecal samples were processed by direct wet smear and formalin ethyl acetate concentration methods followed by iodine staining and was microscopically examined for Entamoeba species and other intestinal parasites. Microscopically positive samples were then subject to nested PCR and real-time PCR. Results The overall prevalence of Entamoeba infection was 19.5% (65/334). SK Posh Piah recorded highest Entamoeba prevalence (63.3%) while Kampung Kemensah had the lowest prevalence (3.7%) of Entamoeba. Microscopically positive samples were then tested by real-time PCR and nested PCR for the presence of Entamoeba histolytica, Entamoeba dispar, and Entamoeba moshkovskii infection. Real-time PCR showed higher Entamoeba detection (86.2%) compared to nested PCR (80%), although the McNemar test value showed no significant difference between the two methods (p = 0.221). Conclusions This study is the first in Malaysia to report the use of real-time PCR in identifying and differentiating the three Entamoeba infections. It is also proven to be more effective compared to the conventional nested PCR molecular method. PMID:23985047
NASA Astrophysics Data System (ADS)
Chen, Ye; Wolanyk, Nathaniel; Ilker, Tunc; Gao, Shouguo; Wang, Xujing
Methods developed based on bifurcation theory have demonstrated their potential in driving network identification for complex human diseases, including the work by Chen, et al. Recently bifurcation theory has been successfully applied to model cellular differentiation. However, there one often faces a technical challenge in driving network prediction: time course cellular differentiation study often only contains one sample at each time point, while driving network prediction typically require multiple samples at each time point to infer the variation and interaction structures of candidate genes for the driving network. In this study, we investigate several methods to identify both the critical time point and the driving network through examination of how each time point affects the autocorrelation and phase locking. We apply these methods to a high-throughput sequencing (RNA-Seq) dataset of 42 subsets of thymocytes and mature peripheral T cells at multiple time points during their differentiation (GSE48138 from GEO). We compare the predicted driving genes with known transcription regulators of cellular differentiation. We will discuss the advantages and limitations of our proposed methods, as well as potential further improvements of our methods.
Xiong, Wenjun; Patel, Ragini; Cao, Jinde; Zheng, Wei Xing
In this brief, our purpose is to apply asynchronous and intermittent sampled-data control methods to achieve the synchronization of hierarchical time-varying neural networks. The asynchronous and intermittent sampled-data controllers are proposed for two reasons: 1) the controllers may not transmit the control information simultaneously and 2) the controllers cannot always exist at any time . The synchronization is then discussed for a kind of hierarchical time-varying neural networks based on the asynchronous and intermittent sampled-data controllers. Finally, the simulation results are given to illustrate the usefulness of the developed criteria.In this brief, our purpose is to apply asynchronous and intermittent sampled-data control methods to achieve the synchronization of hierarchical time-varying neural networks. The asynchronous and intermittent sampled-data controllers are proposed for two reasons: 1) the controllers may not transmit the control information simultaneously and 2) the controllers cannot always exist at any time . The synchronization is then discussed for a kind of hierarchical time-varying neural networks based on the asynchronous and intermittent sampled-data controllers. Finally, the simulation results are given to illustrate the usefulness of the developed criteria.
NASA Astrophysics Data System (ADS)
Wang, Yonggang; Liu, Chong
2016-10-01
Field programmable gate arrays (FPGAs) manufactured with more advanced processing technology have faster carry chains and smaller delay elements, which are favorable for the design of tapped delay line (TDL)-style time-to-digital converters (TDCs) in FPGA. However, new challenges are posed in using them to implement TDCs with a high time precision. In this paper, we propose a bin realignment method and a dual-sampling method for TDC implementation in a Xilinx UltraScale FPGA. The former realigns the disordered time delay taps so that the TDC precision can approach the limit of its delay granularity, while the latter doubles the number of taps in the delay line so that the TDC precision beyond the cell delay limitation can be expected. Two TDC channels were implemented in a Kintex UltraScale FPGA, and the effectiveness of the new methods was evaluated. For fixed time intervals in the range from 0 to 440 ns, the average RMS precision measured by the two TDC channels reaches 5.8 ps using the bin realignment, and it further improves to 3.9 ps by using the dual-sampling method. The time precision has a 5.6% variation in the measured temperature range. Every part of the TDC, including dual-sampling, encoding, and on-line calibration, could run at a 500 MHz clock frequency. The system measurement dead time is only 4 ns.
Lísa, Miroslav; Cífková, Eva; Khalikova, Maria; Ovčačíková, Magdaléna; Holčapek, Michal
2017-11-24
Lipidomic analysis of biological samples in a clinical research represents challenging task for analytical methods given by the large number of samples and their extreme complexity. In this work, we compare direct infusion (DI) and chromatography - mass spectrometry (MS) lipidomic approaches represented by three analytical methods in terms of comprehensiveness, sample throughput, and validation results for the lipidomic analysis of biological samples represented by tumor tissue, surrounding normal tissue, plasma, and erythrocytes of kidney cancer patients. Methods are compared in one laboratory using the identical analytical protocol to ensure comparable conditions. Ultrahigh-performance liquid chromatography/MS (UHPLC/MS) method in hydrophilic interaction liquid chromatography mode and DI-MS method are used for this comparison as the most widely used methods for the lipidomic analysis together with ultrahigh-performance supercritical fluid chromatography/MS (UHPSFC/MS) method showing promising results in metabolomics analyses. The nontargeted analysis of pooled samples is performed using all tested methods and 610 lipid species within 23 lipid classes are identified. DI method provides the most comprehensive results due to identification of some polar lipid classes, which are not identified by UHPLC and UHPSFC methods. On the other hand, UHPSFC method provides an excellent sensitivity for less polar lipid classes and the highest sample throughput within 10min method time. The sample consumption of DI method is 125 times higher than for other methods, while only 40μL of organic solvent is used for one sample analysis compared to 3.5mL and 4.9mL in case of UHPLC and UHPSFC methods, respectively. Methods are validated for the quantitative lipidomic analysis of plasma samples with one internal standard for each lipid class. Results show applicability of all tested methods for the lipidomic analysis of biological samples depending on the analysis requirements. Copyright © 2017 Elsevier B.V. All rights reserved.
A comparative evaluation between real time Roche COBas TAQMAN 48 HCV and bDNA Bayer Versant HCV 3.0.
Giraldi, Cristina; Noto, Alessandra; Tenuta, Robert; Greco, Francesca; Perugini, Daniela; Spadafora, Mario; Bianco, Anna Maria Lo; Savino, Olga; Natale, Alfonso
2006-10-01
The HCV virus is a common human pathogen made of a single stranded RNA genome with 9600nt. This work compared two different commercial methods used for HCV viral load, the bDNA Bayer Versant HCV 3.0 and the RealTime Roche COBAS TaqMan 48 HCV. We compared the reproducibility and linearity of the two methods. Seventy-five plasma samples with genotypes 1 to 4, which represent the population (45% genotype 1; 24% genotype 2; 13% genotype 3; 18% genotype 4) were directly processed with the Versanto method based upon signal amplification; the same samples were first extracted (COBAS Ampliprep - TNAI) and then amplified using RealTime PCR (COBAS TaqMan 48). The results obtained indicate the same performance for both methods if they have genotype 1, but in samples with genotypes 2, 3 and 4 the RealTime PCR Roche method gave an underestimation in respect to the Bayer bDNA assay.
An improved sampling method of complex network
NASA Astrophysics Data System (ADS)
Gao, Qi; Ding, Xintong; Pan, Feng; Li, Weixing
2014-12-01
Sampling subnet is an important topic of complex network research. Sampling methods influence the structure and characteristics of subnet. Random multiple snowball with Cohen (RMSC) process sampling which combines the advantages of random sampling and snowball sampling is proposed in this paper. It has the ability to explore global information and discover the local structure at the same time. The experiments indicate that this novel sampling method could keep the similarity between sampling subnet and original network on degree distribution, connectivity rate and average shortest path. This method is applicable to the situation where the prior knowledge about degree distribution of original network is not sufficient.
Capel, P.D.; Larson, S.J.
1995-01-01
Minimizing the loss of target organic chemicals from environmental water samples between the time of sample collection and isolation is important to the integrity of an investigation. During this sample holding time, there is a potential for analyte loss through volatilization from the water to the headspace, sorption to the walls and cap of the sample bottle; and transformation through biotic and/or abiotic reactions. This paper presents a chemodynamic-based, generalized approach to estimate the most probable loss processes for individual target organic chemicals. The basic premise is that the investigator must know which loss process(es) are important for a particular analyte, based on its chemodynamic properties, when choosing the appropriate method(s) to prevent loss.
Joyce, Richard; Kuziene, Viktorija; Zou, Xin; Wang, Xueting; Pullen, Frank; Loo, Ruey Leng
2016-01-01
An ultra-performance liquid chromatography quadrupole time of flight mass spectrometry (UPLC-qTOF-MS) method using hydrophilic interaction liquid chromatography was developed and validated for simultaneous quantification of 18 free amino acids in urine with a total acquisition time including the column re-equilibration of less than 18 min per sample. This method involves simple sample preparation steps which consisted of 15 times dilution with acetonitrile to give a final composition of 25 % aqueous and 75 % acetonitrile without the need of any derivatization. The dynamic range for our calibration curve is approximately two orders of magnitude (120-fold from the lowest calibration curve point) with good linearity (r (2) ≥ 0.995 for all amino acids). Good separation of all amino acids as well as good intra- and inter-day accuracy (<15 %) and precision (<15 %) were observed using three quality control samples at a concentration of low, medium and high range of the calibration curve. The limits of detection (LOD) and lower limit of quantification of our method were ranging from approximately 1-300 nM and 0.01-0.5 µM, respectively. The stability of amino acids in the prepared urine samples was found to be stable for 72 h at 4 °C, after one freeze thaw cycle and for up to 4 weeks at -80 °C. We have applied this method to quantify the content of 18 free amino acids in 646 urine samples from a dietary intervention study. We were able to quantify all 18 free amino acids in these urine samples, if they were present at a level above the LOD. We found our method to be reproducible (accuracy and precision were typically <10 % for QCL, QCM and QCH) and the relatively high sample throughput nature of this method potentially makes it a suitable alternative for the analysis of urine samples in clinical setting.
Field efficiency and bias of snag inventory methods
Robert S. Kenning; Mark J. Ducey; John C. Brissette; Jeffery H. Gove
2005-01-01
Snags and cavity trees are important components of forests, but can be difficult to inventory precisely and are not always included in inventories because of limited resources. We tested the application of N-tree distance sampling as a time-saving snag sampling method and compared N-tree distance sampling to fixed-area sampling and modified horizontal line sampling in...
Sampling Operations on Big Data
2015-11-29
gories. These include edge sampling methods where edges are selected by a predetermined criteria; snowball sampling methods where algorithms start... Sampling Operations on Big Data Vijay Gadepally, Taylor Herr, Luke Johnson, Lauren Milechin, Maja Milosavljevic, Benjamin A. Miller Lincoln...process and disseminate information for discovery and exploration under real-time constraints. Common signal processing operations such as sampling and
NASA Astrophysics Data System (ADS)
Audet, J.; Martinsen, L.; Hasler, B.; de Jonge, H.; Karydi, E.; Ovesen, N. B.; Kronvang, B.
2014-07-01
Eutrophication of aquatic ecosystems caused by excess concentrations of nitrogen and phosphorus may have harmful consequences for biodiversity and poses a health risk to humans via the water supplies. Reduction of nitrogen and phosphorus losses to aquatic ecosystems involves implementation of costly measures, and reliable monitoring methods are therefore essential to select appropriate mitigation strategies and to evaluate their effects. Here, we compare the performances and costs of three methodologies for the monitoring of nutrients in rivers: grab sampling, time-proportional sampling and passive sampling using flow proportional samplers. Assuming time-proportional sampling to be the best estimate of the "true" nutrient load, our results showed that the risk of obtaining wrong total nutrient load estimates by passive samplers is high despite similar costs as the time-proportional sampling. Our conclusion is that for passive samplers to provide a reliable monitoring alternative, further development is needed. Grab sampling was the cheapest of the three methods and was more precise and accurate than passive sampling. We conclude that although monitoring employing time-proportional sampling is costly, its reliability precludes unnecessarily high implementation expenses.
NASA Astrophysics Data System (ADS)
Audet, J.; Martinsen, L.; Hasler, B.; de Jonge, H.; Karydi, E.; Ovesen, N. B.; Kronvang, B.
2014-11-01
Eutrophication of aquatic ecosystems caused by excess concentrations of nitrogen and phosphorus may have harmful consequences for biodiversity and poses a health risk to humans via water supplies. Reduction of nitrogen and phosphorus losses to aquatic ecosystems involves implementation of costly measures, and reliable monitoring methods are therefore essential to select appropriate mitigation strategies and to evaluate their effects. Here, we compare the performances and costs of three methodologies for the monitoring of nutrients in rivers: grab sampling; time-proportional sampling; and passive sampling using flow-proportional samplers. Assuming hourly time-proportional sampling to be the best estimate of the "true" nutrient load, our results showed that the risk of obtaining wrong total nutrient load estimates by passive samplers is high despite similar costs as the time-proportional sampling. Our conclusion is that for passive samplers to provide a reliable monitoring alternative, further development is needed. Grab sampling was the cheapest of the three methods and was more precise and accurate than passive sampling. We conclude that although monitoring employing time-proportional sampling is costly, its reliability precludes unnecessarily high implementation expenses.
Code of Federal Regulations, 2012 CFR
2012-07-01
... per million dry volume 3-run average (1 hour minimum sample time per run) Performance test (Method 10... (Reapproved 2008) c. Oxides of nitrogen 53 parts per million dry volume 3-run average (1 hour minimum sample... average (1 hour minimum sample time per run) Performance test (Method 6 or 6c at 40 CFR part 60, appendix...
Code of Federal Regulations, 2011 CFR
2011-07-01
... per million dry volume 3-run average (1 hour minimum sample time per run) Performance test (Method 10... (Reapproved 2008) c. Oxides of nitrogen 53 parts per million dry volume 3-run average (1 hour minimum sample... average (1 hour minimum sample time per run) Performance test (Method 6 or 6c at 40 CFR part 60, appendix...
Rapid method to determine 226Ra in steel samples
Maxwell, Sherrod L.; Culligan, Brian; Hutchison, Jay B.; ...
2017-09-22
The rapid measurement of 226Ra in steel samples is very important in the event of a radiological emergency. 226Ra (T 1/2 = 1600 y) is a natural radionuclide present in the environment and a highly toxic alpha-emitter. Due to its long life and tendency to concentrate in bones, 226Ra ingestion or inhalation can lead to significant committed dose to individuals. A new method for the determination of 226Ra in steel samples has been developed at the Savannah River Environmental Laboratory. The new method employs a rugged acid digestion method that includes hydrofluoric acid, followed by a single precipitation step tomore » rapidly preconcentrate the radium and remove most of the dissolved steel sample matrix. Radium is then separated using a combination of cation exchange and extraction chromatography, and 226Ra is measured by alpha spectrometry. This approach has a sample preparation time of ~ 8 h for steel samples, has a very high tracer yield (> 88%), and removes interferences effectively. A 133Ba yield tracer is used so that samples can be counted immediately following the separation method, avoiding lengthy ingrowth times that are required in other methods.« less
Rapid method to determine 226Ra in steel samples
DOE Office of Scientific and Technical Information (OSTI.GOV)
Maxwell, Sherrod L.; Culligan, Brian; Hutchison, Jay B.
The rapid measurement of 226Ra in steel samples is very important in the event of a radiological emergency. 226Ra (T 1/2 = 1600 y) is a natural radionuclide present in the environment and a highly toxic alpha-emitter. Due to its long life and tendency to concentrate in bones, 226Ra ingestion or inhalation can lead to significant committed dose to individuals. A new method for the determination of 226Ra in steel samples has been developed at the Savannah River Environmental Laboratory. The new method employs a rugged acid digestion method that includes hydrofluoric acid, followed by a single precipitation step tomore » rapidly preconcentrate the radium and remove most of the dissolved steel sample matrix. Radium is then separated using a combination of cation exchange and extraction chromatography, and 226Ra is measured by alpha spectrometry. This approach has a sample preparation time of ~ 8 h for steel samples, has a very high tracer yield (> 88%), and removes interferences effectively. A 133Ba yield tracer is used so that samples can be counted immediately following the separation method, avoiding lengthy ingrowth times that are required in other methods.« less
Helicopter TEM parameters analysis and system optimization based on time constant
NASA Astrophysics Data System (ADS)
Xiao, Pan; Wu, Xin; Shi, Zongyang; Li, Jutao; Liu, Lihua; Fang, Guangyou
2018-03-01
Helicopter transient electromagnetic (TEM) method is a kind of common geophysical prospecting method, widely used in mineral detection, underground water exploration and environment investigation. In order to develop an efficient helicopter TEM system, it is necessary to analyze and optimize the system parameters. In this paper, a simple and quantitative method is proposed to analyze the system parameters, such as waveform, power, base frequency, measured field and sampling time. A wire loop model is used to define a comprehensive 'time constant domain' that shows a range of time constant, analogous to a range of conductance, after which the characteristics of the system parameters in this domain is obtained. It is found that the distortion caused by the transmitting base frequency is less than 5% when the ratio of the transmitting period to the target time constant is greater than 6. When the sampling time window is less than the target time constant, the distortion caused by the sampling time window is less than 5%. According to this method, a helicopter TEM system, called CASHTEM, is designed, and flight test has been carried out in the known mining area. The test results show that the system has good detection performance, verifying the effectiveness of the method.
3D sensitivity encoded ellipsoidal MR spectroscopic imaging of gliomas at 3T☆
Ozturk-Isik, Esin; Chen, Albert P.; Crane, Jason C.; Bian, Wei; Xu, Duan; Han, Eric T.; Chang, Susan M.; Vigneron, Daniel B.; Nelson, Sarah J.
2010-01-01
Purpose The goal of this study was to implement time efficient data acquisition and reconstruction methods for 3D magnetic resonance spectroscopic imaging (MRSI) of gliomas at a field strength of 3T using parallel imaging techniques. Methods The point spread functions, signal to noise ratio (SNR), spatial resolution, metabolite intensity distributions and Cho:NAA ratio of 3D ellipsoidal, 3D sensitivity encoding (SENSE) and 3D combined ellipsoidal and SENSE (e-SENSE) k-space sampling schemes were compared with conventional k-space data acquisition methods. Results The 3D SENSE and e-SENSE methods resulted in similar spectral patterns as the conventional MRSI methods. The Cho:NAA ratios were highly correlated (P<.05 for SENSE and P<.001 for e-SENSE) with the ellipsoidal method and all methods exhibited significantly different spectral patterns in tumor regions compared to normal appearing white matter. The geometry factors ranged between 1.2 and 1.3 for both the SENSE and e-SENSE spectra. When corrected for these factors and for differences in data acquisition times, the empirical SNRs were similar to values expected based upon theoretical grounds. The effective spatial resolution of the SENSE spectra was estimated to be same as the corresponding fully sampled k-space data, while the spectra acquired with ellipsoidal and e-SENSE k-space samplings were estimated to have a 2.36–2.47-fold loss in spatial resolution due to the differences in their point spread functions. Conclusion The 3D SENSE method retained the same spatial resolution as full k-space sampling but with a 4-fold reduction in scan time and an acquisition time of 9.28 min. The 3D e-SENSE method had a similar spatial resolution as the corresponding ellipsoidal sampling with a scan time of 4:36 min. Both parallel imaging methods provided clinically interpretable spectra with volumetric coverage and adequate SNR for evaluating Cho, Cr and NAA. PMID:19766422
Koziel, Jacek A; Nguyen, Lam T; Glanville, Thomas D; Ahn, Heekwon; Frana, Timothy S; Hans van Leeuwen, J
2017-10-01
A passive sampling method, using retracted solid-phase microextraction (SPME) - gas chromatography-mass spectrometry and time-weighted averaging, was developed and validated for tracking marker volatile organic compounds (VOCs) emitted during aerobic digestion of biohazardous animal tissue. The retracted SPME configuration protects the fragile fiber from buffeting by the process gas stream, and it requires less equipment and is potentially more biosecure than conventional active sampling methods. VOC concentrations predicted via a model based on Fick's first law of diffusion were within 6.6-12.3% of experimentally controlled values after accounting for VOC adsorption to the SPME fiber housing. Method detection limits for five marker VOCs ranged from 0.70 to 8.44ppbv and were statistically equivalent (p>0.05) to those for active sorbent-tube-based sampling. The sampling time of 30min and fiber retraction of 5mm were found to be optimal for the tissue digestion process. Copyright © 2017 Elsevier Ltd. All rights reserved.
Filla, Robert T; Schrell, Adrian M; Coulton, John B; Edwards, James L; Roper, Michael G
2018-02-20
A method for multiplexed sample analysis by mass spectrometry without the need for chemical tagging is presented. In this new method, each sample is pulsed at unique frequencies, mixed, and delivered to the mass spectrometer while maintaining a constant total flow rate. Reconstructed ion currents are then a time-dependent signal consisting of the sum of the ion currents from the various samples. Spectral deconvolution of each reconstructed ion current reveals the identity of each sample, encoded by its unique frequency, and its concentration encoded by the peak height in the frequency domain. This technique is different from other approaches that have been described, which have used modulation techniques to increase the signal-to-noise ratio of a single sample. As proof of concept of this new method, two samples containing up to 9 analytes were multiplexed. The linear dynamic range of the calibration curve was increased with extended acquisition times of the experiment and longer oscillation periods of the samples. Because of the combination of the samples, salt had little effect on the ability of this method to achieve relative quantitation. Continued development of this method is expected to allow for increased numbers of samples that can be multiplexed.
High-throughput real-time quantitative reverse transcription PCR.
Bookout, Angie L; Cummins, Carolyn L; Mangelsdorf, David J; Pesola, Jean M; Kramer, Martha F
2006-02-01
Extensive detail on the application of the real-time quantitative polymerase chain reaction (QPCR) for the analysis of gene expression is provided in this unit. The protocols are designed for high-throughput, 384-well-format instruments, such as the Applied Biosystems 7900HT, but may be modified to suit any real-time PCR instrument. QPCR primer and probe design and validation are discussed, and three relative quantitation methods are described: the standard curve method, the efficiency-corrected DeltaCt method, and the comparative cycle time, or DeltaDeltaCt method. In addition, a method is provided for absolute quantification of RNA in unknown samples. RNA standards are subjected to RT-PCR in the same manner as the experimental samples, thus accounting for the reaction efficiencies of both procedures. This protocol describes the production and quantitation of synthetic RNA molecules for real-time and non-real-time RT-PCR applications.
Efficiency of snake sampling methods in the Brazilian semiarid region.
Mesquita, Paula C M D; Passos, Daniel C; Cechin, Sonia Z
2013-09-01
The choice of sampling methods is a crucial step in every field survey in herpetology. In countries where time and financial support are limited, the choice of the methods is critical. The methods used to sample snakes often lack objective criteria, and the traditional methods have apparently been more important when making the choice. Consequently researches using not-standardized methods are frequently found in the literature. We have compared four commonly used methods for sampling snake assemblages in a semiarid area in Brazil. We compared the efficacy of each method based on the cost-benefit regarding the number of individuals and species captured, time, and financial investment. We found that pitfall traps were the less effective method in all aspects that were evaluated and it was not complementary to the other methods in terms of abundance of species and assemblage structure. We conclude that methods can only be considered complementary if they are standardized to the objectives of the study. The use of pitfall traps in short-term surveys of the snake fauna in areas with shrubby vegetation and stony soil is not recommended.
40 CFR 63.1385 - Test methods and procedures.
Code of Federal Regulations, 2010 CFR
2010-07-01
... applicable emission limits: (1) Method 1 (40 CFR part 60, appendix A) for the selection of the sampling port location and number of sampling ports; (2) Method 2 (40 CFR part 60, appendix A) for volumetric flow rate.... Each run shall consist of a minimum run time of 2 hours and a minimum sample volume of 60 dry standard...
Lau, Yee Ling; Anthony, Claudia; Fakhrurrazi, Siti Aminah; Ibrahim, Jamaiah; Ithoi, Init; Mahmud, Rohela
2013-08-28
Amebiasis caused by Entamoeba histolytica is the third leading cause of death worldwide. This pathogenic amoeba is morphologically indistinguishable from E. dispar and E. moshkovskii, the non-pathogenic species. Polymerase chain reaction is the current method of choice approved by World Health Organization. Real-time PCR is another attractive molecular method for diagnosis of infectious diseases as post-PCR analyses are eliminated and turnaround times are shorter. The present work aimed to compare the results of Entamoeba species identification using the real-time assay against the established nested PCR method. In this study, a total of 334 human faecal samples were collected from different Orang Asli settlements. Faecal samples were processed by direct wet smear and formalin ethyl acetate concentration methods followed by iodine staining and was microscopically examined for Entamoeba species and other intestinal parasites. Microscopically positive samples were then subject to nested PCR and real-time PCR. The overall prevalence of Entamoeba infection was 19.5% (65/334). SK Posh Piah recorded highest Entamoeba prevalence (63.3%) while Kampung Kemensah had the lowest prevalence (3.7%) of Entamoeba. Microscopically positive samples were then tested by real-time PCR and nested PCR for the presence of Entamoeba histolytica, Entamoeba dispar, and Entamoeba moshkovskii infection. Real-time PCR showed higher Entamoeba detection (86.2%) compared to nested PCR (80%), although the McNemar test value showed no significant difference between the two methods (p = 0.221). This study is the first in Malaysia to report the use of real-time PCR in identifying and differentiating the three Entamoeba infections. It is also proven to be more effective compared to the conventional nested PCR molecular method.
NASA Astrophysics Data System (ADS)
Kowalska, Małgorzata; Janas, Sławomir; Woźniak, Magdalena
2018-04-01
The aim of this work was the presentation of an alternative method of determination of the total dry mass content in processed cheese. The authors claim that the presented method can be used in industry's quality control laboratories for routine testing and for quick in-process control. For the test purposes both reference method of determination of dry mass in processed cheese and moisture analyzer method were used. The tests were carried out for three different kinds of processed cheese. In accordance with the reference method, the sample was placed on a layer of silica sand and dried at the temperature of 102 °C for about 4 h. The moisture analyzer test required method validation, with regard to drying temperature range and mass of the analyzed sample. Optimum drying temperature of 110 °C was determined experimentally. For Hochland cream processed cheese sample, the total dry mass content, obtained using the reference method, was 38.92%, whereas using the moisture analyzer method, it was 38.74%. An average analysis time in case of the moisture analyzer method was 9 min. For the sample of processed cheese with tomatoes, the reference method result was 40.37%, and the alternative method result was 40.67%. For the sample of cream processed cheese with garlic the reference method gave value of 36.88%, and the alternative method, of 37.02%. An average time of those determinations was 16 min. Obtained results confirmed that use of moisture analyzer is effective. Compliant values of dry mass content were obtained for both of the used methods. According to the authors, the fact that the measurement took incomparably less time for moisture analyzer method, is a key criterion of in-process control and final quality control method selection.
Frosth, Sara; König, Ulrika; Nyman, Ann-Kristin; Aspán, Anna
2017-09-01
Dichelobacter nodosus is the principal cause of ovine footrot and strain virulence is an important factor in disease severity. Therefore, detection and virulence determination of D. nodosus is important for proper diagnosis of the disease. Today this is possible by real-time PCR analysis. Analysis of large numbers of samples is costly and laborious; therefore, pooling of individual samples is common in surveillance programs. However, pooling can reduce the sensitivity of the method. The aim of this study was to develop a pooling method for real-time PCR analysis that would allow sensitive detection and simultaneous virulence determination of D. nodosus. A total of 225 sheep from 17 flocks were sampled using ESwabs within the Swedish Footrot Control Program in 2014. Samples were first analysed individually and then in pools of five by real-time PCR assays targeting the 16S rRNA and aprV2/B2 genes of D. nodosus. Each pool consisted of four negative and one positive D. nodosus samples with varying amounts of the bacterium. In the individual analysis, 61 (27.1%) samples were positive in the 16S rRNA and the aprV2/B2 PCR assays and 164 (72.9%) samples were negative. All samples positive in the aprV2/B2 PCR-assay were of aprB2 variant. The pooled analysis showed that all 41 pools were also positive for D. nodosus 16S rRNA and the aprB2 variant. The diagnostic sensitivity for pooled and individual samples was therefore similar. Our method includes concentration of the bacteria before DNA-extraction. This may account for the maintenance of diagnostic sensitivity. Diagnostic sensitivity in the real-time PCR assays of the pooled samples were comparable to the sensitivity obtained for individually analysed samples. Even sub-clinical infections were able to be detected in the pooled PCR samples which is important for control of the disease. This method may therefore be implemented in footrot control programs where it can replace analysis of individual samples.
Quantitative Method for Simultaneous Analysis of Acetaminophen and 6 Metabolites.
Lammers, Laureen A; Achterbergh, Roos; Pistorius, Marcel C M; Romijn, Johannes A; Mathôt, Ron A A
2017-04-01
Hepatotoxicity after ingestion of high-dose acetaminophen [N-acetyl-para-aminophenol (APAP)] is caused by the metabolites of the drug. To gain more insight into factors influencing susceptibility to APAP hepatotoxicity, quantification of APAP and metabolites is important. A few methods have been developed to simultaneously quantify APAP and its most important metabolites. However, these methods require a comprehensive sample preparation and long run times. The aim of this study was to develop and validate a simplified, but sensitive method for the simultaneous quantification of acetaminophen, the main metabolites acetaminophen glucuronide and acetaminophen sulfate, and 4 Cytochrome P450-mediated metabolites by using liquid chromatography with mass spectrometric (LC-MS) detection. The method was developed and validated for the human plasma, and it entailed a single method for sample preparation, enabling quick processing of the samples followed by an LC-MS method with a chromatographic run time of 9 minutes. The method was validated for selectivity, linearity, accuracy, imprecision, dilution integrity, recovery, process efficiency, ionization efficiency, and carryover effect. The method showed good selectivity without matrix interferences. For all analytes, the mean process efficiency was >86%, and the mean ionization efficiency was >94%. Furthermore, the accuracy was between 90.3% and 112% for all analytes, and the within- and between-run imprecision were <20% for the lower limit of quantification and <14.3% for the middle level and upper limit of quantification. The method presented here enables the simultaneous quantification of APAP and 6 of its metabolites. It is less time consuming than previously reported methods because it requires only a single and simple method for the sample preparation followed by an LC-MS method with a short run time. Therefore, this analytical method provides a useful method for both clinical and research purposes.
Reducing acquisition times in multidimensional NMR with a time-optimized Fourier encoding algorithm
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, Zhiyong; Department of Electronic Science, Fujian Provincial Key Laboratory of Plasma and Magnetic Resonance, Xiamen University, Xiamen, Fujian 361005; Smith, Pieter E. S.
Speeding up the acquisition of multidimensional nuclear magnetic resonance (NMR) spectra is an important topic in contemporary NMR, with central roles in high-throughput investigations and analyses of marginally stable samples. A variety of fast NMR techniques have been developed, including methods based on non-uniform sampling and Hadamard encoding, that overcome the long sampling times inherent to schemes based on fast-Fourier-transform (FFT) methods. Here, we explore the potential of an alternative fast acquisition method that leverages a priori knowledge, to tailor polychromatic pulses and customized time delays for an efficient Fourier encoding of the indirect domain of an NMR experiment. Bymore » porting the encoding of the indirect-domain to the excitation process, this strategy avoids potential artifacts associated with non-uniform sampling schemes and uses a minimum number of scans equal to the number of resonances present in the indirect dimension. An added convenience is afforded by the fact that a usual 2D FFT can be used to process the generated data. Acquisitions of 2D heteronuclear correlation NMR spectra on quinine and on the anti-inflammatory drug isobutyl propionic phenolic acid illustrate the new method's performance. This method can be readily automated to deal with complex samples such as those occurring in metabolomics, in in-cell as well as in in vivo NMR applications, where speed and temporal stability are often primary concerns.« less
Analysis of munitions constituents in groundwater using a field-portable GC-MS.
Bednar, A J; Russell, A L; Hayes, C A; Jones, W T; Tackett, P; Splichal, D E; Georgian, T; Parker, L V; Kirgan, R A; MacMillan, D K
2012-05-01
The use of munitions constituents (MCs) at military installations can produce soil and groundwater contamination that requires periodic monitoring even after training or manufacturing activities have ceased. Traditional groundwater monitoring methods require large volumes of aqueous samples (e.g., 2-4 L) to be shipped under chain of custody, to fixed laboratories for analysis. The samples must also be packed on ice and shielded from light to minimize degradation that may occur during transport and storage. The laboratory's turn-around time for sample analysis and reporting can be as long as 45 d. This process hinders the reporting of data to customers in a timely manner; yields data that are not necessarily representative of current site conditions owing to the lag time between sample collection and reporting; and incurs significant shipping costs for samples. The current work compares a field portable Gas Chromatograph-Mass Spectrometer (GC-MS) for analysis of MCs on-site with traditional laboratory-based analysis using High Performance Liquid Chromatography with UV absorption detection. The field method provides near real-time (within ~1 h of sampling) concentrations of MCs in groundwater samples. Mass spectrometry provides reliable confirmation of MCs and a means to identify unknown compounds that are potential false positives for methods with UV and other non-selective detectors. Published by Elsevier Ltd.
Effect of different drying methods on moisture ratio and rehydration of pumpkin slices.
Seremet Ceclu, Liliana; Botez, Elisabeta; Nistor, Oana-Viorela; Andronoiu, Doina Georgeta; Mocanu, Gabriel-Danut
2016-03-15
This study was carried to determine the influence of hot air drying process and combined methods on physicochemical properties of pumpkin (Cucurbita moschata) samples. The experiments in hot air chamber were lead at 50, 60 and 70 °C. The combined method consists of a triple combination of the main drying techniques. Thus, in first stage the samples were dried in hot air convection at 60 °C followed by hot air ventilation at 40 °C simultaneous with microwave. The time required to reduce the moisture content to any given level was highly dependent on the drying conditions. So, the highest value of drying time in hot air has been 540 min at 50 °C, while the lowest time has been 189 min in hot air combined by microwave at 40 °C and a power of 315 W. The samples dried by hot air shows a higher rehydration capacity than samples dried by combined method. Copyright © 2015 Elsevier Ltd. All rights reserved.
NEW COLUMN SEPARATION METHOD FOR EMERGENCY URINE SAMPLES
DOE Office of Scientific and Technical Information (OSTI.GOV)
Maxwell, S; Brian Culligan, B
2007-08-28
The Savannah River Site Environmental Bioassay Lab participated in the 2007 NRIP Emergency Response program administered by the National Institute for Standards and Technology (NIST) in May, 2007. A new rapid column separation method was applied directly to the NRIP 2007 emergency urine samples, with only minimal sample preparation to reduce preparation time. Calcium phosphate precipitation, previously used to pre-concentrate actinides and Sr-90 in NRIP 2006 urine and water samples, was not used for the NRIP 2007 urine samples. Instead, the raw urine was acidified and passed directly through the stacked resin columns (TEVA+TRU+SR Resins) to separate the actinides andmore » strontium from the NRIP urine samples more quickly. This improvement reduced sample preparation time for the NRIP 2007 emergency urine analyses significantly. This approach works well for small volume urine samples expected during an emergency response event. Based on initial feedback from NIST, the SRS Environmental Bioassay Lab had the most rapid analysis times for actinides and strontium-90 analyses for NRIP 2007 urine samples.« less
Wells, Beth; Shaw, Hannah; Innocent, Giles; Guido, Stefano; Hotchkiss, Emily; Parigi, Maria; Opsteegh, Marieke; Green, James; Gillespie, Simon; Innes, Elisabeth A; Katzer, Frank
2015-12-15
Waterborne transmission of Toxoplasma gondii is a potential public health risk and there are currently no agreed optimised methods for the recovery, processing and detection of T. gondii oocysts in water samples. In this study modified methods of T. gondii oocyst recovery and DNA extraction were applied to 1427 samples collected from 147 public water supplies throughout Scotland. T. gondii DNA was detected, using real time PCR (qPCR) targeting the 529bp repeat element, in 8.79% of interpretable samples (124 out of 1411 samples). The samples which were positive for T. gondii DNA originated from a third of the sampled water sources. The samples which were positive by qPCR and some of the negative samples were reanalysed using ITS1 nested PCR (nPCR) and results compared. The 529bp qPCR was the more sensitive technique and a full analysis of assay performance, by Bayesian analysis using a Markov Chain Monte Carlo method, was completed which demonstrated the efficacy of this method for the detection of T. gondii in water samples. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.
Gender Differentiation in the New York "Times": 1885 and 1985.
ERIC Educational Resources Information Center
Jolliffe, Lee
A study examined the descriptive language and sex-linked roles ascribed to women and men in articles of the New York "Times" from 1885 and 1985. Seven content analysis methods were applied to four random samples from the "Times"; one sample each for women and men from both years. Samples were drawn using randomly constructed…
40 CFR Table 1 to Subpart III of... - Emission Limitations
Code of Federal Regulations, 2011 CFR
2011-07-01
... determining compliance using this method Cadmium 0.004 milligrams per dry standard cubic meter 3-run average (1 hour minimum sample time per run) Performance test (Method 29 of appendix A of part 60). Carbon monoxide 157 parts per million by dry volume 3-run average (1 hour minimum sample time per run) Performance...
40 CFR Table 1 to Subpart Eeee of... - Emission Limitations
Code of Federal Regulations, 2011 CFR
2011-07-01
... determiningcompliance using this method 1. Cadmium 18 micrograms per dry standard cubic meter 3-run average (1 hour minimum sample time per run) Method 29 of appendix A of this part. 2. Carbon monoxide 40 parts per million by dry volume 3-run average (1 hour minimum sample time per run during performance test), and 12-hour...
40 CFR Table 1 to Subpart III of... - Emission Limitations
Code of Federal Regulations, 2010 CFR
2010-07-01
... determining compliance using this method Cadmium 0.004 milligrams per dry standard cubic meter 3-run average (1 hour minimum sample time per run) Performance test (Method 29 of appendix A of part 60). Carbon monoxide 157 parts per million by dry volume 3-run average (1 hour minimum sample time per run) Performance...
40 CFR Table 1 to Subpart Eeee of... - Emission Limitations
Code of Federal Regulations, 2010 CFR
2010-07-01
... determiningcompliance using this method 1. Cadmium 18 micrograms per dry standard cubic meter 3-run average (1 hour minimum sample time per run) Method 29 of appendix A of this part. 2. Carbon monoxide 40 parts per million by dry volume 3-run average (1 hour minimum sample time per run during performance test), and 12-hour...
Fuji apple storage time rapid determination method using Vis/NIR spectroscopy.
Liu, Fuqi; Tang, Xuxiang
2015-01-01
Fuji apple storage time rapid determination method using visible/near-infrared (Vis/NIR) spectroscopy was studied in this paper. Vis/NIR diffuse reflection spectroscopy responses to samples were measured for 6 days. Spectroscopy data were processed by stochastic resonance (SR). Principal component analysis (PCA) was utilized to analyze original spectroscopy data and SNR eigen value. Results demonstrated that PCA could not totally discriminate Fuji apples using original spectroscopy data. Signal-to-noise ratio (SNR) spectrum clearly classified all apple samples. PCA using SNR spectrum successfully discriminated apple samples. Therefore, Vis/NIR spectroscopy was effective for Fuji apple storage time rapid discrimination. The proposed method is also promising in condition safety control and management for food and environmental laboratories.
Fuji apple storage time rapid determination method using Vis/NIR spectroscopy
Liu, Fuqi; Tang, Xuxiang
2015-01-01
Fuji apple storage time rapid determination method using visible/near-infrared (Vis/NIR) spectroscopy was studied in this paper. Vis/NIR diffuse reflection spectroscopy responses to samples were measured for 6 days. Spectroscopy data were processed by stochastic resonance (SR). Principal component analysis (PCA) was utilized to analyze original spectroscopy data and SNR eigen value. Results demonstrated that PCA could not totally discriminate Fuji apples using original spectroscopy data. Signal-to-noise ratio (SNR) spectrum clearly classified all apple samples. PCA using SNR spectrum successfully discriminated apple samples. Therefore, Vis/NIR spectroscopy was effective for Fuji apple storage time rapid discrimination. The proposed method is also promising in condition safety control and management for food and environmental laboratories. PMID:25874818
Active learning based segmentation of Crohns disease from abdominal MRI.
Mahapatra, Dwarikanath; Vos, Franciscus M; Buhmann, Joachim M
2016-05-01
This paper proposes a novel active learning (AL) framework, and combines it with semi supervised learning (SSL) for segmenting Crohns disease (CD) tissues from abdominal magnetic resonance (MR) images. Robust fully supervised learning (FSL) based classifiers require lots of labeled data of different disease severities. Obtaining such data is time consuming and requires considerable expertise. SSL methods use a few labeled samples, and leverage the information from many unlabeled samples to train an accurate classifier. AL queries labels of most informative samples and maximizes gain from the labeling effort. Our primary contribution is in designing a query strategy that combines novel context information with classification uncertainty and feature similarity. Combining SSL and AL gives a robust segmentation method that: (1) optimally uses few labeled samples and many unlabeled samples; and (2) requires lower training time. Experimental results show our method achieves higher segmentation accuracy than FSL methods with fewer samples and reduced training effort. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Real-time PCR using SYBR Green for the detection of Shigella spp. in food and stool samples.
Mokhtari, W; Nsaibia, S; Gharbi, A; Aouni, M
2013-02-01
Shigella spp are exquisitely fastidious Gram negative organisms that frequently get missed in the detection by traditional culture methods. For this reason, this work has adapted a classical PCR for detection of Shigella in food and stool specimens to real-time PCR using the SYBR Green format. This method follows a melting curve analysis to be more rapid and provide both qualitative and quantitative data about the targeted pathogen. A total of 117 stool samples with diarrhea and 102 food samples were analyzed in Public Health Regional Laboratory of Nabeul by traditional culture methods and real-time PCR. To validate the real-time PCR assay, an experiment was conducted with both spiked and naturally contaminated stool samples. All Shigella strains tested were ipaH positive and all non-Shigella strains yielded no amplification products. The melting temperature (T(m) = 81.5 ± 0.5 °C) was consistently specific for the amplicon. Correlation coefficients of standard curves constructed using the quantification cycle (C(q)) versus copy numbers of Shigella showed good linearity (R² = 0.995; slope = 2.952) and the minimum level of detection was 1.5 × 10³ CFU/g feces. All food samples analyzed were negative for Shigella by standard culture methods, whereas ipaH was detected in 8.8% culture negative food products. Moreover, the ipaH specific PCR system increased the detection rate over that by culture alone from 1.7% to 11.1% among patients with diarrhea. The data presented here shows that the SYBR Green I was suitable for use in the real-time PCR assay, which provided a specific, sensitive and efficient method for the detection and quantification of Shigella spp in food and stool samples. Copyright © 2012 Elsevier Ltd. All rights reserved.
Samejima, Keijiro; Otani, Masahiro; Murakami, Yasuko; Oka, Takami; Kasai, Misao; Tsumoto, Hiroki; Kohda, Kohfuku
2007-10-01
A sensitive method for the determination of polyamines in mammalian cells was described using electrospray ionization and time-of-flight mass spectrometer. This method was 50-fold more sensitive than the previous method using ionspray ionization and quadrupole mass spectrometer. The method employed the partial purification and derivatization of polyamines, but allowed a measurement of multiple samples which contained picomol amounts of polyamines. Time required for data acquisition of one sample was approximately 2 min. The method was successfully applied for the determination of reduced spermidine and spermine contents in cultured cells under the inhibition of aminopropyltransferases. In addition, a new proper internal standard was proposed for the tracer experiment using (15)N-labeled polyamines.
Nonuniform sampling and non-Fourier signal processing methods in multidimensional NMR
Mobli, Mehdi; Hoch, Jeffrey C.
2017-01-01
Beginning with the introduction of Fourier Transform NMR by Ernst and Anderson in 1966, time domain measurement of the impulse response (the free induction decay, FID) consisted of sampling the signal at a series of discrete intervals. For compatibility with the discrete Fourier transform (DFT), the intervals are kept uniform, and the Nyquist theorem dictates the largest value of the interval sufficient to avoid aliasing. With the proposal by Jeener of parametric sampling along an indirect time dimension, extension to multidimensional experiments employed the same sampling techniques used in one dimension, similarly subject to the Nyquist condition and suitable for processing via the discrete Fourier transform. The challenges of obtaining high-resolution spectral estimates from short data records using the DFT were already well understood, however. Despite techniques such as linear prediction extrapolation, the achievable resolution in the indirect dimensions is limited by practical constraints on measuring time. The advent of non-Fourier methods of spectrum analysis capable of processing nonuniformly sampled data has led to an explosion in the development of novel sampling strategies that avoid the limits on resolution and measurement time imposed by uniform sampling. The first part of this review discusses the many approaches to data sampling in multidimensional NMR, the second part highlights commonly used methods for signal processing of such data, and the review concludes with a discussion of other approaches to speeding up data acquisition in NMR. PMID:25456315
Near real time vapor detection and enhancement using aerosol adsorption
Novick, Vincent J.; Johnson, Stanley A.
1999-01-01
A vapor sample detection method where the vapor sample contains vapor and ambient air and surrounding natural background particles. The vapor sample detection method includes the steps of generating a supply of aerosol that have a particular effective median particle size, mixing the aerosol with the vapor sample forming aerosol and adsorbed vapor suspended in an air stream, impacting the suspended aerosol and adsorbed vapor upon a reflecting element, alternatively directing infrared light to the impacted aerosol and adsorbed vapor, detecting and analyzing the alternatively directed infrared light in essentially real time using a spectrometer and a microcomputer and identifying the vapor sample.
Near real time vapor detection and enhancement using aerosol adsorption
Novick, V.J.; Johnson, S.A.
1999-08-03
A vapor sample detection method is described where the vapor sample contains vapor and ambient air and surrounding natural background particles. The vapor sample detection method includes the steps of generating a supply of aerosol that have a particular effective median particle size, mixing the aerosol with the vapor sample forming aerosol and adsorbed vapor suspended in an air stream, impacting the suspended aerosol and adsorbed vapor upon a reflecting element, alternatively directing infrared light to the impacted aerosol and adsorbed vapor, detecting and analyzing the alternatively directed infrared light in essentially real time using a spectrometer and a microcomputer and identifying the vapor sample. 13 figs.
A novel heterogeneous training sample selection method on space-time adaptive processing
NASA Astrophysics Data System (ADS)
Wang, Qiang; Zhang, Yongshun; Guo, Yiduo
2018-04-01
The performance of ground target detection about space-time adaptive processing (STAP) decreases when non-homogeneity of clutter power is caused because of training samples contaminated by target-like signals. In order to solve this problem, a novel nonhomogeneous training sample selection method based on sample similarity is proposed, which converts the training sample selection into a convex optimization problem. Firstly, the existing deficiencies on the sample selection using generalized inner product (GIP) are analyzed. Secondly, the similarities of different training samples are obtained by calculating mean-hausdorff distance so as to reject the contaminated training samples. Thirdly, cell under test (CUT) and the residual training samples are projected into the orthogonal subspace of the target in the CUT, and mean-hausdorff distances between the projected CUT and training samples are calculated. Fourthly, the distances are sorted in order of value and the training samples which have the bigger value are selective preference to realize the reduced-dimension. Finally, simulation results with Mountain-Top data verify the effectiveness of the proposed method.
Sample-based engine noise synthesis using an enhanced pitch-synchronous overlap-and-add method.
Jagla, Jan; Maillard, Julien; Martin, Nadine
2012-11-01
An algorithm for the real time synthesis of internal combustion engine noise is presented. Through the analysis of a recorded engine noise signal of continuously varying engine speed, a dataset of sound samples is extracted allowing the real time synthesis of the noise induced by arbitrary evolutions of engine speed. The sound samples are extracted from a recording spanning the entire engine speed range. Each sample is delimitated such as to contain the sound emitted during one cycle of the engine plus the necessary overlap to ensure smooth transitions during the synthesis. The proposed approach, an extension of the PSOLA method introduced for speech processing, takes advantage of the specific periodicity of engine noise signals to locate the extraction instants of the sound samples. During the synthesis stage, the sound samples corresponding to the target engine speed evolution are concatenated with an overlap and add algorithm. It is shown that this method produces high quality audio restitution with a low computational load. It is therefore well suited for real time applications.
a New Approach for Accuracy Improvement of Pulsed LIDAR Remote Sensing Data
NASA Astrophysics Data System (ADS)
Zhou, G.; Huang, W.; Zhou, X.; He, C.; Li, X.; Huang, Y.; Zhang, L.
2018-05-01
In remote sensing applications, the accuracy of time interval measurement is one of the most important parameters that affect the quality of pulsed lidar data. The traditional time interval measurement technique has the disadvantages of low measurement accuracy, complicated circuit structure and large error. A high-precision time interval data cannot be obtained in these traditional methods. In order to obtain higher quality of remote sensing cloud images based on the time interval measurement, a higher accuracy time interval measurement method is proposed. The method is based on charging the capacitance and sampling the change of capacitor voltage at the same time. Firstly, the approximate model of the capacitance voltage curve in the time of flight of pulse is fitted based on the sampled data. Then, the whole charging time is obtained with the fitting function. In this method, only a high-speed A/D sampler and capacitor are required in a single receiving channel, and the collected data is processed directly in the main control unit. The experimental results show that the proposed method can get error less than 3 ps. Compared with other methods, the proposed method improves the time interval accuracy by at least 20 %.
Shao, Jingyuan; Cao, Wen; Qu, Haibin; Pan, Jianyang; Gong, Xingchu
2018-01-01
The aim of this study was to present a novel analytical quality by design (AQbD) approach for developing an HPLC method to analyze herbal extracts. In this approach, critical method attributes (CMAs) and critical method parameters (CMPs) of the analytical method were determined using the same data collected from screening experiments. The HPLC-ELSD method for separation and quantification of sugars in Codonopsis Radix extract (CRE) samples and Astragali Radix extract (ARE) samples was developed as an example method with a novel AQbD approach. Potential CMAs and potential CMPs were found with Analytical Target Profile. After the screening experiments, the retention time of the D-glucose peak of CRE samples, the signal-to-noise ratio of the D-glucose peak of CRE samples, and retention time of the sucrose peak in ARE samples were considered CMAs. The initial and final composition of the mobile phase, flow rate, and column temperature were found to be CMPs using a standard partial regression coefficient method. The probability-based design space was calculated using a Monte-Carlo simulation method and verified by experiments. The optimized method was validated to be accurate and precise, and then it was applied in the analysis of CRE and ARE samples. The present AQbD approach is efficient and suitable for analysis objects with complex compositions.
Validated method for quantification of genetically modified organisms in samples of maize flour.
Kunert, Renate; Gach, Johannes S; Vorauer-Uhl, Karola; Engel, Edwin; Katinger, Hermann
2006-02-08
Sensitive and accurate testing for trace amounts of biotechnology-derived DNA from plant material is the prerequisite for detection of 1% or 0.5% genetically modified ingredients in food products or raw materials thereof. Compared to ELISA detection of expressed proteins, real-time PCR (RT-PCR) amplification has easier sample preparation and detection limits are lower. Of the different methods of DNA preparation CTAB method with high flexibility in starting material and generation of sufficient DNA with relevant quality was chosen. Previous RT-PCR data generated with the SYBR green detection method showed that the method is highly sensitive to sample matrices and genomic DNA content influencing the interpretation of results. Therefore, this paper describes a real-time DNA quantification based on the TaqMan probe method, indicating high accuracy and sensitivity with detection limits of lower than 18 copies per sample applicable and comparable to highly purified plasmid standards as well as complex matrices of genomic DNA samples. The results were evaluated with ValiData for homology of variance, linearity, accuracy of the standard curve, and standard deviation.
Piecewise SALT sampling for estimating suspended sediment yields
Robert B. Thomas
1989-01-01
A probability sampling method called SALT (Selection At List Time) has been developed for collecting and summarizing data on delivery of suspended sediment in rivers. It is based on sampling and estimating yield using a suspended-sediment rating curve for high discharges and simple random sampling for low flows. The method gives unbiased estimates of total yield and...
Frison, Severine; Kerac, Marko; Checchi, Francesco; Nicholas, Jennifer
2017-01-01
The assessment of the prevalence of acute malnutrition in children under five is widely used for the detection of emergencies, planning interventions, advocacy, and monitoring and evaluation. This study examined PROBIT Methods which convert parameters (mean and standard deviation (SD)) of a normally distributed variable to a cumulative probability below any cut-off to estimate acute malnutrition in children under five using Middle-Upper Arm Circumference (MUAC). We assessed the performance of: PROBIT Method I, with mean MUAC from the survey sample and MUAC SD from a database of previous surveys; and PROBIT Method II, with mean and SD of MUAC observed in the survey sample. Specifically, we generated sub-samples from 852 survey datasets, simulating 100 surveys for eight sample sizes. Overall the methods were tested on 681 600 simulated surveys. PROBIT methods relying on sample sizes as small as 50 had better performance than the classic method for estimating and classifying the prevalence of acute malnutrition. They had better precision in the estimation of acute malnutrition for all sample sizes and better coverage for smaller sample sizes, while having relatively little bias. They classified situations accurately for a threshold of 5% acute malnutrition. Both PROBIT methods had similar outcomes. PROBIT Methods have a clear advantage in the assessment of acute malnutrition prevalence based on MUAC, compared to the classic method. Their use would require much lower sample sizes, thus enable great time and resource savings and permit timely and/or locally relevant prevalence estimates of acute malnutrition for a swift and well-targeted response.
Modulation Based on Probability Density Functions
NASA Technical Reports Server (NTRS)
Williams, Glenn L.
2009-01-01
A proposed method of modulating a sinusoidal carrier signal to convey digital information involves the use of histograms representing probability density functions (PDFs) that characterize samples of the signal waveform. The method is based partly on the observation that when a waveform is sampled (whether by analog or digital means) over a time interval at least as long as one half cycle of the waveform, the samples can be sorted by frequency of occurrence, thereby constructing a histogram representing a PDF of the waveform during that time interval.
NASA Astrophysics Data System (ADS)
Chepigin, A.; Leonte, M.; Colombo, F.; Kessler, J. D.
2014-12-01
Dissolved methane, ethane, propane, and butane concentrations in natural waters are traditionally measured using a headspace equilibration technique and gas chromatograph with flame ionization detector (GC-FID). While a relatively simple technique, headspace equilibration suffers from slow equilibration times and loss of sensitivity due to concentration dilution with the pure gas headspace. Here we present a newly developed pre-concentration system and auto-analyzer for use with a GC-FID. This system decreases the time required for each analysis by eliminating the headspace equilibration time, increases the sensitivity and precision with a rapid pre-concentration step, and minimized operator time with an autoanalyzer. In this method, samples are collected from Niskin bottles in newly developed 1 L plastic sample bags rather than glass vials. Immediately following sample collection, the sample bags are placed in an incubator and individually connected to a multiport sampling valve. Water is pumped automatically from the desired sample bag through a small (6.5 mL) Liqui-Cel® membrane contactor where the dissolved gas is vacuum extracted and directly flushed into the GC sample loop. The gases of interest are preferentially extracted with the Liqui-Cel and thus a natural pre-concentration effect is obtained. Daily method calibration is achieved in the field with a five-point calibration curve that is created by analyzing gas standard-spiked water stored in 5 L gas-impermeable bags. Our system has been shown to substantially pre-concentrate the dissolved gases of interest and produce a highly linear response of peak areas to dissolved gas concentration. The system retains the high accuracy, precision, and wide range of measurable concentrations of the headspace equilibration method while simultaneously increasing the sensitivity due to the pre-concentration step. The time and labor involved in the headspace equilibration method is eliminated and replaced with the immediate and automatic analysis of a maximum of 13 sequential samples. The elapsed time between sample collection and analysis is reduced from approximately 12 hrs to < 10 min, enabling dynamic and highly resolved sampling plans.
Geologic and hydraulic characteristics of selected shaly geologic units in Oklahoma
Becker, C.J.; Overton, M.D.; Johnson, K.S.; Luza, K.V.
1997-01-01
Information was collected on the geologic and hydraulic characteristics of three shale-dominated units in Oklahoma-the Dog Creek Shale and Chickasha Formation in Canadian County, Hennessey Group in Oklahoma County, and the Boggy Formation in Pittsburg County. The purpose of this project was to gain insight into the characteristics controlling fluid flow in shaly units that could be targeted for confinement of hazardous waste in the State and to evaluate methods of measuring hydraulic characteristics of shales. Permeameter results may not indicate in-place small-scale hydraulic characteristics, due to pretest disturbance and deterioration of core samples. The Dog Creek Shale and Chickasha Formation hydraulic conductivities measured by permeameter methods ranged from 2.8 times 10 to the negative 11 to 3.0 times 10 to the negative 7 meter per second in nine samples and specific storage from 3.3 times 10 to the negative 4 to 1.6 times 10 to the negative 3 per meter in four samples. Hennessey Group hydraulic conductivities ranged from 4.0 times 10 to the negative 12 to 4.0 times 10 to the negative 10 meter per second in eight samples. Hydraulic conductivity in the Boggy Formation ranged from 1.7 times 10 to the negative 12 to 1.0 times 10 to the negative 8 meter per second in 17 samples. The hydraulic properties of isolated borehole intervals of average length of 4.5 meters in the Hennessey Group and the Boggy Formation were evaluated by a pressurized slug-test method. Hydraulic conductivities obtained with this method tend to be low because intervals with features that transmitted large volumes of water were not tested. Hennessey Group hydraulic conductivities measured by this method ranged from 3.0 times 10 to the negative 13 to 1.1 times 10 to the negative 9 meter per second; the specific storage values are small and may be unreliable. Boggy Formation hydraulic conductivities ranged from 2.0 times 10 to the negative 13 to 2.7 times 10 to the negative 10 meter per second and specific storage values in these tests also are small and may be unreliable. A substantially higher hydraulic conductivity of 3.0 times 10 to the negative 8 meter per second was measured in one borehole 30 meters deep in the Boggy Formation using an open hole slug-test method.
Achieving Rigorous Accelerated Conformational Sampling in Explicit Solvent.
Doshi, Urmi; Hamelberg, Donald
2014-04-03
Molecular dynamics simulations can provide valuable atomistic insights into biomolecular function. However, the accuracy of molecular simulations on general-purpose computers depends on the time scale of the events of interest. Advanced simulation methods, such as accelerated molecular dynamics, have shown tremendous promise in sampling the conformational dynamics of biomolecules, where standard molecular dynamics simulations are nonergodic. Here we present a sampling method based on accelerated molecular dynamics in which rotatable dihedral angles and nonbonded interactions are boosted separately. This method (RaMD-db) is a different implementation of the dual-boost accelerated molecular dynamics, introduced earlier. The advantage is that this method speeds up sampling of the conformational space of biomolecules in explicit solvent, as the degrees of freedom most relevant for conformational transitions are accelerated. We tested RaMD-db on one of the most difficult sampling problems - protein folding. Starting from fully extended polypeptide chains, two fast folding α-helical proteins (Trpcage and the double mutant of C-terminal fragment of Villin headpiece) and a designed β-hairpin (Chignolin) were completely folded to their native structures in very short simulation time. Multiple folding/unfolding transitions could be observed in a single trajectory. Our results show that RaMD-db is a promisingly fast and efficient sampling method for conformational transitions in explicit solvent. RaMD-db thus opens new avenues for understanding biomolecular self-assembly and functional dynamics occurring on long time and length scales.
Gorecki, Jerzy; Díez, Sergi; Macherzynski, Mariusz; Kalisinska, Elżbieta; Golas, Janusz
2013-10-15
Improvements to the application of a combined solid-phase microextraction followed by gas chromatography coupled to pyrolysis and atomic fluorescence spectrometry method (SPME-GC-AFS) for methylmercury (MeHg) determination in biota samples are presented. Our new method includes improvements in the methodology of determination and the quantification technique. A shaker instead of a stirrer was used, in order to reduce the possibility of sample contamination and to simplify cleaning procedures. Then, optimal rotation frequency and shaking time were settled at 800 rpm and 10 min, respectively. Moreover, the GC-AFS system was equipped with a valve and an argon heater to eliminate the effect of the decrease in analytical signal caused by the moisture released from SPME fiber. For its determination, MeHg was first extracted from biota samples with a 25% KOH solution (3h) and then it was quantified by two methods, a conventional double standard addition method (AC) and a modified matrix-matched calibration (MQ) which is two times faster than the AC method. Both procedures were successfully tested with certified reference materials, and applied for the first time to the determination of MeHg in muscle samples of goosander (Mergus merganser) and liver samples of white-tailed eagle (Haliaeetus albicilla) with values ranging from 1.19 to 3.84 mg/kg dry weight (dw), and from 0.69 to 6.23 mg kg(-1) dw, respectively. Copyright © 2013 Elsevier B.V. All rights reserved.
Rocha, C F D; Van Sluys, M; Hatano, F H; Boquimpani-Freitas, L; Marra, R V; Marques, R V
2004-11-01
Studies on anurans in restinga habitats are few and, as a result, there is little information on which methods are more efficient for sampling them in this environment. Ten methods are usually used for sampling anuran communities in tropical and sub-tropical areas. In this study we evaluate which methods are more appropriate for this purpose in the restinga environment of Parque Nacional da Restinga de Jurubatiba. We analyzed six methods among those usually used for anuran samplings. For each method, we recorded the total amount of time spent (in min.), the number of researchers involved, and the number of species captured. We calculated a capture efficiency index (time necessary for a researcher to capture an individual frog) in order to make comparable the data obtained. Of the methods analyzed, the species inventory (9.7 min/searcher /ind.- MSI; richness = 6; abundance = 23) and the breeding site survey (9.5 MSI; richness = 4; abundance = 22) were the most efficient. The visual encounter inventory (45.0 MSI) and patch sampling (65.0 MSI) methods were of comparatively lower efficiency restinga, whereas the plot sampling and the pit-fall traps with drift-fence methods resulted in no frog capture. We conclude that there is a considerable difference in efficiency of methods used in the restinga environment and that the complete species inventory method is highly efficient for sampling frogs in the restinga studied and may be so in other restinga environments. Methods that are usually efficient in forested areas seem to be of little value in open restinga habitats.
Fransz, Duncan P; Huurnink, Arnold; de Boode, Vosse A; Kingma, Idsart; van Dieën, Jaap H
2015-01-01
Time to stabilization (TTS) is the time it takes for an individual to return to a baseline or stable state following a jump or hop landing. A large variety exists in methods to calculate the TTS. These methods can be described based on four aspects: (1) the input signal used (vertical, anteroposterior, or mediolateral ground reaction force) (2) signal processing (smoothed by sequential averaging, a moving root-mean-square window, or fitting an unbounded third order polynomial), (3) the stable state (threshold), and (4) the definition of when the (processed) signal is considered stable. Furthermore, differences exist with regard to the sample rate, filter settings and trial length. Twenty-five healthy volunteers performed ten 'single leg drop jump landing' trials. For each trial, TTS was calculated according to 18 previously reported methods. Additionally, the effects of sample rate (1000, 500, 200 and 100 samples/s), filter settings (no filter, 40, 15 and 10 Hz), and trial length (20, 14, 10, 7, 5 and 3s) were assessed. The TTS values varied considerably across the calculation methods. The maximum effect of alterations in the processing settings, averaged over calculation methods, were 2.8% (SD 3.3%) for sample rate, 8.8% (SD 7.7%) for filter settings, and 100.5% (SD 100.9%) for trial length. Differences in TTS calculation methods are affected differently by sample rate, filter settings and trial length. The effects of differences in sample rate and filter settings are generally small, while trial length has a large effect on TTS values. Copyright © 2014 Elsevier B.V. All rights reserved.
An evaluation of methods for estimating decadal stream loads
NASA Astrophysics Data System (ADS)
Lee, Casey J.; Hirsch, Robert M.; Schwarz, Gregory E.; Holtschlag, David J.; Preston, Stephen D.; Crawford, Charles G.; Vecchia, Aldo V.
2016-11-01
Effective management of water resources requires accurate information on the mass, or load of water-quality constituents transported from upstream watersheds to downstream receiving waters. Despite this need, no single method has been shown to consistently provide accurate load estimates among different water-quality constituents, sampling sites, and sampling regimes. We evaluate the accuracy of several load estimation methods across a broad range of sampling and environmental conditions. This analysis uses random sub-samples drawn from temporally-dense data sets of total nitrogen, total phosphorus, nitrate, and suspended-sediment concentration, and includes measurements of specific conductance which was used as a surrogate for dissolved solids concentration. Methods considered include linear interpolation and ratio estimators, regression-based methods historically employed by the U.S. Geological Survey, and newer flexible techniques including Weighted Regressions on Time, Season, and Discharge (WRTDS) and a generalized non-linear additive model. No single method is identified to have the greatest accuracy across all constituents, sites, and sampling scenarios. Most methods provide accurate estimates of specific conductance (used as a surrogate for total dissolved solids or specific major ions) and total nitrogen - lower accuracy is observed for the estimation of nitrate, total phosphorus and suspended sediment loads. Methods that allow for flexibility in the relation between concentration and flow conditions, specifically Beale's ratio estimator and WRTDS, exhibit greater estimation accuracy and lower bias. Evaluation of methods across simulated sampling scenarios indicate that (1) high-flow sampling is necessary to produce accurate load estimates, (2) extrapolation of sample data through time or across more extreme flow conditions reduces load estimate accuracy, and (3) WRTDS and methods that use a Kalman filter or smoothing to correct for departures between individual modeled and observed values benefit most from more frequent water-quality sampling.
An evaluation of methods for estimating decadal stream loads
Lee, Casey; Hirsch, Robert M.; Schwarz, Gregory E.; Holtschlag, David J.; Preston, Stephen D.; Crawford, Charles G.; Vecchia, Aldo V.
2016-01-01
Effective management of water resources requires accurate information on the mass, or load of water-quality constituents transported from upstream watersheds to downstream receiving waters. Despite this need, no single method has been shown to consistently provide accurate load estimates among different water-quality constituents, sampling sites, and sampling regimes. We evaluate the accuracy of several load estimation methods across a broad range of sampling and environmental conditions. This analysis uses random sub-samples drawn from temporally-dense data sets of total nitrogen, total phosphorus, nitrate, and suspended-sediment concentration, and includes measurements of specific conductance which was used as a surrogate for dissolved solids concentration. Methods considered include linear interpolation and ratio estimators, regression-based methods historically employed by the U.S. Geological Survey, and newer flexible techniques including Weighted Regressions on Time, Season, and Discharge (WRTDS) and a generalized non-linear additive model. No single method is identified to have the greatest accuracy across all constituents, sites, and sampling scenarios. Most methods provide accurate estimates of specific conductance (used as a surrogate for total dissolved solids or specific major ions) and total nitrogen – lower accuracy is observed for the estimation of nitrate, total phosphorus and suspended sediment loads. Methods that allow for flexibility in the relation between concentration and flow conditions, specifically Beale’s ratio estimator and WRTDS, exhibit greater estimation accuracy and lower bias. Evaluation of methods across simulated sampling scenarios indicate that (1) high-flow sampling is necessary to produce accurate load estimates, (2) extrapolation of sample data through time or across more extreme flow conditions reduces load estimate accuracy, and (3) WRTDS and methods that use a Kalman filter or smoothing to correct for departures between individual modeled and observed values benefit most from more frequent water-quality sampling.
Alum, Absar; Rock, Channah; Abbaszadegan, Morteza
2014-01-01
For land application, biosolids are classified as Class A or Class B based on the levels of bacterial, viral, and helminths pathogens in residual biosolids. The current EPA methods for the detection of these groups of pathogens in biosolids include discrete steps. Therefore, a separate sample is processed independently to quantify the number of each group of the pathogens in biosolids. The aim of the study was to develop a unified method for simultaneous processing of a single biosolids sample to recover bacterial, viral, and helminths pathogens. At the first stage for developing a simultaneous method, nine eluents were compared for their efficiency to recover viruses from a 100 gm spiked biosolids sample. In the second stage, the three top performing eluents were thoroughly evaluated for the recovery of bacteria, viruses, and helminthes. For all three groups of pathogens, the glycine-based eluent provided higher recovery than the beef extract-based eluent. Additional experiments were performed to optimize performance of glycine-based eluent under various procedural factors such as, solids to eluent ratio, stir time, and centrifugation conditions. Last, the new method was directly compared with the EPA methods for the recovery of the three groups of pathogens spiked in duplicate samples of biosolids collected from different sources. For viruses, the new method yielded up to 10% higher recoveries than the EPA method. For bacteria and helminths, recoveries were 74% and 83% by the new method compared to 34% and 68% by the EPA method, respectively. The unified sample processing method significantly reduces the time required for processing biosolids samples for different groups of pathogens; it is less impacted by the intrinsic variability of samples, while providing higher yields (P = 0.05) and greater consistency than the current EPA methods.
Phase II Trials for Heterogeneous Patient Populations with a Time-to-Event Endpoint.
Jung, Sin-Ho
2017-07-01
In this paper, we consider a single-arm phase II trial with a time-to-event end-point. We assume that the study population has multiple subpopulations with different prognosis, but the study treatment is expected to be similarly efficacious across the subpopulations. We review a stratified one-sample log-rank test and present its sample size calculation method under some practical design settings. Our sample size method requires specification of the prevalence of subpopulations. We observe that the power of the resulting sample size is not very sensitive to misspecification of the prevalence.
Method and apparatus for measuring the gas permeability of a solid sample
Carstens, D.H.W.
1984-01-27
The disclosure is directed to an apparatus and method for measuring the permeability of a gas in a sample. The gas is allowed to reach a steady flow rate through the sample. A measurable amount of the gas is collected during a given time period and then delivered to a sensitive quadrupole. The quadrupole signal, adjusted for background, is proportional to the amount of gas collected during the time period. The quadrupole can be calibrated with a standard helium leak. The gas can be deuterium and the sample can be polyvinyl alcohol.
[Determination of benzo(alpha)pyrene in food with microwave-assisted extraction].
Zhou, Na; Luo, He-Dong; Li, Na; Li, Yao-Qun
2014-03-01
Coupling derivative technique and constant-energy synchronous fluorescence scanning technique, a method of determining benzo[alpha] pyrene in foods by second derivative constant-energy synchronous spectrofluorimetry after microwave-assisted treatment of samples was established using domestic microwave oven. The main factors of influencing the efficiency of microwave extraction were discussed, including the extraction solvent types and amounts, the microwave extraction time, microwave radiation power and cooling time. And the comparison with ultrasonic extraction was made. Low-fat food samples, which were just microwave-extracted with mixed-solvents, could be analyzed immediately by the spectrofluorimetric technique. For high-fat food samples, microwave-assisted saponification and extraction were made at the same time, thus simplifying operation steps and reducing sample analysis time. So the whole sample analysis process could be completed within one hour. This method was simple, rapid and inexpensive. In consequence, it was applied to determine benzo(a)pyrene in food with good reproducibility and the recoveries of benzo(alpha) pyrene ranged from 90.0% to 105.0% for the low fat samples and 83.3% to 94.6% for high-fat samples.
40 CFR Table 1 to Subpart Cccc of... - Emission Limitations
Code of Federal Regulations, 2011 CFR
2011-07-01
... per dry standard cubic meter 3-run average (1 hour minimum sample time per run) Performance test (Method 29 of appendix A of this part). Carbon monoxide 157 parts per million by dry volume 3-run average (1 hour minimum sample time per run) Performance test (Method 10, 10A, or 10B of appendix A of this...
40 CFR Table 1 to Subpart Cccc of... - Emission Limitations
Code of Federal Regulations, 2010 CFR
2010-07-01
... per dry standard cubic meter 3-run average (1 hour minimum sample time per run) Performance test (Method 29 of appendix A of this part). Carbon monoxide 157 parts per million by dry volume 3-run average (1 hour minimum sample time per run) Performance test (Method 10, 10A, or 10B of appendix A of this...
40 CFR Table 2 to Subpart Dddd of... - Model Rule-Emission Limitations
Code of Federal Regulations, 2010 CFR
2010-07-01
... meter 3-run average (1 hour minimum sample time per run) Performance test (Method 29 of appendix A of this part) Carbon monoxide 157 parts per million by dry volume 3-run average (1 hour minimum sample time per run) Performance test (Method 10, 10A, or 10B, of appendix A of this part) Dioxins/furans...
NASA Astrophysics Data System (ADS)
Meng, Su; Chen, Jie; Sun, Jian
2017-10-01
This paper investigates the problem of observer-based output feedback control for networked control systems with non-uniform sampling and time-varying transmission delay. The sampling intervals are assumed to vary within a given interval. The transmission delay belongs to a known interval. A discrete-time model is first established, which contains time-varying delay and norm-bounded uncertainties coming from non-uniform sampling intervals. It is then converted to an interconnection of two subsystems in which the forward channel is delay-free. The scaled small gain theorem is used to derive the stability condition for the closed-loop system. Moreover, the observer-based output feedback controller design method is proposed by utilising a modified cone complementary linearisation algorithm. Finally, numerical examples illustrate the validity and superiority of the proposed method.
2013-01-01
Background Molecular imaging using magnetic nanoparticles (MNPs)—magnetic particle imaging (MPI)—has attracted interest for the early diagnosis of cancer and cardiovascular disease. However, because a steep local magnetic field distribution is required to obtain a defined image, sophisticated hardware is required. Therefore, it is desirable to realize excellent image quality even with low-performance hardware. In this study, the spatial resolution of MPI was evaluated using an image reconstruction method based on the correlation information of the magnetization signal in a time domain and by applying MNP samples made from biocompatible ferucarbotran that have adjusted particle diameters. Methods The magnetization characteristics and particle diameters of four types of MNP samples made from ferucarbotran were evaluated. A numerical analysis based on our proposed method that calculates the image intensity from correlation information between the magnetization signal generated from MNPs and the system function was attempted, and the obtained image quality was compared with that using the prototype in terms of image resolution and image artifacts. Results MNP samples obtained by adjusting ferucarbotran showed superior properties to conventional ferucarbotran samples, and numerical analysis showed that the same image quality could be obtained using a gradient magnetic field generator with 0.6 times the performance. However, because image blurring was included theoretically by the proposed method, an algorithm will be required to improve performance. Conclusions MNP samples obtained by adjusting ferucarbotran showed magnetizing properties superior to conventional ferucarbotran samples, and by using such samples, comparable image quality (spatial resolution) could be obtained with a lower gradient magnetic field intensity. PMID:23734917
Balancing a U-Shaped Assembly Line by Applying Nested Partitions Method
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bhagwat, Nikhil V.
2005-01-01
In this study, we applied the Nested Partitions method to a U-line balancing problem and conducted experiments to evaluate the application. From the results, it is quite evident that the Nested Partitions method provided near optimal solutions (optimal in some cases). Besides, the execution time is quite short as compared to the Branch and Bound algorithm. However, for larger data sets, the algorithm took significantly longer times for execution. One of the reasons could be the way in which the random samples are generated. In the present study, a random sample is a solution in itself which requires assignment ofmore » tasks to various stations. The time taken to assign tasks to stations is directly proportional to the number of tasks. Thus, if the number of tasks increases, the time taken to generate random samples for the different regions also increases. The performance index for the Nested Partitions method in the present study was the number of stations in the random solutions (samples) generated. The total idle time for the samples can be used as another performance index. ULINO method is known to have used a combination of bounds to come up with good solutions. This approach of combining different performance indices can be used to evaluate the random samples and obtain even better solutions. Here, we used deterministic time values for the tasks. In industries where majority of tasks are performed manually, the stochastic version of the problem could be of vital importance. Experimenting with different objective functions (No. of stations was used in this study) could be of some significance to some industries where in the cost associated with creation of a new station is not the same. For such industries, the results obtained by using the present approach will not be of much value. Labor costs, task incompletion costs or a combination of those can be effectively used as alternate objective functions.« less
Kim, Min-Kyu; Hong, Seong-Kwan; Kwon, Oh-Kyong
2015-12-26
This paper presents a fast multiple sampling method for low-noise CMOS image sensor (CIS) applications with column-parallel successive approximation register analog-to-digital converters (SAR ADCs). The 12-bit SAR ADC using the proposed multiple sampling method decreases the A/D conversion time by repeatedly converting a pixel output to 4-bit after the first 12-bit A/D conversion, reducing noise of the CIS by one over the square root of the number of samplings. The area of the 12-bit SAR ADC is reduced by using a 10-bit capacitor digital-to-analog converter (DAC) with four scaled reference voltages. In addition, a simple up/down counter-based digital processing logic is proposed to perform complex calculations for multiple sampling and digital correlated double sampling. To verify the proposed multiple sampling method, a 256 × 128 pixel array CIS with 12-bit SAR ADCs was fabricated using 0.18 μm CMOS process. The measurement results shows that the proposed multiple sampling method reduces each A/D conversion time from 1.2 μs to 0.45 μs and random noise from 848.3 μV to 270.4 μV, achieving a dynamic range of 68.1 dB and an SNR of 39.2 dB.
Recording 2-D Nutation NQR Spectra by Random Sampling Method
Sinyavsky, Nikolaj; Jadzyn, Maciej; Ostafin, Michal; Nogaj, Boleslaw
2010-01-01
The method of random sampling was introduced for the first time in the nutation nuclear quadrupole resonance (NQR) spectroscopy where the nutation spectra show characteristic singularities in the form of shoulders. The analytic formulae for complex two-dimensional (2-D) nutation NQR spectra (I = 3/2) were obtained and the condition for resolving the spectral singularities for small values of an asymmetry parameter η was determined. Our results show that the method of random sampling of a nutation interferogram allows significant reduction of time required to perform a 2-D nutation experiment and does not worsen the spectral resolution. PMID:20949121
Off-line real-time FTIR analysis of a process step in imipenem production
NASA Astrophysics Data System (ADS)
Boaz, Jhansi R.; Thomas, Scott M.; Meyerhoffer, Steven M.; Staskiewicz, Steven J.; Lynch, Joseph E.; Egan, Richard S.; Ellison, Dean K.
1992-08-01
We have developed an FT-IR method, using a Spectra-Tech Monit-IR 400 systems, to monitor off-line the completion of a reaction in real-time. The reaction is moisture-sensitive and analysis by more conventional methods (normal-phase HPLC) is difficult to reproduce. The FT-IR method is based on the shift of a diazo band when a conjugated beta-diketone is transformed into a silyl enol ether during the reaction. The reaction mixture is examined directly by IR and does not require sample workup. Data acquisition time is less than one minute. The method has been validated for specificity, precision and accuracy. The results obtained by the FT-IR method for known mixtures and in-process samples compare favorably with those from a normal-phase HPLC method.
Implementation of unsteady sampling procedures for the parallel direct simulation Monte Carlo method
NASA Astrophysics Data System (ADS)
Cave, H. M.; Tseng, K.-C.; Wu, J.-S.; Jermy, M. C.; Huang, J.-C.; Krumdieck, S. P.
2008-06-01
An unsteady sampling routine for a general parallel direct simulation Monte Carlo method called PDSC is introduced, allowing the simulation of time-dependent flow problems in the near continuum range. A post-processing procedure called DSMC rapid ensemble averaging method (DREAM) is developed to improve the statistical scatter in the results while minimising both memory and simulation time. This method builds an ensemble average of repeated runs over small number of sampling intervals prior to the sampling point of interest by restarting the flow using either a Maxwellian distribution based on macroscopic properties for near equilibrium flows (DREAM-I) or output instantaneous particle data obtained by the original unsteady sampling of PDSC for strongly non-equilibrium flows (DREAM-II). The method is validated by simulating shock tube flow and the development of simple Couette flow. Unsteady PDSC is found to accurately predict the flow field in both cases with significantly reduced run-times over single processor code and DREAM greatly reduces the statistical scatter in the results while maintaining accurate particle velocity distributions. Simulations are then conducted of two applications involving the interaction of shocks over wedges. The results of these simulations are compared to experimental data and simulations from the literature where there these are available. In general, it was found that 10 ensembled runs of DREAM processing could reduce the statistical uncertainty in the raw PDSC data by 2.5-3.3 times, based on the limited number of cases in the present study.
Farnham, David J; Gibson, Rebecca A; Hsueh, Diana Y; McGillis, Wade R; Culligan, Patricia J; Zain, Nina; Buchanan, Rob
2017-02-15
To protect recreational water users from waterborne pathogen exposure, it is crucial that waterways are monitored for the presence of harmful bacteria. In NYC, a citizen science campaign is monitoring waterways impacted by inputs of storm water and untreated sewage during periods of rainfall. However, the spatial and temporal scales over which the monitoring program can sample are constrained by cost and time, thus hindering the construction of databases that benefit both scientists and citizens. In this study, we first illustrate the scientific value of a citizen scientist monitoring campaign by using the data collected through the campaign to characterize the seasonal variability of sampled bacterial concentration as well as its response to antecedent rainfall. Second, we examine the efficacy of the HyServe Compact Dry ETC method, a lower cost and time-efficient alternative to the EPA-approved IDEXX Enterolert method for fecal indicator monitoring, through a paired sample comparison of IDEXX and HyServe (total of 424 paired samples). The HyServe and IDEXX methods return the same result for over 80% of the samples with regard to whether a water sample is above or below the EPA's recreational water quality criteria for a single sample of 110 enterococci per 100mL. The HyServe method classified as unsafe 90% of the 119 water samples that were classified as having unsafe enterococci concentrations by the more established IDEXX method. This study seeks to encourage other scientists to engage with citizen scientist communities and to also pursue the development of cost- and time-efficient methodologies to sample environmental variables that are not easily collected or analyzed in an automated manner. Copyright © 2016 Elsevier B.V. All rights reserved.
Yoshida, Yasuyuki; Takata, Kazuyuki; Takai, Hiroki; Kawahara, Keisuke; Kuzuya, Akinori; Ohya, Yuichi
2017-10-01
On clinical application of biodegradable injectable polymer (IP) systems, quick extemporaneous preparation of IP formulations and longer duration time gel state after injection into the body are the important targets to be developed. Previously, we had reported temperature-responsive covalent gelation systems via bio-orthogonal thiol-ene reaction by 'mixing strategy' of amphiphilic biodegradable tri-block copolymer (tri-PCG) attaching acryloyl groups on both termini (tri-PCG-Acryl) with reactive polythiol. In other previous works, we found 'freeze-dry with PEG/dispersion' method as quick extemporaneous preparation method of biodegradable IP formulations. In this study, we applied this quick preparative method to the temperature-triggered covalent gelation system. The instant formulation (D-sample) could be prepared by 'freeze-dry with PEG/dispersion' just mixing of tri-PCG-Acryl micelle dispersion and tri-PCG/DPMP micelle dispersion with PEG, that can be prepared in 30 s from the dried samples. The obtained D-sample showed irreversible gelation and long duration time of gel state, which was basically the same as the formulations prepared by the usual heating dissolution method (S-sample). Interestingly, the D-sample could maintain its sol state for a longer time (24 h) after preparing the formulation at r.t. compared with the S-sample, which became a gel in 3 h after preparing. The IP system showed good biocompatibility and long duration time of the gel state after subcutaneous implantation. These characteristics of D-samples, quick extemporaneous preparation and high stability in the sol state before injection, would be very convenient in a clinical setting.
Pines, Alexander; Samoson, Ago
1990-01-01
An improved NMR apparatus and method are described which substantially improve the resolution of NMR measurements made on powdered or amorphous or otherwise orientationally disordered samples. The apparatus spins the sample about an axis. The angle of the axis is mechanically varied such that the time average of two or more Legendre polynomials are zero.
NASA Technical Reports Server (NTRS)
Scott, Robert C.; Pototzky, Anthony S.; Perry, Boyd, III
1994-01-01
NASA Langley Research Center has, for several years, conducted research in the area of time-correlated gust loads for linear and nonlinear aircraft. The results of this work led NASA to recommend that the Matched-Filter-Based One-Dimensional Search Method be used for gust load analyses of nonlinear aircraft. This manual describes this method, describes a FORTRAN code which performs this method, and presents example calculations for a sample nonlinear aircraft model. The name of the code is MFD1DS (Matched-Filter-Based One-Dimensional Search). The program source code, the example aircraft equations of motion, a sample input file, and a sample program output are all listed in the appendices.
Evaluating performance of stormwater sampling approaches using a dynamic watershed model.
Ackerman, Drew; Stein, Eric D; Ritter, Kerry J
2011-09-01
Accurate quantification of stormwater pollutant levels is essential for estimating overall contaminant discharge to receiving waters. Numerous sampling approaches exist that attempt to balance accuracy against the costs associated with the sampling method. This study employs a novel and practical approach of evaluating the accuracy of different stormwater monitoring methodologies using stormflows and constituent concentrations produced by a fully validated continuous simulation watershed model. A major advantage of using a watershed model to simulate pollutant concentrations is that a large number of storms representing a broad range of conditions can be applied in testing the various sampling approaches. Seventy-eight distinct methodologies were evaluated by "virtual samplings" of 166 simulated storms of varying size, intensity and duration, representing 14 years of storms in Ballona Creek near Los Angeles, California. The 78 methods can be grouped into four general strategies: volume-paced compositing, time-paced compositing, pollutograph sampling, and microsampling. The performances of each sampling strategy was evaluated by comparing the (1) median relative error between the virtually sampled and the true modeled event mean concentration (EMC) of each storm (accuracy), (2) median absolute deviation about the median or "MAD" of the relative error or (precision), and (3) the percentage of storms where sampling methods were within 10% of the true EMC (combined measures of accuracy and precision). Finally, costs associated with site setup, sampling, and laboratory analysis were estimated for each method. Pollutograph sampling consistently outperformed the other three methods both in terms of accuracy and precision, but was the most costly method evaluated. Time-paced sampling consistently underestimated while volume-paced sampling over estimated the storm EMCs. Microsampling performance approached that of pollutograph sampling at a substantial cost savings. The most efficient method for routine stormwater monitoring in terms of a balance between performance and cost was volume-paced microsampling, with variable sample pacing to ensure that the entirety of the storm was captured. Pollutograph sampling is recommended if the data are to be used for detailed analysis of runoff dynamics.
Soejima, Mikiko; Tsuchiya, Yuji; Egashira, Kouichi; Kawano, Hiroyuki; Sagawa, Kimitaka; Koda, Yoshiro
2010-06-01
Anhaptoglobinemic patients run the risk of severe anaphylactic transfusion reaction because they produce serum haptoglobin (Hp) antibodies. Being homozygous for the Hp gene deletion (HP(del)) is the only known cause of congenital anhaptoglobinemia, and clinical diagnosis of HP(del) before transfusion is important to prevent anaphylactic shock. We recently developed a 5'-nuclease (TaqMan) real-time polymerase chain reaction (PCR) method. A SYBR Green I-based duplex real-time PCR assay using two forward primers and a common reverse primer followed by melting curve analysis was developed to determine HP(del) zygosity in a single tube. In addition, to obviate initial DNA extraction, we examined serially diluted blood samples as PCR templates. Allelic discrimination of HP(del) yielded optimal results at blood sample dilutions of 1:64 to 1:1024. The results from 2231 blood samples were fully concordant with those obtained by the TaqMan-based real-time PCR method. The detection rate of the HP(del) allele by the SYBR Green I-based method is comparable with that using the TaqMan-based method. This method is readily applicable due to its low initial cost and analyzability using economical real-time PCR machines and is suitable for high-throughput analysis as an alternative method for allelic discrimination of HP(del).
Code of Federal Regulations, 2011 CFR
2011-07-01
...) (grains per dry standard cubic foot (gr/dscf)) 115 (0.05) 69 (0.03) 34 (0.015) 3-run average (1-hour minimum sample time per run) EPA Reference Method 5 of appendix A-3 of part 60, or EPA Reference Method...-run average (1-hour minimum sample time per run) EPA Reference Method 10 or 10B of appendix A-4 of...
Code of Federal Regulations, 2010 CFR
2010-07-01
...) (grains per dry standard cubic foot (gr/dscf)) 115 (0.05) 69 (0.03) 34 (0.015) 3-run average (1-hour minimum sample time per run) EPA Reference Method 5 of appendix A-3 of part 60, or EPA Reference Method...-run average (1-hour minimum sample time per run) EPA Reference Method 10 or 10B of appendix A-4 of...
Han, Yang; Hou, Shao-Yang; Ji, Shang-Zhi; Cheng, Juan; Zhang, Meng-Yue; He, Li-Juan; Ye, Xiang-Zhong; Li, Yi-Min; Zhang, Yi-Xuan
2017-11-15
A novel method, real-time reverse transcription PCR (real-time RT-PCR) coupled with probe-melting curve analysis, has been established to detect two kinds of samples within one fluorescence channel. Besides a conventional TaqMan probe, this method employs another specially designed melting-probe with a 5' terminus modification which meets the same label with the same fluorescent group. By using an asymmetric PCR method, the melting-probe is able to detect an extra sample in the melting stage effectively while it almost has little influence on the amplification detection. Thus, this method allows the availability of united employment of both amplification stage and melting stage for detecting samples in one reaction. The further demonstration by simultaneous detection of human immunodeficiency virus (HIV) and hepatitis C virus (HCV) in one channel as a model system is presented in this essay. The sensitivity of detection by real-time RT-PCR coupled with probe-melting analysis was proved to be equal to that detected by conventional real-time RT-PCR. Because real-time RT-PCR coupled with probe-melting analysis can double the detection throughputs within one fluorescence channel, it is expected to be a good solution for the problem of low-throughput in current real-time PCR. Copyright © 2017 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Zhengang, Lu; Hongyang, Yu; Xi, Yang
2017-05-01
The Modular Multilevel Converter (MMC) is one of the most attractive topologies in recent years for medium or high voltage industrial applications, such as high voltage dc transmission (HVDC) and medium voltage varying speed motor drive. The wide adoption of MMCs in industry is mainly due to its flexible expandability, transformer-less configuration, common dc bus, high reliability from redundancy, and so on. But, when the sub module number of MMC is more, the test of MMC controller will cost more time and effort. Hardware in the loop test based on real time simulator will save a lot of time and money caused by the MMC test. And due to the flexible of HIL, it becomes more and more popular in the industry area. The MMC modelling method remains an important issue for the MMC HIL test. Specifically, the VSC model should realistically reflect the nonlinear device switching characteristics, switching and conduction losses, tailing current, and diode reverse recovery behaviour of a realistic converter. In this paper, an IGBT switching characteristic curve embedded half-bridge MMC modelling method is proposed. This method is based on the switching curve referring and sample circuit calculation, and it is sample for implementation. Based on the proposed method, a FPGA real time simulation is carried out with 200ns sample time. The real time simulation results show the proposed method is correct.
An Alu-based, MGB Eclipse real-time PCR method for quantitation of human DNA in forensic samples.
Nicklas, Janice A; Buel, Eric
2005-09-01
The forensic community needs quick, reliable methods to quantitate human DNA in crime scene samples to replace the laborious and imprecise slot blot method. A real-time PCR based method has the possibility of allowing development of a faster and more quantitative assay. Alu sequences are primate-specific and are found in many copies in the human genome, making these sequences an excellent target or marker for human DNA. This paper describes the development of a real-time Alu sequence-based assay using MGB Eclipse primers and probes. The advantages of this assay are simplicity, speed, less hands-on-time and automated quantitation, as well as a large dynamic range (128 ng/microL to 0.5 pg/microL).
Nonuniform sampling and non-Fourier signal processing methods in multidimensional NMR.
Mobli, Mehdi; Hoch, Jeffrey C
2014-11-01
Beginning with the introduction of Fourier Transform NMR by Ernst and Anderson in 1966, time domain measurement of the impulse response (the free induction decay, FID) consisted of sampling the signal at a series of discrete intervals. For compatibility with the discrete Fourier transform (DFT), the intervals are kept uniform, and the Nyquist theorem dictates the largest value of the interval sufficient to avoid aliasing. With the proposal by Jeener of parametric sampling along an indirect time dimension, extension to multidimensional experiments employed the same sampling techniques used in one dimension, similarly subject to the Nyquist condition and suitable for processing via the discrete Fourier transform. The challenges of obtaining high-resolution spectral estimates from short data records using the DFT were already well understood, however. Despite techniques such as linear prediction extrapolation, the achievable resolution in the indirect dimensions is limited by practical constraints on measuring time. The advent of non-Fourier methods of spectrum analysis capable of processing nonuniformly sampled data has led to an explosion in the development of novel sampling strategies that avoid the limits on resolution and measurement time imposed by uniform sampling. The first part of this review discusses the many approaches to data sampling in multidimensional NMR, the second part highlights commonly used methods for signal processing of such data, and the review concludes with a discussion of other approaches to speeding up data acquisition in NMR. Copyright © 2014 Elsevier B.V. All rights reserved.
Method and apparatus for time dispersive spectroscopy
Tarver, III, Edward E.; Siems, William F.
2003-06-17
Methods and apparatus are described for time dispersive spectroscopy. In particular, a modulated flow of ionized molecules of a sample are introduced into a drift region of an ion spectrometer. The ions are subsequently detected by an ion detector to produce an ion detection signal. The ion detection signal can be modulated to obtain a signal useful in assaying the chemical constituents of the sample.
Yong, Dongeun; Ki, Chang-Seok; Kim, Jae-Seok; Seong, Moon-Woo; Lee, Hyukmin
2016-01-01
Background Real-time reverse transcription PCR (rRT-PCR) of sputum samples is commonly used to diagnose Middle East respiratory syndrome coronavirus (MERS-CoV) infection. Owing to the difficulty of extracting RNA from sputum containing mucus, sputum homogenization is desirable prior to nucleic acid isolation. We determined optimal homogenization methods for isolating viral nucleic acids from sputum. Methods We evaluated the following three sputum-homogenization methods: proteinase K and DNase I (PK-DNase) treatment, phosphate-buffered saline (PBS) treatment, and N-acetyl-L-cysteine and sodium citrate (NALC) treatment. Sputum samples were spiked with inactivated MERS-CoV culture isolates. RNA was extracted from pretreated, spiked samples using the easyMAG system (bioMérieux, France). Extracted RNAs were then subjected to rRT-PCR for MERS-CoV diagnosis (DiaPlex Q MERS-coronavirus, SolGent, Korea). Results While analyzing 15 spiked sputum samples prepared in technical duplicate, false-negative results were obtained with five (16.7%) and four samples (13.3%), respectively, by using the PBS and NALC methods. The range of threshold cycle (Ct) values observed when detecting upE in sputum samples was 31.1–35.4 with the PK-DNase method, 34.7–39.0 with the PBS method, and 33.9–38.6 with the NALC method. Compared with the control, which were prepared by adding a one-tenth volume of 1:1,000 diluted viral culture to PBS solution, the ranges of Ct values obtained by the PBS and NALC methods differed significantly from the mean control Ct of 33.2 (both P<0.0001). Conclusions The PK-DNase method is suitable for homogenizing sputum samples prior to RNA extraction. PMID:27374711
Advancing Explosives Detection Capabilities: Vapor Detection
Atkinson, David
2018-05-11
A new, PNNL-developed method provides direct, real-time detection of trace amounts of explosives such as RDX, PETN and C-4. The method selectively ionizes a sample before passing the sample through a mass spectrometer to detect explosive vapors. The method could be used at airports to improve aviation security.
Advancing Explosives Detection Capabilities: Vapor Detection
DOE Office of Scientific and Technical Information (OSTI.GOV)
Atkinson, David
2012-10-15
A new, PNNL-developed method provides direct, real-time detection of trace amounts of explosives such as RDX, PETN and C-4. The method selectively ionizes a sample before passing the sample through a mass spectrometer to detect explosive vapors. The method could be used at airports to improve aviation security.
Liu, Ya-Fei; Yuan, Hong-Fu; Song, Chun-Feng; Xie, Jin-Chun; Li, Xiao-Yu; Yan, De-Lin
2014-11-01
A new method is proposed for the fast determination of the induction period of gasoline using Fourier transform attenuated total reflection infrared spectroscopy (ATR-FTIR). A dedicated analysis system with the function of spectral measurement, data processing, display and storage was designed and integrated using a Fourier transform infrared spectrometer module and chemometric software. The sample presentation accessory designed which has advantages of constant optical path, convenient sample injection and cleaning is composed of a nine times reflection attenuated total reflectance (ATR) crystal of zinc selenide (ZnSe) coated with a diamond film and a stainless steel lid with sealing device. The influence of spectral scanning number and repeated sample loading times on the spectral signal-to-noise ratio was studied. The optimum spectral scanning number is 15 times and the optimum sample loading number is 4 times. Sixty four different gasoline samples were collected from the Beijing-Tianjin area and the induction period values were determined as reference data by standard method GB/T 8018-87. The infrared spectra of these samples were collected in the operating condition mentioned above using the dedicated fast analysis system. Spectra were pretreated using mean centering and 1st derivative to reduce the influence of spectral noise and baseline shift A PLS calibration model for the induction period was established by correlating the known induction period values of the samples with their spectra. The correlation coefficient (R2), standard error of calibration (SEC) and standard error of prediction (SEP) of the model are 0.897, 68.3 and 91.9 minutes, respectively. The relative deviation of the model for gasoline induction period prediction is less than 5%, which meets the requirements of repeatability tolerance in GB method. The new method is simple and fast. It takes no more than 3 minutes to detect one sample. Therefore, the method is feasible for implementing fast determination of gasoline induction period, and of a positive meaning in the evaluation of fuel quality.
Flow injection trace gas analysis method for on-site determination of organoarsenicals
Aldstadt, III, Joseph H.
1997-01-01
A method for real-time determination of the concentration of Lewisite in the ambient atmosphere, the method includes separating and collecting a Lewisite sample from the atmosphere in a collection chamber, converting the collected Lewisite to an arsenite ion solution sample, pumping the arsenite ion containing sample to an electrochemical detector connected to the collection chamber, and electrochemically detecting the converted arsenite ions in the sample, whereby the concentration of arsenite ions detected is proportional to the concentration of Lewisite in the atmosphere.
Convenience Samples and Caregiving Research: How Generalizable Are the Findings?
ERIC Educational Resources Information Center
Pruchno, Rachel A.; Brill, Jonathan E.; Shands, Yvonne; Gordon, Judith R.; Genderson, Maureen Wilson; Rose, Miriam; Cartwright, Francine
2008-01-01
Purpose: We contrast characteristics of respondents recruited using convenience strategies with those of respondents recruited by random digit dial (RDD) methods. We compare sample variances, means, and interrelationships among variables generated from the convenience and RDD samples. Design and Methods: Women aged 50 to 64 who work full time and…
John F. Caratti
2006-01-01
The FIREMON Point Intercept (PO) method is used to assess changes in plant species cover or ground cover for a macroplot. This method uses a narrow diameter sampling pole or sampling pins, placed at systematic intervals along line transects to sample within plot variation and quantify statistically valid changes in plant species cover and height over time. Plant...
EVALUATION OF DIOXIN EMISSIONS MONITORING SYSTEMS
Continuous samplers and real or semi-real-time continuous monitors for polychlorinated dibenzodioxins and furans provide significant advantages relative to conventional methods of extractive sampling. Continuous samplers collect long term samples over a time period of days to wee...
Aulenbach, Brent T.
2010-01-01
Bacteria holding-time experiments of up to 62 h were performed on five surface-water samples from four urban stream sites in the vicinity of Atlanta, GA, USA that had relatively high densities of coliform bacteria (Escherichia coli densities were all well above the US Environmental Protection Agency criterion of 126 colonies (100 ml) − 1 for recreational waters). Holding-time experiments were done for fecal coliform using the membrane filtration modified fecal coliform (mFC) agar method and for total coliform and E. coli using the Colilert®-18 Quanti-Tray® method. The precisions of these analytical methods were quantified. Precisions determined for fecal coliform indicated that the upper bound of the ideal range of counts could reasonably be extended upward and would improve precision. For the Colilert®-18 method, analytical precisions were similar to the theoretical precisions for this method. Fecal and total coliform densities did not change significantly with holding times up to about 27 h. Limited information indicated that fecal coliform densities might be stable for holding times of up to 62 h, whereas total coliform densities might not be stable for holding times greater than about 27 h. E. coli densities were stable for holding times of up to 18 h—a shorter period than indicated from a previous studies. These results should be applicable to non-regulatory monitoring sampling designs for similar urban surface-water sample types.
Apparatus and method for detecting full-capture radiation events
Odell, D.M.C.
1994-10-11
An apparatus and method are disclosed for sampling the output signal of a radiation detector and distinguishing full-capture radiation events from Compton scattering events. The output signal of a radiation detector is continuously sampled. The samples are converted to digital values and input to a discriminator where samples that are representative of events are identified. The discriminator transfers only event samples, that is, samples representing full-capture events and Compton events, to a signal processor where the samples are saved in a three-dimensional count matrix with time (from the time of onset of the pulse) on the first axis, sample pulse current amplitude on the second axis, and number of samples on the third axis. The stored data are analyzed to separate the Compton events from full-capture events, and the energy of the full-capture events is determined without having determined the energies of any of the individual radiation detector events. 4 figs.
Apparatus and method for detecting full-capture radiation events
Odell, Daniel M. C.
1994-01-01
An apparatus and method for sampling the output signal of a radiation detector and distinguishing full-capture radiation events from Compton scattering events. The output signal of a radiation detector is continuously sampled. The samples are converted to digital values and input to a discriminator where samples that are representative of events are identified. The discriminator transfers only event samples, that is, samples representing full-capture events and Compton events, to a signal processor where the samples are saved in a three-dimensional count matrix with time (from the time of onset of the pulse) on the first axis, sample pulse current amplitude on the second axis, and number of samples on the third axis. The stored data are analyzed to separate the Compton events from full-capture events, and the energy of the full-capture events is determined without having determined the energies of any of the individual radiation detector events.
da Costa, Nuno Maçarico; Hepp, Klaus; Martin, Kevan A C
2009-05-30
Synapses can only be morphologically identified by electron microscopy and this is often a very labor-intensive and time-consuming task. When quantitative estimates are required for pathways that contribute a small proportion of synapses to the neuropil, the problems of accurate sampling are particularly severe and the total time required may become prohibitive. Here we present a sampling method devised to count the percentage of rarely occurring synapses in the neuropil using a large sample (approximately 1000 sampling sites), with the strong constraint of doing it in reasonable time. The strategy, which uses the unbiased physical disector technique, resembles that used in particle physics to detect rare events. We validated our method in the primary visual cortex of the cat, where we used biotinylated dextran amine to label thalamic afferents and measured the density of their synapses using the physical disector method. Our results show that we could obtain accurate counts of the labeled synapses, even when they represented only 0.2% of all the synapses in the neuropil.
Validation of PCR methods for quantitation of genetically modified plants in food.
Hübner, P; Waiblinger, H U; Pietsch, K; Brodmann, P
2001-01-01
For enforcement of the recently introduced labeling threshold for genetically modified organisms (GMOs) in food ingredients, quantitative detection methods such as quantitative competitive (QC-PCR) and real-time PCR are applied by official food control laboratories. The experiences of 3 European food control laboratories in validating such methods were compared to describe realistic performance characteristics of quantitative PCR detection methods. The limit of quantitation (LOQ) of GMO-specific, real-time PCR was experimentally determined to reach 30-50 target molecules, which is close to theoretical prediction. Starting PCR with 200 ng genomic plant DNA, the LOQ depends primarily on the genome size of the target plant and ranges from 0.02% for rice to 0.7% for wheat. The precision of quantitative PCR detection methods, expressed as relative standard deviation (RSD), varied from 10 to 30%. Using Bt176 corn containing test samples and applying Bt176 specific QC-PCR, mean values deviated from true values by -7to 18%, with an average of 2+/-10%. Ruggedness of real-time PCR detection methods was assessed in an interlaboratory study analyzing commercial, homogeneous food samples. Roundup Ready soybean DNA contents were determined in the range of 0.3 to 36%, relative to soybean DNA, with RSDs of about 25%. Taking the precision of quantitative PCR detection methods into account, suitable sample plans and sample sizes for GMO analysis are suggested. Because quantitative GMO detection methods measure GMO contents of samples in relation to reference material (calibrants), high priority must be given to international agreements and standardization on certified reference materials.
ERIC Educational Resources Information Center
Alvero, Alicia M.; Struss, Kristen; Rappaport, Eva
2008-01-01
Partial-interval (PIR), whole-interval (WIR), and momentary time sampling (MTS) estimates were compared against continuous measures of safety performance for three postural behaviors: feet, back, and shoulder position. Twenty-five samples of safety performance across five undergraduate students were scored using a second-by-second continuous…
Nascimento, Paloma Andrade Martins; Barsanelli, Paulo Lopes; Rebellato, Ana Paula; Pallone, Juliana Azevedo Lima; Colnago, Luiz Alberto; Pereira, Fabíola Manhas Verbi
2017-03-01
This study shows the use of time-domain (TD)-NMR transverse relaxation (T2) data and chemometrics in the nondestructive determination of fat content for powdered food samples such as commercial dried milk products. Most proposed NMR spectroscopy methods for measuring fat content correlate free induction decay or echo intensities with the sample's mass. The need for the sample's mass limits the analytical frequency of NMR determination, because weighing the samples is an additional step in this procedure. Therefore, the method proposed here is based on a multivariate model of T2 decay, measured with Carr-Purcell-Meiboom-Gill pulse sequence and reference values of fat content. The TD-NMR spectroscopy method shows high correlation (r = 0.95) with the lipid content, determined by the standard extraction method of Bligh and Dyer. For comparison, fat content determination was also performed using a multivariate model with near-IR (NIR) spectroscopy, which is also a nondestructive method. The advantages of the proposed TD-NMR method are that it (1) minimizes toxic residue generation, (2) performs measurements with high analytical frequency (a few seconds per analysis), and (3) does not require sample preparation (such as pelleting, needed for NIR spectroscopy analyses) or weighing the samples.
Method for determining the concentration of atomic species in gases and solids
Loge, G.W.
1998-02-03
Method is described for determining the concentration of atomic species in gases and solids. The method involves measurement of at least two emission intensities from a species in a sample that is excited by incident laser radiation. This generates a plasma therein after a sufficient time period has elapsed and during a second time period, permits an instantaneous temperature to be established within the sample. The concentration of the atomic species to be determined is then derived from the known emission intensity of a predetermined concentration of that species in the sample at the measured temperature, a quantity which is measured prior to the determination of the unknown concentration, and the actual measured emission from the unknown species, or by this latter emission and the emission intensity of a species having known concentration within the sample such as nitrogen for gaseous air samples. 4 figs.
High-speed fixed-target serial virus crystallography
Roedig, Philip; Ginn, Helen M.; Pakendorf, Tim; ...
2017-06-19
Here, we report a method for serial X-ray crystallography at X-ray free-electron lasers (XFELs), which allows for full use of the current 120-Hz repetition rate of the Linear Coherent Light Source (LCLS). Using a micropatterned silicon chip in combination with the high-speed Roadrunner goniometer for sample delivery, we were able to determine the crystal structures of the picornavirus bovine enterovirus 2 (BEV2) and the cytoplasmic polyhedrosis virus type 18 polyhedrin, with total data collection times of less than 14 and 10 min, respectively. Our method requires only micrograms of sample and should therefore broaden the applicability of serial femtosecond crystallographymore » to challenging projects for which only limited sample amounts are available. By synchronizing the sample exchange to the XFEL repetition rate, our method allows for most efficient use of the limited beam time available at XFELs and should enable a substantial increase in sample throughput at these facilities.« less
High-speed fixed-target serial virus crystallography
DOE Office of Scientific and Technical Information (OSTI.GOV)
Roedig, Philip; Ginn, Helen M.; Pakendorf, Tim
Here, we report a method for serial X-ray crystallography at X-ray free-electron lasers (XFELs), which allows for full use of the current 120-Hz repetition rate of the Linear Coherent Light Source (LCLS). Using a micropatterned silicon chip in combination with the high-speed Roadrunner goniometer for sample delivery, we were able to determine the crystal structures of the picornavirus bovine enterovirus 2 (BEV2) and the cytoplasmic polyhedrosis virus type 18 polyhedrin, with total data collection times of less than 14 and 10 min, respectively. Our method requires only micrograms of sample and should therefore broaden the applicability of serial femtosecond crystallographymore » to challenging projects for which only limited sample amounts are available. By synchronizing the sample exchange to the XFEL repetition rate, our method allows for most efficient use of the limited beam time available at XFELs and should enable a substantial increase in sample throughput at these facilities.« less
Development of a Methodology for Assessing Aircrew Workloads.
1981-11-01
Workload Feasibility Study. .. ...... 52 Subjects. .. .............. ........ 53 Equipment .. ............... ....... 53 Date Analysis ... analysis ; simulation; standard time systems; switching synthetic time systems; task activities; task interference; time study; tracking; workload; work sampl...standard data systems, information content analysis , work sampling and job evaluation. Con- ventional methods were found to be deficient in accounting
Improvement of sampling plans for Salmonella detection in pooled table eggs by use of real-time PCR.
Pasquali, Frédérique; De Cesare, Alessandra; Valero, Antonio; Olsen, John Emerdhal; Manfreda, Gerardo
2014-08-01
Eggs and egg products have been described as the most critical food vehicles of salmonellosis. The prevalence and level of contamination of Salmonella on table eggs are low, which severely affects the sensitivity of sampling plans applied voluntarily in some European countries, where one to five pools of 10 eggs are tested by the culture based reference method ISO 6579:2004. In the current study we have compared the testing-sensitivity of the reference culture method ISO 6579:2004 and an alternative real-time PCR method on Salmonella contaminated egg-pool of different sizes (4-9 uninfected eggs mixed with one contaminated egg) and contamination levels (10°-10(1), 10(1)-10(2), 10(2)-10(3)CFU/eggshell). Two hundred and seventy samples corresponding to 15 replicates per pool size and inoculum level were tested. At the lowest contamination level real-time PCR detected Salmonella in 40% of contaminated pools vs 12% using ISO 6579. The results were used to estimate the lowest number of sample units needed to be tested in order to have a 95% certainty not falsely to accept a contaminated lot by Monte Carlo simulation. According to this simulation, at least 16 pools of 10 eggs each are needed to be tested by ISO 6579 in order to obtain this confidence level, while the minimum number of pools to be tested was reduced to 8 pools of 9 eggs each, when real-time PCR was applied as analytical method. This result underlines the importance of including analytical methods with higher sensitivity in order to improve the efficiency of sampling and reduce the number of samples to be tested. Copyright © 2013 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Sabatini, Francesca; Lluveras-Tenorio, Anna; Degano, Ilaria; Kuckova, Stepanka; Krizova, Iva; Colombini, Maria Perla
2016-11-01
This study deals with the identification of anthraquinoid molecular markers in standard dyes, reference lakes, and paint model systems using a micro-invasive and nondestructive technique such as matrix-assisted laser desorption/ionization time-of-flight-mass spectrometry (MALDI-ToF-MS). Red anthraquinoid lakes, such as madder lake, carmine lake, and Indian lac, have been the most widely used for painting purposes since ancient times. From an analytical point of view, identifying lakes in paint samples is challenging and developing methods that maximize the information achievable minimizing the amount of sample needed is of paramount importance. The employed method was tested on less than 0.5 mg of reference samples and required a minimal sample preparation, entailing a hydrofluoric acid extraction. The method is fast and versatile because of the possibility to re-analyze the same sample (once it has been spotted on the steel plate), testing both positive and negative modes in a few minutes. The MALDI mass spectra collected in the two analysis modes were studied and compared with LDI and simulated mass spectra in order to highlight the peculiar behavior of the anthraquinones in the MALDI process. Both ionization modes were assessed for each species. The effect of the different paint binders on dye identification was also evaluated through the analyses of paint model systems. In the end, the method was successful in detecting madder lake in archeological samples from Greek wall paintings and on an Italian funerary clay vessel, demonstrating its capabilities to identify dyes in small amount of highly degraded samples.
NASA Astrophysics Data System (ADS)
He, Jiayi; Shang, Pengjian; Xiong, Hui
2018-06-01
Stocks, as the concrete manifestation of financial time series with plenty of potential information, are often used in the study of financial time series. In this paper, we utilize the stock data to recognize their patterns through out the dissimilarity matrix based on modified cross-sample entropy, then three-dimensional perceptual maps of the results are provided through multidimensional scaling method. Two modified multidimensional scaling methods are proposed in this paper, that is, multidimensional scaling based on Kronecker-delta cross-sample entropy (MDS-KCSE) and multidimensional scaling based on permutation cross-sample entropy (MDS-PCSE). These two methods use Kronecker-delta based cross-sample entropy and permutation based cross-sample entropy to replace the distance or dissimilarity measurement in classical multidimensional scaling (MDS). Multidimensional scaling based on Chebyshev distance (MDSC) is employed to provide a reference for comparisons. Our analysis reveals a clear clustering both in synthetic data and 18 indices from diverse stock markets. It implies that time series generated by the same model are easier to have similar irregularity than others, and the difference in the stock index, which is caused by the country or region and the different financial policies, can reflect the irregularity in the data. In the synthetic data experiments, not only the time series generated by different models can be distinguished, the one generated under different parameters of the same model can also be detected. In the financial data experiment, the stock indices are clearly divided into five groups. Through analysis, we find that they correspond to five regions, respectively, that is, Europe, North America, South America, Asian-Pacific (with the exception of mainland China), mainland China and Russia. The results also demonstrate that MDS-KCSE and MDS-PCSE provide more effective divisions in experiments than MDSC.
Hu, Zheng; Lin, Jun; Chen, Zhong-Sheng; Yang, Yong-Min; Li, Xue-Jun
2015-01-22
High-speed blades are often prone to fatigue due to severe blade vibrations. In particular, synchronous vibrations can cause irreversible damages to the blade. Blade tip-timing methods (BTT) have become a promising way to monitor blade vibrations. However, synchronous vibrations are unsuitably monitored by uniform BTT sampling. Therefore, non-equally mounted probes have been used, which will result in the non-uniformity of the sampling signal. Since under-sampling is an intrinsic drawback of BTT methods, how to analyze non-uniformly under-sampled BTT signals is a big challenge. In this paper, a novel reconstruction method for non-uniformly under-sampled BTT data is presented. The method is based on the periodically non-uniform sampling theorem. Firstly, a mathematical model of a non-uniform BTT sampling process is built. It can be treated as the sum of certain uniform sample streams. For each stream, an interpolating function is required to prevent aliasing in the reconstructed signal. Secondly, simultaneous equations of all interpolating functions in each sub-band are built and corresponding solutions are ultimately derived to remove unwanted replicas of the original signal caused by the sampling, which may overlay the original signal. In the end, numerical simulations and experiments are carried out to validate the feasibility of the proposed method. The results demonstrate the accuracy of the reconstructed signal depends on the sampling frequency, the blade vibration frequency, the blade vibration bandwidth, the probe static offset and the number of samples. In practice, both types of blade vibration signals can be particularly reconstructed by non-uniform BTT data acquired from only two probes.
Capillary microextraction: A new method for sampling methamphetamine vapour.
Nair, M V; Miskelly, G M
2016-11-01
Clandestine laboratories pose a serious health risk to first responders, investigators, decontamination companies, and the public who may be inadvertently exposed to methamphetamine and other chemicals used in its manufacture. Therefore there is an urgent need for reliable methods to detect and measure methamphetamine at such sites. The most common method for determining methamphetamine contamination at former clandestine laboratory sites is selected surface wipe sampling, followed by analysis with gas chromatography-mass spectrometry (GC-MS). We are investigating the use of sampling for methamphetamine vapour to complement such wipe sampling. In this study, we report the use of capillary microextraction (CME) devices for sampling airborne methamphetamine, and compare their sampling efficiency with a previously reported dynamic SPME method. The CME devices consisted of PDMS-coated glass filter strips inside a glass tube. The devices were used to dynamically sample methamphetamine vapour in the range of 0.42-4.2μgm -3 , generated by a custom-built vapour dosing system, for 1-15min, and methamphetamine was analysed using a GC-MS fitted with a ChromatoProbe thermal desorption unit. The devices showed good reproducibility (RSD<15%), and a curvilinear pre-equilibrium relationship between sampling times and peak area, which can be utilised for calibration. Under identical sampling conditions, the CME devices were approximately 30 times more sensitive than the dynamic SPME method. The CME devices could be stored for up to 3days after sampling prior to analysis. Consecutive sampling of methamphetamine and its isotopic substitute, d-9 methamphetamine showed no competitive displacement. This suggests that CME devices, pre-loaded with an internal standard, could be a feasible method for sampling airborne methamphetamine at former clandestine laboratories. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Hu, Zheng; Lin, Jun; Chen, Zhong-Sheng; Yang, Yong-Min; Li, Xue-Jun
2015-01-01
High-speed blades are often prone to fatigue due to severe blade vibrations. In particular, synchronous vibrations can cause irreversible damages to the blade. Blade tip-timing methods (BTT) have become a promising way to monitor blade vibrations. However, synchronous vibrations are unsuitably monitored by uniform BTT sampling. Therefore, non-equally mounted probes have been used, which will result in the non-uniformity of the sampling signal. Since under-sampling is an intrinsic drawback of BTT methods, how to analyze non-uniformly under-sampled BTT signals is a big challenge. In this paper, a novel reconstruction method for non-uniformly under-sampled BTT data is presented. The method is based on the periodically non-uniform sampling theorem. Firstly, a mathematical model of a non-uniform BTT sampling process is built. It can be treated as the sum of certain uniform sample streams. For each stream, an interpolating function is required to prevent aliasing in the reconstructed signal. Secondly, simultaneous equations of all interpolating functions in each sub-band are built and corresponding solutions are ultimately derived to remove unwanted replicas of the original signal caused by the sampling, which may overlay the original signal. In the end, numerical simulations and experiments are carried out to validate the feasibility of the proposed method. The results demonstrate the accuracy of the reconstructed signal depends on the sampling frequency, the blade vibration frequency, the blade vibration bandwidth, the probe static offset and the number of samples. In practice, both types of blade vibration signals can be particularly reconstructed by non-uniform BTT data acquired from only two probes. PMID:25621612
Spatial-dependence recurrence sample entropy
NASA Astrophysics Data System (ADS)
Pham, Tuan D.; Yan, Hong
2018-03-01
Measuring complexity in terms of the predictability of time series is a major area of research in science and engineering, and its applications are spreading throughout many scientific disciplines, where the analysis of physiological signals is perhaps the most widely reported in literature. Sample entropy is a popular measure for quantifying signal irregularity. However, the sample entropy does not take sequential information, which is inherently useful, into its calculation of sample similarity. Here, we develop a method that is based on the mathematical principle of the sample entropy and enables the capture of sequential information of a time series in the context of spatial dependence provided by the binary-level co-occurrence matrix of a recurrence plot. Experimental results on time-series data of the Lorenz system, physiological signals of gait maturation in healthy children, and gait dynamics in Huntington's disease show the potential of the proposed method.
Multivariate survivorship analysis using two cross-sectional samples.
Hill, M E
1999-11-01
As an alternative to survival analysis with longitudinal data, I introduce a method that can be applied when one observes the same cohort in two cross-sectional samples collected at different points in time. The method allows for the estimation of log-probability survivorship models that estimate the influence of multiple time-invariant factors on survival over a time interval separating two samples. This approach can be used whenever the survival process can be adequately conceptualized as an irreversible single-decrement process (e.g., mortality, the transition to first marriage among a cohort of never-married individuals). Using data from the Integrated Public Use Microdata Series (Ruggles and Sobek 1997), I illustrate the multivariate method through an investigation of the effects of race, parity, and educational attainment on the survival of older women in the United States.
Song, Yuqiao; Liao, Jie; Dong, Junxing; Chen, Li
2015-09-01
The seeds of grapevine (Vitis vinifera) are a byproduct of wine production. To examine the potential value of grape seeds, grape seeds from seven sources were subjected to fingerprinting using direct analysis in real time coupled with time-of-flight mass spectrometry combined with chemometrics. Firstly, we listed all reported components (56 components) from grape seeds and calculated the precise m/z values of the deprotonated ions [M-H](-) . Secondly, the experimental conditions were systematically optimized based on the peak areas of total ion chromatograms of the samples. Thirdly, the seven grape seed samples were examined using the optimized method. Information about 20 grape seed components was utilized to represent characteristic fingerprints. Finally, hierarchical clustering analysis and principal component analysis were performed to analyze the data. Grape seeds from seven different sources were classified into two clusters; hierarchical clustering analysis and principal component analysis yielded similar results. The results of this study lay the foundation for appropriate utilization and exploitation of grape seed samples. Due to the absence of complicated sample preparation methods and chromatographic separation, the method developed in this study represents one of the simplest and least time-consuming methods for grape seed fingerprinting. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Volatile organic compounds: sampling methods and their worldwide profile in ambient air.
Kumar, Anuj; Víden, Ivan
2007-08-01
The atmosphere is a particularly difficult analytical system because of the very low levels of substances to be analysed, sharp variations in pollutant levels with time and location, differences in wind, temperature and humidity. This makes the selection of an efficient sampling technique for air analysis a key step to reliable results. Generally, methods for volatile organic compounds sampling include collection of the whole air or preconcentration of samples on adsorbents. All the methods vary from each other according to the sampling technique, type of sorbent, method of extraction and identification technique. In this review paper we discuss various important aspects for sampling of volatile organic compounds by the widely used and advanced sampling methods. Characteristics of various adsorbents used for VOCs sampling are also described. Furthermore, this paper makes an effort to comprehensively review the concentration levels of volatile organic compounds along with the methodology used for analysis, in major cities of the world.
Sung, Heungsup; Yong, Dongeun; Ki, Chang Seok; Kim, Jae Seok; Seong, Moon Woo; Lee, Hyukmin; Kim, Mi Na
2016-09-01
Real-time reverse transcription PCR (rRT-PCR) of sputum samples is commonly used to diagnose Middle East respiratory syndrome coronavirus (MERS-CoV) infection. Owing to the difficulty of extracting RNA from sputum containing mucus, sputum homogenization is desirable prior to nucleic acid isolation. We determined optimal homogenization methods for isolating viral nucleic acids from sputum. We evaluated the following three sputum-homogenization methods: proteinase K and DNase I (PK-DNase) treatment, phosphate-buffered saline (PBS) treatment, and N-acetyl-L-cysteine and sodium citrate (NALC) treatment. Sputum samples were spiked with inactivated MERS-CoV culture isolates. RNA was extracted from pretreated, spiked samples using the easyMAG system (bioMérieux, France). Extracted RNAs were then subjected to rRT-PCR for MERS-CoV diagnosis (DiaPlex Q MERS-coronavirus, SolGent, Korea). While analyzing 15 spiked sputum samples prepared in technical duplicate, false-negative results were obtained with five (16.7%) and four samples (13.3%), respectively, by using the PBS and NALC methods. The range of threshold cycle (Ct) values observed when detecting upE in sputum samples was 31.1-35.4 with the PK-DNase method, 34.7-39.0 with the PBS method, and 33.9-38.6 with the NALC method. Compared with the control, which were prepared by adding a one-tenth volume of 1:1,000 diluted viral culture to PBS solution, the ranges of Ct values obtained by the PBS and NALC methods differed significantly from the mean control Ct of 33.2 (both P<0.0001). The PK-DNase method is suitable for homogenizing sputum samples prior to RNA extraction.
Rashed-Ul Islam, S M; Jahan, Munira; Tabassum, Shahina
2015-01-01
Virological monitoring is the best predictor for the management of chronic hepatitis B virus (HBV) infections. Consequently, it is important to use the most efficient, rapid and cost-effective testing systems for HBV DNA quantification. The present study compared the performance characteristics of a one-step HBV polymerase chain reaction (PCR) vs the two-step HBV PCR method for quantification of HBV DNA from clinical samples. A total of 100 samples consisting of 85 randomly selected samples from patients with chronic hepatitis B (CHB) and 15 samples from apparently healthy individuals were enrolled in this study. Of the 85 CHB clinical samples tested, HBV DNA was detected from 81% samples by one-step PCR method with median HBV DNA viral load (VL) of 7.50 × 10 3 lU/ml. In contrast, 72% samples were detected by the two-step PCR system with median HBV DNA of 3.71 × 10 3 lU/ml. The one-step method showed strong linear correlation with two-step PCR method (r = 0.89; p < 0.0001). Both methods showed good agreement at Bland-Altman plot, with a mean difference of 0.61 log 10 IU/ml and limits of agreement of -1.82 to 3.03 log 10 IU/ml. The intra-assay and interassay coefficients of variation (CV%) of plasma samples (4-7 log 10 IU/ml) for the one-step PCR method ranged between 0.33 to 0.59 and 0.28 to 0.48 respectively, thus demonstrating a high level of concordance between the two methods. Moreover, elimination of the DNA extraction step in the one-step PCR kit allowed time-efficient and significant labor and cost savings for the quantification of HBV DNA in a resource limited setting. Rashed-Ul Islam SM, Jahan M, Tabassum S. Evaluation of a Rapid One-step Real-time PCR Method as a High-throughput Screening for Quantification of Hepatitis B Virus DNA in a Resource-limited Setting. Euroasian J Hepato-Gastroenterol 2015;5(1):11-15.
Jahan, Munira; Tabassum, Shahina
2015-01-01
Virological monitoring is the best predictor for the management of chronic hepatitis B virus (HBV) infections. Consequently, it is important to use the most efficient, rapid and cost-effective testing systems for HBV DNA quantification. The present study compared the performance characteristics of a one-step HBV polymerase chain reaction (PCR) vs the two-step HBV PCR method for quantification of HBV DNA from clinical samples. A total of 100 samples consisting of 85 randomly selected samples from patients with chronic hepatitis B (CHB) and 15 samples from apparently healthy individuals were enrolled in this study. Of the 85 CHB clinical samples tested, HBV DNA was detected from 81% samples by one-step PCR method with median HBV DNA viral load (VL) of 7.50 × 103 lU/ml. In contrast, 72% samples were detected by the two-step PCR system with median HBV DNA of 3.71 × 103 lU/ml. The one-step method showed strong linear correlation with two-step PCR method (r = 0.89; p < 0.0001). Both methods showed good agreement at Bland-Altman plot, with a mean difference of 0.61 log10 IU/ml and limits of agreement of -1.82 to 3.03 log10 IU/ml. The intra-assay and interassay coefficients of variation (CV%) of plasma samples (4-7 log10 IU/ml) for the one-step PCR method ranged between 0.33 to 0.59 and 0.28 to 0.48 respectively, thus demonstrating a high level of concordance between the two methods. Moreover, elimination of the DNA extraction step in the one-step PCR kit allowed time-efficient and significant labor and cost savings for the quantification of HBV DNA in a resource limited setting. How to cite this article Rashed-Ul Islam SM, Jahan M, Tabassum S. Evaluation of a Rapid One-step Real-time PCR Method as a High-throughput Screening for Quantification of Hepatitis B Virus DNA in a Resource-limited Setting. Euroasian J Hepato-Gastroenterol 2015;5(1):11-15. PMID:29201678
Holt, Martin; de Wit, John; Brown, Graham; Maycock, Bruce; Fairley, Christopher; Prestage, Garrett
2014-01-01
Background Behavioural surveillance and research among gay and other men who have sex with men (GMSM) commonly relies on non-random recruitment approaches. Methodological challenges limit their ability to accurately represent the population of adult GMSM. We compared the social and behavioural profiles of GMSM recruited via venue-based, online, and respondent-driven sampling (RDS) and discussed their utility for behavioural surveillance. Methods Data from four studies were selected to reflect each recruitment method. We compared demographic characteristics and the prevalence of key indicators including sexual and HIV testing practices obtained from samples recruited through different methods, and population estimates from respondent-driven sampling partition analysis. Results Overall, the socio-demographic profile of GMSM was similar across samples, with some differences observed in age and sexual identification. Men recruited through time-location sampling appeared more connected to the gay community, reported a greater number of sexual partners, but engaged in less unprotected anal intercourse with regular (UAIR) or casual partners (UAIC). The RDS sample overestimated the proportion of HIV-positive men and appeared to recruit men with an overall higher number of sexual partners. A single-website survey recruited a sample with characteristics which differed considerably from the population estimates with regards to age, ethnically diversity and behaviour. Data acquired through time-location sampling underestimated the rates of UAIR and UAIC, while RDS and online sampling both generated samples that underestimated UAIR. Simulated composite samples combining recruits from time-location and multi-website online sampling may produce characteristics more consistent with the population estimates, particularly with regards to sexual practices. Conclusion Respondent-driven sampling produced the sample that was most consistent to population estimates, but this methodology is complex and logistically demanding. Time-location and online recruitment are more cost-effective and easier to implement; using these approaches in combination may offer the potential to recruit a more representative sample of GMSM. PMID:25409440
Sub-microsecond-resolution probe microscopy
Ginger, David; Giridharagopal, Rajiv; Moore, David; Rayermann, Glennis; Reid, Obadiah
2014-04-01
Methods and apparatus are provided herein for time-resolved analysis of the effect of a perturbation (e.g., a light or voltage pulse) on a sample. By operating in the time domain, the provided method enables sub-microsecond time-resolved measurement of transient, or time-varying, forces acting on a cantilever.
Sabrina, Rabehi; Mossadak, Hamdi Taha; Bakir, Mamache; Asma, Meghezzi; Khaoula, Boushaba
2018-01-01
Aim: The aim of this study was to detect Brucella spp. DNA in milk samples collected from seronegative cows using the real-time polymerase chain reaction (PCR) assay for diagnosis of brucellosis in seronegative dairy cows to prevent transmission of disease to humans and to reduce economic losses in animal production. Materials and Methods: In this study, 65 milk samples were investigated for the detection of Brucella spp. The detection of the IS711 gene in all samples was done by real-time PCR assay by comparative cycle threshold method. Results: The results show that of the 65 DNA samples tested, 2 (3.08%) were positive for Brucella infection. The mean cyclic threshold values of IS711 real-time PCR test were 37.97 and 40.48, indicating a positive reaction. Conclusion: The results of the present study indicated that the real-time PCR appears to offer several advantages over serological tests. For this reason, the real-time PCR should be validated on representative numbers of Brucella-infected and free samples before being implemented in routine diagnosis in human and animal brucellosis for controlling this disease. PMID:29657430
Zhu, Xiaoqiang; Huang, Zhengxu; Gao, Wei; Li, Xue; Li, Lei; Zhu, Hui; Mo, Ting; Huang, Bao; Zhou, Zhen
2016-07-13
The eutrophication of surface water sources and climate changes have resulted in an annual explosion of cyanobacterial blooms in many irrigating and drinking water resources. To decrease health risks to the public, a rapid real time method for the synchronous determination of two usually harmful microcystins (MC-RR and MC-LR) in environmental water samples was built by employing a paper spray ionization method coupled with a time-of-flight mass spectrometer system. With this approach, direct analysis of microcystin mixtures without sample preparation has been achieved. Rapid detection was performed, simulating the release process of microcystins in reservoir water samples, and the routine detection frequency was every three minutes. The identification time of microcystins was reduced from several hours to a few minutes. The limit of detection is 1 μg/L, and the limit of quantitation is 3 μg/L. This method displays the ability for carrying out rapid, direct, and high-throughput experiments for determination of microcystins, and it would be of significant interest for environmental and food safety applications.
Church, Deirdre L; Ambasta, Anshula; Wilmer, Amanda; Williscroft, Holly; Ritchie, Gordon; Pillai, Dylan R; Champagne, Sylvie; Gregson, Daniel G
2015-01-01
BACKGROUND: Pneumocystis jirovecii (PJ), a pathogenic fungus, causes severe interstitial Pneumocystis pneumonia (PCP) among immunocompromised patients. A laboratory-developed real-time polyermase chain reaction (PCR) assay was validated for PJ detection to improve diagnosis of PCP. METHODS: Forty stored bronchoalveolar lavage (BAL) samples (20 known PJ positive [PJ+] and 20 known PJ negative [PJ−]) were initially tested using the molecular assay. Ninety-two sequentially collected BAL samples were then analyzed using an immunofluorescence assay (IFA) and secondarily tested using the PJ real-time PCR assay. Discrepant results were resolved by retesting BAL samples using another real-time PCR assay with a different target. PJ real-time PCR assay performance was compared with the existing gold standard (ie, IFA) and a modified gold standard, in which a true positive was defined as a sample that tested positive in two of three methods in a patient suspected to have PCP. RESULTS: Ninety of 132 (68%) BAL fluid samples were collected from immunocompromised patients. Thirteen of 92 (14%) BALs collected were PJ+ when tested using IFA. A total of 40 BAL samples were PJ+ in the present study including: all IFA positive samples (n=13); all referred PJ+ BAL samples (n=20); and seven additional BAL samples that were IFA negative, but positive using the modified gold standard. Compared with IFA, the PJ real-time PCR had sensitivity, specificity, and positive and negative predictive values of 100%, 91%, 65% and 100%, respectively. Compared with the modified gold standard, PJ real-time PCR had a sensitivity, specificity, and positive and negative predictive values of 100%. CONCLUSION: PJ real-time PCR improved detection of PJ in immunocompromised patients. PMID:26600815
Evaluating the efficiency of environmental monitoring programs
Levine, Carrie R.; Yanai, Ruth D.; Lampman, Gregory G.; Burns, Douglas A.; Driscoll, Charles T.; Lawrence, Gregory B.; Lynch, Jason; Schoch, Nina
2014-01-01
Statistical uncertainty analyses can be used to improve the efficiency of environmental monitoring, allowing sampling designs to maximize information gained relative to resources required for data collection and analysis. In this paper, we illustrate four methods of data analysis appropriate to four types of environmental monitoring designs. To analyze a long-term record from a single site, we applied a general linear model to weekly stream chemistry data at Biscuit Brook, NY, to simulate the effects of reducing sampling effort and to evaluate statistical confidence in the detection of change over time. To illustrate a detectable difference analysis, we analyzed a one-time survey of mercury concentrations in loon tissues in lakes in the Adirondack Park, NY, demonstrating the effects of sampling intensity on statistical power and the selection of a resampling interval. To illustrate a bootstrapping method, we analyzed the plot-level sampling intensity of forest inventory at the Hubbard Brook Experimental Forest, NH, to quantify the sampling regime needed to achieve a desired confidence interval. Finally, to analyze time-series data from multiple sites, we assessed the number of lakes and the number of samples per year needed to monitor change over time in Adirondack lake chemistry using a repeated-measures mixed-effects model. Evaluations of time series and synoptic long-term monitoring data can help determine whether sampling should be re-allocated in space or time to optimize the use of financial and human resources.
A New Automated Method and Sample Data Flow for Analysis of Volatile Nitrosamines in Human Urine*
Hodgson, James A.; Seyler, Tiffany H.; McGahee, Ernest; Arnstein, Stephen; Wang, Lanqing
2016-01-01
Volatile nitrosamines (VNAs) are a group of compounds classified as probable (group 2A) and possible (group 2B) carcinogens in humans. Along with certain foods and contaminated drinking water, VNAs are detected at high levels in tobacco products and in both mainstream and sidestream smoke. Our laboratory monitors six urinary VNAs—N-nitrosodimethylamine (NDMA), N-nitrosomethylethylamine (NMEA), N-nitrosodiethylamine (NDEA), N-nitrosopiperidine (NPIP), N-nitrosopyrrolidine (NPYR), and N-nitrosomorpholine (NMOR)—using isotope dilution GC-MS/MS (QQQ) for large population studies such as the National Health and Nutrition Examination Survey (NHANES). In this paper, we report for the first time a new automated sample preparation method to more efficiently quantitate these VNAs. Automation is done using Hamilton STAR™ and Caliper Staccato™ workstations. This new automated method reduces sample preparation time from 4 hours to 2.5 hours while maintaining precision (inter-run CV < 10%) and accuracy (85% - 111%). More importantly this method increases sample throughput while maintaining a low limit of detection (<10 pg/mL) for all analytes. A streamlined sample data flow was created in parallel to the automated method, in which samples can be tracked from receiving to final LIMs output with minimal human intervention, further minimizing human error in the sample preparation process. This new automated method and the sample data flow are currently applied in bio-monitoring of VNAs in the US non-institutionalized population NHANES 2013-2014 cycle. PMID:26949569
NASA Technical Reports Server (NTRS)
Roman, Monserrate C.; Jones, Kathy U.; Oubre, Cherie M.; Castro, Victoria; Ott, Mark C.; Birmele, Michele; Venkateswaran, Kasthuri J.; Vaishampayan, Parag A.
2013-01-01
Current methods for microbial detection: a) Labor & time intensive cultivation-based approaches that can fail to detect or characterize all cells present. b) Requires collection of samples on orbit and transportation back to ground for analysis. Disadvantages to current detection methods: a) Unable to perform quick and reliable detection on orbit. b) Lengthy sampling intervals. c) No microbe identification.
Kim, Min-Kyu; Hong, Seong-Kwan; Kwon, Oh-Kyong
2015-01-01
This paper presents a fast multiple sampling method for low-noise CMOS image sensor (CIS) applications with column-parallel successive approximation register analog-to-digital converters (SAR ADCs). The 12-bit SAR ADC using the proposed multiple sampling method decreases the A/D conversion time by repeatedly converting a pixel output to 4-bit after the first 12-bit A/D conversion, reducing noise of the CIS by one over the square root of the number of samplings. The area of the 12-bit SAR ADC is reduced by using a 10-bit capacitor digital-to-analog converter (DAC) with four scaled reference voltages. In addition, a simple up/down counter-based digital processing logic is proposed to perform complex calculations for multiple sampling and digital correlated double sampling. To verify the proposed multiple sampling method, a 256 × 128 pixel array CIS with 12-bit SAR ADCs was fabricated using 0.18 μm CMOS process. The measurement results shows that the proposed multiple sampling method reduces each A/D conversion time from 1.2 μs to 0.45 μs and random noise from 848.3 μV to 270.4 μV, achieving a dynamic range of 68.1 dB and an SNR of 39.2 dB. PMID:26712765
Sample entropy applied to the analysis of synthetic time series and tachograms
NASA Astrophysics Data System (ADS)
Muñoz-Diosdado, A.; Gálvez-Coyt, G. G.; Solís-Montufar, E.
2017-01-01
Entropy is a method of non-linear analysis that allows an estimate of the irregularity of a system, however, there are different types of computational entropy that were considered and tested in order to obtain one that would give an index of signals complexity taking into account the data number of the analysed time series, the computational resources demanded by the method, and the accuracy of the calculation. An algorithm for the generation of fractal time-series with a certain value of β was used for the characterization of the different entropy algorithms. We obtained a significant variation for most of the algorithms in terms of the series size, which could result counterproductive for the study of real signals of different lengths. The chosen method was sample entropy, which shows great independence of the series size. With this method, time series of heart interbeat intervals or tachograms of healthy subjects and patients with congestive heart failure were analysed. The calculation of sample entropy was carried out for 24-hour tachograms and time subseries of 6-hours for sleepiness and wakefulness. The comparison between the two populations shows a significant difference that is accentuated when the patient is sleeping.
Lin, L-H; Tsai, C-Y; Hung, M-H; Fang, Y-T; Ling, Q-D
2011-09-01
Although routine bacterial culture is the traditional reference standard method for the detection of Salmonella infection in children with diarrhoea, it is a time-consuming procedure that usually only gives results after 3-4 days. Some molecular detection methods can improve the turn-around time to within 24 h, but these methods are not applied directly from stool or rectal swab specimens as routine diagnostic methods for the detection of gastrointestinal pathogens. In this study, we tested the feasibility of a bacterial enrichment culture-based real-time PCR assay method for detecting and screening for diarrhoea in children caused by Salmonella. Our results showed that the minimum real-time PCR assay time required to detect enriched bacterial culture from a swab was 3 h. In all children with suspected Salmonella diarrhoea, the enrichment culture-based real-time PCR achieved 85.4% sensitivity and 98.1% specificity, as compared with the 53.7% sensitivity and 100% specificity of detection with the routine bacterial culture method. We suggest that rectal swab sampling followed by enrichment culture-based real-time PCR is suitable as a rapid method for detecting and screening for Salmonella in paediatric patients. © 2011 The Authors. Clinical Microbiology and Infection © 2011 European Society of Clinical Microbiology and Infectious Diseases.
Method and automated apparatus for detecting coliform organisms
NASA Technical Reports Server (NTRS)
Dill, W. P.; Taylor, R. E.; Jeffers, E. L. (Inventor)
1980-01-01
Method and automated apparatus are disclosed for determining the time of detection of metabolically produced hydrogen by coliform bacteria cultured in an electroanalytical cell from the time the cell is inoculated with the bacteria. The detection time data provides bacteria concentration values. The apparatus is sequenced and controlled by a digital computer to discharge a spent sample, clean and sterilize the culture cell, provide a bacteria nutrient into the cell, control the temperature of the nutrient, inoculate the nutrient with a bacteria sample, measures the electrical potential difference produced by the cell, and measures the time of detection from inoculation.
Fast 2D NMR Spectroscopy for In vivo Monitoring of Bacterial Metabolism in Complex Mixtures.
Dass, Rupashree; Grudzia Ż, Katarzyna; Ishikawa, Takao; Nowakowski, Michał; Dȩbowska, Renata; Kazimierczuk, Krzysztof
2017-01-01
The biological toolbox is full of techniques developed originally for analytical chemistry. Among them, spectroscopic experiments are very important source of atomic-level structural information. Nuclear magnetic resonance (NMR) spectroscopy, although very advanced in chemical and biophysical applications, has been used in microbiology only in a limited manner. So far, mostly one-dimensional 1 H experiments have been reported in studies of bacterial metabolism monitored in situ . However, low spectral resolution and limited information on molecular topology limits the usability of these methods. These problems are particularly evident in the case of complex mixtures, where spectral peaks originating from many compounds overlap and make the interpretation of changes in a spectrum difficult or even impossible. Often a suite of two-dimensional (2D) NMR experiments is used to improve resolution and extract structural information from internuclear correlations. However, for dynamically changing sample, like bacterial culture, the time-consuming sampling of so-called indirect time dimensions in 2D experiments is inefficient. Here, we propose the technique known from analytical chemistry and structural biology of proteins, i.e., time-resolved non-uniform sampling. The method allows application of 2D (and multi-D) experiments in the case of quickly varying samples. The indirect dimension here is sparsely sampled resulting in significant reduction of experimental time. Compared to conventional approach based on a series of 1D measurements, this method provides extraordinary resolution and is a real-time approach to process monitoring. In this study, we demonstrate the usability of the method on a sample of Escherichia coli culture affected by ampicillin and on a sample of Propionibacterium acnes , an acne causing bacterium, mixed with a dose of face tonic, which is a complicated, multi-component mixture providing complex NMR spectrum. Through our experiments we determine the exact concentration and time at which the anti-bacterial agents affect the bacterial metabolism. We show, that it is worth to extend the NMR toolbox for microbiology by including techniques of 2D z-TOCSY, for total "fingerprinting" of a sample and 2D 13 C-edited HSQC to monitor changes in concentration of metabolites in selected metabolic pathways.
40 CFR 60.446 - Test methods and procedures.
Code of Federal Regulations, 2010 CFR
2010-07-01
... the web substrate. (b) Method 25 shall be used to determine the VOC concentration, in parts per... equivalent, and each effluent gas stream emitted directly to the atmosphere. Methods 1, 2, 3, and 4 shall be... minimum sampling volume must be 0.003 dscm except that shorter sampling times or smaller volumes, when...
40 CFR 60.446 - Test methods and procedures.
Code of Federal Regulations, 2011 CFR
2011-07-01
... the web substrate. (b) Method 25 shall be used to determine the VOC concentration, in parts per... equivalent, and each effluent gas stream emitted directly to the atmosphere. Methods 1, 2, 3, and 4 shall be... minimum sampling volume must be 0.003 dscm except that shorter sampling times or smaller volumes, when...
John F. Caratti
2006-01-01
The FIREMON Line Intercept (LI) method is used to assess changes in plant species cover for a macroplot. This method uses multiple line transects to sample within plot variation and quantify statistically valid changes in plant species cover and height over time. This method is suited for most forest and rangeland communities, but is especially useful for sampling...
Method and apparatus for measuring the NMR spectrum of an orientationally disordered sample
Pines, Alexander; Samoson, Ago
1990-01-01
An improved NMR probe and method are described which substantially improve the resolution of NMR measurements made on powdered or amorphous or otherwise oreintationally disordered samples. The apparatus mechanically varies the orientation of the sample such that the time average of two or more sets of spherical harmonic functions is zero.
Effectiveness of modified 1-hour air-oven moisture methods for determining popcorn moisture
USDA-ARS?s Scientific Manuscript database
Two of the most commonly used approved grain moisture air-oven reference methods are the air oven method ASAE S352.2, which requires long heating time (72-h) for unground samples, and the AACC 44-15.02 air-oven method, which dries a ground sample for 1 hr, but there is specific moisture measurement ...
Pilliod, David S.; Goldberg, Caren S.; Arkle, Robert S.; Waits, Lisette P.
2013-01-01
Environmental DNA (eDNA) methods for detecting aquatic species are advancing rapidly, but with little evaluation of field protocols or precision of resulting estimates. We compared sampling results from traditional field methods with eDNA methods for two amphibians in 13 streams in central Idaho, USA. We also evaluated three water collection protocols and the influence of sampling location, time of day, and distance from animals on eDNA concentration in the water. We found no difference in detection or amount of eDNA among water collection protocols. eDNA methods had slightly higher detection rates than traditional field methods, particularly when species occurred at low densities. eDNA concentration was positively related to field-measured density, biomass, and proportion of transects occupied. Precision of eDNA-based abundance estimates increased with the amount of eDNA in the water and the number of replicate subsamples collected. eDNA concentration did not vary significantly with sample location in the stream, time of day, or distance downstream from animals. Our results further advance the implementation of eDNA methods for monitoring aquatic vertebrates in stream habitats.
Lesot, Philippe; Kazimierczuk, Krzysztof; Trébosc, Julien; Amoureux, Jean-Paul; Lafon, Olivier
2015-11-01
Unique information about the atom-level structure and dynamics of solids and mesophases can be obtained by the use of multidimensional nuclear magnetic resonance (NMR) experiments. Nevertheless, the acquisition of these experiments often requires long acquisition times. We review here alternative sampling methods, which have been proposed to circumvent this issue in the case of solids and mesophases. Compared to the spectra of solutions, those of solids and mesophases present some specificities because they usually display lower signal-to-noise ratios, non-Lorentzian line shapes, lower spectral resolutions and wider spectral widths. We highlight herein the advantages and limitations of these alternative sampling methods. A first route to accelerate the acquisition time of multidimensional NMR spectra consists in the use of sparse sampling schemes, such as truncated, radial or random sampling ones. These sparsely sampled datasets are generally processed by reconstruction methods differing from the Discrete Fourier Transform (DFT). A host of non-DFT methods have been applied for solids and mesophases, including the G-matrix Fourier transform, the linear least-square procedures, the covariance transform, the maximum entropy and the compressed sensing. A second class of alternative sampling consists in departing from the Jeener paradigm for multidimensional NMR experiments. These non-Jeener methods include Hadamard spectroscopy as well as spatial or orientational encoding of the evolution frequencies. The increasing number of high field NMR magnets and the development of techniques to enhance NMR sensitivity will contribute to widen the use of these alternative sampling methods for the study of solids and mesophases in the coming years. Copyright © 2015 John Wiley & Sons, Ltd.
Mokhtari, Amirhossein; Christopher Frey, H; Zheng, Junyu
2006-11-01
Sensitivity analyses of exposure or risk models can help identify the most significant factors to aid in risk management or to prioritize additional research to reduce uncertainty in the estimates. However, sensitivity analysis is challenged by non-linearity, interactions between inputs, and multiple days or time scales. Selected sensitivity analysis methods are evaluated with respect to their applicability to human exposure models with such features using a testbed. The testbed is a simplified version of a US Environmental Protection Agency's Stochastic Human Exposure and Dose Simulation (SHEDS) model. The methods evaluated include the Pearson and Spearman correlation, sample and rank regression, analysis of variance, Fourier amplitude sensitivity test (FAST), and Sobol's method. The first five methods are known as "sampling-based" techniques, wheras the latter two methods are known as "variance-based" techniques. The main objective of the test cases was to identify the main and total contributions of individual inputs to the output variance. Sobol's method and FAST directly quantified these measures of sensitivity. Results show that sensitivity of an input typically changed when evaluated under different time scales (e.g., daily versus monthly). All methods provided similar insights regarding less important inputs; however, Sobol's method and FAST provided more robust insights with respect to sensitivity of important inputs compared to the sampling-based techniques. Thus, the sampling-based methods can be used in a screening step to identify unimportant inputs, followed by application of more computationally intensive refined methods to a smaller set of inputs. The implications of time variation in sensitivity results for risk management are briefly discussed.
Economic and workflow analysis of a blood bank automated system.
Shin, Kyung-Hwa; Kim, Hyung Hoi; Chang, Chulhun L; Lee, Eun Yup
2013-07-01
This study compared the estimated costs and times required for ABO/Rh(D) typing and unexpected antibody screening using an automated system and manual methods. The total cost included direct and labor costs. Labor costs were calculated on the basis of the average operator salaries and unit values (minutes), which was the hands-on time required to test one sample. To estimate unit values, workflows were recorded on video, and the time required for each process was analyzed separately. The unit values of ABO/Rh(D) typing using the manual method were 5.65 and 8.1 min during regular and unsocial working hours, respectively. The unit value was less than 3.5 min when several samples were tested simultaneously. The unit value for unexpected antibody screening was 2.6 min. The unit values using the automated method for ABO/Rh(D) typing, unexpected antibody screening, and both simultaneously were all 1.5 min. The total cost of ABO/Rh(D) typing of only one sample using the automated analyzer was lower than that of testing only one sample using the manual technique but higher than that of testing several samples simultaneously. The total cost of unexpected antibody screening using an automated analyzer was less than that using the manual method. ABO/Rh(D) typing using an automated analyzer incurs a lower unit value and cost than that using the manual technique when only one sample is tested at a time. Unexpected antibody screening using an automated analyzer always incurs a lower unit value and cost than that using the manual technique.
Tomuta, Ioan; Iovanov, Rares; Bodoki, Ede; Vonica, Loredana
2014-04-01
Near-Infrared (NIR) spectroscopy is an important component of a Process Analytical Technology (PAT) toolbox and is a key technology for enabling the rapid analysis of pharmaceutical tablets. The aim of this research work was to develop and validate NIR-chemometric methods not only for the determination of active pharmaceutical ingredients content but also pharmaceutical properties (crushing strength, disintegration time) of meloxicam tablets. The development of the method for active content assay was performed on samples corresponding to 80%, 90%, 100%, 110% and 120% of meloxicam content and the development of the methods for pharmaceutical characterization was performed on samples prepared at seven different compression forces (ranging from 7 to 45 kN) using NIR transmission spectra of intact tablets and PLS as a regression method. The results show that the developed methods have good trueness, precision and accuracy and are appropriate for direct active content assay in tablets (ranging from 12 to 18 mg/tablet) and also for predicting crushing strength and disintegration time of intact meloxicam tablets. The comparative data show that the proposed methods are in good agreement with the reference methods currently used for the characterization of meloxicam tablets (HPLC-UV methods for the assay and European Pharmacopeia methods for determining the crushing strength and disintegration time). The results show the possibility to predict both chemical properties (active content) and physical/pharmaceutical properties (crushing strength and disintegration time) directly, without any sample preparation, from the same NIR transmission spectrum of meloxicam tablets.
Laser-Induced Breakdown Spectroscopy Based Protein Assay for Cereal Samples.
Sezer, Banu; Bilge, Gonca; Boyaci, Ismail Hakki
2016-12-14
Protein content is an important quality parameter in terms of price, nutritional value, and labeling of various cereal samples. However, conventional analysis methods, namely, Kjeldahl and Dumas, have major drawbacks such as long analysis time, titration mistakes, and carrier gas dependence with high purity. For this reason, there is an urgent need for rapid, reliable, and environmentally friendly technologies for protein analysis. The present study aims to develop a new method for protein analysis in wheat flour and whole meal by using laser-induced breakdown spectroscopy (LIBS), which is a multielemental, fast, and simple spectroscopic method. Unlike the Kjeldahl and Dumas methods, it has potential to analyze a high number of samples in considerably short time. In the study, nitrogen peaks in LIBS spectra of wheat flour and whole meal samples with different protein contents were correlated with results of the standard Dumas method with the aid of chemometric methods. A calibration graph showed good linearity with the protein content between 7.9 and 20.9% and a 0.992 coefficient of determination (R 2 ). The limit of detection was calculated as 0.26%. The results indicated that LIBS is a promising and reliable method with its high sensitivity for routine protein analysis in wheat flour and whole meal samples.
Random phase detection in multidimensional NMR.
Maciejewski, Mark W; Fenwick, Matthew; Schuyler, Adam D; Stern, Alan S; Gorbatyuk, Vitaliy; Hoch, Jeffrey C
2011-10-04
Despite advances in resolution accompanying the development of high-field superconducting magnets, biomolecular applications of NMR require multiple dimensions in order to resolve individual resonances, and the achievable resolution is typically limited by practical constraints on measuring time. In addition to the need for measuring long evolution times to obtain high resolution, the need to distinguish the sign of the frequency constrains the ability to shorten measuring times. Sign discrimination is typically accomplished by sampling the signal with two different receiver phases or by selecting a reference frequency outside the range of frequencies spanned by the signal and then sampling at a higher rate. In the parametrically sampled (indirect) time dimensions of multidimensional NMR experiments, either method imposes an additional factor of 2 sampling burden for each dimension. We demonstrate that by using a single detector phase at each time sample point, but randomly altering the phase for different points, the sign ambiguity that attends fixed single-phase detection is resolved. Random phase detection enables a reduction in experiment time by a factor of 2 for each indirect dimension, amounting to a factor of 8 for a four-dimensional experiment, albeit at the cost of introducing sampling artifacts. Alternatively, for fixed measuring time, random phase detection can be used to double resolution in each indirect dimension. Random phase detection is complementary to nonuniform sampling methods, and their combination offers the potential for additional benefits. In addition to applications in biomolecular NMR, random phase detection could be useful in magnetic resonance imaging and other signal processing contexts.
Lu, Tzong-Shi; Yiao, Szu-Yu; Lim, Kenneth; Jensen, Roderick V; Hsiao, Li-Li
2010-07-01
The identification of differences in protein expression resulting from methodical variations is an essential component to the interpretation of true, biologically significant results. We used the Lowry and Bradford methods- two most commonly used methods for protein quantification, to assess whether differential protein expressions are a result of true biological or methodical variations. MATERIAL #ENTITYSTARTX00026; Differential protein expression patterns was assessed by western blot following protein quantification by the Lowry and Bradford methods. We have observed significant variations in protein concentrations following assessment with the Lowry versus Bradford methods, using identical samples. Greater variations in protein concentration readings were observed over time and in samples with higher concentrations, with the Bradford method. Identical samples quantified using both methods yielded significantly different expression patterns on Western blot. We show for the first time that methodical variations observed in these protein assay techniques, can potentially translate into differential protein expression patterns, that can be falsely taken to be biologically significant. Our study therefore highlights the pivotal need to carefully consider methodical approaches to protein quantification in techniques that report quantitative differences.
Almutairy, Meznah; Torng, Eric
2018-01-01
Bioinformatics applications and pipelines increasingly use k-mer indexes to search for similar sequences. The major problem with k-mer indexes is that they require lots of memory. Sampling is often used to reduce index size and query time. Most applications use one of two major types of sampling: fixed sampling and minimizer sampling. It is well known that fixed sampling will produce a smaller index, typically by roughly a factor of two, whereas it is generally assumed that minimizer sampling will produce faster query times since query k-mers can also be sampled. However, no direct comparison of fixed and minimizer sampling has been performed to verify these assumptions. We systematically compare fixed and minimizer sampling using the human genome as our database. We use the resulting k-mer indexes for fixed sampling and minimizer sampling to find all maximal exact matches between our database, the human genome, and three separate query sets, the mouse genome, the chimp genome, and an NGS data set. We reach the following conclusions. First, using larger k-mers reduces query time for both fixed sampling and minimizer sampling at a cost of requiring more space. If we use the same k-mer size for both methods, fixed sampling requires typically half as much space whereas minimizer sampling processes queries only slightly faster. If we are allowed to use any k-mer size for each method, then we can choose a k-mer size such that fixed sampling both uses less space and processes queries faster than minimizer sampling. The reason is that although minimizer sampling is able to sample query k-mers, the number of shared k-mer occurrences that must be processed is much larger for minimizer sampling than fixed sampling. In conclusion, we argue that for any application where each shared k-mer occurrence must be processed, fixed sampling is the right sampling method.
Torng, Eric
2018-01-01
Bioinformatics applications and pipelines increasingly use k-mer indexes to search for similar sequences. The major problem with k-mer indexes is that they require lots of memory. Sampling is often used to reduce index size and query time. Most applications use one of two major types of sampling: fixed sampling and minimizer sampling. It is well known that fixed sampling will produce a smaller index, typically by roughly a factor of two, whereas it is generally assumed that minimizer sampling will produce faster query times since query k-mers can also be sampled. However, no direct comparison of fixed and minimizer sampling has been performed to verify these assumptions. We systematically compare fixed and minimizer sampling using the human genome as our database. We use the resulting k-mer indexes for fixed sampling and minimizer sampling to find all maximal exact matches between our database, the human genome, and three separate query sets, the mouse genome, the chimp genome, and an NGS data set. We reach the following conclusions. First, using larger k-mers reduces query time for both fixed sampling and minimizer sampling at a cost of requiring more space. If we use the same k-mer size for both methods, fixed sampling requires typically half as much space whereas minimizer sampling processes queries only slightly faster. If we are allowed to use any k-mer size for each method, then we can choose a k-mer size such that fixed sampling both uses less space and processes queries faster than minimizer sampling. The reason is that although minimizer sampling is able to sample query k-mers, the number of shared k-mer occurrences that must be processed is much larger for minimizer sampling than fixed sampling. In conclusion, we argue that for any application where each shared k-mer occurrence must be processed, fixed sampling is the right sampling method. PMID:29389989
Gradient-free MCMC methods for dynamic causal modelling
Sengupta, Biswa; Friston, Karl J.; Penny, Will D.
2015-03-14
Here, we compare the performance of four gradient-free MCMC samplers (random walk Metropolis sampling, slice-sampling, adaptive MCMC sampling and population-based MCMC sampling with tempering) in terms of the number of independent samples they can produce per unit computational time. For the Bayesian inversion of a single-node neural mass model, both adaptive and population-based samplers are more efficient compared with random walk Metropolis sampler or slice-sampling; yet adaptive MCMC sampling is more promising in terms of compute time. Slice-sampling yields the highest number of independent samples from the target density -- albeit at almost 1000% increase in computational time, in comparisonmore » to the most efficient algorithm (i.e., the adaptive MCMC sampler).« less
Elges, Sandra; Arnold, Renate; Liesenfeld, Oliver; Kofla, Grzegorz; Mikolajewska, Agata; Schwartz, Stefan; Uharek, Lutz; Ruhnke, Markus
2017-12-01
We prospectively evaluated a multiplex real-time PCR assay (SeptiFast, SF) in a cohort of patients undergoing allo-BMT in comparison to an in-house PCR method (IH-PCR). Overall 847 blood samples (mean 8 samples/patient) from 104 patients with haematological malignancies were analysed. The majority of patients had acute leukaemia (62%) with a mean age of 52 years (54% female). Pathogens could be detected in 91 of 847 (11%) samples by SF compared to 38 of 205 (18.5%) samples by BC, and 57 of 847 (6.7%) samples by IH-PCR. Coagulase-negative staphylococci (n=41 in SF, n=29 in BC) were the most frequently detected bacteria followed by Escherichia coli (n=9 in SF, n=6 in BC). Candida albicans (n=17 in SF, n=0 in BC, n=24 in IH-PCR) was the most frequently detected fungal pathogen. SF gave positive results in 5% of samples during surveillance vs in 26% of samples during fever episodes. Overall, the majority of blood samples gave negative results in both PCR methods resulting in 93% overall agreement resulting in a negative predictive value of 0.96 (95% CI: 0.95-0.97), and a positive predictive value of 0.10 (95% CI: -0.01 to 0.21). SeptiFast appeared to be superior over BC and the IH-PCR method. © 2017 Blackwell Verlag GmbH.
NASA Astrophysics Data System (ADS)
Rufino, Marta M.; Baptista, Paulo; Pereira, Fábio; Gaspar, Miguel B.
2018-01-01
In the current work we propose a new method to sample surface sediment during bivalve fishing surveys. Fishing institutes all around the word carry out regular surveys with the aim of monitoring the stocks of commercial species. These surveys comprise often more than one hundred of sampling stations and cover large geographical areas. Although superficial sediment grain sizes are among the main drivers of benthic communities and provide crucial information for studies on coastal dynamics, overall there is a strong lack of this type of data, possibly, because traditional surface sediment sampling methods use grabs, that require considerable time and effort to be carried out on regular basis or on large areas. In face of these aspects, we developed an easy and un-expensive method to sample superficial sediments, during bivalve fisheries monitoring surveys, without increasing survey time or human resources. The method was successfully evaluated and validated during a typical bivalve survey carried out on the Northwest coast of Portugal, confirming that it had any interference with the survey objectives. Furthermore, the method was validated by collecting samples using a traditional Van Veen grabs (traditional method), which showed a similar grain size composition to the ones collected by the new method, on the same localities. We recommend that the procedure is implemented on regular bivalve fishing surveys, together with an image analysis system to analyse the collected samples. The new method will provide substantial quantity of data on surface sediment in coastal areas, using a non-expensive and efficient manner, with a high potential application in different fields of research.
Smith, Brian; Menchaca, Leticia
1999-01-01
A method for determination of .sup.18 O/.sup.16 O and .sup.2 H/.sup.1 H ratios and .sup.3 H concentrations of xylem and subsurface waters using time series sampling, insulating sampling chambers, and combined .sup.18 O/.sup.16 O, .sup.2 H/.sup.1 H and .sup.3 H concentration data on transpired water. The method involves collecting water samples transpired from living plants and correcting the measured isotopic compositions of oxygen (.sup.18 O/.sup.16 O) and hydrogen (.sup.2 H/.sup.1 H and/or .sup.3 H concentrations) to account for evaporative isotopic fractionation in the leafy material of the plant.
Korfmacher, Walter; Luo, Yongyi; Ho, Stacy; Sun, Wei; Shen, Liduo; Wang, Jie; Wu, Zhongtao; Guo, Yang; Snow, Gregory; O'Shea, Thomas
2015-01-01
Serial sampling methods have been used for rat pharmacokinetic (PK) studies for over 20 years. Currently, it is still common to take 200-250 μL of blood at each timepoint when performing a PK study in rats and using serial sampling. While several techniques have been employed for collecting blood samples from rats, there is only limited published data to compare these methods. Recently, microsampling (≤ 50 μL) techniques have been reported as an alternative process for collecting blood samples from rats. In this report, five compounds were dosed orally into rats. For three proprietary compounds, jugular vein cannula (JVC) sampling was used to collect whole blood and plasma samples and capillary microsampling (CMS) was used to collect blood samples from the tail vein of the same animal. For the two other compounds, marketed drugs fluoxetine and glipizide, JVC sampling was used to collect both whole blood and blood CMS samples while tail-vein sampling from the same rats was also used to collect both whole blood and blood CMS samples. For the three proprietary compounds, the blood AUC as well as the blood concentration-time profile that were obtained from the tail vein were different from those obtained via JVC sampling. For fluoxetine, the blood total exposure (AUC) was not statistically different when comparing tail-vein sampling to JVC sampling, however the blood concentration-time profile that was obtained from the tail vein was different than the one obtained from JVC sampling. For glipizide, the blood AUC and concentration-time profile were not statistically different when comparing the tail-vein sampling to the JVC sampling. For both fluoxetine and glipizide, the blood concentration profiles obtained from CMS were equivalent to the blood concentration profiles obtained from the standard whole blood sampling, collected at the same sampling site. The data in this report provide strong evidence that blood CMS is a valuable small volume blood sampling approach for rats and that it provides results for test compound concentrations that are equivalent to those obtained from traditional whole blood sampling. The data also suggest that for some compounds, the concentration-time profile that is obtained for a test compound based on sampling from a rat tail vein may be different from that obtained from rat JVC sampling. In some cases, this shift in the concentration-time profile will result in different PK parameters for the test compound. Based on these observations, it is recommended that a consistent blood sampling method should be used for serial microsampling in discovery rat PK studies when testing multiple new chemical entities. If the rat tail vein sampling method is selected for PK screening, then conducting a bridging study on the lead compound is recommended to confirm that the rat PK obtained from JVC sampling is comparable to the tail-vein sampling. Copyright © 2015 Elsevier Inc. All rights reserved.
A survey method for characterizing daily life experience: the day reconstruction method.
Kahneman, Daniel; Krueger, Alan B; Schkade, David A; Schwarz, Norbert; Stone, Arthur A
2004-12-03
The Day Reconstruction Method (DRM) assesses how people spend their time and how they experience the various activities and settings of their lives, combining features of time-budget measurement and experience sampling. Participants systematically reconstruct their activities and experiences of the preceding day with procedures designed to reduce recall biases. The DRM's utility is shown by documenting close correspondences between the DRM reports of 909 employed women and established results from experience sampling. An analysis of the hedonic treadmill shows the DRM's potential for well-being research.
Evaluation of a new automated instrument for pretransfusion testing.
Morelati, F; Revelli, N; Maffei, L M; Poretti, M; Santoro, C; Parravicini, A; Rebulla, P; Cole, R; Sirchia, G
1998-10-01
A number of automated devices for pretransfusion testing have recently become available. This study evaluated a fully automated device based on column agglutination technology (AutoVue System, Ortho, Raritan, NJ). Some 6747 tests including forward and reverse ABO group, Rh type and phenotype, antibody screen, autocontrol, and crossmatch were performed on random samples from 1069 blood donors, 2063 patients, and 98 newborns and cord blood. Also tested were samples from 168 immunized patients and 53 donors expressing weak or variant A and D antigens. Test results and technician times required for their performance were compared with those obtained by standard methods (manual column agglutination technology, slide, semiautomatic handler). No erroneous conclusions were found in regard to the 5028 ABO group and Rh type or phenotype determinations carried out with the device. The device rejected 1.53 percent of tests for sample inadequacy. Of the remaining 18 tests with discrepant results found with the device and not confirmed with the standard methods, 6 gave such results because of mixed-field reactions, 10 gave negative results with A2 RBCs in reverse ABO grouping, and 2 gave very weak positive reactions in antibody screening and crossmatching. In the samples from immunized patients, the device missed one weak anti-K, whereas standard methods missed five weak antibodies. In addition, 48, 34, and 31 of the 53 weak or variant antigens were detected by the device, the slide method, and the semiautomated handler, respectively. Technician time with the standard methods was 1.6 to 7 times higher than that with the device. The technical performance of the device compared favorably with that of standard methods, with a number of advantages, including in particular the saving of technician time. Sample inadequacy was the most common cause of discrepancy, which suggests that standardization of sample collection can further improve the performance of the device.
Evaluation of AUC(0-4) predictive methods for cyclosporine in kidney transplant patients.
Aoyama, Takahiko; Matsumoto, Yoshiaki; Shimizu, Makiko; Fukuoka, Masamichi; Kimura, Toshimi; Kokubun, Hideya; Yoshida, Kazunari; Yago, Kazuo
2005-05-01
Cyclosporine (CyA) is the most commonly used immunosuppressive agent in patients who undergo kidney transplantation. Dosage adjustment of CyA is usually based on trough levels. Recently, trough levels have been replacing the area under the concentration-time curve during the first 4 h after CyA administration (AUC(0-4)). The aim of this study was to compare the predictive values obtained using three different methods of AUC(0-4) monitoring. AUC(0-4) was calculated from 0 to 4 h in early and stable renal transplant patients using the trapezoidal rule. The predicted AUC(0-4) was calculated using three different methods: the multiple regression equation reported by Uchida et al.; Bayesian estimation for modified population pharmacokinetic parameters reported by Yoshida et al.; and modified population pharmacokinetic parameters reported by Cremers et al. The predicted AUC(0-4) was assessed on the basis of predictive bias, precision, and correlation coefficient. The predicted AUC(0-4) values obtained using three methods through measurement of three blood samples showed small differences in predictive bias, precision, and correlation coefficient. In the prediction of AUC(0-4) measurement of one blood sample from stable renal transplant patients, the performance of the regression equation reported by Uchida depended on sampling time. On the other hand, the performance of Bayesian estimation with modified pharmacokinetic parameters reported by Yoshida through measurement of one blood sample, which is not dependent on sampling time, showed a small difference in the correlation coefficient. The prediction of AUC(0-4) using a regression equation required accurate sampling time. In this study, the prediction of AUC(0-4) using Bayesian estimation did not require accurate sampling time in the AUC(0-4) monitoring of CyA. Thus Bayesian estimation is assumed to be clinically useful in the dosage adjustment of CyA.
Influence of item distribution pattern and abundance on efficiency of benthic core sampling
Behney, Adam C.; O'Shaughnessy, Ryan; Eichholz, Michael W.; Stafford, Joshua D.
2014-01-01
ore sampling is a commonly used method to estimate benthic item density, but little information exists about factors influencing the accuracy and time-efficiency of this method. We simulated core sampling in a Geographic Information System framework by generating points (benthic items) and polygons (core samplers) to assess how sample size (number of core samples), core sampler size (cm2), distribution of benthic items, and item density affected the bias and precision of estimates of density, the detection probability of items, and the time-costs. When items were distributed randomly versus clumped, bias decreased and precision increased with increasing sample size and increased slightly with increasing core sampler size. Bias and precision were only affected by benthic item density at very low values (500–1,000 items/m2). Detection probability (the probability of capturing ≥ 1 item in a core sample if it is available for sampling) was substantially greater when items were distributed randomly as opposed to clumped. Taking more small diameter core samples was always more time-efficient than taking fewer large diameter samples. We are unable to present a single, optimal sample size, but provide information for researchers and managers to derive optimal sample sizes dependent on their research goals and environmental conditions.
Heller, Melina; Vitali, Luciano; Oliveira, Marcone Augusto Leal; Costa, Ana Carolina O; Micke, Gustavo Amadeu
2011-07-13
The present study aimed to develop a methodology using capillary electrophoresis for the determination of sinapaldehyde, syringaldehyde, coniferaldehyde, and vanillin in whiskey samples. The main objective was to obtain a screening method to differentiate authentic samples from seized samples suspected of being false using the phenolic aldehydes as chemical markers. The optimized background electrolyte was composed of 20 mmol L(-1) sodium tetraborate with 10% MeOH at pH 9.3. The study examined two kinds of sample stacking, using a long-end injection mode: normal sample stacking (NSM) and sample stacking with matrix removal (SWMR). In SWMR, the optimized injection time of the samples was 42 s (SWMR42); at this time, no matrix effects were observed. Values of r were >0.99 for the both methods. The LOD and LOQ were better than 100 and 330 mg mL(-1) for NSM and better than 22 and 73 mg L(-1) for SWMR. The CE-UV reliability in the aldehyde analysis in the real sample was compared statistically with LC-MS/MS methodology, and no significant differences were found, with a 95% confidence interval between the methodologies.
Methyl-CpG island-associated genome signature tags
Dunn, John J
2014-05-20
Disclosed is a method for analyzing the organismic complexity of a sample through analysis of the nucleic acid in the sample. In the disclosed method, through a series of steps, including digestion with a type II restriction enzyme, ligation of capture adapters and linkers and digestion with a type IIS restriction enzyme, genome signature tags are produced. The sequences of a statistically significant number of the signature tags are determined and the sequences are used to identify and quantify the organisms in the sample. Various embodiments of the invention described herein include methods for using single point genome signature tags to analyze the related families present in a sample, methods for analyzing sequences associated with hyper- and hypo-methylated CpG islands, methods for visualizing organismic complexity change in a sampling location over time and methods for generating the genome signature tag profile of a sample of fragmented DNA.
Operational Evaluation of the Rapid Viability PCR Method for ...
Journal Article This research work has a significant impact on the use of the RV-PCR method to analyze post-decontamination environmental samples during an anthrax event. The method has shown 98% agreement with the traditional culture based method. With such a success, this method, upon validation, will significantly increase the laboratory throughput/capacity to analyze a large number of anthrax event samples in a relatively short time.
Flow injection trace gas analysis method for on-site determination of organoarsenicals
Aldstadt, J.H. III
1997-06-24
A method is described for real-time determination of the concentration of Lewisite in the ambient atmosphere, the method includes separating and collecting a Lewisite sample from the atmosphere in a collection chamber, converting the collected Lewisite to an arsenite ion solution sample, pumping the arsenite ion containing sample to an electrochemical detector connected to the collection chamber, and electrochemically detecting the converted arsenite ions in the sample, whereby the concentration of arsenite ions detected is proportional to the concentration of Lewisite in the atmosphere. 2 figs.
Monitoring benthic aIgal communides: A comparison of targeted and coefficient sampling methods
Edwards, Matthew S.; Tinker, M. Tim
2009-01-01
Choosing an appropriate sample unit is a fundamental decision in the design of ecological studies. While numerous methods have been developed to estimate organism abundance, they differ in cost, accuracy and precision.Using both field data and computer simulation modeling, we evaluated the costs and benefits associated with two methods commonly used to sample benthic organisms in temperate kelp forests. One of these methods, the Targeted Sampling method, relies on different sample units, each "targeted" for a specific species or group of species while the other method relies on coefficients that represent ranges of bottom cover obtained from visual esti-mates within standardized sample units. Both the field data and the computer simulations suggest that both methods yield remarkably similar estimates of organism abundance and among-site variability, although the Coefficient method slightly underestimates variability among sample units when abundances are low. In contrast, the two methods differ considerably in the effort needed to sample these communities; the Targeted Sampling requires more time and twice the personnel to complete. We conclude that the Coefficent Sampling method may be better for environmental monitoring programs where changes in mean abundance are of central concern and resources are limiting, but that the Targeted sampling methods may be better for ecological studies where quantitative relationships among species and small-scale variability in abundance are of central concern.
NASA Astrophysics Data System (ADS)
Max-Moerbeck, W.; Richards, J. L.; Hovatta, T.; Pavlidou, V.; Pearson, T. J.; Readhead, A. C. S.
2014-11-01
We present a practical implementation of a Monte Carlo method to estimate the significance of cross-correlations in unevenly sampled time series of data, whose statistical properties are modelled with a simple power-law power spectral density. This implementation builds on published methods; we introduce a number of improvements in the normalization of the cross-correlation function estimate and a bootstrap method for estimating the significance of the cross-correlations. A closely related matter is the estimation of a model for the light curves, which is critical for the significance estimates. We present a graphical and quantitative demonstration that uses simulations to show how common it is to get high cross-correlations for unrelated light curves with steep power spectral densities. This demonstration highlights the dangers of interpreting them as signs of a physical connection. We show that by using interpolation and the Hanning sampling window function we are able to reduce the effects of red-noise leakage and to recover steep simple power-law power spectral densities. We also introduce the use of a Neyman construction for the estimation of the errors in the power-law index of the power spectral density. This method provides a consistent way to estimate the significance of cross-correlations in unevenly sampled time series of data.
Mull, Bonnie J.; Narayanan, Jothikumar; Hill, Vincent R.
2013-01-01
Primary amebic meningoencephalitis (PAM) is a rare and typically fatal infection caused by the thermophilic free-living ameba, Naegleria fowleri. In 2010, the first confirmed case of PAM acquired in Minnesota highlighted the need for improved detection and quantification methods in order to study the changing ecology of N. fowleri and to evaluate potential risk factors for increased exposure. An immunomagnetic separation (IMS) procedure and real-time PCR TaqMan assay were developed to recover and quantify N. fowleri in water and sediment samples. When one liter of lake water was seeded with N. fowleri strain CDC:V212, the method had an average recovery of 46% and detection limit of 14 amebas per liter of water. The method was then applied to sediment and water samples with unknown N. fowleri concentrations, resulting in positive direct detections by real-time PCR in 3 out of 16 samples and confirmation of N. fowleri culture in 6 of 16 samples. This study has resulted in a new method for detection and quantification of N. fowleri in water and sediment that should be a useful tool to facilitate studies of the physical, chemical, and biological factors associated with the presence and dynamics of N. fowleri in environmental systems. PMID:24228172
Olofsson, Madelen A; Bylund, Dan
2015-10-01
A liquid chromatography with electrospray ionization mass spectrometry method was developed to quantitatively and qualitatively analyze 13 hydroxamate siderophores (ferrichrome, ferrirubin, ferrirhodin, ferrichrysin, ferricrocin, ferrioxamine B, D1 , E and G, neocoprogen I and II, coprogen and triacetylfusarinine C). Samples were preconcentrated on-line by a switch-valve setup prior to analyte separation on a Kinetex C18 column. Gradient elution was performed using a mixture of an ammonium formate buffer and acetonitrile. Total analysis time including column conditioning was 20.5 min. Analytes were fragmented by applying collision-induced dissociation, enabling structural identification by tandem mass spectrometry. Limit of detection values for the selected ion monitoring method ranged from 71 pM to 1.5 nM with corresponding values of two to nine times higher for the multiple reaction monitoring method. The liquid chromatography with mass spectrometry method resulted in a robust and sensitive quantification of hydroxamate siderophores as indicated by retention time stability, linearity, sensitivity, precision and recovery. The analytical error of the methods, assessed through random-order, duplicate analysis of soil samples extracted with a mixture of 10 mM phosphate buffer and methanol, appears negligible in relation to between-sample variations. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Gilbert-López, Bienvenida; García-Reyes, Juan F; Lozano, Ana; Fernández-Alba, Amadeo R; Molina-Díaz, Antonio
2010-09-24
In this work we have evaluated the performance of two sample preparation methodologies for the large-scale multiresidue analysis of pesticides in olives using liquid chromatography-electrospray tandem mass spectrometry (LC-MS/MS). The tested sample treatment methodologies were: (1) liquid-liquid partitioning with acetonitrile followed by dispersive solid-phase extraction clean-up using GCB, PSA and C18 sorbents (QuEChERS method - modified for fatty vegetables) and (2) matrix solid-phase dispersion (MSPD) using aminopropyl as sorbent material and a final clean-up performed in the elution step using Florisil. An LC-MS/MS method covering 104 multiclass pesticides was developed to examine the performance of these two protocols. The separation of the compounds from the olive extracts was achieved using a short C18 column (50 mm x 4.6 mm i.d.) with 1.8 microm particle size. The identification and confirmation of the compounds was based on retention time matching along with the presence (and ratio) of two typical MRM transitions. Limits of detection obtained were lower than 10 microgkg(-1) for 89% analytes using both sample treatment protocols. Recoveries studies performed on olives samples spiked at two concentration levels (10 and 100 microgkg(-1)) yielded average recoveries in the range 70-120% for most analytes when QuEChERS procedure is employed. When MSPD was the choice for sample extraction, recoveries obtained were in the range 50-70% for most of target compounds. The proposed methods were successfully applied to the analysis of real olives samples, revealing the presence of some of the target species in the microgkg(-1) range. Besides the evaluation of the sample preparation approaches, we also discuss the use of advanced software features associated to MRM method development that overcome several limitations and drawbacks associated to MS/MS methods (time segments boundaries, tedious method development/manual scheduling and acquisition limitations). This software feature recently offered by different vendors is based on an algorithm that associates retention time data for each individual MS/MS transition, so that the number of simultaneously traced transitions throughout the entire chromatographic run (dwell times and sensitivity) is maximized. Copyright 2010 Elsevier B.V. All rights reserved.
Minimum and Maximum Times Required to Obtain Representative Suspended Sediment Samples
NASA Astrophysics Data System (ADS)
Gitto, A.; Venditti, J. G.; Kostaschuk, R.; Church, M. A.
2014-12-01
Bottle sampling is a convenient method of obtaining suspended sediment measurements for the development of sediment budgets. While these methods are generally considered to be reliable, recent analysis of depth-integrated sampling has identified considerable uncertainty in measurements of grain-size concentration between grain-size classes of multiple samples. Point-integrated bottle sampling is assumed to represent the mean concentration of suspended sediment but the uncertainty surrounding this method is not well understood. Here we examine at-a-point variability in velocity, suspended sediment concentration, grain-size distribution, and grain-size moments to determine if traditional point-integrated methods provide a representative sample of suspended sediment. We present continuous hour-long observations of suspended sediment from the sand-bedded portion of the Fraser River at Mission, British Columbia, Canada, using a LISST laser-diffraction instrument. Spectral analysis suggests that there are no statistically significant peak in energy density, suggesting the absence of periodic fluctuations in flow and suspended sediment. However, a slope break in the spectra at 0.003 Hz corresponds to a period of 5.5 minutes. This coincides with the threshold between large-scale turbulent eddies that scale with channel width/mean velocity and hydraulic phenomena related to channel dynamics. This suggests that suspended sediment samples taken over a period longer than 5.5 minutes incorporate variability that is larger scale than turbulent phenomena in this channel. Examination of 5.5-minute periods of our time series indicate that ~20% of the time a stable mean value of volumetric concentration is reached within 30 seconds, a typical bottle sample duration. In ~12% of measurements a stable mean was not reached over the 5.5 minute sample duration. The remaining measurements achieve a stable mean in an even distribution over the intervening interval.
Gu, Qun; David, Frank; Lynen, Frédéric; Rumpel, Klaus; Dugardeyn, Jasper; Van Der Straeten, Dominique; Xu, Guowang; Sandra, Pat
2011-05-27
In this paper, automated sample preparation, retention time locked gas chromatography-mass spectrometry (GC-MS) and data analysis methods for the metabolomics study were evaluated. A miniaturized and automated derivatisation method using sequential oximation and silylation was applied to a polar extract of 4 types (2 types×2 ages) of Arabidopsis thaliana, a popular model organism often used in plant sciences and genetics. Automation of the derivatisation process offers excellent repeatability, and the time between sample preparation and analysis was short and constant, reducing artifact formation. Retention time locked (RTL) gas chromatography-mass spectrometry was used, resulting in reproducible retention times and GC-MS profiles. Two approaches were used for data analysis. XCMS followed by principal component analysis (approach 1) and AMDIS deconvolution combined with a commercially available program (Mass Profiler Professional) followed by principal component analysis (approach 2) were compared. Several features that were up- or down-regulated in the different types were detected. Copyright © 2011 Elsevier B.V. All rights reserved.
Real-time RT-PCR, a necessary tool to support the diagnosis and surveillance of rotavirus in Mexico.
De La Cruz Hernández, Sergio Isaac; Anaya Molina, Yazmin; Gómez Santiago, Fabián; Terán Vega, Heidi Lizbeth; Monroy Leyva, Elda; Méndez Pérez, Héctor; García Lozano, Herlinda
2018-04-01
Rotavirus produces diarrhea in children under 5 years old. Most of those conventional methods such as polyacrylamide gel electrophoresis (PAGE) and reverse transcription-polymerase chain reaction (RT-PCR) have been used for rotavirus detection. However, these techniques need a multi-step process to get the results. In comparison with conventional methods, the real-time RT-PCR is a highly sensitive method, which allows getting the results in only one day. In this study a real-time RT-PCR assay was tested using a panel of 440 samples from patients with acute gastroenteritis, and characterized by PAGE and RT-PCR. The results show that the real-time RT-PCR detected rotavirus from 73% of rotavirus-negative samples analyzed by PAGE and RT-PCR; thus, the percentage of rotavirus-positive samples increased to 81%. The results indicate that this real-time RT-PCR should be part of a routine analysis, and as a support of the diagnosis of rotavirus in Mexico. Copyright © 2017 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Benítez, Hernán D.; Ibarra-Castanedo, Clemente; Bendada, AbdelHakim; Maldague, Xavier; Loaiza, Humberto; Caicedo, Eduardo
2008-01-01
It is well known that the methods of thermographic non-destructive testing based on the thermal contrast are strongly affected by non-uniform heating at the surface. Hence, the results obtained from these methods considerably depend on the chosen reference point. The differential absolute contrast (DAC) method was developed to eliminate the need of determining a reference point that defined the thermal contrast with respect to an ideal sound area. Although, very useful at early times, the DAC accuracy decreases when the heat front approaches the sample rear face. We propose a new DAC version by explicitly introducing the sample thickness using the thermal quadrupoles theory and showing that the new DAC range of validity increases for long times while preserving the validity for short times. This new contrast is used for defect quantification in composite, Plexiglas™ and aluminum samples.
NASA Astrophysics Data System (ADS)
Heinemeier, Jan; Jungner, Högne; Lindroos, Alf; Ringbom, Åsa; von Konow, Thorborg; Rud, Niels
1997-03-01
A method for refining lime mortar samples for 14C dating has been developed. It includes mechanical and chemical separation of mortar carbonate with optical control of the purity of the samples. The method has been applied to a large series of AMS datings on lime mortar from three medieval churches on the Åland Islands, Finland. The datings show convincing internal consistency and confine the construction time of the churches to AD 1280-1380 with a most probable date just before AD 1300. We have also applied the method to the controversial Newport Tower, Rhode Island, USA. Our mortar datings confine the building to colonial time in the 17th century and thus refute claims of Viking origin of the tower. For the churches, a parallel series of datings of organic (charcoal) inclusions in the mortar show less reliable results than the mortar samples, which is ascribed to poor association with the construction time.
Variance of discharge estimates sampled using acoustic Doppler current profilers from moving boats
Garcia, Carlos M.; Tarrab, Leticia; Oberg, Kevin; Szupiany, Ricardo; Cantero, Mariano I.
2012-01-01
This paper presents a model for quantifying the random errors (i.e., variance) of acoustic Doppler current profiler (ADCP) discharge measurements from moving boats for different sampling times. The model focuses on the random processes in the sampled flow field and has been developed using statistical methods currently available for uncertainty analysis of velocity time series. Analysis of field data collected using ADCP from moving boats from three natural rivers of varying sizes and flow conditions shows that, even though the estimate of the integral time scale of the actual turbulent flow field is larger than the sampling interval, the integral time scale of the sampled flow field is on the order of the sampling interval. Thus, an equation for computing the variance error in discharge measurements associated with different sampling times, assuming uncorrelated flow fields is appropriate. The approach is used to help define optimal sampling strategies by choosing the exposure time required for ADCPs to accurately measure flow discharge.
Miniaturized and direct spectrophotometric multi-sample analysis of trace metals in natural waters.
Albendín, Gemma; López-López, José A; Pinto, Juan J
2016-03-15
Trends in the analysis of trace metals in natural waters are mainly based on the development of sample treatment methods to isolate and pre-concentrate the metal from the matrix in a simpler extract for further instrumental analysis. However, direct analysis is often possible using more accessible techniques such as spectrophotometry. In this case a proper ligand is required to form a complex that absorbs radiation in the ultraviolet-visible (UV-Vis) spectrum. In this sense, the hydrazone derivative, di-2-pyridylketone benzoylhydrazone (dPKBH), forms complexes with copper (Cu) and vanadium (V) that absorb light at 370 and 395 nm, respectively. Although spectrophotometric methods are considered as time- and reagent-consuming, this work focused on its miniaturization by reducing the volume of sample as well as time and cost of analysis. In both methods, a micro-amount of sample is placed into a microplate reader with a capacity for 96 samples, which can be analyzed in times ranging from 5 to 10 min. The proposed methods have been optimized using a Box-Behnken design of experiments. For Cu determination, concentration of phosphate buffer solution at pH 8.33, masking agents (ammonium fluoride and sodium citrate), and dPKBH were optimized. For V analysis, sample (pH 4.5) was obtained using acetic acid/sodium acetate buffer, and masking agents were ammonium fluoride and 1,2-cyclohexanediaminetetraacetic acid. Under optimal conditions, both methods were applied to the analysis of certified reference materials TMDA-62 (lake water), LGC-6016 (estuarine water), and LGC-6019 (river water). In all cases, results proved the accuracy of the method. Copyright © 2015 Elsevier Inc. All rights reserved.
Sabatini, Francesca; Lluveras-Tenorio, Anna; Degano, Ilaria; Kuckova, Stepanka; Krizova, Iva; Colombini, Maria Perla
2016-11-01
This study deals with the identification of anthraquinoid molecular markers in standard dyes, reference lakes, and paint model systems using a micro-invasive and nondestructive technique such as matrix-assisted laser desorption/ionization time-of-flight-mass spectrometry (MALDI-ToF-MS). Red anthraquinoid lakes, such as madder lake, carmine lake, and Indian lac, have been the most widely used for painting purposes since ancient times. From an analytical point of view, identifying lakes in paint samples is challenging and developing methods that maximize the information achievable minimizing the amount of sample needed is of paramount importance. The employed method was tested on less than 0.5 mg of reference samples and required a minimal sample preparation, entailing a hydrofluoric acid extraction. The method is fast and versatile because of the possibility to re-analyze the same sample (once it has been spotted on the steel plate), testing both positive and negative modes in a few minutes. The MALDI mass spectra collected in the two analysis modes were studied and compared with LDI and simulated mass spectra in order to highlight the peculiar behavior of the anthraquinones in the MALDI process. Both ionization modes were assessed for each species. The effect of the different paint binders on dye identification was also evaluated through the analyses of paint model systems. In the end, the method was successful in detecting madder lake in archeological samples from Greek wall paintings and on an Italian funerary clay vessel, demonstrating its capabilities to identify dyes in small amount of highly degraded samples. Graphical Abstract ᅟ.
Qiu, Junlang; Wang, Fuxin; Zhang, Tianlang; Chen, Le; Liu, Yuan; Zhu, Fang; Ouyang, Gangfeng
2018-01-02
Decreasing the tedious sample preparation duration is one of the most important concerns for the environmental analytical chemistry especially for in vivo experiments. However, due to the slow mass diffusion paths for most of the conventional methods, ultrafast in vivo sampling remains challenging. Herein, for the first time, we report an ultrafast in vivo solid-phase microextraction (SPME) device based on electrosorption enhancement and a novel custom-made CNT@PPY@pNE fiber for in vivo sampling of ionized acidic pharmaceuticals in fish. This sampling device exhibited an excellent robustness, reproducibility, matrix effect-resistant capacity, and quantitative ability. Importantly, the extraction kinetics of the targeted ionized pharmaceuticals were significantly accelerated using the device, which significantly improved the sensitivity of the SPME in vivo sampling method (limits of detection ranged from 0.12 ng·g -1 to 0.25 ng·g -1 ) and shorten the sampling time (only 1 min). The proposed approach was successfully applied to monitor the concentrations of ionized pharmaceuticals in living fish, which demonstrated that the device and fiber were suitable for ultrafast in vivo sampling and continuous monitoring. In addition, the bioconcentration factor (BCF) values of the pharmaceuticals were derived in tilapia (Oreochromis mossambicus) for the first time, based on the data of ultrafast in vivo sampling. Therefore, we developed and validated an effective and ultrafast SPME sampling device for in vivo sampling of ionized analytes in living organisms and this state-of-the-art method provides an alternative technique for future in vivo studies.
Kawai, Toshio; Sumino, Kimiaki; Ohashi, Fumiko; Ikeda, Masayuki
2011-01-01
To facilitate urine sample preparation prior to head-space gas-chromatographic (HS-GC) analysis. Urine samples containing one of the five solvents (acetone, methanol, methyl ethyl ketone, methyl isobutyl ketone and toluene) at the levels of biological exposure limits were aspirated into a vacuum tube via holder, a device commercially available for venous blood collection (the vacuum tube method). The urine sample, 5 ml, was quantitatively transferred to a 20-ml head-space vial prior to HS-GC analysis. The loaded tubes were stored at +4 ℃ in dark for up to 3 d. The vacuum tube method facilitated on-site procedures of urine sample preparation for HS-GC with no significant loss of solvents in the sample and no need of skilled hands, whereas on-site sample preparation time was significantly reduced. Furthermore, no loss of solvents was detected during the 3-d storage, irrespective of hydrophilic (acetone) or lipophilic solvent (toluene). In a pilot application, high performance of the vacuum tube method in sealing a sample in an air-tight space succeeded to confirm that no solvent will be lost when sealing is completed within 5 min after urine voiding, and that the allowance time is as long as 30 min in case of toluene in urine. The use of the holder-vacuum tube device not only saves hands for transfer of the sample to air-tight space, but facilitates sample storage prior to HS-GC analysis.
van Hassel, Daniël; van der Velden, Lud; de Bakker, Dinny; van der Hoek, Lucas; Batenburg, Ronald
2017-12-04
Our research is based on a technique for time sampling, an innovative method for measuring the working hours of Dutch general practitioners (GPs), which was deployed in an earlier study. In this study, 1051 GPs were questioned about their activities in real time by sending them one SMS text message every 3 h during 1 week. The required sample size for this study is important for health workforce planners to know if they want to apply this method to target groups who are hard to reach or if fewer resources are available. In this time-sampling method, however, standard power analyses is not sufficient for calculating the required sample size as this accounts only for sample fluctuation and not for the fluctuation of measurements taken from every participant. We investigated the impact of the number of participants and frequency of measurements per participant upon the confidence intervals (CIs) for the hours worked per week. Statistical analyses of the time-use data we obtained from GPs were performed. Ninety-five percent CIs were calculated, using equations and simulation techniques, for various different numbers of GPs included in the dataset and for various frequencies of measurements per participant. Our results showed that the one-tailed CI, including sample and measurement fluctuation, decreased from 21 until 3 h between one and 50 GPs. As a result of the formulas to calculate CIs, the increase of the precision continued and was lower with the same additional number of GPs. Likewise, the analyses showed how the number of participants required decreased if more measurements per participant were taken. For example, one measurement per 3-h time slot during the week requires 300 GPs to achieve a CI of 1 h, while one measurement per hour requires 100 GPs to obtain the same result. The sample size needed for time-use research based on a time-sampling technique depends on the design and aim of the study. In this paper, we showed how the precision of the measurement of hours worked each week by GPs strongly varied according to the number of GPs included and the frequency of measurements per GP during the week measured. The best balance between both dimensions will depend upon different circumstances, such as the target group and the budget available.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, Yaping; Williams, Brent J.; Goldstein, Allen H.
Here, we present a rapid method for apportioning the sources of atmospheric organic aerosol composition measured by gas chromatography–mass spectrometry methods. Here, we specifically apply this new analysis method to data acquired on a thermal desorption aerosol gas chromatograph (TAG) system. Gas chromatograms are divided by retention time into evenly spaced bins, within which the mass spectra are summed. A previous chromatogram binning method was introduced for the purpose of chromatogram structure deconvolution (e.g., major compound classes) (Zhang et al., 2014). Here we extend the method development for the specific purpose of determining aerosol samples' sources. Chromatogram bins are arrangedmore » into an input data matrix for positive matrix factorization (PMF), where the sample number is the row dimension and the mass-spectra-resolved eluting time intervals (bins) are the column dimension. Then two-dimensional PMF can effectively do three-dimensional factorization on the three-dimensional TAG mass spectra data. The retention time shift of the chromatogram is corrected by applying the median values of the different peaks' shifts. Bin width affects chemical resolution but does not affect PMF retrieval of the sources' time variations for low-factor solutions. A bin width smaller than the maximum retention shift among all samples requires retention time shift correction. A six-factor PMF comparison among aerosol mass spectrometry (AMS), TAG binning, and conventional TAG compound integration methods shows that the TAG binning method performs similarly to the integration method. However, the new binning method incorporates the entirety of the data set and requires significantly less pre-processing of the data than conventional single compound identification and integration. In addition, while a fraction of the most oxygenated aerosol does not elute through an underivatized TAG analysis, the TAG binning method does have the ability to achieve molecular level resolution on other bulk aerosol components commonly observed by the AMS.« less
Askenazi, David J; Moore, John F; Fineberg, Naomi; Koralkar, Rajesh; Clevenger, Stephanie; Sharer, Jon Daniel
2014-09-01
Measurement of serum creatinine (SCr) and urine creatinine (UCr) is regularly used in clinical and research settings. For small animal experiments and for studies in which sample collection is spare (i.e. neonatal cohorts), measuring SCr and UCr using tiny amounts of sample (as low as 10 mcl) would maximize exploration and minimize iatrogenic blood loss. We performed an evaluation in six healthy adults to determine differences between SCr and UCr values in different methodologies and storage environments and time. Study was conducted using 20 mcl of sample. Analyses were done using two-way repeated measures of ANOVA. Scr values showed no significant differences between LC/MS vs. Jaffe. However, the SCr using LC/MS method was lowest when measured immediately compared to other time points (F = 7.2; P< 0.001). Similarly, Jaffe measurements showed changes in the mean differences over time; however, these were not significant. UCr values were consistently higher using LC/MS than Jaffe (F = 19; P< 0.01), and UCr changed over time (F = 8.7; P < 0.02). In addition, the interaction term for method and time was also significant (F = 5.8; P < 0.04) which reflects the stability of the Jaffe measurements over time whereas the LC/MS measurements declined; especially after being frozen for 1 year (P < 0.001). UCr measured by Jaffe is lower than samples measured by LC/MS. UCr measurements by LC/MS vary more over time, mostly due to the sample measured after 1 year; therefore, storage of urine for more than 90 days measured by LC/MS may provide altered results. © 2014 Wiley Periodicals, Inc.
Révész, Kinga M; Landwehr, Jurate M
2002-01-01
A new method was developed to analyze the stable carbon and oxygen isotope ratios of small samples (400 +/- 20 micro g) of calcium carbonate. This new method streamlines the classical phosphoric acid/calcium carbonate (H(3)PO(4)/CaCO(3)) reaction method by making use of a recently available Thermoquest-Finnigan GasBench II preparation device and a Delta Plus XL continuous flow isotope ratio mass spectrometer. Conditions for which the H(3)PO(4)/CaCO(3) reaction produced reproducible and accurate results with minimal error had to be determined. When the acid/carbonate reaction temperature was kept at 26 degrees C and the reaction time was between 24 and 54 h, the precision of the carbon and oxygen isotope ratios for pooled samples from three reference standard materials was =0.1 and =0.2 per mill or per thousand, respectively, although later analysis showed that materials from one specific standard required reaction time between 34 and 54 h for delta(18)O to achieve this level of precision. Aliquot screening methods were shown to further minimize the total error. The accuracy and precision of the new method were analyzed and confirmed by statistical analysis. The utility of the method was verified by analyzing calcite from Devils Hole, Nevada, for which isotope-ratio values had previously been obtained by the classical method. Devils Hole core DH-11 recently had been re-cut and re-sampled, and isotope-ratio values were obtained using the new method. The results were comparable with those obtained by the classical method with correlation = +0.96 for both isotope ratios. The consistency of the isotopic results is such that an alignment offset could be identified in the re-sampled core material, and two cutting errors that occurred during re-sampling then were confirmed independently. This result indicates that the new method is a viable alternative to the classical reaction method. In particular, the new method requires less sample material permitting finer resolution and allows automation of some processes resulting in considerable time savings.
A new method to sample stuttering in preschool children.
O'Brian, Sue; Jones, Mark; Pilowsky, Rachel; Onslow, Mark; Packman, Ann; Menzies, Ross
2010-06-01
This study reports a new method for sampling the speech of preschool stuttering children outside the clinic environment. Twenty parents engaged their stuttering children in an everyday play activity in the home with a telephone handset nearby. A remotely located researcher telephoned the parent and recorded the play session with a phone-recording jack attached to a digital audio recorder at the remote location. The parent placed an audio recorder near the child for comparison purposes. Children as young as 2 years complied with the remote method of speech sampling. The quality of the remote recordings was superior to that of the in-home recordings. There was no difference in means or reliability of stutter-count measures made from the remote recordings compared with those made in-home. Advantages of the new method include: (1) cost efficiency of real-time measurement of percent syllables stuttered in naturalistic situations, (2) reduction of bias associated with parent-selected timing of home recordings, (3) standardization of speech sampling procedures, (4) improved parent compliance with sampling procedures, (5) clinician or researcher on-line control of the acoustic and linguistic quality of recordings, and (6) elimination of the need to lend equipment to parents for speech sampling.
Real-Time PCR Method for Detection of Salmonella spp. in Environmental Samples.
Kasturi, Kuppuswamy N; Drgon, Tomas
2017-07-15
The methods currently used for detecting Salmonella in environmental samples require 2 days to produce results and have limited sensitivity. Here, we describe the development and validation of a real-time PCR Salmonella screening method that produces results in 18 to 24 h. Primers and probes specific to the gene invA , group D, and Salmonella enterica serovar Enteritidis organisms were designed and evaluated for inclusivity and exclusivity using a panel of 329 Salmonella isolates representing 126 serovars and 22 non- Salmonella organisms. The invA - and group D-specific sets identified all the isolates accurately. The PCR method had 100% inclusivity and detected 1 to 2 copies of Salmonella DNA per reaction. Primers specific for Salmonella -differentiating fragment 1 (Sdf-1) in conjunction with the group D set had 100% inclusivity for 32 S Enteritidis isolates and 100% exclusivity for the 297 non-Enteritidis Salmonella isolates. Single-laboratory validation performed on 1,741 environmental samples demonstrated that the PCR method detected 55% more positives than the V itek i mmuno d iagnostic a ssay s ystem (VIDAS) method. The PCR results correlated well with the culture results, and the method did not report any false-negative results. The receiver operating characteristic (ROC) analysis documented excellent agreement between the results from the culture and PCR methods (area under the curve, 0.90; 95% confidence interval of 0.76 to 1.0) confirming the validity of the PCR method. IMPORTANCE This validated PCR method detects 55% more positives for Salmonella in half the time required for the reference method, VIDAS. The validated PCR method will help to strengthen public health efforts through rapid screening of Salmonella spp. in environmental samples.
Real-Time PCR Method for Detection of Salmonella spp. in Environmental Samples
Drgon, Tomas
2017-01-01
ABSTRACT The methods currently used for detecting Salmonella in environmental samples require 2 days to produce results and have limited sensitivity. Here, we describe the development and validation of a real-time PCR Salmonella screening method that produces results in 18 to 24 h. Primers and probes specific to the gene invA, group D, and Salmonella enterica serovar Enteritidis organisms were designed and evaluated for inclusivity and exclusivity using a panel of 329 Salmonella isolates representing 126 serovars and 22 non-Salmonella organisms. The invA- and group D-specific sets identified all the isolates accurately. The PCR method had 100% inclusivity and detected 1 to 2 copies of Salmonella DNA per reaction. Primers specific for Salmonella-differentiating fragment 1 (Sdf-1) in conjunction with the group D set had 100% inclusivity for 32 S. Enteritidis isolates and 100% exclusivity for the 297 non-Enteritidis Salmonella isolates. Single-laboratory validation performed on 1,741 environmental samples demonstrated that the PCR method detected 55% more positives than the Vitek immunodiagnostic assay system (VIDAS) method. The PCR results correlated well with the culture results, and the method did not report any false-negative results. The receiver operating characteristic (ROC) analysis documented excellent agreement between the results from the culture and PCR methods (area under the curve, 0.90; 95% confidence interval of 0.76 to 1.0) confirming the validity of the PCR method. IMPORTANCE This validated PCR method detects 55% more positives for Salmonella in half the time required for the reference method, VIDAS. The validated PCR method will help to strengthen public health efforts through rapid screening of Salmonella spp. in environmental samples. PMID:28500041
Sample selection via angular distance in the space of the arguments of an artificial neural network
NASA Astrophysics Data System (ADS)
Fernández Jaramillo, J. M.; Mayerle, R.
2018-05-01
In the construction of an artificial neural network (ANN) a proper data splitting of the available samples plays a major role in the training process. This selection of subsets for training, testing and validation affects the generalization ability of the neural network. Also the number of samples has an impact in the time required for the design of the ANN and the training. This paper introduces an efficient and simple method for reducing the set of samples used for training a neural network. The method reduces the required time to calculate the network coefficients, while keeping the diversity and avoiding overtraining the ANN due the presence of similar samples. The proposed method is based on the calculation of the angle between two vectors, each one representing one input of the neural network. When the angle formed among samples is smaller than a defined threshold only one input is accepted for the training. The accepted inputs are scattered throughout the sample space. Tidal records are used to demonstrate the proposed method. The results of a cross-validation show that with few inputs the quality of the outputs is not accurate and depends on the selection of the first sample, but as the number of inputs increases the accuracy is improved and differences among the scenarios with a different starting sample have and important reduction. A comparison with the K-means clustering algorithm shows that for this application the proposed method with a smaller number of samples is producing a more accurate network.
Method and system for providing precise multi-function modulation
NASA Technical Reports Server (NTRS)
Davarian, Faramaz (Inventor); Sumida, Joe T. (Inventor)
1989-01-01
A method and system is disclosed which provides precise multi-function digitally implementable modulation for a communication system. The invention provides a modulation signal for a communication system in response to an input signal from a data source. A digitized time response is generated from samples of a time domain representation of a spectrum profile of a selected modulation scheme. The invention generates and stores coefficients for each input symbol in accordance with the selected modulation scheme. The output signal is provided by a plurality of samples, each sample being generated by summing the products of a predetermined number of the coefficients and a predetermined number of the samples of the digitized time response. In a specific illustrative implementation, the samples of the output signals are converted to analog signals, filtered and used to modulate a carrier in a conventional manner. The invention is versatile in that it allows for the storage of the digitized time responses and corresponding coefficient lookup table of a number of modulation schemes, any of which may then be selected for use in accordance with the teachings of the invention.
Reyes-Morales, Fátima; Springer, Monika
2014-04-01
Aquatic macroinvertebrates are the group of organisms most commonly used to determine ecosystem health in water quality studies and freshwater biomonitoring. Nevertheless, the methods and collecting time of biomonitoring have not yet been sufficiently adapted and tested in tropical aquatic environments. Twelve rivers in the Lago de Atitlán watershed in Guatemala were assessed with different collecting times, during the dry season. The method involved the collection of organic and inorganic material including benthic organisms, from different microhabitats, for a pre-established time period (5, 10, 15 min) with a D-frame net. Samples were preserved with 95% ethanol in the field, and sorted in the laboratory. As expected, the analysis showed that the abundance and taxonomic richness was higher with increasing sampling effort. The water quality categories obtained from the newly proposed BMWP/Atitlán index varied among sampling times. However, the Kruskal-Wallis test showed no significant differences between the categories obtained with the index and the number of taxa collected at 10 and 15 min. Therefore, we recommend a reduction of sample time, but maintaining the tree subsamples in order to include most variety of microhabitats and assure a representative sample of the aquatic macroinvertebrates.
Videvall, Elin; Strandh, Maria; Engelbrecht, Anel; Cloete, Schalk; Cornwallis, Charlie K
2017-01-01
The gut microbiome of animals is emerging as an important factor influencing ecological and evolutionary processes. A major bottleneck in obtaining microbiome data from large numbers of samples is the time-consuming laboratory procedures required, specifically the isolation of DNA and generation of amplicon libraries. Recently, direct PCR kits have been developed that circumvent conventional DNA extraction steps, thereby streamlining the laboratory process by reducing preparation time and costs. However, the reliability and efficacy of direct PCR for measuring host microbiomes have not yet been investigated other than in humans with 454 sequencing. Here, we conduct a comprehensive evaluation of the microbial communities obtained with direct PCR and the widely used Mo Bio PowerSoil DNA extraction kit in five distinct gut sample types (ileum, cecum, colon, feces, and cloaca) from 20 juvenile ostriches, using 16S rRNA Illumina MiSeq sequencing. We found that direct PCR was highly comparable over a range of measures to the DNA extraction method in cecal, colon, and fecal samples. However, the two methods significantly differed in samples with comparably low bacterial biomass: cloacal and especially ileal samples. We also sequenced 100 replicate sample pairs to evaluate repeatability during both extraction and PCR stages and found that both methods were highly consistent for cecal, colon, and fecal samples ( r s > 0.7) but had low repeatability for cloacal ( r s = 0.39) and ileal ( r s = -0.24) samples. This study indicates that direct PCR provides a fast, cheap, and reliable alternative to conventional DNA extraction methods for retrieving 16S rRNA data, which can aid future gut microbiome studies. IMPORTANCE The microbial communities of animals can have large impacts on their hosts, and the number of studies using high-throughput sequencing to measure gut microbiomes is rapidly increasing. However, the library preparation procedure in microbiome research is both costly and time-consuming, especially for large numbers of samples. We investigated a cheaper and faster direct PCR method designed to bypass the DNA isolation steps during 16S rRNA library preparation and compared it with a standard DNA extraction method. We used both techniques on five different gut sample types collected from 20 juvenile ostriches and sequenced samples with Illumina MiSeq. The methods were highly comparable and highly repeatable in three sample types with high microbial biomass (cecum, colon, and feces), but larger differences and low repeatability were found in the microbiomes obtained from the ileum and cloaca. These results will help microbiome researchers assess library preparation procedures and plan their studies accordingly.
Direct analysis of organic priority pollutants by IMS
NASA Technical Reports Server (NTRS)
Giam, C. S.; Reed, G. E.; Holliday, T. L.; Chang, L.; Rhodes, B. J.
1995-01-01
Many routine methods for monitoring of trace amounts of atmospheric organic pollutants consist of several steps. Typical steps are: (1) collection of the air sample; (2) trapping of organics from the sample; (3) extraction of the trapped organics; and (4) identification of the organics in the extract by GC (gas chromatography), HPLC (High Performance Liquid Chromatography), or MS (Mass Spectrometry). These methods are often cumbersome and time consuming. A simple and fast method for monitoring atmospheric organics using an IMS (Ion Mobility Spectrometer) is proposed. This method has a short sampling time and does not require extraction of the organics since the sample is placed directly in the IMS. The purpose of this study was to determine the responses in the IMS to organic 'priority pollutants'. Priority pollutants including representative polycyclic aromatic hydrocarbons (PAHs), phthalates, phenols, chlorinated pesticides, and polychlorinated biphenyls (PCB's) were analyzed in both the positive and negative detection mode at ambient atmospheric pressure. Detection mode and amount detected are presented.
Model-Based Adaptive Event-Triggered Control of Strict-Feedback Nonlinear Systems.
Li, Yuan-Xin; Yang, Guang-Hong
2018-04-01
This paper is concerned with the adaptive event-triggered control problem of nonlinear continuous-time systems in strict-feedback form. By using the event-sampled neural network (NN) to approximate the unknown nonlinear function, an adaptive model and an associated event-triggered controller are designed by exploiting the backstepping method. In the proposed method, the feedback signals and the NN weights are aperiodically updated only when the event-triggered condition is violated. A positive lower bound on the minimum intersample time is guaranteed to avoid accumulation point. The closed-loop stability of the resulting nonlinear impulsive dynamical system is rigorously proved via Lyapunov analysis under an adaptive event sampling condition. In comparing with the traditional adaptive backstepping design with a fixed sample period, the event-triggered method samples the state and updates the NN weights only when it is necessary. Therefore, the number of transmissions can be significantly reduced. Finally, two simulation examples are presented to show the effectiveness of the proposed control method.
Rapid method to determine actinides and 89/90Sr in limestone and marble samples
Maxwell, Sherrod L.; Culligan, Brian; Hutchison, Jay B.; ...
2016-04-12
A new method for the determination of actinides and radiostrontium in limestone and marble samples has been developed that utilizes a rapid sodium hydroxide fusion to digest the sample. Following rapid pre-concentration steps to remove sample matrix interferences, the actinides and 89/90Sr are separated using extraction chromatographic resins and measured radiometrically. The advantages of sodium hydroxide fusion versus other fusion techniques will be discussed. Lastly, this approach has a sample preparation time for limestone and marble samples of <4 hours.
Ferreira, L; Sánchez-Juanes, F; Muñoz-Bellido, J L; González-Buitrago, J M
2011-07-01
Matrix-assisted laser desorption ionization time-of-flight (MALDI-TOF) mass spectrometry (MS) is a fast and reliable technology for the identification of microorganisms with proteomics approaches. Here, we compare an intact cell method and a protein extraction method before application on the MALDI plate for the direct identification of microorganisms in both urine and blood culture samples from clinical microbiology laboratories. The results show that the intact cell method provides excellent results for urine and is a good initial method for blood cultures. The extraction method complements the intact cell method, improving microorganism identification from blood culture. Thus, we consider that MALDI-TOF MS performed directly on urine and blood culture samples, with the protocols that we propose, is a suitable technique for microorganism identification, as compared with the routine methods used in the clinical microbiology laboratory. © 2010 The Authors. Clinical Microbiology and Infection © 2010 European Society of Clinical Microbiology and Infectious Diseases.
Chapter A5. Section 6.1.F. Wastewater, Pharmaceutical, and Antibiotic Compounds
Lewis, Michael Edward; Zaugg, Steven D.
2003-01-01
The USGS differentiates between samples collected for analysis of wastewater compounds and those collected for analysis of pharmaceutical and antibiotic compounds, based on the analytical schedule for the laboratory method. Currently, only the wastewater laboratory method for field-filtered samples (SH1433) is an approved, routine (production) method. (The unfiltered wastewater method LC 8033 also is available but requires a proposal for custom analysis.) At this time, analysis of samples for pharmaceutical and antibiotic compounds is confined to research studies and is available only on a custom basis.
Determination of benzylpenicillin in pharmaceuticals by capillary zone electrophoresis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hoyt, A.M. Jr.; Sepaniak, M.J.
A rapid and direct method is described for the determination of benzylpenicillin (penicillin G) in pharmaceutical preparations. The method involves very little sample preparation and total analysis time for duplicate results is less 30 minutes per sample. The method takes advantage of the speed and separating power of capillary zone electrophoresis (CZE). Detection of penicillin is by absorption at 228 nm. An internal standard is employed to reduce sample injection error. The method was applied successfully to both tablets and injectable preparations. 14 refs., 5 figs., 3 tabs.
Sun, Yanqing; Sun, Liuquan; Zhou, Jie
2013-07-01
This paper studies the generalized semiparametric regression model for longitudinal data where the covariate effects are constant for some and time-varying for others. Different link functions can be used to allow more flexible modelling of longitudinal data. The nonparametric components of the model are estimated using a local linear estimating equation and the parametric components are estimated through a profile estimating function. The method automatically adjusts for heterogeneity of sampling times, allowing the sampling strategy to depend on the past sampling history as well as possibly time-dependent covariates without specifically model such dependence. A [Formula: see text]-fold cross-validation bandwidth selection is proposed as a working tool for locating an appropriate bandwidth. A criteria for selecting the link function is proposed to provide better fit of the data. Large sample properties of the proposed estimators are investigated. Large sample pointwise and simultaneous confidence intervals for the regression coefficients are constructed. Formal hypothesis testing procedures are proposed to check for the covariate effects and whether the effects are time-varying. A simulation study is conducted to examine the finite sample performances of the proposed estimation and hypothesis testing procedures. The methods are illustrated with a data example.
Extending the solvent-free MALDI sample preparation method.
Hanton, Scott D; Parees, David M
2005-01-01
Matrix-assisted laser desorption/ionization (MALDI) mass spectrometry is an important technique to characterize many different materials, including synthetic polymers. MALDI mass spectral data can be used to determine the polymer average molecular weights, repeat units, and end groups. One of the key issues in traditional MALDI sample preparation is making good solutions of the analyte and the matrix. Solvent-free sample preparation methods have been developed to address these issues. Previous results of solvent-free or dry prepared samples show some advantages over traditional wet sample preparation methods. Although the results of the published solvent-free sample preparation methods produced excellent mass spectra, we found the method to be very time-consuming, with significant tool cleaning, which presents a significant possibility of cross contamination. To address these issues, we developed an extension of the solvent-free method that replaces the mortar and pestle grinding with ball milling the sample in a glass vial with two small steel balls. This new method generates mass spectra with equal quality of the previous methods, but has significant advantages in productivity, eliminates cross contamination, and is applicable to liquid and soft or waxy analytes.
[Stability of disintegration in health food].
Ma, Lan; Zhao, Xin; Zhou, Shuang; Yang, Dajin
2012-11-01
To study the change of disintegration of different formulation samples which stored in the artificial climate box or room temperature and provide the technical support for health food monitoring. According to the method of Chinese Pharmacopoeia and British Pharmacopoeia. Appendix XII A. Disintegration 2010. Disintegration of the non-accelerate, accelerated after 1, 2 and 3 months samples were determined by the disintegrator, respectively. Sample properties, the ingredients of the samples, the proportions of the capsule and treatment methods have some effect on the stability of the disintegration. The disintegration time of health food will be changed particularly after they were accelerated under the condition of (38 +/- 1) degrees C/75% RH. Especially the disintegration time of soft capsules were significantly prolonged. The composition and properties of samples were the main factors that affected the disintegration.
Truzzi, Cristina; Annibaldi, Anna; Illuminati, Silvia; Finale, Carolina; Scarponi, Giuseppe
2014-05-01
The study compares official spectrophotometric methods for the determination of proline content in honey - those of the International Honey Commission (IHC) and the Association of Official Analytical Chemists (AOAC) - with the original Ough method. Results show that the extra time-consuming treatment stages added by the IHC method with respect to the Ough method are pointless. We demonstrate that the AOACs method proves to be the best in terms of accuracy and time saving. The optimized waiting time for the absorbance recording is set at 35min from the removal of reaction tubes from the boiling bath used in the sample treatment. The optimized method was validated in the matrix: linearity up to 1800mgL(-1), limit of detection 20mgL(-1), limit of quantification 61mgL(-1). The method was applied to 43 unifloral honey samples from the Marche region, Italy. Copyright © 2013 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Rinsoz, Thomas; Duquenne, Philippe; Greff-Mirguet, Guylaine; Oppliger, Anne
Traditional culture-dependent methods to quantify and identify airborne microorganisms are limited by factors such as short-duration sampling times and inability to count non-culturable or non-viable bacteria. Consequently, the quantitative assessment of bioaerosols is often underestimated. Use of the real-time quantitative polymerase chain reaction (Q-PCR) to quantify bacteria in environmental samples presents an alternative method, which should overcome this problem. The aim of this study was to evaluate the performance of a real-time Q-PCR assay as a simple and reliable way to quantify the airborne bacterial load within poultry houses and sewage treatment plants, in comparison with epifluorescence microscopy and culture-dependent methods. The estimates of bacterial load that we obtained from real-time PCR and epifluorescence methods, are comparable, however, our analysis of sewage treatment plants indicate these methods give values 270-290 fold greater than those obtained by the "impaction on nutrient agar" method. The culture-dependent method of air impaction on nutrient agar was also inadequate in poultry houses, as was the impinger-culture method, which gave a bacterial load estimate 32-fold lower than obtained by Q-PCR. Real-time quantitative PCR thus proves to be a reliable, discerning, and simple method that could be used to estimate airborne bacterial load in a broad variety of other environments expected to carry high numbers of airborne bacteria.
Noise and drift analysis of non-equally spaced timing data
NASA Technical Reports Server (NTRS)
Vernotte, F.; Zalamansky, G.; Lantz, E.
1994-01-01
Generally, it is possible to obtain equally spaced timing data from oscillators. The measurement of the drifts and noises affecting oscillators is then performed by using a variance (Allan variance, modified Allan variance, or time variance) or a system of several variances (multivariance method). However, in some cases, several samples, or even several sets of samples, are missing. In the case of millisecond pulsar timing data, for instance, observations are quite irregularly spaced in time. Nevertheless, since some observations are very close together (one minute) and since the timing data sequence is very long (more than ten years), information on both short-term and long-term stability is available. Unfortunately, a direct variance analysis is not possible without interpolating missing data. Different interpolation algorithms (linear interpolation, cubic spline) are used to calculate variances in order to verify that they neither lose information nor add erroneous information. A comparison of the results of the different algorithms is given. Finally, the multivariance method was adapted to the measurement sequence of the millisecond pulsar timing data: the responses of each variance of the system are calculated for each type of noise and drift, with the same missing samples as in the pulsar timing sequence. An estimation of precision, dynamics, and separability of this method is given.
Biniarz, Piotr; Łukaszewicz, Marcin
2017-06-01
The rapid and accurate quantification of biosurfactants in biological samples is challenging. In contrast to the orcinol method for rhamnolipids, no simple biochemical method is available for the rapid quantification of lipopeptides. Various liquid chromatography (LC) methods are promising tools for relatively fast and exact quantification of lipopeptides. Here, we report strategies for the quantification of the lipopeptides pseudofactin and surfactin in bacterial cultures using different high- (HPLC) and ultra-performance liquid chromatography (UPLC) systems. We tested three strategies for sample pretreatment prior to LC analysis. In direct analysis (DA), bacterial cultures were injected directly and analyzed via LC. As a modification, we diluted the samples with methanol and detected an increase in lipopeptide recovery in the presence of methanol. Therefore, we suggest this simple modification as a tool for increasing the accuracy of LC methods. We also tested freeze-drying followed by solvent extraction (FDSE) as an alternative for the analysis of "heavy" samples. In FDSE, the bacterial cultures were freeze-dried, and the resulting powder was extracted with different solvents. Then, the organic extracts were analyzed via LC. Here, we determined the influence of the extracting solvent on lipopeptide recovery. HPLC methods allowed us to quantify pseudofactin and surfactin with run times of 15 and 20 min per sample, respectively, whereas UPLC quantification was as fast as 4 and 5.5 min per sample, respectively. Our methods provide highly accurate measurements and high recovery levels for lipopeptides. At the same time, UPLC-MS provides the possibility to identify lipopeptides and their structural isoforms.
ERIC Educational Resources Information Center
Bohnert, Amy M.; Richards, Maryse; Kohl, Krista; Randall, Edin
2009-01-01
Using the Experience Sampling Method (ESM), this cross-sectional study examined mediated and moderated associations between different types of discretionary time activities and depressive symptoms and delinquency among a sample of 246 (107 boys, 139 girls) fifth through eighth grade urban African American adolescents. More time spent in passive…
A method for determining the weak statistical stationarity of a random process
NASA Technical Reports Server (NTRS)
Sadeh, W. Z.; Koper, C. A., Jr.
1978-01-01
A method for determining the weak statistical stationarity of a random process is presented. The core of this testing procedure consists of generating an equivalent ensemble which approximates a true ensemble. Formation of an equivalent ensemble is accomplished through segmenting a sufficiently long time history of a random process into equal, finite, and statistically independent sample records. The weak statistical stationarity is ascertained based on the time invariance of the equivalent-ensemble averages. Comparison of these averages with their corresponding time averages over a single sample record leads to a heuristic estimate of the ergodicity of a random process. Specific variance tests are introduced for evaluating the statistical independence of the sample records, the time invariance of the equivalent-ensemble autocorrelations, and the ergodicity. Examination and substantiation of these procedures were conducted utilizing turbulent velocity signals.
2012-01-01
Background Real-time quantitative nucleic acid sequence-based amplification (QT-NASBA) is a sensitive method for detection of sub-microscopic gametocytaemia by measuring gametocyte-specific mRNA. Performing analysis on fresh whole blood samples is often not feasible in remote and resource-poor areas. Convenient methods for sample storage and transport are urgently needed. Methods Real-time QT-NASBA was performed on whole blood spiked with a dilution series of purified in-vitro cultivated gametocytes. The blood was either freshly processed or spotted on filter papers. Gametocyte detection sensitivity for QT-NASBA was determined and controlled by microscopy. Dried blood spot (DBS) samples were subjected to five different storage conditions and the loss of sensitivity over time was investigated. A formula to approximate the loss of Pfs25-mRNA due to different storage conditions and time was developed. Results Pfs25-mRNA was measured in time to positivity (TTP) and correlated well with the microscopic counts and the theoretical concentrations of the dilution series. TTP results constantly indicated higher amounts of RNA in filter paper samples extracted after 24 hours than in immediately extracted fresh blood. Among investigated storage conditions freezing at −20°C performed best with 98.7% of the Pfs25-mRNA still detectable at day 28 compared to fresh blood samples. After 92 days, the RNA detection rate was only slightly decreased to 92.9%. Samples stored at 37°C showed most decay with only 64.5% of Pfs25-mRNA detectable after one month. The calculated theoretical detection limit for 24 h-old DBS filter paper samples was 0.0095 (95% CI: 0.0025 to 0.0380) per μl. Conclusions The results suggest that the application of DBS filter papers for quantification of Plasmodium falciparum gametocytes with real-time QT-NASBA is practical and recommendable. This method proved sensitive enough for detection of sub-microscopic densities even after prolonged storage. Decay rates can be predicted for different storage conditions as well as durations. PMID:22545954
Alles, Susan; Peng, Linda X; Mozola, Mark A
2009-01-01
A modification to Performance-Tested Method (PTM) 070601, Reveal Listeria Test (Reveal), is described. The modified method uses a new media formulation, LESS enrichment broth, in single-step enrichment protocols for both foods and environmental sponge and swab samples. Food samples are enriched for 27-30 h at 30 degrees C and environmental samples for 24-48 h at 30 degrees C. Implementation of these abbreviated enrichment procedures allows test results to be obtained on a next-day basis. In testing of 14 food types in internal comparative studies with inoculated samples, there was a statistically significant difference in performance between the Reveal and reference culture [U.S. Food and Drug Administration's Bacteriological Analytical Manual (FDA/BAM) or U.S. Department of Agriculture-Food Safety and Inspection Service (USDA-FSIS)] methods for only a single food in one trial (pasteurized crab meat) at the 27 h enrichment time point, with more positive results obtained with the FDA/BAM reference method. No foods showed statistically significant differences in method performance at the 30 h time point. Independent laboratory testing of 3 foods again produced a statistically significant difference in results for crab meat at the 27 h time point; otherwise results of the Reveal and reference methods were statistically equivalent. Overall, considering both internal and independent laboratory trials, sensitivity of the Reveal method relative to the reference culture procedures in testing of foods was 85.9% at 27 h and 97.1% at 30 h. Results from 5 environmental surfaces inoculated with various strains of Listeria spp. showed that the Reveal method was more productive than the reference USDA-FSIS culture procedure for 3 surfaces (stainless steel, plastic, and cast iron), whereas results were statistically equivalent to the reference method for the other 2 surfaces (ceramic tile and sealed concrete). An independent laboratory trial with ceramic tile inoculated with L. monocytogenes confirmed the effectiveness of the Reveal method at the 24 h time point. Overall, sensitivity of the Reveal method at 24 h relative to that of the USDA-FSIS method was 153%. The Reveal method exhibited extremely high specificity, with only a single false-positive result in all trials combined for overall specificity of 99.5%.
Gradient-free MCMC methods for dynamic causal modelling.
Sengupta, Biswa; Friston, Karl J; Penny, Will D
2015-05-15
In this technical note we compare the performance of four gradient-free MCMC samplers (random walk Metropolis sampling, slice-sampling, adaptive MCMC sampling and population-based MCMC sampling with tempering) in terms of the number of independent samples they can produce per unit computational time. For the Bayesian inversion of a single-node neural mass model, both adaptive and population-based samplers are more efficient compared with random walk Metropolis sampler or slice-sampling; yet adaptive MCMC sampling is more promising in terms of compute time. Slice-sampling yields the highest number of independent samples from the target density - albeit at almost 1000% increase in computational time, in comparison to the most efficient algorithm (i.e., the adaptive MCMC sampler). Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.
NASA Technical Reports Server (NTRS)
Marks, Daniel L. (Inventor); Boppart, Stephen A. (Inventor)
2009-01-01
A method of examining a sample comprises exposing the sample to a pump pulse of electromagnetic radiation for a first period of time, exposing the sample to a stimulant pulse of electromagnetic radiation for a second period of time which overlaps in time with at least a portion of the first exposing, to produce a signal pulse of electromagnetic radiation for a third period of time, and interfering the signal pulse with a reference pulse of electromagnetic radiation, to determine which portions of the signal pulse were produced during the exposing of the sample to the stimulant pulse. The first and third periods of time are each greater than the second period of time.
An ultrasonic-accelerated oxidation method for determining the oxidative stability of biodiesel.
Avila Orozco, Francisco D; Sousa, Antonio C; Domini, Claudia E; Ugulino Araujo, Mario Cesar; Fernández Band, Beatriz S
2013-05-01
Biodiesel is considered an alternative energy because it is produced from fats and vegetable oils by means of transesterification. Furthermore, it consists of fatty acid alkyl esters (FAAS) which have a great influence on biodiesel fuel properties and in the storage lifetime of biodiesel itself. The biodiesel storage stability is directly related to the oxidative stability parameter (Induction Time - IT) which is determined by means of the Rancimat® method. This method uses condutimetric monitoring and induces the degradation of FAAS by heating the sample at a constant temperature. The European Committee for Standardization established a standard (EN 14214) to determine the oxidative stability of biodiesel, which requires it to reach a minimum induction period of 6h as tested by Rancimat® method at 110°C. In this research, we aimed at developing a fast and simple alternative method to determine the induction time (IT) based on the FAAS ultrasonic-accelerated oxidation. The sonodegradation of biodiesel samples was induced by means of an ultrasonic homogenizer fitted with an immersible horn at 480Watts of power and 20 duty cycles. The UV-Vis spectrometry was used to monitor the FAAS sonodegradation by measuring the absorbance at 270nm every 2. Biodiesel samples from different feedstock were studied in this work. In all cases, IT was established as the inflection point of the absorbance versus time curve. The induction time values of all biodiesel samples determined using the proposed method was in accordance with those measured through the Rancimat® reference method by showing a R(2)=0.998. Copyright © 2012 Elsevier B.V. All rights reserved.
Phytoforensics—Using trees to find contamination
Wilson, Jordan L.
2017-09-28
The water we drink, air we breathe, and soil we come into contact with have the potential to adversely affect our health because of contaminants in the environment. Environmental samples can characterize the extent of potential contamination, but traditional methods for collecting water, air, and soil samples below the ground (for example, well drilling or direct-push soil sampling) are expensive and time consuming. Trees are closely connected to the subsurface and sampling tree trunks can indicate subsurface pollutants, a process called phytoforensics. Scientists at the Missouri Water Science Center were among the first to use phytoforensics to screen sites for contamination before using traditional sampling methods, to guide additional sampling, and to show the large cost savings associated with tree sampling compared to traditional methods.
Identifying High-Rate Flows Based on Sequential Sampling
NASA Astrophysics Data System (ADS)
Zhang, Yu; Fang, Binxing; Luo, Hao
We consider the problem of fast identification of high-rate flows in backbone links with possibly millions of flows. Accurate identification of high-rate flows is important for active queue management, traffic measurement and network security such as detection of distributed denial of service attacks. It is difficult to directly identify high-rate flows in backbone links because tracking the possible millions of flows needs correspondingly large high speed memories. To reduce the measurement overhead, the deterministic 1-out-of-k sampling technique is adopted which is also implemented in Cisco routers (NetFlow). Ideally, a high-rate flow identification method should have short identification time, low memory cost and processing cost. Most importantly, it should be able to specify the identification accuracy. We develop two such methods. The first method is based on fixed sample size test (FSST) which is able to identify high-rate flows with user-specified identification accuracy. However, since FSST has to record every sampled flow during the measurement period, it is not memory efficient. Therefore the second novel method based on truncated sequential probability ratio test (TSPRT) is proposed. Through sequential sampling, TSPRT is able to remove the low-rate flows and identify the high-rate flows at the early stage which can reduce the memory cost and identification time respectively. According to the way to determine the parameters in TSPRT, two versions of TSPRT are proposed: TSPRT-M which is suitable when low memory cost is preferred and TSPRT-T which is suitable when short identification time is preferred. The experimental results show that TSPRT requires less memory and identification time in identifying high-rate flows while satisfying the accuracy requirement as compared to previously proposed methods.
Wójcik-Fatla, Angelina; Stojek, Nimfa Maria; Dutkiewicz, Jacek
2012-01-01
The aim of the present study was: - to compare methods for concentration and isolation of Legionella DNA from water; - to examine the efficacy of various modifications of PCR test (PCR, semi-nested PCR, and real-time PCR) for the detection of known numbers of Legionella pneumophila in water samples artificially contaminated with the strain of this bacterium and in randomly selected samples of environmental water, in parallel with examination by culture. It was found that filtration is much more effective than centrifugation for the concentration of DNA in water samples, and that the Qiamp DNA Mini-Kit is the most efficient for isolation of Legionella DNA from water. The semi-nested PCR and real-time PCR proved to be the most sensitive methods for detection of Legionella DNA in water samples. Both PCR modifications showed a high correlation with recovery of Legionella by culture (p<0.01), while no correlation occurred between the results of one-stage PCR and culture (p>0.1).
Anthrax Sampling and Decontamination: Technology Trade-Offs
DOE Office of Scientific and Technical Information (OSTI.GOV)
Price, Phillip N.; Hamachi, Kristina; McWilliams, Jennifer
2008-09-12
The goal of this project was to answer the following questions concerning response to a future anthrax release (or suspected release) in a building: 1. Based on past experience, what rules of thumb can be determined concerning: (a) the amount of sampling that may be needed to determine the extent of contamination within a given building; (b) what portions of a building should be sampled; (c) the cost per square foot to decontaminate a given type of building using a given method; (d) the time required to prepare for, and perform, decontamination; (e) the effectiveness of a given decontamination methodmore » in a given type of building? 2. Based on past experience, what resources will be spent on evaluating the extent of contamination, performing decontamination, and assessing the effectiveness of the decontamination in abuilding of a given type and size? 3. What are the trade-offs between cost, time, and effectiveness for the various sampling plans, sampling methods, and decontamination methods that have been used in the past?« less
An active learning representative subset selection method using net analyte signal.
He, Zhonghai; Ma, Zhenhe; Luan, Jingmin; Cai, Xi
2018-05-05
To guarantee accurate predictions, representative samples are needed when building a calibration model for spectroscopic measurements. However, in general, it is not known whether a sample is representative prior to measuring its concentration, which is both time-consuming and expensive. In this paper, a method to determine whether a sample should be selected into a calibration set is presented. The selection is based on the difference of Euclidean norm of net analyte signal (NAS) vector between the candidate and existing samples. First, the concentrations and spectra of a group of samples are used to compute the projection matrix, NAS vector, and scalar values. Next, the NAS vectors of candidate samples are computed by multiplying projection matrix with spectra of samples. Scalar value of NAS is obtained by norm computation. The distance between the candidate set and the selected set is computed, and samples with the largest distance are added to selected set sequentially. Last, the concentration of the analyte is measured such that the sample can be used as a calibration sample. Using a validation test, it is shown that the presented method is more efficient than random selection. As a result, the amount of time and money spent on reference measurements is greatly reduced. Copyright © 2018 Elsevier B.V. All rights reserved.
Alfano, Robert R.; Demos, Stavros G.; Zhang, Gang
2003-12-16
Method and an apparatus for examining a tissue using the spectral wing emission therefrom induced by visible to infrared photoexcitation. In one aspect, the method is used to characterize the condition of a tissue sample and comprises the steps of (a) photoexciting the tissue sample with substantially monochromatic light having a wavelength of at least 600 nm; and (b) using the resultant far red and near infrared spectral wing emission (SW) emitted from the tissue sample to characterize the condition of the tissue sample. In one embodiment, the substantially monochromatic photoexciting light is a continuous beam of light, and the resultant steady-state far red and near infrared SW emission from the tissue sample is used to characterize the condition of the tissue sample. In another embodiment, the substantially monochromatic photoexciting light is a light pulse, and the resultant time-resolved far red and near infrared SW emission emitted from the tissue sample is used to characterize the condition of the tissue sample. In still another embodiment, the substantially monochromatic photoexciting light is a polarized light pulse, and the parallel and perpendicular components of the resultant polarized time-resolved SW emission emitted from the tissue sample are used to characterize the condition of the tissue sample.
An active learning representative subset selection method using net analyte signal
NASA Astrophysics Data System (ADS)
He, Zhonghai; Ma, Zhenhe; Luan, Jingmin; Cai, Xi
2018-05-01
To guarantee accurate predictions, representative samples are needed when building a calibration model for spectroscopic measurements. However, in general, it is not known whether a sample is representative prior to measuring its concentration, which is both time-consuming and expensive. In this paper, a method to determine whether a sample should be selected into a calibration set is presented. The selection is based on the difference of Euclidean norm of net analyte signal (NAS) vector between the candidate and existing samples. First, the concentrations and spectra of a group of samples are used to compute the projection matrix, NAS vector, and scalar values. Next, the NAS vectors of candidate samples are computed by multiplying projection matrix with spectra of samples. Scalar value of NAS is obtained by norm computation. The distance between the candidate set and the selected set is computed, and samples with the largest distance are added to selected set sequentially. Last, the concentration of the analyte is measured such that the sample can be used as a calibration sample. Using a validation test, it is shown that the presented method is more efficient than random selection. As a result, the amount of time and money spent on reference measurements is greatly reduced.
U/Th dating by SHRIMP RG ion-microprobe mass spectrometry using single ion-exchange beads
NASA Astrophysics Data System (ADS)
Bischoff, James L.; Wooden, Joe; Murphy, Fred; Williams, Ross W.
2005-04-01
We present a new analytical method for U-series isotopes using the SHRIMP RG (Sensitive High mass Resolution Ion MicroProbe) mass spectrometer that utilizes the preconcentration of the U-series isotopes from a sample onto a single ion-exchange bead. Ion-microprobe mass spectrometry is capable of producing Th ionization efficiencies in excess of 2%. Analytical precision is typically better than alpha spectroscopy, but not as good as thermal ionization mass spectroscopy (TIMS) and inductively coupled plasma multicollector mass spectrometry (ICP-MS). Like TIMS and ICP-MS the method allows analysis of small samples sizes, but also adds the advantage of rapidity of analysis. A major advantage of ion-microprobe analysis is that U and Th isotopes are analyzed in the same bead, simplifying the process of chemical separation. Analytical time on the instrument is ˜60 min per sample, and a single instrument-loading can accommodate 15-20 samples to be analyzed in a 24-h day. An additional advantage is that the method allows multiple reanalyses of the same bead and that samples can be archived for reanalysis at a later time. Because the ion beam excavates a pit only a few μm deep, the mount can later be repolished and reanalyzed numerous times. The method described of preconcentrating a low concentration sample onto a small conductive substrate to allow ion-microprobe mass spectrometry is potentially applicable to many other systems.
U/Th dating by SHRIMP RG ion-microprobe mass spectrometry using single ion-exchange beads
Bischoff, J.L.; Wooden, J.; Murphy, F.; Williams, Ross W.
2005-01-01
We present a new analytical method for U-series isotopes using the SHRIMP RG (Sensitive High mass Resolution Ion MicroProbe) mass spectrometer that utilizes the preconcentration of the U-series isotopes from a sample onto a single ion-exchange bead. Ion-microprobe mass spectrometry is capable of producing Th ionization efficiencies in excess of 2%. Analytical precision is typically better than alpha spectroscopy, but not as good as thermal ionization mass spectroscopy (TIMS) and inductively coupled plasma multicollector mass spectrometry (ICP-MS). Like TIMS and ICP-MS the method allows analysis of small samples sizes, but also adds the advantage of rapidity of analysis. A major advantage of ion-microprobe analysis is that U and Th isotopes are analyzed in the same bead, simplifying the process of chemical separation. Analytical time on the instrument is ???60 min per sample, and a single instrument-loading can accommodate 15-20 samples to be analyzed in a 24-h day. An additional advantage is that the method allows multiple reanalyses of the same bead and that samples can be archived for reanalysis at a later time. Because the ion beam excavates a pit only a few ??m deep, the mount can later be repolished and reanalyzed numerous times. The method described of preconcentrating a low concentration sample onto a small conductive substrate to allow ion-microprobe mass spectrometry is potentially applicable to many other systems. Copyright ?? 2005 Elsevier Ltd.
European validation of Real-Time PCR method for detection of Salmonella spp. in pork meat.
Delibato, Elisabetta; Rodriguez-Lazaro, David; Gianfranceschi, Monica; De Cesare, Alessandra; Comin, Damiano; Gattuso, Antonietta; Hernandez, Marta; Sonnessa, Michele; Pasquali, Frédérique; Sreter-Lancz, Zuzsanna; Saiz-Abajo, María-José; Pérez-De-Juan, Javier; Butrón, Javier; Prukner-Radovcic, Estella; Horvatek Tomic, Danijela; Johannessen, Gro S; Jakočiūnė, Džiuginta; Olsen, John E; Chemaly, Marianne; Le Gall, Francoise; González-García, Patricia; Lettini, Antonia Anna; Lukac, Maja; Quesne, Segolénè; Zampieron, Claudia; De Santis, Paola; Lovari, Sarah; Bertasi, Barbara; Pavoni, Enrico; Proroga, Yolande T R; Capuano, Federico; Manfreda, Gerardo; De Medici, Dario
2014-08-01
The classical microbiological method for detection of Salmonella spp. requires more than five days for final confirmation, and consequently there is a need for an alternative methodology for detection of this pathogen particularly in those food categories with a short shelf-life. This study presents an international (at European level) ISO 16140-based validation study of a non-proprietary Real-Time PCR-based method that can generate final results the day following sample analysis. It is based on an ISO compatible enrichment coupled to an easy and inexpensive DNA extraction and a consolidated Real-Time PCR assay. Thirteen laboratories from seven European Countries participated to this trial, and pork meat was selected as food model. The limit of detection observed was down to 10 CFU per 25 g of sample, showing excellent concordance and accordance values between samples and laboratories (100%). In addition, excellent values were obtained for relative accuracy, specificity and sensitivity (100%) when the results obtained for the Real-Time PCR-based methods were compared to those of the ISO 6579:2002 standard method. The results of this international trial demonstrate that the evaluated Real-Time PCR-based method represents an excellent alternative to the ISO standard. In fact, it shows an equal and solid performance as well as it reduces dramatically the extent of the analytical process, and can be easily implemented routinely by the Competent Authorities and Food Industry laboratories. Copyright © 2014 Elsevier B.V. All rights reserved.
Wang, Yi; Zeng, Jie; Yin, Lixue; Zhang, Mei; Hou, Dailun
2016-01-01
The purpose of this study was to evaluate the reliability, effectiveness, and safety of modified right heart contrast transthoracic echocardiography (cTTE) in comparison with the traditional method. We performed a modified right heart cTTE using saline mixed with a small sample of patient's own blood. Samples were agitated with varying intensity. This study protocol involved microscopic analysis and patient evaluation. 1. Microscopic analysis: After two contrast samples had been agitated 10 or 20 times, they underwent a comparison of bubble size, bubble number, and red blood cell morphology. 2. Patient analysis: 40 patients with suspected RLS (right- to-left shunt) were enrolled. All patients underwent right heart contrast echocardiography. Oxygen saturation, transit time and duration, presence of RLS, change in indirect bilirubin and urobilinogen concentrations were compared afterward. Modified method generated more bubbles (P<0.05), but the differences in bubble size were not significant (P>0.05). Twenty-four patients were diagnosed with RLS (60%) using the modified method compared to 16 patients (40%) with the traditional method. The transit time of ASb20 group was the shortest (P<0.05). However, the duration time in this group was much longer (P<0.05). Also, in semi-quantitative analysis mean rank of RLS was higher after injecting the modified contrast agent agitated 20 times (P<0.05). Modified right heart contrast echocardiography is a reliable, effective and safe method of detecting cardiovascular RLS.
Clegg, Miriam E; Shafat, Amir
2010-08-01
The (13)C octanoic acid breath test (OBT) was first developed as an alternative method of measuring gastric emptying (GE) to scintigraphy. There has been much debate about the test duration and how often measurements need to be taken. This study aims to address these issues. For 78 GE tests using the (13)C OBT, GE lag phase (T(lag)) was calculated while sampling more frequently than the recommended every 15 min. Comparisons between T(lag) were completed using Bland-Altman plots. Similarly, 4 or 6 h test durations were assessed to establish if they yield the same GE half time (T(half)). From one volunteer, samples were taken every 1 min for the first 30 min and then every 15 min until 6 h. GE times were then calculated using different combinations of sampling times. Evidence of a visible T(lag) was also explored from this data. Findings indicated that taking samples every 5 min for the first 30 min instead of every 15 min did not change the GE T(lag) based on Bland-Altman plots. The correlation between these two methods was also high (r(2) = 0.9957). The findings showed that the difference between the two sampling durations 4 and 6 h was large and the correlation between the methods was low (r(2) = 0.8335). Samples taken at a rate of one breath per min indicated lack of a visible T(lag). Sampling for the (13)C OBT should be completed every 15 min for 6 h.
ARMSTRONG, JENNA L.; DILLS, RUSSELL L.; YU, JIANBO; YOST, MICHAEL G.; FENSKE, RICHARD A.
2018-01-01
A rapid liquid chromatography tandem mass spectrometry (LC-MS/MS) method has been developed for determination of levels of the organophosphorus (OP) pesticides chlorpyrifos (CPF), azinphos methyl (AZM), and their oxygen analogs chlorpyrifos-oxon (CPF-O) and azinphos methyl-oxon (AZM-O) on common active air sampling matrices. XAD-2 resin and polyurethane foam (PUF) matrices were extracted with acetonitrile containing stable-isotope labeled internal standards (ISTD). Analysis was accomplished in Multiple Reaction Monitoring (MRM) mode, and analytes in unknown samples were identified by retention time (±0.1 min) and qualifier ratio (±30% absolute) as compared to the mean of calibrants. For all compounds, calibration linearity correlation coefficients were ≥0.996. Limits of detection (LOD) ranged from 0.15–1.1 ng/sample for CPF, CPF-O, AZM, and AZM-O on active sampling matrices. Spiked fortification recoveries were 78–113% from XAD-2 active air sampling tubes and 71–108% from PUF active air sampling tubes. Storage stability tests also yielded recoveries ranging from 74–94% after time periods ranging from 2–10 months. The results demonstrate that LC-MS/MS is a sensitive method for determining these compounds from two different matrices at the low concentrations that can result from spray drift and long range transport in non-target areas following agricultural applications. In an inter-laboratory comparison, the limit of quantification (LOQ) for LC-MS/MS was 100 times lower than a typical gas chromatography-mass spectrometry (GC-MS) method. PMID:24328542
Armstrong, Jenna L; Dills, Russell L; Yu, Jianbo; Yost, Michael G; Fenske, Richard A
2014-01-01
A rapid liquid chromatography tandem mass spectrometry (LC-MS/MS) method has been developed for determination of levels of the organophosphorus (OP) pesticides chlorpyrifos (CPF), azinphos methyl (AZM), and their oxygen analogs chlorpyrifos-oxon (CPF-O) and azinphos methyl-oxon (AZM-O) on common active air sampling matrices. XAD-2 resin and polyurethane foam (PUF) matrices were extracted with acetonitrile containing stable-isotope labeled internal standards (ISTD). Analysis was accomplished in Multiple Reaction Monitoring (MRM) mode, and analytes in unknown samples were identified by retention time (±0.1 min) and qualifier ratio (±30% absolute) as compared to the mean of calibrants. For all compounds, calibration linearity correlation coefficients were ≥0.996. Limits of detection (LOD) ranged from 0.15-1.1 ng/sample for CPF, CPF-O, AZM, and AZM-O on active sampling matrices. Spiked fortification recoveries were 78-113% from XAD-2 active air sampling tubes and 71-108% from PUF active air sampling tubes. Storage stability tests also yielded recoveries ranging from 74-94% after time periods ranging from 2-10 months. The results demonstrate that LC-MS/MS is a sensitive method for determining these compounds from two different matrices at the low concentrations that can result from spray drift and long range transport in non-target areas following agricultural applications. In an inter-laboratory comparison, the limit of quantification (LOQ) for LC-MS/MS was 100 times lower than a typical gas chromatography-mass spectrometry (GC-MS) method.
Sobhi, Hamid Reza; Yamini, Yadollah; Esrafili, Ali; Abadi, Reza Haji Hosseini Baghdad
2008-07-04
A simple, rapid and efficient microextraction method for the extraction and determination of some fat-soluble vitamins (A, D2, D3) in aqueous samples was developed. For the first time orthogonal array designs (OADs) were employed to screen the liquid-phase microextraction (LPME) method in which few microliters of 1-undecanol were delivered to the surface of the aqueous sample and it was agitated for a selected time. Then sample vial was cooled by inserting it into an ice bath for 5 min. The solidified solvent was transferred into a suitable vial and immediately melted. Then, the extract was directly injected into a high-performance liquid chromatography (HPLC) for analysis. Several factors affecting the microextraction efficiency such as sample solution temperature, stirring speed, volume of the organic solvent, ionic strength and extraction time were investigated and screened using an OA16 (4(5)) matrix. Under the best conditions (temperature, 55 degrees C; stirring speed, 1000 rpm; the volume of extracting solvent, 15.0 microL; no salt addition and extraction time, 60 min), detection limits of the method were in the range of 1.0-3.5 microgL(-1). The relative standard deviations (RSDs) to determine the vitamins at microg L(-1) levels by applying the proposed method varied in the range of 5.1-10.7%. Dynamic linear ranges of 5-500 mugL(-1) with good correlation coefficients (0.9984
Hensman, James; Lawrence, Neil D; Rattray, Magnus
2013-08-20
Time course data from microarrays and high-throughput sequencing experiments require simple, computationally efficient and powerful statistical models to extract meaningful biological signal, and for tasks such as data fusion and clustering. Existing methodologies fail to capture either the temporal or replicated nature of the experiments, and often impose constraints on the data collection process, such as regularly spaced samples, or similar sampling schema across replications. We propose hierarchical Gaussian processes as a general model of gene expression time-series, with application to a variety of problems. In particular, we illustrate the method's capacity for missing data imputation, data fusion and clustering.The method can impute data which is missing both systematically and at random: in a hold-out test on real data, performance is significantly better than commonly used imputation methods. The method's ability to model inter- and intra-cluster variance leads to more biologically meaningful clusters. The approach removes the necessity for evenly spaced samples, an advantage illustrated on a developmental Drosophila dataset with irregular replications. The hierarchical Gaussian process model provides an excellent statistical basis for several gene-expression time-series tasks. It has only a few additional parameters over a regular GP, has negligible additional complexity, is easily implemented and can be integrated into several existing algorithms. Our experiments were implemented in python, and are available from the authors' website: http://staffwww.dcs.shef.ac.uk/people/J.Hensman/.
Vital, Pierangeli G; Van Ha, Nguyen Thi; Tuyet, Le Thi Hong; Widmer, Kenneth W
2017-02-01
Surface water samples in Vietnam were collected from the Saigon River, rural and suburban canals, and urban runoff canals in Ho Chi Minh City, Vietnam, and were processed to enumerate Escherichia coli. Quantification was done through membrane filtration and quantitative real-time polymerase chain reaction (PCR). Mean log colony-forming unit (CFU)/100 ml E. coli counts in the dry season for river/suburban canals and urban canals were log 2.8 and 3.7, respectively, using a membrane filtration method, while using Taqman quantitative real-time PCR they were log 2.4 and 2.8 for river/suburban canals and urban canals, respectively. For the wet season, data determined by the membrane filtration method in river/suburban canals and urban canals samples had mean counts of log 3.7 and 4.1, respectively. While mean log CFU/100 ml counts in the wet season using quantitative PCR were log 3 and 2, respectively. Additionally, the urban canal samples were significantly lower than those determined by conventional culture methods for the wet season. These results show that while quantitative real-time PCR can be used to determine levels of fecal indicator bacteria in surface waters, there are some limitations to its application and it may be impacted by sources of runoff based on surveyed samples.
Surge Block Method for Controlling Well Clogging and Sampling Sediment during Bioremediation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wu, Wei-min; Watson, David B; Luo, Jian
2013-01-01
A surge block treatment method (i.e. inserting a solid rod plunger with a flat seal that closely fits the casing interior into a well and stocking it up and down) was performed for the rehabilitation of wells clogged with biomass and for the collection of time series sediment samples during in situ bioremediation tests for U(VI) immobilization at a the U.S. Department of Energy site in Oak Ridge, TN. The clogging caused by biomass growth had been controlled by using routine surge block treatment for18 times over a nearly four year test period. The treatment frequency was dependent of themore » dosage of electron donor injection and microbial community developed in the subsurface. Hydraulic tests showed that the apparent aquifer transmissivity at a clogged well with an inner diameter (ID) of 10.16 cm was increased by 8 13 times after the rehabilitation, indicating the effectiveness of the rehabilitation. Simultaneously with the rehabilitation, the surge block method was successfully used for collecting time series sediment samples composed of fine particles (clay and silt) from wells with ID 1.9 10.16 cm for the analysis of mineralogical and geochemical composition and microbial community during the same period. Our results demonstrated that the surge block method provided a cost-effective approach for both well rehabilitation and frequent solid sampling at the same location.« less
do Nascimento, Cássio; Muller, Katia; Sato, Sandra; Albuquerque Junior, Rubens Ferreira
2012-04-01
Long-term sample storage can affect the intensity of the hybridization signals provided by molecular diagnostic methods that use chemiluminescent detection. The aim of this study was to evaluate the effect of different storage times on the hybridization signals of 13 bacterial species detected by the Checkerboard DNA-DNA hybridization method using whole-genomic DNA probes. Ninety-six subgingival biofilm samples were collected from 36 healthy subjects, and the intensity of hybridization signals was evaluated at 4 different time periods: (1) immediately after collecting (n = 24) and (2) after storage at -20 °C for 6 months (n = 24), (3) for 12 months (n = 24), and (4) for 24 months (n = 24). The intensity of hybridization signals obtained from groups 1 and 2 were significantly higher than in the other groups (p < 0.001). No differences were found between groups 1 and 2 (p > 0.05). The Checkerboard DNA-DNA hybridization method was suitable to detect hybridization signals from all groups evaluated, and the intensity of signals decreased significantly after long periods of sample storage.
Zittermann, Sandra I; Stanghini, Brenda; See, Ryan Soo; Melano, Roberto G; Boleszczuk, Peter; Murphy, Allana; Maki, Anne; Mallo, Gustavo V
2016-01-01
Detection of Listeria monocytogenes in food is currently based on enrichment methods. When L. monocytogenes is present with other Listeria species in food, the species compete during the enrichment process. Overgrowth competition of the nonpathogenic Listeria species might result in false-negative results obtained with the current reference methods. This potential issue was noted when 50 food samples artificially spiked with L. monocytogenes were tested with a real-time PCR assay and Canada's current reference method, MFHPB-30. Eleven of the samples studied were from foods naturally contaminated with Listeria species other than those used for spiking. The real-time PCR assay detected L. monocytogenes in all 11 of these samples; however, only 6 of these samples were positive by the MFHPB-30 method. To determine whether L. monocytogenes detection can be affected by other species of the same genus due to competition, an L. monocytogenes strain and a Listeria innocua strain with a faster rate of growth in the enrichment broth were artificially coinoculated at different ratios into ground pork meat samples and cultured according to the MFHPB-30 method. L. monocytogenes was detected only by the MFHPB-30 method when L. monocytogenes/L. innocua ratios were 6.0 or higher. In contrast, using the same enrichments, the real-time PCR assay detected L. monocytogenes at ratios as low as 0.6. Taken together, these findings support the hypothesis that L. monocytogenes can be outcompeted by L. innocua during the MFHPB-30 enrichment phase. However, more reliable detection of L. monocytogenes in this situation can be achieved by a PCR-based method mainly because of its sensitivity.
Montagna, Maria Teresa; De Giglio, Osvalda; Cristina, Maria Luisa; Napoli, Christian; Pacifico, Claudia; Agodi, Antonella; Baldovin, Tatjana; Casini, Beatrice; Coniglio, Maria Anna; D'Errico, Marcello Mario; Delia, Santi Antonino; Deriu, Maria Grazia; Guida, Marco; Laganà, Pasqualina; Liguori, Giorgio; Moro, Matteo; Mura, Ida; Pennino, Francesca; Privitera, Gaetano; Romano Spica, Vincenzo; Sembeni, Silvia; Spagnolo, Anna Maria; Tardivo, Stefano; Torre, Ida; Valeriani, Federica; Albertini, Roberto; Pasquarella, Cesira
2017-06-22
Healthcare facilities (HF) represent an at-risk environment for legionellosis transmission occurring after inhalation of contaminated aerosols. In general, the control of water is preferred to that of air because, to date, there are no standardized sampling protocols. Legionella air contamination was investigated in the bathrooms of 11 HF by active sampling (Surface Air System and Coriolis ® μ) and passive sampling using settling plates. During the 8-hour sampling, hot tap water was sampled three times. All air samples were evaluated using culture-based methods, whereas liquid samples collected using the Coriolis ® μ were also analyzed by real-time PCR. Legionella presence in the air and water was then compared by sequence-based typing (SBT) methods. Air contamination was found in four HF (36.4%) by at least one of the culturable methods. The culturable investigation by Coriolis ® μ did not yield Legionella in any enrolled HF. However, molecular investigation using Coriolis ® μ resulted in eight HF testing positive for Legionella in the air. Comparison of Legionella air and water contamination indicated that Legionella water concentration could be predictive of its presence in the air. Furthermore, a molecular study of 12 L. pneumophila strains confirmed a match between the Legionella strains from air and water samples by SBT for three out of four HF that tested positive for Legionella by at least one of the culturable methods. Overall, our study shows that Legionella air detection cannot replace water sampling because the absence of microorganisms from the air does not necessarily represent their absence from water; nevertheless, air sampling may provide useful information for risk assessment. The liquid impingement technique appears to have the greatest capacity for collecting airborne Legionella if combined with molecular investigations.
Montagna, Maria Teresa; De Giglio, Osvalda; Cristina, Maria Luisa; Napoli, Christian; Pacifico, Claudia; Agodi, Antonella; Baldovin, Tatjana; Casini, Beatrice; Coniglio, Maria Anna; D’Errico, Marcello Mario; Delia, Santi Antonino; Deriu, Maria Grazia; Guida, Marco; Laganà, Pasqualina; Liguori, Giorgio; Moro, Matteo; Mura, Ida; Pennino, Francesca; Privitera, Gaetano; Romano Spica, Vincenzo; Sembeni, Silvia; Spagnolo, Anna Maria; Tardivo, Stefano; Torre, Ida; Valeriani, Federica; Albertini, Roberto; Pasquarella, Cesira
2017-01-01
Healthcare facilities (HF) represent an at-risk environment for legionellosis transmission occurring after inhalation of contaminated aerosols. In general, the control of water is preferred to that of air because, to date, there are no standardized sampling protocols. Legionella air contamination was investigated in the bathrooms of 11 HF by active sampling (Surface Air System and Coriolis®μ) and passive sampling using settling plates. During the 8-hour sampling, hot tap water was sampled three times. All air samples were evaluated using culture-based methods, whereas liquid samples collected using the Coriolis®μ were also analyzed by real-time PCR. Legionella presence in the air and water was then compared by sequence-based typing (SBT) methods. Air contamination was found in four HF (36.4%) by at least one of the culturable methods. The culturable investigation by Coriolis®μ did not yield Legionella in any enrolled HF. However, molecular investigation using Coriolis®μ resulted in eight HF testing positive for Legionella in the air. Comparison of Legionella air and water contamination indicated that Legionella water concentration could be predictive of its presence in the air. Furthermore, a molecular study of 12 L. pneumophila strains confirmed a match between the Legionella strains from air and water samples by SBT for three out of four HF that tested positive for Legionella by at least one of the culturable methods. Overall, our study shows that Legionella air detection cannot replace water sampling because the absence of microorganisms from the air does not necessarily represent their absence from water; nevertheless, air sampling may provide useful information for risk assessment. The liquid impingement technique appears to have the greatest capacity for collecting airborne Legionella if combined with molecular investigations. PMID:28640202
Murphy, Helen R; Lee, Seulgi; da Silva, Alexandre J
2017-07-01
Cyclospora cayetanensis is a protozoan parasite that causes human diarrheal disease associated with the consumption of fresh produce or water contaminated with C. cayetanensis oocysts. In the United States, foodborne outbreaks of cyclosporiasis have been linked to various types of imported fresh produce, including cilantro and raspberries. An improved method was developed for identification of C. cayetanensis in produce at the U.S. Food and Drug Administration. The method relies on a 0.1% Alconox produce wash solution for efficient recovery of oocysts, a commercial kit for DNA template preparation, and an optimized TaqMan real-time PCR assay with an internal amplification control for molecular detection of the parasite. A single laboratory validation study was performed to assess the method's performance and compare the optimized TaqMan real-time PCR assay and a reference nested PCR assay by examining 128 samples. The samples consisted of 25 g of cilantro or 50 g of raspberries seeded with 0, 5, 10, or 200 C. cayetanensis oocysts. Detection rates for cilantro seeded with 5 and 10 oocysts were 50.0 and 87.5%, respectively, with the real-time PCR assay and 43.7 and 94.8%, respectively, with the nested PCR assay. Detection rates for raspberries seeded with 5 and 10 oocysts were 25.0 and 75.0%, respectively, with the real-time PCR assay and 18.8 and 68.8%, respectively, with the nested PCR assay. All unseeded samples were negative, and all samples seeded with 200 oocysts were positive. Detection rates using the two PCR methods were statistically similar, but the real-time PCR assay is less laborious and less prone to amplicon contamination and allows monitoring of amplification and analysis of results, making it more attractive to diagnostic testing laboratories. The improved sample preparation steps and the TaqMan real-time PCR assay provide a robust, streamlined, and rapid analytical procedure for surveillance, outbreak response, and regulatory testing of foods for detection of C. cayetanensis.
NASA Astrophysics Data System (ADS)
Shupp, Aaron M.; Rodier, Dan; Rowley, Steven
2007-03-01
Monitoring and controlling Airborne Molecular Contamination (AMC) has become essential in deep ultraviolet (DUV) photolithography for both optimizing yields and protecting tool optics. A variety of technologies have been employed for both real-time and grab-sample monitoring. Real-time monitoring has the advantage of quickly identifying "spikes" and upset conditions, while 2 - 24 hour plus grab sampling allows for extremely low detection limits by concentrating the mass of the target contaminant over a period of time. Employing a combination of both monitoring techniques affords the highest degree of control, lowest detection limits, and the most detailed data possible in terms of speciation. As happens with many technologies, there can be concern regarding the accuracy and agreement between real-time and grab-sample methods. This study utilizes side by side comparisons of two different real-time monitors operating in parallel with both liquid impingers and dry sorbent tubes to measure NIST traceable gas standards as well as real world samples. By measuring in parallel, a truly valid comparison is made between methods while verifying the results against a certified standard. The final outcome for this investigation is that a dry sorbent tube grab-sample technique produced results that agreed in terms of accuracy with NIST traceable standards as well as the two real-time techniques Ion Mobility Spectrometry (IMS) and Pulsed Fluorescence Detection (PFD) while a traditional liquid impinger technique showed discrepancies.
Insights from two industrial hygiene pilot e-cigarette passive vaping studies.
Maloney, John C; Thompson, Michael K; Oldham, Michael J; Stiff, Charles L; Lilly, Patrick D; Patskan, George J; Shafer, Kenneth H; Sarkar, Mohamadi A
2016-01-01
While several reports have been published using research methods of estimating exposure risk to e-cigarette vapors in nonusers, only two have directly measured indoor air concentrations from vaping using validated industrial hygiene sampling methodology. Our first study was designed to measure indoor air concentrations of nicotine, menthol, propylene glycol, glycerol, and total particulates during the use of multiple e-cigarettes in a well-characterized room over a period of time. Our second study was a repeat of the first study, and it also evaluated levels of formaldehyde. Measurements were collected using active sampling, near real-time and direct measurement techniques. Air sampling incorporated industrial hygiene sampling methodology using analytical methods established by the National Institute of Occupational Safety and Health and the Occupational Safety and Health Administration. Active samples were collected over a 12-hr period, for 4 days. Background measurements were taken in the same room the day before and the day after vaping. Panelists (n = 185 Study 1; n = 145 Study 2) used menthol and non-menthol MarkTen prototype e-cigarettes. Vaping sessions (six, 1-hr) included 3 prototypes, with total number of puffs ranging from 36-216 per session. Results of the active samples were below the limit of quantitation of the analytical methods. Near real-time data were below the lowest concentration on the established calibration curves. Data from this study indicate that the majority of chemical constituents sampled were below quantifiable levels. Formaldehyde was detected at consistent levels during all sampling periods. These two studies found that indoor vaping of MarkTen prototype e-cigarette does not produce chemical constituents at quantifiable levels or background levels using standard industrial hygiene collection techniques and analytical methods.
Sparse magnetic resonance imaging reconstruction using the bregman iteration
NASA Astrophysics Data System (ADS)
Lee, Dong-Hoon; Hong, Cheol-Pyo; Lee, Man-Woo
2013-01-01
Magnetic resonance imaging (MRI) reconstruction needs many samples that are sequentially sampled by using phase encoding gradients in a MRI system. It is directly connected to the scan time for the MRI system and takes a long time. Therefore, many researchers have studied ways to reduce the scan time, especially, compressed sensing (CS), which is used for sparse images and reconstruction for fewer sampling datasets when the k-space is not fully sampled. Recently, an iterative technique based on the bregman method was developed for denoising. The bregman iteration method improves on total variation (TV) regularization by gradually recovering the fine-scale structures that are usually lost in TV regularization. In this study, we studied sparse sampling image reconstruction using the bregman iteration for a low-field MRI system to improve its temporal resolution and to validate its usefulness. The image was obtained with a 0.32 T MRI scanner (Magfinder II, SCIMEDIX, Korea) with a phantom and an in-vivo human brain in a head coil. We applied random k-space sampling, and we determined the sampling ratios by using half the fully sampled k-space. The bregman iteration was used to generate the final images based on the reduced data. We also calculated the root-mean-square-error (RMSE) values from error images that were obtained using various numbers of bregman iterations. Our reconstructed images using the bregman iteration for sparse sampling images showed good results compared with the original images. Moreover, the RMSE values showed that the sparse reconstructed phantom and the human images converged to the original images. We confirmed the feasibility of sparse sampling image reconstruction methods using the bregman iteration with a low-field MRI system and obtained good results. Although our results used half the sampling ratio, this method will be helpful in increasing the temporal resolution at low-field MRI systems.
Rackauckas, Christopher; Nie, Qing
2017-01-01
Adaptive time-stepping with high-order embedded Runge-Kutta pairs and rejection sampling provides efficient approaches for solving differential equations. While many such methods exist for solving deterministic systems, little progress has been made for stochastic variants. One challenge in developing adaptive methods for stochastic differential equations (SDEs) is the construction of embedded schemes with direct error estimates. We present a new class of embedded stochastic Runge-Kutta (SRK) methods with strong order 1.5 which have a natural embedding of strong order 1.0 methods. This allows for the derivation of an error estimate which requires no additional function evaluations. Next we derive a general method to reject the time steps without losing information about the future Brownian path termed Rejection Sampling with Memory (RSwM). This method utilizes a stack data structure to do rejection sampling, costing only a few floating point calculations. We show numerically that the methods generate statistically-correct and tolerance-controlled solutions. Lastly, we show that this form of adaptivity can be applied to systems of equations, and demonstrate that it solves a stiff biological model 12.28x faster than common fixed timestep algorithms. Our approach only requires the solution to a bridging problem and thus lends itself to natural generalizations beyond SDEs.
Rackauckas, Christopher
2017-01-01
Adaptive time-stepping with high-order embedded Runge-Kutta pairs and rejection sampling provides efficient approaches for solving differential equations. While many such methods exist for solving deterministic systems, little progress has been made for stochastic variants. One challenge in developing adaptive methods for stochastic differential equations (SDEs) is the construction of embedded schemes with direct error estimates. We present a new class of embedded stochastic Runge-Kutta (SRK) methods with strong order 1.5 which have a natural embedding of strong order 1.0 methods. This allows for the derivation of an error estimate which requires no additional function evaluations. Next we derive a general method to reject the time steps without losing information about the future Brownian path termed Rejection Sampling with Memory (RSwM). This method utilizes a stack data structure to do rejection sampling, costing only a few floating point calculations. We show numerically that the methods generate statistically-correct and tolerance-controlled solutions. Lastly, we show that this form of adaptivity can be applied to systems of equations, and demonstrate that it solves a stiff biological model 12.28x faster than common fixed timestep algorithms. Our approach only requires the solution to a bridging problem and thus lends itself to natural generalizations beyond SDEs. PMID:29527134
do Nascimento, Cássio; dos Santos, Janine Navarro; Pedrazzi, Vinícius; Pita, Murillo Sucena; Monesi, Nadia; Ribeiro, Ricardo Faria; de Albuquerque, Rubens Ferreira
2014-01-01
Molecular diagnosis methods have been largely used in epidemiological or clinical studies to detect and quantify microbial species that may colonize the oral cavity in healthy or disease. The preservation of genetic material from samples remains the major challenge to ensure the feasibility of these methodologies. Long-term storage may compromise the final result. The aim of this study was to evaluate the effect of temperature and time storage on the microbial detection of oral samples by Checkerboard DNA-DNA hybridization. Saliva and supragingival biofilm were taken from 10 healthy subjects, aliquoted (n=364) and processed according to proposed protocols: immediate processing and processed after 2 or 4 weeks, and 6 or 12 months of storage at 4°C, -20°C and -80°C. Either total or individual microbial counts were recorded in lower values for samples processed after 12 months of storage, irrespective of temperatures tested. Samples stored up to 6 months at cold temperatures showed similar counts to those immediately processed. The microbial incidence was also significantly reduced in samples stored during 12 months in all temperatures. Temperature and time of oral samples storage have relevant impact in the detection and quantification of bacterial and fungal species by Checkerboard DNA-DNA hybridization method. Samples should be processed immediately after collection or up to 6 months if conserved at cold temperatures to avoid false-negative results. Copyright © 2013 Elsevier Ltd. All rights reserved.
Beknazarova, Meruyert; Millsteed, Shelby; Robertson, Gemma; Whiley, Harriet; Ross, Kirstin
2017-06-09
Strongyloides stercoralis is a gastrointestinal parasitic nematode with a life cycle that includes free-living and parasitic forms. For both clinical (diagnostic) and environmental evaluation, it is important that we can detect Strongyloides spp. in both human and non-human fecal samples. Real-time PCR is the most feasible method for detecting the parasite in both clinical and environmental samples that have been preserved. However, one of the biggest challenges with PCR detection is DNA degradation during the postage time from rural and remote areas to the laboratory. This study included a laboratory assessment and field validation of DESS (dimethyl sulfoxide, disodium EDTA, and saturated NaCl) preservation of Strongyloides spp. DNA in fecal samples. The laboratory study investigated the capacity of 1:1 and 1:3 sample to DESS ratios to preserve Strongyloides ratti in spike canine feces. It was found that both ratios of DESS significantly prevented DNA degradation compared to the untreated sample. This method was then validated by applying it to the field-collected canine feces and detecting Strongyloides DNA using PCR. A total of 37 canine feces samples were collected and preserved in the 1:3 ratio (sample: DESS) and of these, 17 were positive for Strongyloides spp. The study shows that both 1:1 and 1:3 sample to DESS ratios were able to preserve the Strongyloides spp. DNA in canine feces samples stored at room temperature for up to 56 days. This DESS preservation method presents the most applicable and feasible method for the Strongyloides DNA preservation in field-collected feces.
Assays for therapeutic drug monitoring of β-lactam antibiotics: A structured review.
Carlier, Mieke; Stove, Veronique; Wallis, Steven C; De Waele, Jan J; Verstraete, Alain G; Lipman, Jeffrey; Roberts, Jason A
2015-10-01
In some patient groups, including critically ill patients, the pharmacokinetics of β-lactam antibiotics may be profoundly disturbed due to pathophysiological changes in distribution and elimination. Therapeutic drug monitoring (TDM) is a strategy that may help to optimise dosing. The aim of this review was to identify and analyse the published literature on the methods used for β-lactam quantification in TDM programmes. Sixteen reports described methods for the simultaneous determination of three or more β-lactam antibiotics in plasma/serum. Measurement of these antibiotics, due to low frequency of usage relative to some other tests, is generally limited to in-house chromatographic methods coupled to ultraviolet or mass spectrometric detection. Although many published methods state they are fit for TDM, they are inconvenient because of intensive sample preparation and/or long run times. Ideally, methods used for routine TDM should have a short turnaround time (fast run-time and fast sample preparation), a low limit of quantification and a sufficiently high upper limit of quantification. The published assays included a median of 6 analytes [interquartile range (IQR) 4-10], with meropenem and piperacillin being the most frequently measured β-lactam antibiotics. The median run time was 8 min (IQR 5.9-21.3 min). There is also a growing number of methods measuring free concentrations. An assay that measures antibiotics without any sample preparation would be the next step towards real-time monitoring; no such method is currently available. Copyright © 2015 Elsevier B.V. and the International Society of Chemotherapy. All rights reserved.
NASA Astrophysics Data System (ADS)
Romero, Rodrigo; Sienra, Rosario; Richter, Pablo
A rapid analytical approach for determination of polycyclic aromatic hydrocarbons (PAHs) present in real samples of particulate matter (PM10 filters) was investigated, based on the use of water under sub critical conditions, and the subsequent determination by GC-MS (SIM). The method avoids the use of large volumes of organic solvents as dichloromethane, toluene or other unhealthy liquid organic mixtures which are normally used in time-consuming conventional sample preparation methods. By using leaching times <1 h, the method allows determination of PAHs in the range of ng/m 3 (detection limits between 0.05 and 0.2 ng/m 3 for 1458 m 3 of sampled air) with a precision expressed as RSD between 5.6% and 11.2%. The main idea behind this approach is to raise the temperature and pressure of water inside a miniaturized laboratory-made extraction unit and to decrease its dielectric constant from 80 to nearly 20. This effect allows an increase in the solubility of low polarity hydrocarbons such as PAHs. In this way, an extraction step of a few minutes can be sufficient for a quantitative extraction of airborne particles collected in high volume PM10 samplers. Parameters such as: extraction flow, static or dynamic extraction times and water volume were optimized by using a standard reference material. Technical details are given and a comparison using real samples is made between the conventional Soxhlet extraction method and the proposed approach. The proposed approach can be used as a quantitative method to characterize low molecular PAHs and simultaneously as a screening method for high molecular weight PAHs, because the recoveries are not quantitative for molecular weights over 202. In the specific case of the Santiago metropolitan area, due to the frequent occurrence of particulate matter during high pollution episodes, this approach was applied as an efficient short-time screening method for urban PAHs. Application of this screening method is recommended especially during the winter, when periods of clear detriment of the atmospheric and meteorological conditions occur in the area.
Hoorfar, J.; Hansen, F.; Christensen, J.; Mansdal, S.; Josefsen, M. H.
2016-01-01
ABSTRACT Salmonella is recognized as one of the most important foodborne bacteria and has wide health and socioeconomic impacts worldwide. Fresh pork meat is one of the main sources of Salmonella, and efficient and fast methods for detection are therefore necessary. Current methods for Salmonella detection in fresh meat usually include >16 h of culture enrichment, in a few cases <12 h, thus requiring at least two working shifts. Here, we report a rapid (<5 h) and high-throughput method for screening of Salmonella in samples from fresh pork meat, consisting of a 3-h enrichment in standard buffered peptone water and a real-time PCR-compatible sample preparation method based on filtration, centrifugation, and enzymatic digestion, followed by fast-cycling real-time PCR detection. The method was validated in an unpaired comparative study against the Nordic Committee on Food Analysis (NMKL) reference culture method 187. Pork meat samples (n = 140) were either artificially contaminated with Salmonella at 0, 1 to 10, or 10 to 100 CFU/25 g of meat or naturally contaminated. Cohen's kappa for the degree of agreement between the rapid method and the reference was 0.64, and the relative accuracy, sensitivity, and specificity for the rapid method were 81.4, 95.1, and 97.9%, respectively. The 50% limit of detections (LOD50s) were 8.8 CFU/25 g for the rapid method and 7.7 CFU/25 g for the reference method. Implementation of this method will enable faster release of Salmonella low-risk meat, providing savings for meat producers, and it will help contribute to improved food safety. IMPORTANCE While the cost of analysis and hands-on time of the presented rapid method were comparable to those of reference culture methods, the fast product release by this method can provide the meat industry with a competitive advantage. Not only will the abattoirs save costs for work hours and cold storage, but consumers and retailers will also benefit from fresher meat with a longer shelf life. Furthermore, the presented sample preparation might be adjusted for application in the detection of other pathogenic bacteria in different sample types. PMID:27986726
Fachmann, M S R; Löfström, C; Hoorfar, J; Hansen, F; Christensen, J; Mansdal, S; Josefsen, M H
2017-03-01
Salmonella is recognized as one of the most important foodborne bacteria and has wide health and socioeconomic impacts worldwide. Fresh pork meat is one of the main sources of Salmonella , and efficient and fast methods for detection are therefore necessary. Current methods for Salmonella detection in fresh meat usually include >16 h of culture enrichment, in a few cases <12 h, thus requiring at least two working shifts. Here, we report a rapid (<5 h) and high-throughput method for screening of Salmonella in samples from fresh pork meat, consisting of a 3-h enrichment in standard buffered peptone water and a real-time PCR-compatible sample preparation method based on filtration, centrifugation, and enzymatic digestion, followed by fast-cycling real-time PCR detection. The method was validated in an unpaired comparative study against the Nordic Committee on Food Analysis (NMKL) reference culture method 187. Pork meat samples ( n = 140) were either artificially contaminated with Salmonella at 0, 1 to 10, or 10 to 100 CFU/25 g of meat or naturally contaminated. Cohen's kappa for the degree of agreement between the rapid method and the reference was 0.64, and the relative accuracy, sensitivity, and specificity for the rapid method were 81.4, 95.1, and 97.9%, respectively. The 50% limit of detections (LOD 50 s) were 8.8 CFU/25 g for the rapid method and 7.7 CFU/25 g for the reference method. Implementation of this method will enable faster release of Salmonella low-risk meat, providing savings for meat producers, and it will help contribute to improved food safety. IMPORTANCE While the cost of analysis and hands-on time of the presented rapid method were comparable to those of reference culture methods, the fast product release by this method can provide the meat industry with a competitive advantage. Not only will the abattoirs save costs for work hours and cold storage, but consumers and retailers will also benefit from fresher meat with a longer shelf life. Furthermore, the presented sample preparation might be adjusted for application in the detection of other pathogenic bacteria in different sample types. Copyright © 2017 American Society for Microbiology.
Sampling and sample processing in pesticide residue analysis.
Lehotay, Steven J; Cook, Jo Marie
2015-05-13
Proper sampling and sample processing in pesticide residue analysis of food and soil have always been essential to obtain accurate results, but the subject is becoming a greater concern as approximately 100 mg test portions are being analyzed with automated high-throughput analytical methods by agrochemical industry and contract laboratories. As global food trade and the importance of monitoring increase, the food industry and regulatory laboratories are also considering miniaturized high-throughput methods. In conjunction with a summary of the symposium "Residues in Food and Feed - Going from Macro to Micro: The Future of Sample Processing in Residue Analytical Methods" held at the 13th IUPAC International Congress of Pesticide Chemistry, this is an opportune time to review sampling theory and sample processing for pesticide residue analysis. If collected samples and test portions do not adequately represent the actual lot from which they came and provide meaningful results, then all costs, time, and efforts involved in implementing programs using sophisticated analytical instruments and techniques are wasted and can actually yield misleading results. This paper is designed to briefly review the often-neglected but crucial topic of sample collection and processing and put the issue into perspective for the future of pesticide residue analysis. It also emphasizes that analysts should demonstrate the validity of their sample processing approaches for the analytes/matrices of interest and encourages further studies on sampling and sample mass reduction to produce a test portion.
Use of Passive Diffusion Samplers for Monitoring Volatile Organic Compounds in Ground Water
Harte, Philip T.; Brayton, Michael J.; Ives, Wayne
2000-01-01
Passive diffusion samplers have been tested at a number of sites where volatile organic compounds (VOC's) are the principal contaminants in ground water. Test results generally show good agreement between concentrations of VOC's in samples collected with diffusion samplers and concentrations in samples collected by purging the water from a well. Diffusion samplers offer several advantages over conventional and low-flow ground-water sampling procedures: * Elimination of the need to purge a well before collecting a sample and to dispose of contaminated water. * Elimination of cross-contamination of samples associated with sampling with non-dedicated pumps or sample delivery tubes. * Reduction in sampling time by as much as 80 percent of that required for 'purge type' sampling methods. * An increase in the frequency and spatial coverage of monitoring at a site because of the associated savings in time and money. The successful use of diffusion samplers depends on the following three primary factors: (1) understanding site conditions and contaminants of interest (defining sample objectives), (2) validating of results of diffusion samplers against more widely acknowledged sampling methods, and (3) applying diffusion samplers in the field.
USGS GeoData Digital Raster Graphics
,
2001-01-01
Passive diffusion samplers have been tested at a number of sites where volatile organic compounds (VOC?s) are the principal contaminants in ground water. Test results generally show good agreement between concentrations of VOC?s in samples collected with diffusion samplers and concentrations in samples collected by purging the water from a well. Diffusion samplers offer several advantages over conventional and low-flow ground-water sampling procedures: ? Elimination of the need to purge a well before collecting a sample and to dispose of contaminated water. ? Elimination of cross-contamination of samples associated with sampling with non-dedicated pumps or sample delivery tubes. ? Reduction in sampling time by as much as 80 percent of that required for ?purge type? sampling methods. ? An increase in the frequency and spatial coverage of monitoring at a site because of the associated savings in time and money. The successful use of diffusion samplers depends on the following three primary factors: (1) understanding site conditions and contaminants of interest (defining sample objectives), (2) validating of results of diffusion samplers against more widely acknowledged sampling methods, and (3) applying diffusion samplers in the field.
Shu, Xu; Schaubel, Douglas E
2016-06-01
Times between successive events (i.e., gap times) are of great importance in survival analysis. Although many methods exist for estimating covariate effects on gap times, very few existing methods allow for comparisons between gap times themselves. Motivated by the comparison of primary and repeat transplantation, our interest is specifically in contrasting the gap time survival functions and their integration (restricted mean gap time). Two major challenges in gap time analysis are non-identifiability of the marginal distributions and the existence of dependent censoring (for all but the first gap time). We use Cox regression to estimate the (conditional) survival distributions of each gap time (given the previous gap times). Combining fitted survival functions based on those models, along with multiple imputation applied to censored gap times, we then contrast the first and second gap times with respect to average survival and restricted mean lifetime. Large-sample properties are derived, with simulation studies carried out to evaluate finite-sample performance. We apply the proposed methods to kidney transplant data obtained from a national organ transplant registry. Mean 10-year graft survival of the primary transplant is significantly greater than that of the repeat transplant, by 3.9 months (p=0.023), a result that may lack clinical importance. © 2015, The International Biometric Society.
Rodríguez, Roberto A; Love, David C; Stewart, Jill R; Tajuba, Julianne; Knee, Jacqueline; Dickerson, Jerold W; Webster, Laura F; Sobsey, Mark D
2012-04-01
Methods for detection of two fecal indicator viruses, F+ and somatic coliphages, were evaluated for application to recreational marine water. Marine water samples were collected during the summer of 2007 in Southern California, United States from transects along Avalon Beach (n=186 samples) and Doheny Beach (n=101 samples). Coliphage detection methods included EPA method 1601 - two-step enrichment (ENR), EPA method 1602 - single agar layer (SAL), and variations of ENR. Variations included comparison of two incubation times (overnight and 5-h incubation) and two final detection steps (lysis zone assay and a rapid latex agglutination assay). A greater number of samples were positive for somatic and F+ coliphages by ENR than by SAL (p<0.01). The standard ENR with overnight incubation and detection by lysis zone assay was the most sensitive method for the detection of F+ and somatic coliphages from marine water, although the method takes up to three days to obtain results. A rapid 5-h enrichment version of ENR also performed well, with more positive samples than SAL, and could be performed in roughly 24h. Latex agglutination-based detection methods require the least amount of time to perform, although the sensitivity was less than lysis zone-based detection methods. Rapid culture-based enrichment of coliphages in marine water may be possible by further optimizing culture-based methods for saline water conditions to generate higher viral titers than currently available, as well as increasing the sensitivity of latex agglutination detection methods. Copyright © 2012 Elsevier B.V. All rights reserved.
Coherent amplification of X-ray scattering from meso-structures
Lhermitte, Julien R.; Stein, Aaron; Tian, Cheng; ...
2017-07-10
Small-angle X-ray scattering (SAXS) often includes an unwanted background, which increases the required measurement time to resolve the sample structure. This is undesirable in all experiments, and may make measurement of dynamic or radiation-sensitive samples impossible. Here, we demonstrate a new technique, applicable when the scattering signal is background-dominated, which reduces the requisite exposure time. Our method consists of exploiting coherent interference between a sample with a designed strongly scattering `amplifier'. A modified angular correlation function is used to extract the symmetry of the interference term; that is, the scattering arising from the interference between the amplifier and the sample.more » This enables reconstruction of the sample's symmetry, despite the sample scattering itself being well below the intensity of background scattering. Thus, coherent amplification is used to generate a strong scattering term (well above background), from which sample scattering is inferred. We validate this method using lithographically defined test samples.« less
NASA Astrophysics Data System (ADS)
Zhang, Jingdong; Zhu, Tao; Zheng, Hua; Kuang, Yang; Liu, Min; Huang, Wei
2017-04-01
The round trip time of the light pulse limits the maximum detectable frequency response range of vibration in phase-sensitive optical time domain reflectometry (φ-OTDR). We propose a method to break the frequency response range restriction of φ-OTDR system by modulating the light pulse interval randomly which enables a random sampling for every vibration point in a long sensing fiber. This sub-Nyquist randomized sampling method is suits for detecting sparse-wideband- frequency vibration signals. Up to MHz resonance vibration signal with over dozens of frequency components and 1.153MHz single frequency vibration signal are clearly identified for a sensing range of 9.6km with 10kHz maximum sampling rate.
Code of Federal Regulations, 2014 CFR
2014-07-01
... ambient temperature and pressure and the sampling time. The mass concentrations of both PM10c and PM2.5 in... 25 hours), and the start times of the PM2.5 and PM10c samples are within 10 minutes and the stop times of the samples are also within 10 minutes (see section 10.4 of this appendix). 4.0Accuracy (bias...
Code of Federal Regulations, 2012 CFR
2012-07-01
... ambient temperature and pressure and the sampling time. The mass concentrations of both PM10c and PM2.5 in... 25 hours), and the start times of the PM2.5 and PM10c samples are within 10 minutes and the stop times of the samples are also within 10 minutes (see section 10.4 of this appendix). 4.0Accuracy (bias...
Code of Federal Regulations, 2013 CFR
2013-07-01
... ambient temperature and pressure and the sampling time. The mass concentrations of both PM10c and PM2.5 in... 25 hours), and the start times of the PM2.5 and PM10c samples are within 10 minutes and the stop times of the samples are also within 10 minutes (see section 10.4 of this appendix). 4.0Accuracy (bias...
Code of Federal Regulations, 2011 CFR
2011-07-01
... ambient temperature and pressure and the sampling time. The mass concentrations of both PM10c and PM2.5 in... 25 hours), and the start times of the PM2.5 and PM10c samples are within 10 minutes and the stop times of the samples are also within 10 minutes (see section 10.4 of this appendix). 4.0Accuracy (bias...
[Standard sample preparation method for quick determination of trace elements in plastic].
Yao, Wen-Qing; Zong, Rui-Long; Zhu, Yong-Fa
2011-08-01
Reference sample was prepared by masterbatch method, containing heavy metals with known concentration of electronic information products (plastic), the repeatability and precision were determined, and reference sample preparation procedures were established. X-Ray fluorescence spectroscopy (XRF) analysis method was used to determine the repeatability and uncertainty in the analysis of the sample of heavy metals and bromine element. The working curve and the metrical methods for the reference sample were carried out. The results showed that the use of the method in the 200-2000 mg x kg(-1) concentration range for Hg, Pb, Cr and Br elements, and in the 20-200 mg x kg(-1) range for Cd elements, exhibited a very good linear relationship, and the repeatability of analysis methods for six times is good. In testing the circuit board ICB288G and ICB288 from the Mitsubishi Heavy Industry Company, results agreed with the recommended values.
Riesgo, Ana; Pérez-Porro, Alicia R; Carmona, Susana; Leys, Sally P; Giribet, Gonzalo
2012-03-01
Transcriptome sequencing with next-generation sequencing technologies has the potential for addressing many long-standing questions about the biology of sponges. Transcriptome sequence quality depends on good cDNA libraries, which requires high-quality mRNA. Standard protocols for preserving and isolating mRNA often require optimization for unusual tissue types. Our aim was assessing the efficiency of two preservation modes, (i) flash freezing with liquid nitrogen (LN₂) and (ii) immersion in RNAlater, for the recovery of high-quality mRNA from sponge tissues. We also tested whether the long-term storage of samples at -80 °C affects the quantity and quality of mRNA. We extracted mRNA from nine sponge species and analysed the quantity and quality (A260/230 and A260/280 ratios) of mRNA according to preservation method, storage time, and taxonomy. The quantity and quality of mRNA depended significantly on the preservation method used (LN₂) outperforming RNAlater), the sponge species, and the interaction between them. When the preservation was analysed in combination with either storage time or species, the quantity and A260/230 ratio were both significantly higher for LN₂-preserved samples. Interestingly, individual comparisons for each preservation method over time indicated that both methods performed equally efficiently during the first month, but RNAlater lost efficiency in storage times longer than 2 months compared with flash-frozen samples. In summary, we find that for long-term preservation of samples, flash freezing is the preferred method. If LN₂ is not available, RNAlater can be used, but mRNA extraction during the first month of storage is advised. © 2011 Blackwell Publishing Ltd.
40 CFR 60.285 - Test methods and procedures.
Code of Federal Regulations, 2013 CFR
2013-07-01
... determine compliance with the particulate matter standards in § 60.282(a) (1) and (3) as follows: (1) Method 5 shall be used to determine the particulate matter concentration. The sampling time and sample... opacity. (c) The owner or operator shall determine compliance with the particular matter standard in § 60...
40 CFR 60.285 - Test methods and procedures.
Code of Federal Regulations, 2014 CFR
2014-07-01
... determine compliance with the particulate matter standards in § 60.282(a) (1) and (3) as follows: (1) Method 5 shall be used to determine the particulate matter concentration. The sampling time and sample... opacity. (c) The owner or operator shall determine compliance with the particular matter standard in § 60...
40 CFR 60.285 - Test methods and procedures.
Code of Federal Regulations, 2012 CFR
2012-07-01
... determine compliance with the particulate matter standards in § 60.282(a) (1) and (3) as follows: (1) Method 5 shall be used to determine the particulate matter concentration. The sampling time and sample... opacity. (c) The owner or operator shall determine compliance with the particular matter standard in § 60...
40 CFR 60.285 - Test methods and procedures.
Code of Federal Regulations, 2011 CFR
2011-07-01
... determine compliance with the particulate matter standards in § 60.282(a) (1) and (3) as follows: (1) Method 5 shall be used to determine the particulate matter concentration. The sampling time and sample... opacity. (c) The owner or operator shall determine compliance with the particular matter standard in § 60...
A high performance liquid chromatography (HPLC) method was developed to quantitatively determine phenolic compounds and their isomers in aqueous samples. The HPLC method can analyze a mixture of 15 contaminants in the same analytical run with an analysis time of 25 minutes. The...
Perfluorocarbon tracer method for air-infiltration measurements
Dietz, R.N.
1982-09-23
A method of measuring air infiltration rates suitable for use in rooms of homes and buildings comprises the steps of emitting perfluorocarbons in the room to be measured, sampling the air containing the emitted perfluorocarbons over a period of time, and analyzing the samples at a laboratory or other facility.
John F. Caratti
2006-01-01
The FIREMON Density (DE) method is used to assess changes in plant species density and height for a macroplot. This method uses multiple quadrats and belt transects (transects having a width) to sample within plot variation and quantify statistically valid changes in plant species density and height over time. Herbaceous plant species are sampled with quadrats while...
Procedures are described for analysis of water samples and may be adapted for assessment of solid, particulate and liquid samples. The method uses real-time PCR assay for detecting Toxoplasma gondii DNA using gene-specific primers and probe.
A high performance liquid chromatography (HPLC) method was developed to quantitatively determine phenolic compounds and their isomers in aqueous samples. The HPLC method can analyze a mixture of 15 contaminants in the same analytical run with an analysis time of 25 minutes. The...
Comparative evaluation of two methods of enumerating enterococci in foods: collaborative study.
Peterz, M; Steneryd, A C
1993-05-01
Two methods of enumerating enterococci in foods were compared in a collaborative study. Thirteen laboratories tested four blind duplicate samples containing different levels of enterococci and two negative control samples. Freeze-dried mixtures of bacteria were used as simulated food samples. The freeze-dried samples were reconstituted and either spread directly on the surface of Slanetz and Bartley medium (SB) and incubated at 44 degrees C for 48 h or preincubated in tryptone soya agar at 37 degrees C for 2 h before being overlaid by SB and incubated at 37 degrees C for a further 46 h. The numbers CFU of enterococci recovered by the two methods were not significantly different except for one sample where the 37 degrees C method gave a somewhat higher recovery. The 44 degrees C method was less time-consuming and less laborious.
Gutiérrez-Fonseca, Pablo E; Lorion, Christopher M
2014-04-01
The use of aquatic macroinvertebrates as bio-indicators in water quality studies has increased considerably over the last decade in Costa Rica, and standard biomonitoring methods have now been formulated at the national level. Nevertheless, questions remain about the effectiveness of different methods of sampling freshwater benthic assemblages, and how sampling intensity may influence biomonitoring results. In this study, we compared the results of qualitative sampling using commonly applied methods with a more intensive quantitative approach at 12 sites in small, lowland streams on the southern Caribbean slope of Costa Rica. Qualitative samples were collected following the official protocol using a strainer during a set time period and macroinvertebrates were field-picked. Quantitative sampling involved collecting ten replicate Surber samples and picking out macroinvertebrates in the laboratory with a stereomicroscope. The strainer sampling method consistently yielded fewer individuals and families than quantitative samples. As a result, site scores calculated using the Biological Monitoring Working Party-Costa Rica (BMWP-CR) biotic index often differed greatly depending on the sampling method. Site water quality classifications using the BMWP-CR index differed between the two sampling methods for 11 of the 12 sites in 2005, and for 9 of the 12 sites in 2006. Sampling intensity clearly had a strong influence on BMWP-CR index scores, as well as perceived differences between reference and impacted sites. Achieving reliable and consistent biomonitoring results for lowland Costa Rican streams may demand intensive sampling and requires careful consideration of sampling methods.
Scheid, Anika; Nebel, Markus E
2012-07-09
Over the past years, statistical and Bayesian approaches have become increasingly appreciated to address the long-standing problem of computational RNA structure prediction. Recently, a novel probabilistic method for the prediction of RNA secondary structures from a single sequence has been studied which is based on generating statistically representative and reproducible samples of the entire ensemble of feasible structures for a particular input sequence. This method samples the possible foldings from a distribution implied by a sophisticated (traditional or length-dependent) stochastic context-free grammar (SCFG) that mirrors the standard thermodynamic model applied in modern physics-based prediction algorithms. Specifically, that grammar represents an exact probabilistic counterpart to the energy model underlying the Sfold software, which employs a sampling extension of the partition function (PF) approach to produce statistically representative subsets of the Boltzmann-weighted ensemble. Although both sampling approaches have the same worst-case time and space complexities, it has been indicated that they differ in performance (both with respect to prediction accuracy and quality of generated samples), where neither of these two competing approaches generally outperforms the other. In this work, we will consider the SCFG based approach in order to perform an analysis on how the quality of generated sample sets and the corresponding prediction accuracy changes when different degrees of disturbances are incorporated into the needed sampling probabilities. This is motivated by the fact that if the results prove to be resistant to large errors on the distinct sampling probabilities (compared to the exact ones), then it will be an indication that these probabilities do not need to be computed exactly, but it may be sufficient and more efficient to approximate them. Thus, it might then be possible to decrease the worst-case time requirements of such an SCFG based sampling method without significant accuracy losses. If, on the other hand, the quality of sampled structures can be observed to strongly react to slight disturbances, there is little hope for improving the complexity by heuristic procedures. We hence provide a reliable test for the hypothesis that a heuristic method could be implemented to improve the time scaling of RNA secondary structure prediction in the worst-case - without sacrificing much of the accuracy of the results. Our experiments indicate that absolute errors generally lead to the generation of useless sample sets, whereas relative errors seem to have only small negative impact on both the predictive accuracy and the overall quality of resulting structure samples. Based on these observations, we present some useful ideas for developing a time-reduced sampling method guaranteeing an acceptable predictive accuracy. We also discuss some inherent drawbacks that arise in the context of approximation. The key results of this paper are crucial for the design of an efficient and competitive heuristic prediction method based on the increasingly accepted and attractive statistical sampling approach. This has indeed been indicated by the construction of prototype algorithms.
2012-01-01
Background Over the past years, statistical and Bayesian approaches have become increasingly appreciated to address the long-standing problem of computational RNA structure prediction. Recently, a novel probabilistic method for the prediction of RNA secondary structures from a single sequence has been studied which is based on generating statistically representative and reproducible samples of the entire ensemble of feasible structures for a particular input sequence. This method samples the possible foldings from a distribution implied by a sophisticated (traditional or length-dependent) stochastic context-free grammar (SCFG) that mirrors the standard thermodynamic model applied in modern physics-based prediction algorithms. Specifically, that grammar represents an exact probabilistic counterpart to the energy model underlying the Sfold software, which employs a sampling extension of the partition function (PF) approach to produce statistically representative subsets of the Boltzmann-weighted ensemble. Although both sampling approaches have the same worst-case time and space complexities, it has been indicated that they differ in performance (both with respect to prediction accuracy and quality of generated samples), where neither of these two competing approaches generally outperforms the other. Results In this work, we will consider the SCFG based approach in order to perform an analysis on how the quality of generated sample sets and the corresponding prediction accuracy changes when different degrees of disturbances are incorporated into the needed sampling probabilities. This is motivated by the fact that if the results prove to be resistant to large errors on the distinct sampling probabilities (compared to the exact ones), then it will be an indication that these probabilities do not need to be computed exactly, but it may be sufficient and more efficient to approximate them. Thus, it might then be possible to decrease the worst-case time requirements of such an SCFG based sampling method without significant accuracy losses. If, on the other hand, the quality of sampled structures can be observed to strongly react to slight disturbances, there is little hope for improving the complexity by heuristic procedures. We hence provide a reliable test for the hypothesis that a heuristic method could be implemented to improve the time scaling of RNA secondary structure prediction in the worst-case – without sacrificing much of the accuracy of the results. Conclusions Our experiments indicate that absolute errors generally lead to the generation of useless sample sets, whereas relative errors seem to have only small negative impact on both the predictive accuracy and the overall quality of resulting structure samples. Based on these observations, we present some useful ideas for developing a time-reduced sampling method guaranteeing an acceptable predictive accuracy. We also discuss some inherent drawbacks that arise in the context of approximation. The key results of this paper are crucial for the design of an efficient and competitive heuristic prediction method based on the increasingly accepted and attractive statistical sampling approach. This has indeed been indicated by the construction of prototype algorithms. PMID:22776037
Sample size determination for mediation analysis of longitudinal data.
Pan, Haitao; Liu, Suyu; Miao, Danmin; Yuan, Ying
2018-03-27
Sample size planning for longitudinal data is crucial when designing mediation studies because sufficient statistical power is not only required in grant applications and peer-reviewed publications, but is essential to reliable research results. However, sample size determination is not straightforward for mediation analysis of longitudinal design. To facilitate planning the sample size for longitudinal mediation studies with a multilevel mediation model, this article provides the sample size required to achieve 80% power by simulations under various sizes of the mediation effect, within-subject correlations and numbers of repeated measures. The sample size calculation is based on three commonly used mediation tests: Sobel's method, distribution of product method and the bootstrap method. Among the three methods of testing the mediation effects, Sobel's method required the largest sample size to achieve 80% power. Bootstrapping and the distribution of the product method performed similarly and were more powerful than Sobel's method, as reflected by the relatively smaller sample sizes. For all three methods, the sample size required to achieve 80% power depended on the value of the ICC (i.e., within-subject correlation). A larger value of ICC typically required a larger sample size to achieve 80% power. Simulation results also illustrated the advantage of the longitudinal study design. The sample size tables for most encountered scenarios in practice have also been published for convenient use. Extensive simulations study showed that the distribution of the product method and bootstrapping method have superior performance to the Sobel's method, but the product method was recommended to use in practice in terms of less computation time load compared to the bootstrapping method. A R package has been developed for the product method of sample size determination in mediation longitudinal study design.
Sample Preparation of Corn Seed Tissue to Prevent Analyte Relocations for Mass Spectrometry Imaging
NASA Astrophysics Data System (ADS)
Kim, Shin Hye; Kim, Jeongkwon; Lee, Young Jin; Lee, Tae Geol; Yoon, Sohee
2017-08-01
Corn seed tissue sections were prepared by the tape support method using an adhesive tape, and mass spectrometry imaging (MSI) was performed. The effect of heat generated during sample preparation was investigated by time-of-flight secondary mass spectrometry (TOF-SIMS) imaging of corn seed tissue prepared by the tape support and the thaw-mounted methods. Unlike thaw-mounted sample preparation, the tape support method does not cause imaging distortion because of the absence of heat, which can cause migration of the analytes on the sample. By applying the tape-support method, the corn seed tissue was prepared without structural damage and MSI with accurate spatial information of analytes was successfully performed.
Method for determining the concentration of atomic species in gases and solids
Loge, Gary W.
1998-01-01
Method for determining the concentration of atomic species in gases and solids. Measurement of at least two emission intensities from a species in a sample that is excited by incident laser radiation. Which generates a plasma therein after a sufficient time period has elapsed and during a second time period, permits an instantaneous temperature to be established within the sample. The concentration of the atomic species to be determined is then derived from the known emission intensity of a predetermined concentration of that species in the sample at the measured temperature, a quantity which is measured prior to the determination of the unknown concentration, and the actual measured emission from the unknown species, or by this latter emission and the emission intensity of a species having known concentration within the sample such as nitrogen for gaseous air samples.
Segmental analysis of amphetamines in hair using a sensitive UHPLC-MS/MS method.
Jakobsson, Gerd; Kronstrand, Robert
2014-06-01
A sensitive and robust ultra high performance liquid chromatography-tandem mass spectrometry (UHPLC-MS/MS) method was developed and validated for quantification of amphetamine, methamphetamine, 3,4-methylenedioxyamphetamine and 3,4-methylenedioxy methamphetamine in hair samples. Segmented hair (10 mg) was incubated in 2M sodium hydroxide (80°C, 10 min) before liquid-liquid extraction with isooctane followed by centrifugation and evaporation of the organic phase to dryness. The residue was reconstituted in methanol:formate buffer pH 3 (20:80). The total run time was 4 min and after optimization of UHPLC-MS/MS-parameters validation included selectivity, matrix effects, recovery, process efficiency, calibration model and range, lower limit of quantification, precision and bias. The calibration curve ranged from 0.02 to 12.5 ng/mg, and the recovery was between 62 and 83%. During validation the bias was less than ±7% and the imprecision was less than 5% for all analytes. In routine analysis, fortified control samples demonstrated an imprecision <13% and control samples made from authentic hair demonstrated an imprecision <26%. The method was applied to samples from a controlled study of amphetamine intake as well as forensic hair samples previously analyzed with an ultra high performance liquid chromatography time of flight mass spectrometry (UHPLC-TOF-MS) screening method. The proposed method was suitable for quantification of these drugs in forensic cases including violent crimes, autopsy cases, drug testing and re-granting of driving licences. This study also demonstrated that if hair samples are divided into several short segments, the time point for intake of a small dose of amphetamine can be estimated, which might be useful when drug facilitated crimes are investigated. Copyright © 2014 John Wiley & Sons, Ltd.
Strotman, Lindsay N; Lin, Guangyun; Berry, Scott M; Johnson, Eric A; Beebe, David J
2012-09-07
Extraction and purification of DNA is a prerequisite to detection and analytical techniques. While DNA sample preparation methods have improved over the last few decades, current methods are still time consuming and labor intensive. Here we demonstrate a technology termed IFAST (Immiscible Filtration Assisted by Surface Tension), that relies on immiscible phase filtration to reduce the time and effort required to purify DNA. IFAST replaces the multiple wash and centrifugation steps required by traditional DNA sample preparation methods with a single step. To operate, DNA from lysed cells is bound to paramagnetic particles (PMPs) and drawn through an immiscible fluid phase barrier (i.e. oil) by an external handheld magnet. Purified DNA is then eluted from the PMPs. Here, detection of Clostridium botulinum type A (BoNT/A) in food matrices (milk, orange juice), a bioterrorism concern, was used as a model system to establish IFAST's utility in detection assays. Data validated that the DNA purified by IFAST was functional as a qPCR template to amplify the bont/A gene. The sensitivity limit of IFAST was comparable to the commercially available Invitrogen ChargeSwitch® method. Notably, pathogen detection via IFAST required only 8.5 μL of sample and was accomplished in five-fold less time. The simplicity, rapidity and portability of IFAST offer significant advantages when compared to existing DNA sample preparation methods.
Faster protein folding using enhanced conformational sampling of molecular dynamics simulation.
Kamberaj, Hiqmet
2018-05-01
In this study, we applied swarm particle-like molecular dynamics (SPMD) approach to enhance conformational sampling of replica exchange simulations. In particular, the approach showed significant improvement in sampling efficiency of conformational phase space when combined with replica exchange method (REM) in computer simulation of peptide/protein folding. First we introduce the augmented dynamical system of equations, and demonstrate the stability of the algorithm. Then, we illustrate the approach by using different fully atomistic and coarse-grained model systems, comparing them with the standard replica exchange method. In addition, we applied SPMD simulation to calculate the time correlation functions of the transitions in a two dimensional surface to demonstrate the enhancement of transition path sampling. Our results showed that folded structure can be obtained in a shorter simulation time using the new method when compared with non-augmented dynamical system. Typically, in less than 0.5 ns using replica exchange runs assuming that native folded structure is known and within simulation time scale of 40 ns in the case of blind structure prediction. Furthermore, the root mean square deviations from the reference structures were less than 2Å. To demonstrate the performance of new method, we also implemented three simulation protocols using CHARMM software. Comparisons are also performed with standard targeted molecular dynamics simulation method. Copyright © 2018 Elsevier Inc. All rights reserved.
General constraints on sampling wildlife on FIA plots
Bailey, L.L.; Sauer, J.R.; Nichols, J.D.; Geissler, P.H.; McRoberts, Ronald E.; Reams, Gregory A.; Van Deusen, Paul C.; McWilliams, William H.; Cieszewski, Chris J.
2005-01-01
This paper reviews the constraints to sampling wildlife populations at FIA points. Wildlife sampling programs must have well-defined goals and provide information adequate to meet those goals. Investigators should choose a State variable based on information needs and the spatial sampling scale. We discuss estimation-based methods for three State variables: species richness, abundance, and patch occupancy. All methods incorporate two essential sources of variation: detectability estimation and spatial variation. FIA sampling imposes specific space and time criteria that may need to be adjusted to meet local wildlife objectives.
Jeong, In-Seek; Kwak, Byung-Man; Ahn, Jang-Hyuk; Leem, Donggil; Yoon, Taehyung; Yoon, Changyong; Jeong, Jayoung; Park, Jung-Min; Kim, Jin-Man
2012-10-01
In this study, nonheated saponification was employed as a novel, rapid, and easy sample preparation method for the determination of cholesterol in emulsified foods. Cholesterol content was analyzed using gas chromatography with a flame ionization detector (GC-FID). The cholesterol extraction method was optimized for maximum recovery from baby food and infant formula. Under these conditions, the optimum extraction solvent was 10 mL ethyl ether per 1 to 2 g sample, and the saponification solution was 0.2 mL KOH in methanol. The cholesterol content in the products was determined to be within the certified range of certified reference materials (CRMs), NIST SRM 1544 and SRM 1849. The results of the recovery test performed using spiked materials were in the range of 98.24% to 99.45% with an relative standard devitation (RSD) between 0.83% and 1.61%. This method could be used to reduce sample pretreatment time and is expected to provide an accurate determination of cholesterol in emulsified food matrices such as infant formula and baby food. A novel, rapid, and easy sample preparation method using nonheated saponification was developed for cholesterol detection in emulsified foods. Recovery tests of CRMs were satisfactory, and the recoveries of spiked materials were accurate and precise. This method was effective and decreased the time required for analysis by 5-fold compared to the official method. © 2012 Institute of Food Technologists®
Gosselin, Robert C; Adcock, Dorothy M; Bates, Shannon M; Douxfils, Jonathan; Favaloro, Emmanuel J; Gouin-Thibault, Isabelle; Guillermo, Cecilia; Kawai, Yohko; Lindhoff-Last, Edelgard; Kitchen, Steve
2018-03-01
This guidance document was prepared on behalf of the International Council for Standardization in Haematology (ICSH) for providing haemostasis-related guidance documents for clinical laboratories. This inaugural coagulation ICSH document was developed by an ad hoc committee, comprised of international clinical and laboratory direct acting oral anticoagulant (DOAC) experts. The committee developed consensus recommendations for laboratory measurement of DOACs (dabigatran, rivaroxaban, apixaban and edoxaban), which would be germane for laboratories assessing DOAC anticoagulation. This guidance document addresses all phases of laboratory DOAC measurements, including pre-analytical (e.g. preferred time sample collection, preferred sample type, sample stability), analytical (gold standard method, screening and quantifying methods) and post analytical (e.g. reporting units, quality assurance). The committee addressed the use and limitations of screening tests such as prothrombin time, activated partial thromboplastin time as well as viscoelastic measurements of clotting blood and point of care methods. Additionally, the committee provided recommendations for the proper validation or verification of performance of laboratory assays prior to implementation for clinical use, and external quality assurance to provide continuous assessment of testing and reporting method. Schattauer GmbH Stuttgart.
[Classical and molecular methods for identification and quantification of domestic moulds].
Fréalle, E; Bex, V; Reboux, G; Roussel, S; Bretagne, S
2017-12-01
To study the impact of the constant and inevitable inhalation of moulds, it is necessary to sample, identify and count the spores. Environmental sampling methods can be separated into three categories: surface sampling is easy to perform but non quantitative, air sampling is easy to calibrate but provides time limited information, and dust sampling which is more representative of long term exposure to moulds. The sampling strategy depends on the objectives (evaluation of the risk of exposure for individuals; quantification of the household contamination; evaluation of the efficacy of remediation). The mould colonies obtained in culture are identified using microscopy, Maldi-TOF, and/or DNA sequencing. Electrostatic dust collectors are an alternative to older methods for identifying and quantifying household mould spores. They are easy to use and relatively cheap. Colony counting should be progressively replaced by quantitative real-time PCR, which is already validated, while waiting for more standardised high throughput sequencing methods for assessment of mould contamination without technical bias. Despite some technical recommendations for obtaining reliable and comparable results, the huge diversity of environmental moulds, the variable quantity of spores inhaled and the association with other allergens (mites, plants) make the evaluation of their impact on human health difficult. Hence there is a need for reliable and generally applicable quantitative methods. Copyright © 2017 SPLF. Published by Elsevier Masson SAS. All rights reserved.
Real-time estimation of BDS/GPS high-rate satellite clock offsets using sequential least squares
NASA Astrophysics Data System (ADS)
Fu, Wenju; Yang, Yuanxi; Zhang, Qin; Huang, Guanwen
2018-07-01
The real-time precise satellite clock product is one of key prerequisites for real-time Precise Point Positioning (PPP). The accuracy of the 24-hour predicted satellite clock product with 15 min sampling interval and an update of 6 h provided by the International GNSS Service (IGS) is only 3 ns, which could not meet the needs of all real-time PPP applications. The real-time estimation of high-rate satellite clock offsets is an efficient method for improving the accuracy. In this paper, the sequential least squares method to estimate real-time satellite clock offsets with high sample rate is proposed to improve the computational speed by applying an optimized sparse matrix operation to compute the normal equation and using special measures to take full advantage of modern computer power. The method is first applied to BeiDou Navigation Satellite System (BDS) and provides real-time estimation with a 1 s sample rate. The results show that the amount of time taken to process a single epoch is about 0.12 s using 28 stations. The Standard Deviation (STD) and Root Mean Square (RMS) of the real-time estimated BDS satellite clock offsets are 0.17 ns and 0.44 ns respectively when compared to German Research Center for Geosciences (GFZ) final clock products. The positioning performance of the real-time estimated satellite clock offsets is evaluated. The RMSs of the real-time BDS kinematic PPP in east, north, and vertical components are 7.6 cm, 6.4 cm and 19.6 cm respectively. The method is also applied to Global Positioning System (GPS) with a 10 s sample rate and the computational time of most epochs is less than 1.5 s with 75 stations. The STD and RMS of the real-time estimated GPS satellite clocks are 0.11 ns and 0.27 ns, respectively. The accuracies of 5.6 cm, 2.6 cm and 7.9 cm in east, north, and vertical components are achieved for the real-time GPS kinematic PPP.
Methods of Reverberation Mapping. I. Time-lag Determination by Measures of Randomness
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chelouche, Doron; Pozo-Nuñez, Francisco; Zucker, Shay, E-mail: doron@sci.haifa.ac.il, E-mail: francisco.pozon@gmail.com, E-mail: shayz@post.tau.ac.il
A class of methods for measuring time delays between astronomical time series is introduced in the context of quasar reverberation mapping, which is based on measures of randomness or complexity of the data. Several distinct statistical estimators are considered that do not rely on polynomial interpolations of the light curves nor on their stochastic modeling, and do not require binning in correlation space. Methods based on von Neumann’s mean-square successive-difference estimator are found to be superior to those using other estimators. An optimized von Neumann scheme is formulated, which better handles sparsely sampled data and outperforms current implementations of discretemore » correlation function methods. This scheme is applied to existing reverberation data of varying quality, and consistency with previously reported time delays is found. In particular, the size–luminosity relation of the broad-line region in quasars is recovered with a scatter comparable to that obtained by other works, yet with fewer assumptions made concerning the process underlying the variability. The proposed method for time-lag determination is particularly relevant for irregularly sampled time series, and in cases where the process underlying the variability cannot be adequately modeled.« less
40 CFR 60.704 - Test methods and procedures.
Code of Federal Regulations, 2010 CFR
2010-07-01
... sites. The control device inlet sampling site for determination of vent stream molar composition or TOC... compliance with the 20 ppmv limit. The sampling site shall be the same as that of the TOC samples, and the samples shall be taken during the same time that the TOC samples are taken. The TOC concentration...
40 CFR 60.664 - Test methods and procedures.
Code of Federal Regulations, 2010 CFR
2010-07-01
... sites. The control device inlet sampling site for determination of vent stream molar composition or TOC... compliance with the 20 ppmv limit. The sampling site shall be the same as that of the TOC samples, and the samples shall be taken during the same time that the TOC samples are taken. The TOC concentration...
Abras, Alba; Ballart, Cristina; Llovet, Teresa; Roig, Carme; Gutiérrez, Cristina; Tebar, Silvia; Berenguer, Pere; Pinazo, María-Jesús; Posada, Elizabeth; Gascón, Joaquim; Schijman, Alejandro G; Gállego, Montserrat; Muñoz, Carmen
2018-01-01
Polymerase chain reaction (PCR) has become a useful tool for the diagnosis of Trypanosoma cruzi infection. The development of automated DNA extraction methodologies and PCR systems is an important step toward the standardization of protocols in routine diagnosis. To date, there are only two commercially available Real-Time PCR assays for the routine laboratory detection of T. cruzi DNA in clinical samples: TCRUZIDNA.CE (Diagnostic Bioprobes Srl) and RealCycler CHAG (Progenie Molecular). Our aim was to evaluate the RealCycler CHAG assay taking into account the whole process. We assessed the usefulness of an automated DNA extraction system based on magnetic particles (EZ1 Virus Mini Kit v2.0, Qiagen) combined with a commercially available Real-Time PCR assay targeting satellite DNA (SatDNA) of T. cruzi (RealCycler CHAG), a methodology used for routine diagnosis in our hospital. It was compared with a well-known strategy combining a commercial DNA isolation kit based on silica columns (High Pure PCR Template Preparation Kit, Roche Diagnostics) with an in-house Real-Time PCR targeting SatDNA. The results of the two methodologies were in almost perfect agreement, indicating they can be used interchangeably. However, when variations in protocol factors were applied (sample treatment, extraction method and Real-Time PCR), the results were less convincing. A comprehensive fine-tuning of the whole procedure is the key to successful results. Guanidine EDTA-blood (GEB) samples are not suitable for DNA extraction based on magnetic particles due to inhibition, at least when samples are not processed immediately. This is the first study to evaluate the RealCycler CHAG assay taking into account the overall process, including three variables (sample treatment, extraction method and Real-Time PCR). Our findings may contribute to the harmonization of protocols between laboratories and to a wider application of Real-Time PCR in molecular diagnostic laboratories associated with health centers.
An apparatus for sequentially combining microvolumes of reagents by infrasonic mixing.
Camien, M N; Warner, R C
1984-05-01
A method employing high-speed infrasonic mixing for obtaining timed samples for following the progress of a moderately rapid chemical reaction is described. Drops of 10 to 50 microliter each of two reagents are mixed to initiate the reaction, followed, after a measured time interval, by mixing with a drop of a third reagent to quench the reaction. The method was developed for measuring the rate of denaturation of covalently closed, circular DNA in NaOH at several temperatures. For this purpose the timed samples were analyzed by analytical ultracentrifugation. The apparatus was tested by determination of the rate of hydrolysis of 2,4-dinitrophenyl acetate in an alkaline buffer. The important characteristics of the method are (i) it requires very small volumes of sample and reagents; (ii) the components of the reaction mixture are pre-equilibrated and mixed with no transfer outside the prescribed constant temperature environment; (iii) the mixing is very rapid; and (iv) satisfactorily precise measurements of relatively short time intervals (approximately 2 sec minimum) between sequential mixings of the components are readily obtainable.
Zhang, Shunqi; Yin, Tao; Ma, Ren; Liu, Zhipeng
2015-08-01
Functional imaging method of biological electrical characteristics based on magneto-acoustic effect gives valuable information of tissue in early tumor diagnosis, therein time and frequency characteristics analysis of magneto-acoustic signal is important in image reconstruction. This paper proposes wave summing method based on Green function solution for acoustic source of magneto-acoustic effect. Simulations and analysis under quasi 1D transmission condition are carried out to time and frequency characteristics of magneto-acoustic signal of models with different thickness. Simulation results of magneto-acoustic signal were verified through experiments. Results of the simulation with different thickness showed that time-frequency characteristics of magneto-acoustic signal reflected thickness of sample. Thin sample, which is less than one wavelength of pulse, and thick sample, which is larger than one wavelength, showed different summed waveform and frequency characteristics, due to difference of summing thickness. Experimental results verified theoretical analysis and simulation results. This research has laid a foundation for acoustic source and conductivity reconstruction to the medium with different thickness in magneto-acoustic imaging.
Protocol for Detection of Yersinia pestis in Environmental ...
Methods Report This is the first ever open-access and detailed protocol available to all government departments and agencies, and their contractors to detect Yersinia pestis, the pathogen that causes plague, from multiple environmental sample types including water. Each analytical method includes sample processing procedure for each sample type in a step-by-step manner. It includes real-time PCR, traditional microbiological culture, and the Rapid Viability PCR (RV-PCR) analytical methods. For large volume water samples it also includes an ultra-filtration-based sample concentration procedure. Because of such a non-restrictive availability of this protocol to all government departments and agencies, and their contractors, the nation will now have increased laboratory capacity to analyze large number of samples during a wide-area plague incident.
Simulation of cryolipolysis as a novel method for noninvasive fat layer reduction.
Majdabadi, Abbas; Abazari, Mohammad
2016-12-20
Regarding previous problems in conventional liposuction methods, the need for development of new fat removal operations was appreciated. In this study we are going to simulate one of the novel methods, cryolipolysis, aimed to tackle those drawbacks. We think that simulation of clinical procedures contributes considerably in efficacious performance of the operations. To do this we have attempted to simulate temperature distribution in a sample fat of the human body. Using Abaqus software we have presented the graphical display of temperature-time variations within the medium. Findings of our simulation indicate that tissue temperature decreases after cold exposure of about 30 min. It can be seen that the minimum temperature of tissue occurs in shallow layers of the sample and the temperature in deeper layers of the sample remains nearly unchanged. It is clear that cold exposure time of more than the specific time (t > 30 min) does not result in considerable changes. Numerous clinical studies have proved the efficacy of cryolipolysis. This noninvasive technique has eliminated some of drawbacks of conventional methods. Findings of our simulation clearly prove the efficiency of this method, especially for superficial fat layers.
del Río, Vanessa; Larrechi, M Soledad; Callao, M Pilar
2010-06-15
A new concept of flow titration is proposed and demonstrated for the determination of total acidity in plant oils and biodiesel. We use sequential injection analysis (SIA) with a diode array spectrophotometric detector linked to chemometric tools such as multivariate curve resolution-alternating least squares (MCR-ALS). This system is based on the evolution of the basic specie of an acid-base indicator, alizarine, when it comes into contact with a sample that contains free fatty acids. The gradual pH change in the reactor coil due to diffusion and reaction phenomenona allows the sequential appearance of both species of the indicator in the detector coil, recording a data matrix for each sample. The SIA-MCR-ALS method helps to reduce the amounts of sample, the reagents and the time consumed. Each determination consumes 0.413ml of sample, 0.250ml of indicator and 3ml of carrier (ethanol) and generates 3.333ml of waste. The frequency of the analysis is high (12 samples h(-1) including all steps, i.e., cleaning, preparing and analysing). The utilized reagents are of common use in the laboratory and it is not necessary to use the reagents of perfect known concentration. The method was applied to determine acidity in plant oil and biodiesel samples. Results obtained by the proposed method compare well with those obtained by the official European Community method that is time consuming and uses large amounts of organic solvents.
Xia, Dan; Gao, Lirong; Zheng, Minghui; Tian, Qichang; Huang, Huiting; Qiao, Lin
2016-07-19
Chlorinated paraffins (CPs) are complex technical mixtures containing thousands of isomers. Analyzing CPs in environmental matrices is extremely challenging. CPs have broad, unresolved profiles when analyzed by one-dimensional gas chromatography (GC). Comprehensive two-dimensional GC (GC×GC) can separate CPs with a high degree of orthogonality. A novel method for simultaneously profiling and quantifying short- and medium-chain CPs, using GC×GC coupled with electron capture negative ionization high-resolution time-of-flight mass spectrometry, was developed. The method allowed 48 CP formula congener groups to be analyzed highly selectively in one injection through accurate mass measurements of the [M - Cl](-) ions in full scan mode. The correlation coefficients (R(2)) for the linear calibration curves for different chlorine contents were 0.982 for short-chain CPs and 0.945 for medium-chain CPs. The method was successfully used to determine CPs in sediment and fish samples. By using this method, with enhanced chromatographic separation and high mass resolution, interferences between CP congeners and other organohalogen compounds, such as toxaphene, are minimized. New compounds, with the formulas C9H14Cl6 and C9H13Cl7, were found in sediment and biological samples for the first time. The method was shown to be a powerful tool for the analysis of CPs in environmental samples.
A new method for determining the acid number of biodiesel based on coulometric titration.
Barbieri Gonzaga, Fabiano; Pereira Sobral, Sidney
2012-08-15
A new method is proposed for determining the acid number (AN) of biodiesel using coulometric titration with potentiometric detection, basically employing a potentiostat/galvanostat and an electrochemical cell containing a platinum electrode, a silver electrode, and a combination pH electrode. The method involves a sequential application of a constant current between the platinum (cathode) and silver (anode) electrodes, followed by measuring the potential of the combination pH electrode, using an isopropanol/water mixture as solvent and LiCl as the supporting electrolyte. A preliminary evaluation of the new method, using acetic acid for doping a biodiesel sample, showed an average recovery of 100.1%. Compared to a volumetric titration-based method for determining the AN of several biodiesel samples (ranging from about 0.18 to 0.95 mg g(-1)), the new method produced statistically similar results with better repeatability. Compared to other works reported in the literature, the new method presented an average repeatability up to 3.2 times better and employed a sample size up to 20 times smaller. Copyright © 2012 Elsevier B.V. All rights reserved.
NASA Technical Reports Server (NTRS)
Kim, Hyun Jung; Choi, Sang H.; Bae, Hyung-Bin; Lee, Tae Woo
2012-01-01
The National Aeronautics and Space Administration-invented X-ray diffraction (XRD) methods, including the total defect density measurement method and the spatial wafer mapping method, have confirmed super hetero epitaxy growth for rhombohedral single crystalline silicon germanium (Si1-xGex) on a c-plane sapphire substrate. However, the XRD method cannot observe the surface morphology or roughness because of the method s limited resolution. Therefore the authors used transmission electron microscopy (TEM) with samples prepared in two ways, the focused ion beam (FIB) method and the tripod method to study the structure between Si1-xGex and sapphire substrate and Si1?xGex itself. The sample preparation for TEM should be as fast as possible so that the sample should contain few or no artifacts induced by the preparation. The standard sample preparation method of mechanical polishing often requires a relatively long ion milling time (several hours), which increases the probability of inducing defects into the sample. The TEM sampling of the Si1-xGex on sapphire is also difficult because of the sapphire s high hardness and mechanical instability. The FIB method and the tripod method eliminate both problems when performing a cross-section TEM sampling of Si1-xGex on c-plane sapphire, which shows the surface morphology, the interface between film and substrate, and the crystal structure of the film. This paper explains the FIB sampling method and the tripod sampling method, and why sampling Si1-xGex, on a sapphire substrate with TEM, is necessary.
Zhang, Chun-Yun; Hu, Hui-Chao; Chai, Xin-Sheng; Pan, Lei; Xiao, Xian-Ming
2014-02-07
In this paper, we present a novel method for determining the maximal amount of ethane, a minor gas species, adsorbed in a shale sample. The method is based on the time-dependent release of ethane from shale samples measured by headspace gas chromatography (HS-GC). The study includes a mathematical model for fitting the experimental data, calculating the maximal amount gas adsorbed, and predicting results at other temperatures. The method is a more efficient alternative to the isothermal adsorption method that is in widespread use today. Copyright © 2013 Elsevier B.V. All rights reserved.
Duvivier, Wilco F; van Beek, Teris A; Pennings, Ed J M; Nielen, Michel W F
2014-04-15
Forensic hair analysis methods are laborious, time-consuming and provide only a rough retrospective estimate of the time of drug intake. Recently, hair imaging methods using matrix-assisted laser desorption/ionization mass spectrometry (MALDI-MS) were reported, but these methods require the application of MALDI matrix and are performed under vacuum. Direct analysis of entire locks of hair without any sample pretreatment and with improved spatial resolution would thus address a need. Hair samples were attached to stainless steel mesh screens and scanned in the X-direction using direct analysis in real time (DART) ambient ionization orbitrap MS. The DART gas temperature and the accuracy of the probed hair zone were optimized using Δ-9-tetrahydrocannabinol (THC) as a model compound. Since external contamination is a major issue in forensic hair analysis, sub-samples were measured before and after dichloromethane decontamination. The relative intensity of the THC signal in spiked blank hair versus that of quinine as the internal standard showed good reproducibility (26% RSD) and linearity of the method (R(2) = 0.991). With the DART hair scan THC could be detected in hair samples from different chronic cannabis users. The presence of THC was confirmed by quantitative liquid chromatography/tandem mass spectrometry. Zones with different THC content could be clearly distinguished, indicating that the method might be used for retrospective timeline assessments. Detection of THC in decontaminated drug user hair showed that the DART hair scan not only probes THC on the surface of hair, but penetrates deeply enough to measure incorporated THC. A new approach in forensic hair analysis has been developed by probing complete locks of hair using DART-MS. Longitudinal scanning enables detection of incorporated compounds and can be used as pre-screening for THC without sample preparation. The method could also be adjusted for the analysis of other drugs of abuse. Copyright © 2014 John Wiley & Sons, Ltd.
Appearance-based representative samples refining method for palmprint recognition
NASA Astrophysics Data System (ADS)
Wen, Jiajun; Chen, Yan
2012-07-01
The sparse representation can deal with the lack of sample problem due to utilizing of all the training samples. However, the discrimination ability will degrade when more training samples are used for representation. We propose a novel appearance-based palmprint recognition method. We aim to find a compromise between the discrimination ability and the lack of sample problem so as to obtain a proper representation scheme. Under the assumption that the test sample can be well represented by a linear combination of a certain number of training samples, we first select the representative training samples according to the contributions of the samples. Then we further refine the training samples by an iteration procedure, excluding the training sample with the least contribution to the test sample for each time. Experiments on PolyU multispectral palmprint database and two-dimensional and three-dimensional palmprint database show that the proposed method outperforms the conventional appearance-based palmprint recognition methods. Moreover, we also explore and find out the principle of the usage for the key parameters in the proposed algorithm, which facilitates to obtain high-recognition accuracy.
Schwartz, Alanna; Baidjoe, Amrish; Rosenthal, Philip J; Dorsey, Grant; Bousema, Teun; Greenhouse, Bryan
2015-05-01
Extraction and amplification of DNA from dried blood spots (DBS) collected in field studies is commonly used for detection of Plasmodium falciparum. However, there have been few systematic efforts to determine the effects of storage and extraction methods on the sensitivity of DNA amplification. We investigated the effects of storage conditions, length of storage, and DNA extraction methods on amplification via three PCR-based assays using field samples and laboratory controls. Samples stored as DBS for 2 or more years at ambient temperature showed a significant loss of sensitivity that increased with time; after 10 years only 10% samples with parasite densities > 1,000 parasites/μL were detectable by nested polymerase chain reaction (PCR). Conversely, DBS and extracted DNA stored at -20°C showed no loss of sensitivity with time. Samples with low parasite densities amplified more successfully with saponin/Chelex compared with spin-column-based extraction, though the latter method performed better on samples with higher parasite densities stored for 2 years at ambient temperature. DNA extracted via both methods was stable after 20 freeze-thaw cycles. Our results suggest that DBS should be stored at -20°C or extracted immediately, especially if anticipating 2 or more years of storage. © The American Society of Tropical Medicine and Hygiene.
Schwartz, Alanna; Baidjoe, Amrish; Rosenthal, Philip J.; Dorsey, Grant; Bousema, Teun; Greenhouse, Bryan
2015-01-01
Extraction and amplification of DNA from dried blood spots (DBS) collected in field studies is commonly used for detection of Plasmodium falciparum. However, there have been few systematic efforts to determine the effects of storage and extraction methods on the sensitivity of DNA amplification. We investigated the effects of storage conditions, length of storage, and DNA extraction methods on amplification via three PCR-based assays using field samples and laboratory controls. Samples stored as DBS for 2 or more years at ambient temperature showed a significant loss of sensitivity that increased with time; after 10 years only 10% samples with parasite densities > 1,000 parasites/μL were detectable by nested polymerase chain reaction (PCR). Conversely, DBS and extracted DNA stored at −20°C showed no loss of sensitivity with time. Samples with low parasite densities amplified more successfully with saponin/Chelex compared with spin-column-based extraction, though the latter method performed better on samples with higher parasite densities stored for 2 years at ambient temperature. DNA extracted via both methods was stable after 20 freeze-thaw cycles. Our results suggest that DBS should be stored at −20°C or extracted immediately, especially if anticipating 2 or more years of storage. PMID:25758652
Nichols, Jessica E; Harries, Megan E; Lovestead, Tara M; Bruno, Thomas J
2014-03-21
In this paper we present results of the application of PLOT-cryoadsorption (PLOT-cryo) to the analysis of ignitable liquids in fire debris. We tested ignitable liquids, broadly divided into fuels and solvents (although the majority of the results presented here were obtained with gasoline and diesel fuel) on three substrates: Douglas fir, oak plywood and Nylon carpet. We determined that PLOT-cryo allows the analyst to distinguish all of the ignitable liquids tested by use of a very rapid sampling protocol, and performs better (more recovered components, higher efficiency, lower elution solvent volumes) than a conventional purge and trap method. We also tested the effect of latency (the time period between applying the ignitable liquid and ignition), and we tested a variety of sampling times and a variety of PLOT capillary lengths. Reliable results can be obtained with sampling time periods as short as 3min, and on PLOT capillaries as short as 20cm. The variability of separate samples was also assessed, a study made possible by the high throughput nature of the PLOT-cryo method. We also determined that the method performs better than the conventional carbon strip method that is commonly used in fire debris analysis. Published by Elsevier B.V.
Evaluation of direct saponification method for determination of cholesterol in meats.
Adams, M L; Sullivan, D M; Smith, R L; Richter, E F
1986-01-01
A gas chromatographic (GC) method has been developed for determination of cholesterol in meats. The method involves ethanolic KOH saponification of the sample material, homogeneous-phase toluene extraction of the unsaponifiables, derivatization of cholesterol to its trimethylsilylether, and quantitation by GC-flame ionization detection using 5-alpha-cholestane as internal standard. This direct saponification method is compared with the current AOAC official method for determination of cholesterol in 20 different meat products. The direct saponification method eliminates the need for initial lipid extraction, thus offering a 30% savings in labor, and requires fewer solvents than the AOAC method. It produced comparable or slightly higher cholesterol results than the AOAC method in all meat samples examined. Precision, determined by assaying a turkey meat sample 16 times over 4 days, was excellent (CV = 1.74%). Average recovery of cholesterol added to meat samples was 99.8%.
Marino, Anna Maria Fausta; Percipalle, Maurizio; Giunta, Renato Paolo; Salvaggio, Antonio; Caracappa, Giulia; Alfonzetti, Tiziana; Aparo, Alessandra; Reale, Stefano
2017-03-01
We report a rapid and reliable method for the detection of Toxoplasma gondii in meat and animal tissues based on real-time polymerase chain reaction (PCR). Samples were collected from cattle, small ruminants, horses, and pigs raised or imported into Sicily, Italy. All DNA preparations were assayed by real-time PCR tests targeted to a 98-bp long fragment in the AF 529-bp repeat element and to the B1 gene using specific primers. Diagnostic sensitivity (100%), diagnostic specificity (100%), limit of detection (0.01 pg), efficiency (92-109%), and precision (mean coefficient of variation = 0.60%), repeatability (100%), reproducibility (100%), and robustness were evaluated using 240 DNA extracted samples (120 positives and 120 negative as per the OIE nested PCR method) from different matrices. Positive results were confirmed by the repetition of both real-time and nested PCR assays. Our study demonstrates the viability of a reliable, rapid, and specific real-time PCR on a large scale to monitor contamination with Toxoplasma cysts in meat and animal specimens. This validated method can be used for postmortem detection in domestic and wild animals and for food safety purposes.
Křesinová, Zdena; Linhartová, Lucie; Petrů, Klára; Krejčová, Lucie; Šrédlová, Kamila; Lhotský, Ondřej; Kameník, Zdeněk; Cajthaml, Tomáš
2016-04-01
A rapid and reliable analytical method was developed for the quantitative determination of psychopharmaceuticals, their precursors and by-products in real contaminated samples from a pharmaceutical company in Olomouc (Czech Republic), based on SPE disk extraction and detection by ultra performance liquid chromatography, combined with time-of-flight mass spectrometry. The target compounds were quantified in the real whole-water samples (water including suspended particles), both in the presence of suspended particulate matter (SPM) and high concentrations of other organic pollutants. A total of nine compounds were analyzed which consisted of three commonly used antidepressants (tricyclic antidepressants and antipsychotics), one antitussive agent and five by-products or precursors. At first, the SPE disk method was developed for the extraction of water samples (dissolved analytes, recovery 84-104%) and pressurised liquid extraction technique was verified for solid matrices (sludge samples, recovery 81-95%). In order to evaluate the SPE disk technique for whole water samples containing SPM, non contaminated groundwater samples were also loaded with different amounts (100 and 300mgL(-1)) of real contaminated sludge originating from the same locality. The recoveries from the whole-water samples obtained by SPE disk method ranged between 67 and 119% after the addition of the most contaminated sludge. The final method was applied to several real groundwater (whole-water) samples from the industrial area and high concentrations (up to 10(3)μgL(-1)) of the target compounds were detected. The results of this study document and indicate the feasibility of the SPE disk method for analysis of groundwater. Copyright © 2016 Elsevier B.V. All rights reserved.
Baker, Laurie L; Mills Flemming, Joanna E; Jonsen, Ian D; Lidgard, Damian C; Iverson, Sara J; Bowen, W Don
2015-01-01
Paired with satellite location telemetry, animal-borne instruments can collect spatiotemporal data describing the animal's movement and environment at a scale relevant to its behavior. Ecologists have developed methods for identifying the area(s) used by an animal (e.g., home range) and those used most intensely (utilization distribution) based on location data. However, few have extended these models beyond their traditional roles as descriptive 2D summaries of point data. Here we demonstrate how the home range method, T-LoCoH, can be expanded to quantify collective sampling coverage by multiple instrumented animals using grey seals (Halichoerus grypus) equipped with GPS tags and acoustic transceivers on the Scotian Shelf (Atlantic Canada) as a case study. At the individual level, we illustrate how time and space-use metrics quantifying individual sampling coverage may be used to determine the rate of acoustic transmissions received. Grey seals collectively sampled an area of 11,308 km (2) and intensely sampled an area of 31 km (2) from June-December. The largest area sampled was in July (2094.56 km (2)) and the smallest area sampled occurred in August (1259.80 km (2)), with changes in sampling coverage observed through time. T-LoCoH provides an effective means to quantify changes in collective sampling effort by multiple instrumented animals and to compare these changes across time. We also illustrate how time and space-use metrics of individual instrumented seal movement calculated using T-LoCoH can be used to account for differences in the amount of time a bioprobe (biological sampling platform) spends in an area.
Daigle, Courtney L; Siegford, Janice M
2014-03-01
Continuous observation is the most accurate way to determine animals' actual time budget and can provide a 'gold standard' representation of resource use, behavior frequency, and duration. Continuous observation is useful for capturing behaviors that are of short duration or occur infrequently. However, collecting continuous data is labor intensive and time consuming, making multiple individual or long-term data collection difficult. Six non-cage laying hens were video recorded for 15 h and behavioral data collected every 2 s were compared with data collected using scan sampling intervals of 5, 10, 15, 30, and 60 min and subsamples of 2 second observations performed for 10 min every 30 min, 15 min every 1 h, 30 min every 1.5 h, and 15 min every 2 h. Three statistical approaches were used to provide a comprehensive analysis to examine the quality of the data obtained via different sampling methods. General linear mixed models identified how the time budget from the sampling techniques differed from continuous observation. Correlation analysis identified how strongly results from the sampling techniques were associated with those from continuous observation. Regression analysis identified how well the results from the sampling techniques were associated with those from continuous observation, changes in magnitude, and whether a sampling technique had bias. Static behaviors were well represented with scan and time sampling techniques, while dynamic behaviors were best represented with time sampling techniques. Methods for identifying an appropriate sampling strategy based upon the type of behavior of interest are outlined and results for non-caged laying hens are presented. Copyright © 2013 Elsevier B.V. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Edwards, T.; Hera, K.; Coleman, C.
2011-12-05
Evaluation of Defense Waste Processing Facility (DWPF) Chemical Process Cell (CPC) cycle time identified several opportunities to improve the CPC processing time. The Mechanical Systems & Custom Equipment Development (MS&CED) Section of the Savannah River National Laboratory (SRNL) recently completed the evaluation of one of these opportunities - the possibility of using an Isolok sampling valve as an alternative to the Hydragard valve for taking DWPF process samples at the Slurry Mix Evaporator (SME). The use of an Isolok for SME sampling has the potential to improve operability, reduce maintenance time, and decrease CPC cycle time. The SME acceptability testingmore » for the Isolok was requested in Task Technical Request (TTR) HLW-DWPF-TTR-2010-0036 and was conducted as outlined in Task Technical and Quality Assurance Plan (TTQAP) SRNLRP-2011-00145. RW-0333P QA requirements applied to the task, and the results from the investigation were documented in SRNL-STI-2011-00693. Measurement of the chemical composition of study samples was a critical component of the SME acceptability testing of the Isolok. A sampling and analytical plan supported the investigation with the analytical plan directing that the study samples be prepared by a cesium carbonate (Cs{sub 2}CO{sub 3}) fusion dissolution method and analyzed by Inductively Coupled Plasma - Optical Emission Spectroscopy (ICP-OES). The use of the cesium carbonate preparation method for the Isolok testing provided an opportunity for an additional assessment of this dissolution method, which is being investigated as a potential replacement for the two methods (i.e., sodium peroxide fusion and mixed acid dissolution) that have been used at the DWPF for the analysis of SME samples. Earlier testing of the Cs{sub 2}CO{sub 3} method yielded promising results which led to a TTR from Savannah River Remediation, LLC (SRR) to SRNL for additional support and an associated TTQAP to direct the SRNL efforts. A technical report resulting from this work was issued that recommended that the mixed acid method be replaced by the Cs{sub 2}CO{sub 3} method for the measurement of magnesium (Mg), sodium (Na), and zirconium (Zr) with additional testing of the method by DWPF Laboratory being needed before further implementation of the Cs{sub 2}CO{sub 3} method at that laboratory. While the SME acceptability testing of the Isolok does not address any of the open issues remaining after the publication of the recommendation for the replacement of the mixed acid method by the Cs{sub 2}CO{sub 3} method (since those issues are to be addressed by the DWPF Laboratory), the Cs{sub 2}CO{sub 3} testing associated with the Isolok testing does provide additional insight into the performance of the method as conducted by SRNL. The performance is to be investigated by looking to the composition measurement data generated by the samples of a standard glass, the Analytical Reference Glass - 1 (ARG-1), that were prepared by the Cs{sub 2}CO{sub 3} method and included in the SME acceptability testing of the Isolok. The measurements of these samples were presented as part of the study results, but no statistical analysis of these measurements was conducted as part of those results. It is the purpose of this report to provide that analysis, which was supported using JMP Version 7.0.2.« less
Boker, Steven M; Xu, Minquan; Rotondo, Jennifer L; King, Kadijah
2002-09-01
Cross-correlation and most other longitudinal analyses assume that the association between 2 variables is stationary. Thus, a sample of occasions of measurement is expected to be representative of the association between variables regardless of the time of onset or number of occasions in the sample. The authors propose a method to analyze the association between 2 variables when the assumption of stationarity may not be warranted. The method results in estimates of both the strength of peak association and the time lag when the peak association occurred for a range of starting values of elapsed time from the beginning of an experiment.
Validation of a standardized extraction method for formalin-fixed paraffin-embedded tissue samples.
Lagheden, Camilla; Eklund, Carina; Kleppe, Sara Nordqvist; Unger, Elizabeth R; Dillner, Joakim; Sundström, Karin
2016-07-01
Formalin-fixed paraffin-embedded (FFPE) samples can be DNA-extracted and used for human papillomavirus (HPV) genotyping. The xylene-based gold standard for extracting FFPE samples is laborious, suboptimal and involves health hazards for the personnel involved. To compare extraction with the standard xylene method to a xylene-free method used in an HPV LabNet Global Reference Laboratory at the Centers for Disease Control (CDC); based on a commercial method with an extra heating step. Fifty FFPE samples were randomly selected from a national audit of all cervical cancer cases diagnosed in Sweden during 10 years. For each case-block, a blank-block was sectioned, as a control for contamination. For xylene extraction, the standard WHO Laboratory Manual protocol was used. For the CDC method, the manufacturers' protocol was followed except for an extra heating step, 120°C for 20min. Samples were extracted and tested in parallel with β-globin real-time PCR, HPV16 real-time PCR and HPV typing using modified general primers (MGP)-PCR and Luminex assays. For a valid result the blank-block had to be betaglobin-negative in all tests and the case-block positive for beta-globin. Overall, detection was improved with the heating method and the amount of HPV-positive samples increased from 70% to 86% (p=0.039). For all samples where HPV type concordance could be evaluated, there was 100% type concordance. A xylene-free and robust extraction method for HPV-DNA typing in FFPE material is currently in great demand. Our proposed standardized protocol appears to be generally useful. Copyright © 2016. Published by Elsevier B.V.
Puzon, Geoffrey J; Lancaster, James A; Wylie, Jason T; Plumb, Iason J
2009-09-01
Rapid detection of pathogenic Naegleria fowler in water distribution networks is critical for water utilities. Current detection methods rely on sampling drinking water followed by culturing and molecular identification of purified strains. This culture-based method takes an extended amount of time (days), detects both nonpathogenic and pathogenic species, and does not account for N. fowleri cells associated with pipe wall biofilms. In this study, a total DNA extraction technique coupled with a real-time PCR method using primers specific for N. fowleri was developed and validated. The method readily detected N. fowleri without preculturing with the lowest detection limit for N. fowleri cells spiked in biofilm being one cell (66% detection rate) and five cells (100% detection rate). For drinking water, the detection limit was five cells (66% detection rate) and 10 cells (100% detection rate). By comparison, culture-based methods were less sensitive for detection of cells spiked into both biofilm (66% detection for <10 cells) and drinking water (0% detection for <10 cells). In mixed cultures of N. fowleri and nonpathogenic Naegleria, the method identified N. fowleri in 100% of all replicates, whereastests with the current consensus primers detected N. fowleri in only 5% of all replicates. Application of the new method to drinking water and pipe wall biofilm samples obtained from a distribution network enabled the detection of N. fowleri in under 6 h, versus 3+ daysforthe culture based method. Further, comparison of the real-time PCR data from the field samples and the standard curves enabled an approximation of N. fowleri cells in the biofilm and drinking water. The use of such a method will further aid water utilities in detecting and managing the persistence of N. fowleri in water distribution networks.
Rakkiyappan, R; Sakthivel, N; Cao, Jinde
2015-06-01
This study examines the exponential synchronization of complex dynamical networks with control packet loss and additive time-varying delays. Additionally, sampled-data controller with time-varying sampling period is considered and is assumed to switch between m different values in a random way with given probability. Then, a novel Lyapunov-Krasovskii functional (LKF) with triple integral terms is constructed and by using Jensen's inequality and reciprocally convex approach, sufficient conditions under which the dynamical network is exponentially mean-square stable are derived. When applying Jensen's inequality to partition double integral terms in the derivation of linear matrix inequality (LMI) conditions, a new kind of linear combination of positive functions weighted by the inverses of squared convex parameters appears. In order to handle such a combination, an effective method is introduced by extending the lower bound lemma. To design the sampled-data controller, the synchronization error system is represented as a switched system. Based on the derived LMI conditions and average dwell-time method, sufficient conditions for the synchronization of switched error system are derived in terms of LMIs. Finally, numerical example is employed to show the effectiveness of the proposed methods. Copyright © 2015 Elsevier Ltd. All rights reserved.
Muthukrishnan, Madhanmohan; Singanallur, Nagendrakumar B; Ralla, Kumar; Villuppanoor, Srinivasan A
2008-08-01
Foot-and-mouth disease virus (FMDV) samples transported to the laboratory from far and inaccessible areas for serodiagnosis pose a major problem in a tropical country like India, where there is maximum temperature fluctuation. Inadequate storage methods lead to spoilage of FMDV samples collected from clinically positive animals in the field. Such samples are declared as non-typeable by the typing laboratories with the consequent loss of valuable epidemiological data. The present study evaluated the usefulness of FTA Classic Cards for the collection, shipment, storage and identification of the FMDV genome by RT-PCR and real-time RT-PCR. The stability of the viral RNA, the absence of infectivity and ease of processing the sample for molecular methods make the FTA cards a useful option for transport of FMDV genome for identification and serotyping. The method can be used routinely for FMDV research as it is economical and the cards can be transported easily in envelopes by regular document transport methods. Live virus cannot be isolated from samples collected in FTA cards, which is a limitation. This property can be viewed as an advantage as it limits the risk of transmission of live virus.
Cai, Ying; Yan, Zhihong; Wang, Lijia; NguyenVan, Manh; Cai, Qingyun
2016-01-15
A magnetic solid phase extraction (MSPE) protocol combining a static headspace gas chromatography coupled to mass spectrometry (HS-GC-MS) method has been developed for extraction, and determination of 16 polycyclic aromatic hydrocarbons (PAHs) in drinking water samples. Magnetic nanoparticles (MNPs) were coated with 3-aminopropyltriethoxysilane and modified by cholesterol chloroformate. Transmission electron microscope, vibrating sample magnetometer, Fourier transform infrared spectrometry and X-ray photoelectron spectroscopy were used to characterize the cholesterol-functionalized sorbents, and the main parameters affecting the extraction as well as HS sampling, such as sorbent amount, extraction time, oven temperature and equilibration time have been investigated and established. Combination with HS sampling, the MSPE procedure was simple, fast and environmentally friendly, without need of any organic solvent. Method validation proved the feasibility of the developed sorbents for the quantitation of the investigated analytes at trace levels obtaining the limit of detection (S/N=3) ranging from 0.20 to 7.8 ng/L. Good values for intra and inter-day precision were obtained (RSDs ≤ 9.9%). The proposed method was successfully applied to drinking water samples. Copyright © 2015 Elsevier B.V. All rights reserved.
Ensemble Sampling vs. Time Sampling in Molecular Dynamics Simulations of Thermal Conductivity
Gordiz, Kiarash; Singh, David J.; Henry, Asegun
2015-01-29
In this report we compare time sampling and ensemble averaging as two different methods available for phase space sampling. For the comparison, we calculate thermal conductivities of solid argon and silicon structures, using equilibrium molecular dynamics. We introduce two different schemes for the ensemble averaging approach, and show that both can reduce the total simulation time as compared to time averaging. It is also found that velocity rescaling is an efficient mechanism for phase space exploration. Although our methodology is tested using classical molecular dynamics, the ensemble generation approaches may find their greatest utility in computationally expensive simulations such asmore » first principles molecular dynamics. For such simulations, where each time step is costly, time sampling can require long simulation times because each time step must be evaluated sequentially and therefore phase space averaging is achieved through sequential operations. On the other hand, with ensemble averaging, phase space sampling can be achieved through parallel operations, since each ensemble is independent. For this reason, particularly when using massively parallel architectures, ensemble sampling can result in much shorter simulation times and exhibits similar overall computational effort.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Coleman, Charles J.; Edwards, Thomas B.
2005-04-30
The wet chemistry digestion method development for providing process control elemental analyses of the Hanford Tank Waste Treatment and Immobilization Plant (WTP) Melter Feed Preparation Vessel (MFPV) samples is divided into two phases: Phase I consists of: (1) optimizing digestion methods as a precursor to elemental analyses by ICP-AES techniques; (2) selecting methods with the desired analytical reliability and speed to support the nine-hour or less turnaround time requirement of the WTP; and (3) providing baseline comparison to the laser ablation (LA) sample introduction technique for ICP-AES elemental analyses that is being developed at the Savannah River National Laboratory (SRNL).more » Phase II consists of: (1) Time-and-Motion study of the selected methods from Phase I with actual Hanford waste or waste simulants in shielded cell facilities to ensure that the methods can be performed remotely and maintain the desired characteristics; and (2) digestion of glass samples prepared from actual Hanford Waste tank sludge for providing comparative results to the LA Phase II study. Based on the Phase I testing discussed in this report, a tandem digestion approach consisting of sodium peroxide fusion digestions carried out in nickel crucibles and warm mixed-acid digestions carried out in plastic bottles has been selected for Time-and-Motion study in Phase II. SRNL experience with performing this analytical approach in laboratory hoods indicates that well-trained cell operator teams will be able to perform the tandem digestions in five hours or less. The selected approach will produce two sets of solutions for analysis by ICP-AES techniques. Four hours would then be allocated for performing the ICP-AES analyses and reporting results to meet the nine-hour or less turnaround time requirement. The tandem digestion approach will need to be performed in two separate shielded analytical cells by two separate cell operator teams in order to achieve the nine-hour or less turnaround time. Because of the simplicity of the warm mixed-acid method, a well-trained cell operator team may in time be able to perform both sets of digestions. However, having separate shielded cells for each of the methods is prudent to avoid overcrowding problems that would impede a minimal turnaround time.« less
Baumgartner, Jeannine; Zeder, Christophe; Krzystek, Adam; Osei, Jennifer; Haldimann, Max; Zimmermann, Michael B.; Andersson, Maria
2016-01-01
Background: Breast milk iodine concentration (BMIC) may be an indicator of iodine status during lactation, but there are few data comparing different analytical methods or timing of sampling. The aims of this study were: (i) to optimize a new inductively coupled plasma mass spectrometry (ICP-MS) method; and (ii) to evaluate the effect of analytical method and timing of within-feed sample collection on BMIC. Methods: The colorimetric Sandell–Kolthoff method was evaluated with (a) or without (b) alkaline ashing, and ICP-MS was evaluated using a new 129I isotope ratio approach including Tellurium (Te) for mass bias correction (c) or external standard curve (d). From iodine-sufficient lactating women (n = 97), three samples were collected within one breast-feeding session (fore-, mid-, and hind-feed samples) and BMIC was analyzed using (c) and (d). Results: Iodine recovery from NIST SRM1549a whole milk powder for methods (a)–(d) was 67%, 24%, 105%, and 102%, respectively. Intra- and inter-assay coefficients of variation for ICP-MS comparing (c) and (d) were 1.3% versus 5.6% (p = 0.04) and 1.1% versus 2.4% (p = 0.33). The limit of detection (LOD) was lower for (c) (0.26 μg/kg) than it was for (d) (2.54 μg/kg; p = 0.02). Using (c), the median [95% confidence interval (CI) obtained by bootstrap] BMIC (μg/kg) in foremilk (179 [CI 161–206]) and in mid-feed milk (184 [CI 160–220]) were not significantly different (p = 0.017), but were higher than in hindmilk (175 [CI 153–216]; p < 0.001). In foremilk using (d), BMIC was 199 ([CI 182–257]; p < 0.001 vs. (c)). The variation in BMIC comparing (c) and (d) (13%) was greater than variation within feeding (5%; p < 0.001). Conclusions: Because of poor recoveries, (a) and (b) should not be used to measure BMIC. Compared with (d), (c) has the advantages of higher precision and a lower LOD. In iodine-sufficient women, BMIC shows low variation within a breast-feeding session, so timing of sampling is not a major determinant of BMIC. PMID:26563466
Alles, Susan; Peng, Linda X; Mozola, Mark A
2009-01-01
A modification to Performance-Tested Method 010403, GeneQuence Listeria Test (DNAH method), is described. The modified method uses a new media formulation, LESS enrichment broth, in single-step enrichment protocols for both foods and environmental sponge and swab samples. Food samples are enriched for 27-30 h at 30 degrees C, and environmental samples for 24-48 h at 30 degrees C. Implementation of these abbreviated enrichment procedures allows test results to be obtained on a next-day basis. In testing of 14 food types in internal comparative studies with inoculated samples, there were statistically significant differences in method performance between the DNAH method and reference culture procedures for only 2 foods (pasteurized crab meat and lettuce) at the 27 h enrichment time point and for only a single food (pasteurized crab meat) in one trial at the 30 h enrichment time point. Independent laboratory testing with 3 foods showed statistical equivalence between the methods for all foods, and results support the findings of the internal trials. Overall, considering both internal and independent laboratory trials, sensitivity of the DNAH method relative to the reference culture procedures was 90.5%. Results of testing 5 environmental surfaces inoculated with various strains of Listeria spp. showed that the DNAH method was more productive than the reference U.S. Department of Agriculture-Food Safety and Inspection Service (USDA-FSIS) culture procedure for 3 surfaces (stainless steel, plastic, and cast iron), whereas results were statistically equivalent to the reference method for the other 2 surfaces (ceramic tile and sealed concrete). An independent laboratory trial with ceramic tile inoculated with L. monocytogenes confirmed the effectiveness of the DNAH method at the 24 h time point. Overall, sensitivity of the DNAH method at 24 h relative to that of the USDA-FSIS method was 152%. The DNAH method exhibited extremely high specificity, with only 1% false-positive reactions overall.
Galea, Karen S; Mueller, Will; Arfaj, Ayman M; Llamas, Jose L; Buick, Jennifer; Todd, David; McGonagle, Carolyn
2018-05-21
Crude oil may cause adverse dermal effects therefore dermal exposure is an exposure route of concern. Galea et al. (2014b) reported on a study comparing recovery (wipe) and interception (cotton glove) dermal sampling methods. The authors concluded that both methods were suitable for assessing dermal exposure to oil-based drilling fluids and crude oil but that glove samplers may overestimate the amount of fluid transferred to the skin. We describe a study which aimed to further evaluate the wipe sampling method to assess dermal exposure to crude oil, with this assessment including extended sample storage periods and sampling efficiency tests being undertaken at environmental conditions to mimic those typical of outdoor conditions in Saudi Arabia. The wipe sampling method was then used to assess the laboratory technicians' actual exposure to crude oil during typical petroleum laboratory tasks. Overall, acceptable storage efficiencies up to 54 days were reported with results suggesting storage stability over time. Sampling efficiencies were also reported to be satisfactory at both ambient and elevated temperature and relative humidity environmental conditions for surrogate skin spiked with known masses of crude oil and left up to 4 h prior to wiping, though there was an indication of reduced sampling efficiency over time. Nineteen petroleum laboratory technicians provided a total of 35 pre- and 35 post-activity paired hand wipe samples. Ninety-three percent of the pre-exposure paired hand wipes were less than the analytical limit of detection (LOD), whereas 46% of the post-activity paired hand wipes were less than the LOD. The geometric mean paired post-activity wipe sample measurement was 3.09 µg cm-2 (range 1.76-35.4 µg cm-2). It was considered that dermal exposure most frequently occurred through direct contact with the crude oil (emission) or via deposition. The findings of this study suggest that the wipe sampling method is satisfactory in quantifying laboratory technicians' dermal exposure to crude oil. It is therefore considered that this wipe sampling method may be suitable to quantify dermal exposure to crude oil for other petroleum workers.
Optical method for the characterization of laterally-patterned samples in integrated circuits
Maris, Humphrey J.
2001-01-01
Disclosed is a method for characterizing a sample having a structure disposed on or within the sample, comprising the steps of applying a first pulse of light to a surface of the sample for creating a propagating strain pulse in the sample, applying a second pulse of light to the surface so that the second pulse of light interacts with the propagating strain pulse in the sample, sensing from a reflection of the second pulse a change in optical response of the sample, and relating a time of occurrence of the change in optical response to at least one dimension of the structure.
Maris, Humphrey J.
2008-03-04
Disclosed is a method for characterizing a sample having a structure disposed on or within the sample, comprising the steps of applying a first pulse of light to a surface of the sample for creating a propagating strain pulse in the sample, applying a second pulse of light to the surface so that the second pulse of light interacts with the propagating strain pulse in the sample, sensing from a reflection of the second pulse a change in optical response of the sample, and relating a time of occurrence of the change in optical response to at least one dimension of the structure.
Optical method for the characterization of laterally-patterned samples in integrated circuits
Maris, Humphrey J.
2010-08-24
Disclosed is a method for characterizing a sample having a structure disposed on or within the sample, comprising the steps of applying a first pulse of light to a surface of the sample for creating a propagating strain pulse in the sample, applying a second pulse of light to the surface so that the second pulse of light interacts with the propagating strain pulse in the sample, sensing from a reflection of the second pulse a change in optical response of the sample, and relating a time of occurrence of the change in optical response to at least one dimension of the structure.
Optical method for the characterization of laterally patterned samples in integrated circuits
Maris, Humphrey J [Barrington, RI
2009-03-17
Disclosed is a method for characterizing a sample having a structure disposed on or within the sample, comprising the steps of applying a first pulse of light to a surface of the sample for creating a propagating strain pulse in the sample, applying a second pulse of light to the surface so that the second pulse of light interacts with the propagating strain pulse in the sample, sensing from a reflection of the second pulse a change in optical response of the sample, and relating a time of occurrence of the change in optical response to at least one dimension of the structure.
Maris, Humphrey J [Barrington, RI
2011-02-22
Disclosed is a method for characterizing a sample having a structure disposed on or within the sample, comprising the steps of applying a first pulse of light to a surface of the sample for creating a propagating strain pulse in the sample, applying a second pulse of light to the surface so that the second pulse of light interacts with the propagating strain pulse in the sample, sensing from a reflection of the second pulse a change in optical response of the sample, and relating a time of occurrence of the change in optical response to at least one dimension of the structure.
ERIC Educational Resources Information Center
Hofer, Scott M.; Flaherty, Brian P.; Hoffman, Lesa
2006-01-01
The effect of time-related mean differences on estimates of association in cross-sectional studies has not been widely recognized in developmental and aging research. Cross-sectional studies of samples varying in age have found moderate to high levels of shared age-related variance among diverse age-related measures. These findings may be…
Fast and accurate Monte Carlo sampling of first-passage times from Wiener diffusion models.
Drugowitsch, Jan
2016-02-11
We present a new, fast approach for drawing boundary crossing samples from Wiener diffusion models. Diffusion models are widely applied to model choices and reaction times in two-choice decisions. Samples from these models can be used to simulate the choices and reaction times they predict. These samples, in turn, can be utilized to adjust the models' parameters to match observed behavior from humans and other animals. Usually, such samples are drawn by simulating a stochastic differential equation in discrete time steps, which is slow and leads to biases in the reaction time estimates. Our method, instead, facilitates known expressions for first-passage time densities, which results in unbiased, exact samples and a hundred to thousand-fold speed increase in typical situations. In its most basic form it is restricted to diffusion models with symmetric boundaries and non-leaky accumulation, but our approach can be extended to also handle asymmetric boundaries or to approximate leaky accumulation.
Glass frit nebulizer for atomic spectrometry
Layman, L.R.
1982-01-01
The nebuilizatlon of sample solutions Is a critical step In most flame or plasma atomic spectrometrlc methods. A novel nebulzatlon technique, based on a porous glass frit, has been Investigated. Basic operating parameters and characteristics have been studied to determine how thte new nebulizer may be applied to atomic spectrometrlc methods. The results of preliminary comparisons with pneumatic nebulizers Indicate several notable differences. The frit nebulizer produces a smaller droplet size distribution and has a higher sample transport efficiency. The mean droplet size te approximately 0.1 ??m, and up to 94% of the sample te converted to usable aerosol. The most significant limitations In the performance of the frit nebulizer are the stow sample equMbratton time and the requirement for wash cycles between samples. Loss of solute by surface adsorption and contamination of samples by leaching from the glass were both found to be limitations only In unusual cases. This nebulizer shows great promise where sample volume te limited or where measurements require long nebullzatlon times.
Chin, Wai Hoe; Sun, Yi; Høgberg, Jonas; Quyen, Than Linh; Engelsmann, Pia; Wolff, Anders; Bang, Dang Duong
2017-04-01
Salmonellosis, an infectious disease caused by Salmonella spp., is one of the most common foodborne diseases. Isolation and identification of Salmonella by conventional bacterial culture method is time consuming. In response to the demand for rapid on line or at site detection of pathogens, in this study, we developed a multiplex Direct PCR method for rapid detection of different Salmonella serotypes directly from pork meat samples without any DNA purification steps. An inhibitor-resistant Phusion Pfu DNA polymerase was used to overcome PCR inhibition. Four pairs of primers including a pair of newly designed primers targeting Salmonella spp. at subtype level were incorporated in the multiplex Direct PCR. To maximize the efficiency of the Direct PCR, the ratio between sample and dilution buffer was optimized. The sensitivity and specificity of the multiplex Direct PCR were tested using naturally contaminated pork meat samples for detecting and subtyping of Salmonella spp. Conventional bacterial culture methods were used as reference to evaluate the performance of the multiplex Direct PCR. Relative accuracy, sensitivity and specificity of 98.8%; 97.6% and 100%, respectively, were achieved by the method. Application of the multiplex Direct PCR to detect Salmonella in pork meat at slaughter reduces the time of detection from 5 to 6 days by conventional bacterial culture and serotyping methods to 14 h (including 12 h enrichment time). Furthermore, the method poses a possibility of miniaturization and integration into a point-of-need Lab-on-a-chip system for rapid online pathogen detection. Copyright © 2016 Elsevier Ltd. All rights reserved.
Viswanathan, Tito
2014-07-29
A method of synthesizing carbon-magnetite nanocomposites. In one embodiment, the method includes the steps of (a) dissolving a first amount of an alkali salt of lignosulfonate in water to form a first solution, (b) heating the first solution to a first temperature, (c) adding a second amount of iron sulfate (FeSO.sub.4) to the first solution to form a second solution, (d) heating the second solution at a second temperature for a first duration of time effective to form a third solution of iron lignosulfonate, (e) adding a third amount of 1N sodium hydroxide (NaOH) to the third solution of iron lignosulfonate to form a fourth solution with a first pH level, (f) heating the fourth solution at a third temperature for a second duration of time to form a first sample, and (g) subjecting the first sample to a microwave radiation for a third duration of time effective to form a second sample containing a plurality of carbon-magnetite nanocomposites.
Taylor, Vivien F; Toms, Andrew; Longerich, Henry P
2002-01-01
The application of open vessel focused microwave acid digestion is described for the preparation of geological and environmental samples for analysis using inductively coupled plasma-mass spectrometry (ICP-MS). The method is compared to conventional closed-vessel high pressure methods which are limited in the use of HF to break down silicates. Open-vessel acid digestion more conveniently enables the use of HF to remove Si from geological and plant samples as volatile SiF4, as well as evaporation-to-dryness and sequential acid addition during the procedure. Rock reference materials (G-2 granite, MRG-1 gabbros, SY-2 syenite, JA-1 andesite, and JB-2 and SRM-688 basalts) and plant reference materials (BCR and IAEA lichens, peach leaves, apple leaves, Durham wheat flour, and pine needles) were digested with results comparable to conventional hotplate digestion. The microwave digestion method gave poor results for granitic samples containing refractory minerals, however fusion was the preferred method of preparation for these samples. Sample preparation time was reduced from several days, using conventional hotplate digestion method, to one hour per sample using our microwave method.
SMA Diagnosis: Detection of SMN1 Deletion with Real-Time mCOP-PCR System Using Fresh Blood DNA.
Niba, Emma Tabe Eko; Ar Rochmah, Mawaddah; Harahap, Nur Imma Fatimah; Awano, Hiroyuki; Morioka, Ichiro; Iijima, Kazumoto; Saito, Toshio; Saito, Kayoko; Takeuchi, Atsuko; Lai, Poh San; Bouike, Yoshihiro; Nishio, Hisahide; Shinohara, Masakazu
2017-12-18
Spinal muscular atrophy (SMA) is one of the most common autosomal recessive disorders. The symptoms are caused by defects of lower motor neurons in the spinal cord. More than 95% of SMA patients are homozygous for survival motor neuron 1 (SMN1) deletion. We previously developed a screening system for SMN1 deletion based on a modified competitive oligonucleotide priming-PCR (mCOP-PCR) technique using dried blood spot (DBS) on filter paper. This system is convenient for mass screening in the large population and/or first-tier diagnostic method of the patients in the remote areas. However, this system was still time-consuming and effort-taking, because it required pre-amplification procedure to avoid non-specific amplification and gel-electrophoresis to detect the presence or absence of SMN1 deletion. When the fresh blood samples are used instead of DBS, or when the gel-electrophoresis is replaced by real-time PCR, we may have a simpler and more rapid diagnostic method for SMA. To establish a simpler and more rapid diagnostic method of SMN1 deletion using fresh blood DNA. DNA samples extracted from fresh blood and stored at 4 ℃ for 1 month. The samples were assayed using a real-time mCOP-PCR system without pre-amplification procedures. DNA samples had already been genotyped by PCR-restriction fragment length polymorphism (PCR-RFLP), showing the presence or absence of SMN1 exon 7. The DNA samples were directly subjected to the mCOP-PCR step. The amplification of mCOP-PCR was monitored in a real-time PCR apparatus. The genotyping results of the real-time mCOP-PCR system using fresh blood DNA were completely matched with those of PCR-RFLP. In this real-time mCOP-PCR system using fresh blood-DNA, it took only four hours from extraction of DNA to detection of the presence or absence of SMN1 deletion, while it took more than 12 hours in PCR-RFLP. Our real-time mCOP-PCR system using fresh blood DNA was rapid and accurate, suggesting it may be useful for the first-tier diagnostic method of SMA.
Sampling is the act of selecting items from a specified population in order to estimate the parameters of that population (e.g., selecting soil samples to characterize the properties at an environmental site). Sampling occurs at various levels and times throughout an environmenta...
A STRINGENT COMPARISON OF SAMPLING AND ANALYSIS METHODS FOR VOCS IN AMBIENT AIR
A carefully designed study was conducted during the summer of 1998 to simultaneously collect samples of ambient air by canisters and compare the analysis results to direct sorbent preconcentration results taken at the time of sample collection. A total of 32 1-h sample sets we...
Zboromyrska, Y; Rubio, E; Alejo, I; Vergara, A; Mons, A; Campo, I; Bosch, J; Marco, F; Vila, J
2016-06-01
The current gold standard method for the diagnosis of urinary tract infections (UTI) is urine culture that requires 18-48 h for the identification of the causative microorganisms and an additional 24 h until the results of antimicrobial susceptibility testing (AST) are available. The aim of this study was to shorten the time of urine sample processing by a combination of flow cytometry for screening and matrix-assisted laser desorption/ionization time-of-flight mass spectrometry (MALDI-TOF-MS) for bacterial identification followed by AST directly from urine. The study was divided into two parts. During the first part, 675 urine samples were processed by a flow cytometry device and a cut-off value of bacterial count was determined to select samples for direct identification by MALDI-TOF-MS at ≥5 × 10(6) bacteria/mL. During the second part, 163 of 1029 processed samples reached the cut-off value. The sample preparation protocol for direct identification included two centrifugation and two washing steps. Direct AST was performed by the disc diffusion method if a reliable direct identification was obtained. Direct MALDI-TOF-MS identification was performed in 140 urine samples; 125 of the samples were positive by urine culture, 12 were contaminated and 3 were negative. Reliable direct identification was obtained in 108 (86.4%) of the 125 positive samples. AST was performed in 102 identified samples, and the results were fully concordant with the routine method among 83 monomicrobial infections. In conclusion, the turnaround time of the protocol described to diagnose UTI was about 1 h for microbial identification and 18-24 h for AST. Copyright © 2016 European Society of Clinical Microbiology and Infectious Diseases. Published by Elsevier Ltd. All rights reserved.
Statistical Inference on Memory Structure of Processes and Its Applications to Information Theory
2016-05-12
valued times series from a sample. (A practical algorithm to compute the estimator is a work in progress.) Third, finitely-valued spatial processes...ES) U.S. Army Research Office P.O. Box 12211 Research Triangle Park, NC 27709-2211 mathematical statistics; time series ; Markov chains; random...proved. Second, a statistical method is developed to estimate the memory depth of discrete- time and continuously-valued times series from a sample. (A
Rapid determination of 226Ra in emergency urine samples
DOE Office of Scientific and Technical Information (OSTI.GOV)
Maxwell, Sherrod L.; Culligan, Brian K.; Hutchison, Jay B.
2014-02-27
A new method has been developed at the Savannah River National Laboratory (SRNL) that can be used for the rapid determination of 226Ra in emergency urine samples following a radiological incident. If a radiological dispersive device event or a nuclear accident occurs, there will be an urgent need for rapid analyses of radionuclides in urine samples to ensure the safety of the public. Large numbers of urine samples will have to be analyzed very quickly. This new SRNL method was applied to 100 mL urine aliquots, however this method can be applied to smaller or larger sample aliquots as needed.more » The method was optimized for rapid turnaround times; urine samples may be prepared for counting in <3 h. A rapid calcium phosphate precipitation method was used to pre-concentrate 226Ra from the urine sample matrix, followed by removal of calcium by cation exchange separation. A stacked elution method using DGA Resin was used to purify the 226Ra during the cation exchange elution step. This approach combines the cation resin elution step with the simultaneous purification of 226Ra with DGA Resin, saving time. 133Ba was used instead of 225Ra as tracer to allow immediate counting; however, 225Ra can still be used as an option. The rapid purification of 226Ra to remove interferences using DGA Resin was compared with a slightly longer Ln Resin approach. A final barium sulfate micro-precipitation step was used with isopropanol present to reduce solubility; producing alpha spectrometry sources with peaks typically <40 keV FWHM (full width half max). This new rapid method is fast, has very high tracer yield (>90 %), and removes interferences effectively. The sample preparation method can also be adapted to ICP-MS measurement of 226Ra, with rapid removal of isobaric interferences.« less
Ramos, Inês I; Magalhães, Luís M; Barreiros, Luisa; Reis, Salette; Lima, José L F C; Segundo, Marcela A
2018-01-01
Immunoglobulin G (IgG) represents the major fraction of antibodies in healthy adult human serum, and deviations from physiological levels are a generic marker of disease corresponding to different pathologies. Therefore, screening methods for IgG evaluation are a valuable aid to diagnostics. The present work proposes a rapid, automatic, and miniaturized method based on UV-vis micro-bead injection spectroscopy (μ-BIS) for the real-time determination of human serum IgG with label-free detection. Relying on attachment of IgG in rec-protein G immobilized in Sepharose 4B, a bioaffinity column is automatically assembled, where IgG is selectively retained and determined by on-column optical density measurement. A "dilution-and-shoot" approach (50 to 200 times) was implemented without further sample treatment because interferences were flushed out of the column upon sample loading, with minimization of carryover and cross-contamination by automatically discarding the sorbent (0.2 mg) after each determination. No interference from human serum albumin at 60 mg mL -1 in undiluted sample was found. The method allowed IgG determination in the range 100-300 μg mL -1 (corresponding to 5.0-60 mg mL -1 in undiluted samples), with a detection limit of 33 μg mL -1 (1.7 mg mL -1 for samples, dilution factor of 50). RSD values were < 9.4 and < 11.7%, for intra and inter-assay precision, respectively, while recovery values for human serum spiked with IgG at high pathological levels were 97.8-101.4%. Comparison to commercial ELISA kit showed no significant difference for tested samples (n = 8). Moreover, time-to-result decreased from several hours to < 5 min and analysis cost decreased 10 times, showing the potential of the proposed approach as a point-of-care method. Graphical abstract Micro-Bead Injection Spectroscopy method for real time, automated and label-free determination of total serum human Immunoglobulin G (IgG). The method was designed for Lab-on-Valve (LOV) platforms using a miniaturised protein G bioaffinity separative approach. IgG are separated from serum matrix components upon quantification with low non-specific binding in less than 5 min.
NASA Astrophysics Data System (ADS)
Qi, Shengqi; Hou, Deyi; Luo, Jian
2017-09-01
This study presents a numerical model based on field data to simulate groundwater flow in both the aquifer and the well-bore for the low-flow sampling method and the well-volume sampling method. The numerical model was calibrated to match well with field drawdown, and calculated flow regime in the well was used to predict the variation of dissolved oxygen (DO) concentration during the purging period. The model was then used to analyze sampling representativeness and sampling time. Site characteristics, such as aquifer hydraulic conductivity, and sampling choices, such as purging rate and screen length, were found to be significant determinants of sampling representativeness and required sampling time. Results demonstrated that: (1) DO was the most useful water quality indicator in ensuring groundwater sampling representativeness in comparison with turbidity, pH, specific conductance, oxidation reduction potential (ORP) and temperature; (2) it is not necessary to maintain a drawdown of less than 0.1 m when conducting low flow purging. However, a high purging rate in a low permeability aquifer may result in a dramatic decrease in sampling representativeness after an initial peak; (3) the presence of a short screen length may result in greater drawdown and a longer sampling time for low-flow purging. Overall, the present study suggests that this new numerical model is suitable for describing groundwater flow during the sampling process, and can be used to optimize sampling strategies under various hydrogeological conditions.
Swezey, Robert; Shinn, Walter; Green, Carol; Drover, David R.; Hammer, Gregory B.; Schulman, Scott R.; Zajicek, Anne; Jett, David A.; Boss, Gerry R.
2013-01-01
Most hospital laboratories do not measure blood cyanide concentrations, and samples must be sent to reference laboratories. A simple method is needed for measuring cyanide in hospitals. The authors previously developed a method to quantify cyanide based on the high binding affinity of the vitamin B12 analog, cobinamide, for cyanide and a major spectral change observed for cyanide-bound cobinamide. This method is now validated in human blood, and the findings include a mean inter-assay accuracy of 99.1%, precision of 8.75% and a lower limit of quantification of 3.27 µM cyanide. The method was applied to blood samples from children treated with sodium nitroprusside and it yielded measurable results in 88 of 172 samples (51%), whereas the reference laboratory yielded results in only 19 samples (11%). In all 19 samples, the cobinamide-based method also yielded measurable results. The two methods showed reasonable agreement when analyzed by linear regression, but not when analyzed by a standard error of the estimate or paired t-test. Differences in results between the two methods may be because samples were assayed at different times on different sample types. The cobinamide-based method is applicable to human blood, and can be used in hospital laboratories and emergency rooms. PMID:23653045
A new real-time PCR protocol for detection of avian haemosporidians.
Bell, Jeffrey A; Weckstein, Jason D; Fecchio, Alan; Tkach, Vasyl V
2015-07-19
Birds possess the most diverse assemblage of haemosporidian parasites; including three genera, Plasmodium, Haemoproteus, and Leucocytozoon. Currently there are over 200 morphologically identified avian haemosporidian species, although true species richness is unknown due to great genetic diversity and insufficient sampling in highly diverse regions. Studies aimed at surveying haemosporidian diversity involve collecting and screening samples from hundreds to thousands of individuals. Currently, screening relies on microscopy and/or single or nested standard PCR. Although effective, these methods are time and resource consuming, and in the case of microscopy require substantial expertise. Here we report a newly developed real-time PCR protocol designed to quickly and reliably detect all three genera of avian haemosporidians in a single biochemical reaction. Using available DNA sequences from avian haemosporidians we designed primers R330F and R480RL, which flank a 182 base pair fragment of mitochondrial conserved rDNA. These primers were initially tested using real-time PCR on samples from Malawi, Africa, previously screened for avian haemosporidians using traditional nested PCR. Our real time protocol was further tested on 94 samples from the Cerrado biome of Brazil, previously screened using a single PCR assay for haemosporidian parasites. These samples were also amplified using modified nested PCR protocols, allowing for comparisons between the three different screening methods (single PCR, nested PCR, real-time PCR). The real-time PCR protocol successfully identified all three genera of avian haemosporidians from both single and mixed infections previously detected from Malawi. There was no significant difference between the three different screening protocols used for the 94 samples from the Brazilian Cerrado (χ(2) = 0.3429, df = 2, P = 0.842). After proving effective, the real-time protocol was used to screen 2113 Brazilian samples, identifying 693 positive samples. Our real-time PCR assay proved as effective as two widely used molecular screening techniques, single PCR and nested PCR. However, the real-time protocol has the distinct advantage of detecting all three genera in a single reaction, which significantly increases efficiency by greatly decreasing screening time and cost. Our real-time PCR protocol is therefore a valuable tool in the quickly expanding field of avian haemosporidian research.
Comparability among four invertebrate sampling methods, Fountain Creek Basin, Colorado, 2010-2012
Zuellig, Robert E.; Bruce, James F.; Stogner, Sr., Robert W.; Brown, Krystal D.
2014-01-01
The U.S. Geological Survey, in cooperation with Colorado Springs City Engineering and Colorado Springs Utilities, designed a study to determine if sampling method and sample timing resulted in comparable samples and assessments of biological condition. To accomplish this task, annual invertebrate samples were collected concurrently using four sampling methods at 15 U.S. Geological Survey streamflow gages in the Fountain Creek basin from 2010 to 2012. Collectively, the four methods are used by local (U.S. Geological Survey cooperative monitoring program) and State monitoring programs (Colorado Department of Public Health and Environment) in the Fountain Creek basin to produce two distinct sample types for each program that target single-and multiple-habitats. This study found distinguishable differences between single-and multi-habitat sample types using both community similarities and multi-metric index values, while methods from each program within sample type were comparable. This indicates that the Colorado Department of Public Health and Environment methods were compatible with the cooperative monitoring program methods within multi-and single-habitat sample types. Comparisons between September and October samples found distinguishable differences based on community similarities for both sample types, whereas only differences were found for single-habitat samples when multi-metric index values were considered. At one site, differences between September and October index values from single-habitat samples resulted in opposing assessments of biological condition. Direct application of the results to inform the revision of the existing Fountain Creek basin U.S. Geological Survey cooperative monitoring program are discussed.
Prediction of final error level in learning and repetitive control
NASA Astrophysics Data System (ADS)
Levoci, Peter A.
Repetitive control (RC) is a field that creates controllers to eliminate the effects of periodic disturbances on a feedback control system. The methods have applications in spacecraft problems, to isolate fine pointing equipment from periodic vibration disturbances such as slight imbalances in momentum wheels or cryogenic pumps. A closely related field of control design is iterative learning control (ILC) which aims to eliminate tracking error in a task that repeats, each time starting from the same initial condition. Experiments done on a robot at NASA Langley Research Center showed that the final error levels produced by different candidate repetitive and learning controllers can be very different, even when each controller is analytically proven to converge to zero error in the deterministic case. Real world plant and measurement noise and quantization noise (from analog to digital and digital to analog converters) in these control methods are acted on as if they were error sources that will repeat and should be cancelled, which implies that the algorithms amplify such errors. Methods are developed that predict the final error levels of general first order ILC, of higher order ILC including current cycle learning, and of general RC, in the presence of noise, using frequency response methods. The method involves much less computation than the corresponding time domain approach that involves large matrices. The time domain approach was previously developed for ILC and handles a certain class of ILC methods. Here methods are created to include zero-phase filtering that is very important in creating practical designs. Also, time domain methods are developed for higher order ILC and for repetitive control. Since RC and ILC must be implemented digitally, all of these methods predict final error levels at the sample times. It is shown here that RC can easily converge to small error levels between sample times, but that ILC in most applications will have large and diverging intersample error if in fact zero error is reached at the sample times. This is independent of the ILC law used, and is purely a property of the physical system. Methods are developed to address this issue.
Nizamani, Sooraj; Kazi, Tasneem G; Afridi, Hassan I
2018-01-01
An efficient preconcentration technique based on ultrasonic-assisted ionic liquid-based dual microextraction (UA-ILDµE) method has been developed to preconcentrate the lead (Pb +2 ) in ground and stored rain water. In the current proposed method, Pb +2 was complexed with a chelating agent (dithizone), whereas an ionic liquid (1-butyl-3-methylimidazolium hexafluorophosphate) was used for extraction purpose. The ultrasonic irradiation and electrical shaking system were applied to enhance the dispersion and extraction of Pb +2 complex in aqueous samples. For second phase, dual microextraction (DµE phase), the enriched Pb +2 complex in ionic liquid, extracted back into the acidic aqueous solution and finally determined by flame atomic absorption spectrometry. Some major analytical parameters that influenced the extraction efficiency of developed method, such as pH, concentration of ligand, volume of ionic liquid and samples, time of shaking in thermostatic electrical shaker and ultrasonic bath, effect of back extracting HNO 3 volume, matrix effect, centrifugation time and rate were optimized. At the sample volume of 25mL, the calculated preconcentration factor was 62.2. The limit of detection of proposed procedure for Pb +2 ions was found to be 0.54μgL -1 . The validation of developed method was performed by the analysis of certified sample of water SRM 1643e and standard addition method in a real water sample. The extraction recovery of Pb +2 was enhanced≥2% with shaking time of 80s in ultrasonic bath as compared to used thermostatic electrical shaker, where for optimum recovery up to 10min was required. The developed procedure was successfully used for the enrichment of Pb +2 in ground and stored rain water (surface water) samples of an endemic region of Pakistan. The resulted data indicated that the ground water samples were highly contaminated with Pb +2 , while some of the surface water samples were also have higher values of Pb +2 than permissible limit of WHO. The concentration of Pb +2 in surface and ground water samples was found in the range of 17.5-24.5 and 25.6-99.1μgL - 1 respectively. Copyright © 2017 Elsevier B.V. All rights reserved.
Sampling methods for terrestrial amphibians and reptiles.
Paul Stephen Corn; R. Bruce Bury
1990-01-01
Methods described for sampling amphibians and reptiles in Douglas-fir forests in the Pacific Northwest include pitfall trapping, time-constrained collecting, and surveys of coarse woody debris. The herpetofauna of this region differ in breeding and nonbreeding habitats and vagility, so that no single technique is sufficient for a community study. A combination of...
40 CFR 53.63 - Test procedure: Wind tunnel inlet aspiration test.
Code of Federal Regulations, 2010 CFR
2010-07-01
... the sampler inlet opening centered in the sampling zone. To meet the maximum blockage limit of § 53.62(c)(1) or for convenience, part of the test sampler may be positioned external to the wind tunnel... = reference method sampler volumetric flow rate; and t = sampling time. (iii) Remove the reference method...
Portable automation of static chamber sample collection for quantifying soil gas flux
USDA-ARS?s Scientific Manuscript database
The collection of soil gas flux using the static chamber method is labor intensive. The number of chambers that can be sampled in a given time period is limited by the spacing between chambers and the availability of trained research technicians. However, the static chamber method can limit spatial ...
Hoppin, Jane A; Ulmer, Ross; Calafat, Antonia M; Barr, Dana B; Baker, Susan V; Meltzer, Helle M; Rønningen, Kjersti S
2006-01-01
Collection of urine samples in human studies involves choices regarding shipping, sample preservation, and storage that may ultimately influence future analysis. As more studies collect and archive urine samples to evaluate environmental exposures in the future, we were interested in assessing the impact of urine preservative, storage temperature, and time since collection on nonpersistent contaminants in urine samples. In spiked urine samples stored in three types of urine vacutainers (no preservative, boric acid, and chlorhexidine), we measured five groups of contaminants to assess the levels of these analytes at five time points (0, 24, 48, and 72 h, and 1 week) and at two temperatures (room temperature and 4 degrees C). The target chemicals were bisphenol A (BPA), metabolites of organophosphate (OP), carbamate, and pyrethroid insecticides, chlorinated phenols, and phthalate monoesters, and were measured using five different mass spectrometry-based methods. Three samples were analyzed at each time point, with the exception of BPA. Repeated measures analysis of variance was used to evaluate effects of storage time, temperature, and preservative. Stability was summarized with percent change in mean concentration from time 0. In general, most analytes were stable under all conditions with changes in mean concentration over time, temperature, and preservative being generally less than 20%, with the exception of the OP metabolites in the presence of boric acid. The effect of storage temperature was less important than time since collection. The precision of the laboratory measurements was high allowing us to observe small differences, which may not be important when categorizing individuals into broader exposure groups.
Alagandula, Ravali; Zhou, Xiang; Guo, Baochuan
2017-01-15
Liquid chromatography/tandem mass spectrometry (LC/MS/MS) is the gold standard of urine drug testing. However, current LC-based methods are time consuming, limiting the throughput of MS-based testing and increasing the cost. This is particularly problematic for quantification of drugs such as phenobarbital, which is often analyzed in a separate run because they must be negatively ionized. This study examined the feasibility of using a dilute-and-shoot flow-injection method without LC separation to quantify drugs with phenobarbital as a model system. Briefly, a urine sample containing phenobarbital was first diluted by 10 times, followed by flow injection of the diluted sample to mass spectrometer. Quantification and detection of phenobarbital were achieved by an electrospray negative ionization MS/MS system operated in the multiple reaction monitoring (MRM) mode with the stable-isotope-labeled drug as internal standard. The dilute-and-shoot flow-injection method developed was linear with a dynamic range of 50-2000 ng/mL of phenobarbital and correlation coefficient > 0.9996. The coefficients of variation and relative errors for intra- and inter-assays at four quality control (QC) levels (50, 125, 445 and 1600 ng/mL) were 3.0% and 5.0%, respectively. The total run time to quantify one sample was 2 min, and the sensitivity and specificity of the method did not deteriorate even after 1200 consecutive injections. Our method can accurately and robustly quantify phenobarbital in urine without LC separation. Because of its 2 min run time, the method can process 720 samples per day. This feasibility study shows that the dilute-and-shoot flow-injection method can be a general way for fast analysis of drugs in urine. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.
Revesz, Kinga M.; Landwehr, Jurate M.
2002-01-01
A new method was developed to analyze the stable carbon and oxygen isotope ratios of small samples (400 ± 20 µg) of calcium carbonate. This new method streamlines the classical phosphoric acid/calcium carbonate (H3PO4/CaCO3) reaction method by making use of a recently available Thermoquest-Finnigan GasBench II preparation device and a Delta Plus XL continuous flow isotope ratio mass spectrometer. Conditions for which the H3PO4/CaCO3 reaction produced reproducible and accurate results with minimal error had to be determined. When the acid/carbonate reaction temperature was kept at 26 °C and the reaction time was between 24 and 54 h, the precision of the carbon and oxygen isotope ratios for pooled samples from three reference standard materials was ≤0.1 and ≤0.2 per mill or ‰, respectively, although later analysis showed that materials from one specific standard required reaction time between 34 and 54 h for δ18O to achieve this level of precision. Aliquot screening methods were shown to further minimize the total error. The accuracy and precision of the new method were analyzed and confirmed by statistical analysis. The utility of the method was verified by analyzing calcite from Devils Hole, Nevada, for which isotope-ratio values had previously been obtained by the classical method. Devils Hole core DH-11 recently had been re-cut and re-sampled, and isotope-ratio values were obtained using the new method. The results were comparable with those obtained by the classical method with correlation = +0.96 for both isotope ratios. The consistency of the isotopic results is such that an alignment offset could be identified in the re-sampled core material, and two cutting errors that occurred during re-sampling then were confirmed independently. This result indicates that the new method is a viable alternative to the classical reaction method. In particular, the new method requires less sample material permitting finer resolution and allows automation of some processes resulting in considerable time savings.
Zhou, Qingxiang; Fang, Zhi; Liao, Xiangkun
2015-07-01
We describe a highly sensitive micro-solid-phase extraction method for the pre-concentration of six phthalate esters utilizing a TiO2 nanotube array coupled to high-performance liquid chromatography with a variable-wavelength ultraviolet visible detector. The selected phthalate esters included dimethyl phthalate, diethyl phthalate, dibutyl phthalate, butyl benzyl phthalate, bis(2-ethylhexyl)phthalate and dioctyl phthalate. The factors that would affect the enrichment, such as desorption solvent, sample pH, salting-out effect, extraction time and desorption time, were optimized. Under the optimum conditions, the linear range of the proposed method was 0.3-200 μg/L. The limits of detection were 0.04-0.2 μg/L (S/N = 3). The proposed method was successfully applied to the determination of six phthalate esters in water samples and satisfied spiked recoveries were achieved. These results indicated that the proposed method was appropriate for the determination of trace phthalate esters in environmental water samples. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
McEvoy, Eamon; Donegan, Sheila; Power, Joe; Altria, Kevin
2007-05-09
A rapid and efficient oil-in-water microemulsion liquid chromatographic method has been optimised and validated for the analysis of paracetamol in a suppository formulation. Excellent linearity, accuracy, precision and assay results were obtained. Lengthy sample pre-treatment/extraction procedures were eliminated due to the solubilising power of the microemulsion and rapid analysis times were achieved. The method was optimised to achieve rapid analysis time and relatively high peak efficiencies. A standard microemulsion composition of 33 g SDS, 66 g butan-1-ol, 8 g n-octane in 1l of 0.05% TFA modified with acetonitrile has been shown to be suitable for the rapid analysis of paracetamol in highly hydrophobic preparations under isocratic conditions. Validated assay results and overall analysis time of the optimised method was compared to British Pharmacopoeia reference methods. Sample preparation and analysis times for the MELC analysis of paracetamol in a suppository were extremely rapid compared to the reference method and similar assay results were achieved. A gradient MELC method using the same microemulsion has been optimised for the resolution of paracetamol and five of its related substances in approximately 7 min.
Characterizing lentic freshwater fish assemblages using multiple sampling methods
Fischer, Jesse R.; Quist, Michael C.
2014-01-01
Characterizing fish assemblages in lentic ecosystems is difficult, and multiple sampling methods are almost always necessary to gain reliable estimates of indices such as species richness. However, most research focused on lentic fish sampling methodology has targeted recreationally important species, and little to no information is available regarding the influence of multiple methods and timing (i.e., temporal variation) on characterizing entire fish assemblages. Therefore, six lakes and impoundments (48–1,557 ha surface area) were sampled seasonally with seven gear types to evaluate the combined influence of sampling methods and timing on the number of species and individuals sampled. Probabilities of detection for species indicated strong selectivities and seasonal trends that provide guidance on optimal seasons to use gears when targeting multiple species. The evaluation of species richness and number of individuals sampled using multiple gear combinations demonstrated that appreciable benefits over relatively few gears (e.g., to four) used in optimal seasons were not present. Specifically, over 90 % of the species encountered with all gear types and season combinations (N = 19) from six lakes and reservoirs were sampled with nighttime boat electrofishing in the fall and benthic trawling, modified-fyke, and mini-fyke netting during the summer. Our results indicated that the characterization of lentic fish assemblages was highly influenced by the selection of sampling gears and seasons, but did not appear to be influenced by waterbody type (i.e., natural lake, impoundment). The standardization of data collected with multiple methods and seasons to account for bias is imperative to monitoring of lentic ecosystems and will provide researchers with increased reliability in their interpretations and decisions made using information on lentic fish assemblages.
Segura, Ferran; Pons, Immaculada; Sanfeliu, Isabel; Nogueras, María-Mercedes
2016-04-01
Rickettsia conorii and Rickettsia massiliae-Bar29 are related to Mediterranean spotted fever (MSF). They are intracellular microorganisms. The Shell-vial culture assay (SV) improved Rickettsia culture but it still has some limitations: blood usually contains low amount of microorganisms and the samples that contain the highest amount of them are non-sterile. The objectives of this study were to optimize SV culture conditions and monitoring methods and to establish antibiotic concentrations useful for non-sterile samples. 12 SVs were inoculated with each microorganism, incubated at different temperatures and monitored by classical methods and real-time PCR. R. conorii was detected by all methods at all temperatures since 7th day of incubation. R. massiliae-Bar29 was firstly observed at 28°C. Real-time PCR allowed to detected it 2-7 days earlier (depend on temperature) than classical methods. Antibiotics concentration needed for the isolation of these Rickettsia species from non-sterile samples was determined inoculating SV with R. conorii, R. massiliae-Bar29, biopsy or tick, incubating them with different dilutions of antibiotics and monitoring them weekly. To sum up, if a MSF diagnosis is suspected, SV should be incubated at both 28°C and 32°C for 1-3 weeks and monitored by a sensitive real-time PCR. If the sample is non-sterile the panel of antibiotics tested can be added. Copyright © 2016 Elsevier GmbH. All rights reserved.
Digital carrier demodulator employing components working beyond normal limits
NASA Technical Reports Server (NTRS)
Hurd, William J. (Inventor); Sadr, Ramin (Inventor)
1990-01-01
In a digital device, having an input comprised of a digital sample stream at a frequency F, a method is disclosed for employing a component designed to work at a frequency less than F. The method, in general, is comprised of the following steps: dividing the digital sample stream into odd and even digital samples streams each at a frequency of F/2; passing one of the digital sample streams through the component designed to work at a frequency less than F where the component responds only to the odd or even digital samples in one of the digital sample streams; delaying the other digital sample streams for the time it takes the digital sample stream to pass through the component; and adding the one digital sample stream after passing through the component with the other delayed digital sample streams. In the specific example, the component is a finite impulse response filter of the order ((N + 1)/2) and the delaying step comprised passing the other digital sample streams through a shift register for a time (in sampling periods) of ((N + 1)/2) + r, where r is a pipline delay through the finite impulse response filter.
Plane-Based Sampling for Ray Casting Algorithm in Sequential Medical Images
Lin, Lili; Chen, Shengyong; Shao, Yan; Gu, Zichun
2013-01-01
This paper proposes a plane-based sampling method to improve the traditional Ray Casting Algorithm (RCA) for the fast reconstruction of a three-dimensional biomedical model from sequential images. In the novel method, the optical properties of all sampling points depend on the intersection points when a ray travels through an equidistant parallel plan cluster of the volume dataset. The results show that the method improves the rendering speed at over three times compared with the conventional algorithm and the image quality is well guaranteed. PMID:23424608
Jedynak, Łukasz; Jedynak, Maria; Kossykowska, Magdalena; Zagrodzka, Joanna
2017-02-20
An HPLC method with UV detection and separation with the use of a C30 reversed phase analytical column for the determination of chemical purity and assay of menaquinone-7 (MK7) in one chromatographic run was developed. The method is superior to the methods published in the USP Monograph in terms of selectivity, sensitivity and accuracy, as well as time, solvent and sample consumption. The developed methodology was applied to MK7 samples of active pharmaceutical ingredient (API) purity, MK7 samples of lower quality and crude MK7 samples before purification. The comparison of the results revealed that the use of USP methodology could lead to serious overestimation (up to a few percent) of both purity and MK7 assay in menaquinone-7 samples. Copyright © 2016 Elsevier B.V. All rights reserved.
Jiang, Jia-Jia; Duan, Fa-Jie; Li, Yan-Chao; Hua, Xiang-Ning
2014-03-01
Synchronization sampling is very important in underwater towed array system where every acquisition node (AN) samples analog signals by its own analog-digital converter (ADC). In this paper, a simple and effective synchronization sampling method is proposed to ensure synchronized operation among different ANs of the underwater towed array system. We first present a master-slave synchronization sampling model, and then design a high accuracy phase-locked loop to synchronize all delta-sigma ADCs to a reference clock. However, when the master-slave synchronization sampling model is used, both the time-delay (TD) of messages traveling along the wired transmission medium and the jitter of the clocks will bring out synchronization sampling error (SSE). Therefore, a simple method is proposed to estimate and compensate the TD of the messages transmission, and then another effective method is presented to overcome the SSE caused by the jitter of the clocks. An experimental system with three ANs is set up, and the related experimental results verify the validity of the synchronization sampling method proposed in this paper.
NASA Astrophysics Data System (ADS)
Jiang, Jia-Jia; Duan, Fa-Jie; Li, Yan-Chao; Hua, Xiang-Ning
2014-03-01
Synchronization sampling is very important in underwater towed array system where every acquisition node (AN) samples analog signals by its own analog-digital converter (ADC). In this paper, a simple and effective synchronization sampling method is proposed to ensure synchronized operation among different ANs of the underwater towed array system. We first present a master-slave synchronization sampling model, and then design a high accuracy phase-locked loop to synchronize all delta-sigma ADCs to a reference clock. However, when the master-slave synchronization sampling model is used, both the time-delay (TD) of messages traveling along the wired transmission medium and the jitter of the clocks will bring out synchronization sampling error (SSE). Therefore, a simple method is proposed to estimate and compensate the TD of the messages transmission, and then another effective method is presented to overcome the SSE caused by the jitter of the clocks. An experimental system with three ANs is set up, and the related experimental results verify the validity of the synchronization sampling method proposed in this paper.
Gianfranceschi, Monica Virginia; Rodriguez-Lazaro, David; Hernandez, Marta; González-García, Patricia; Comin, Damiano; Gattuso, Antonietta; Delibato, Elisabetta; Sonnessa, Michele; Pasquali, Frederique; Prencipe, Vincenza; Sreter-Lancz, Zuzsanna; Saiz-Abajo, María-José; Pérez-De-Juan, Javier; Butrón, Javier; Kozačinski, Lidija; Tomic, Danijela Horvatek; Zdolec, Nevijo; Johannessen, Gro S; Jakočiūnė, Džiuginta; Olsen, John Elmerdahl; De Santis, Paola; Lovari, Sarah; Bertasi, Barbara; Pavoni, Enrico; Paiusco, Antonella; De Cesare, Alessandra; Manfreda, Gerardo; De Medici, Dario
2014-08-01
The classical microbiological method for detection of Listeria monocytogenes requires around 7 days for final confirmation, and due to perishable nature of RTE food products, there is a clear need for an alternative methodology for detection of this pathogen. This study presents an international (at European level) ISO 16140-based validation trial of a non-proprietary real-time PCR-based methodology that can generate final results in the following day of the analysis. This methodology is based on an ISO compatible enrichment coupled to a bacterial DNA extraction and a consolidated real-time PCR assay. Twelve laboratories from six European countries participated in this trial, and soft cheese was selected as food model since it can represent a difficult matrix for the bacterial DNA extraction and real-time PCR amplification. The limit of detection observed was down to 10 CFU per 25 of sample, showing excellent concordance and accordance values between samples and laboratories (>75%). In addition, excellent values were obtained for relative accuracy, specificity and sensitivity (82.75%, 96.70% and 97.62%, respectively) when the results obtained for the real-time PCR-based methods were compared to those of the ISO 11290-1 standard method. An interesting observation was that the L. monocytogenes detection by the real-time PCR method was less affected in the presence of Listeria innocua in the contaminated samples, proving therefore to be more reliable than the reference method. The results of this international trial demonstrate that the evaluated real-time PCR-based method represents an excellent alterative to the ISO standard since it shows a higher performance as well as reduce the extent of the analytical process, and can be easily implemented routinely by the competent authorities and food industry laboratories. Copyright © 2014 Elsevier B.V. All rights reserved.
R.B. Ferguson; V. Clark Baldwin
1995-01-01
Estimating tree and stand volume in mature plantations is time consuming, involving much manpower and equipment; however, several sampling and volume-prediction techniques are available. This study showed that a well-constructed, volume-equation method yields estimates comparable to those of the often more time-consuming, height-accumulation method, even though the...
NASA Astrophysics Data System (ADS)
Godsey, S. E.; Kirchner, J. W.
2008-12-01
The mean residence time - the average time that it takes rainfall to reach the stream - is a basic parameter used to characterize catchment processes. Heterogeneities in these processes lead to a distribution of travel times around the mean residence time. By examining this travel time distribution, we can better predict catchment response to contamination events. A catchment system with shorter residence times or narrower distributions will respond quickly to contamination events, whereas systems with longer residence times or longer-tailed distributions will respond more slowly to those same contamination events. The travel time distribution of a catchment is typically inferred from time series of passive tracers (e.g., water isotopes or chloride) in precipitation and streamflow. Variations in the tracer concentration in streamflow are usually damped compared to those in precipitation, because precipitation inputs from different storms (with different tracer signatures) are mixed within the catchment. Mathematically, this mixing process is represented by the convolution of the travel time distribution and the precipitation tracer inputs to generate the stream tracer outputs. Because convolution in the time domain is equivalent to multiplication in the frequency domain, it is relatively straightforward to estimate the parameters of the travel time distribution in either domain. In the time domain, the parameters describing the travel time distribution are typically estimated by maximizing the goodness of fit between the modeled and measured tracer outputs. In the frequency domain, the travel time distribution parameters can be estimated by fitting a power-law curve to the ratio of precipitation spectral power to stream spectral power. Differences between the methods of parameter estimation in the time and frequency domain mean that these two methods may respond differently to variations in data quality, record length and sampling frequency. Here we evaluate how well these two methods of travel time parameter estimation respond to different sources of uncertainty and compare the methods to one another. We do this by generating synthetic tracer input time series of different lengths, and convolve these with specified travel-time distributions to generate synthetic output time series. We then sample both the input and output time series at various sampling intervals and corrupt the time series with realistic error structures. Using these 'corrupted' time series, we infer the apparent travel time distribution, and compare it to the known distribution that was used to generate the synthetic data in the first place. This analysis allows us to quantify how different record lengths, sampling intervals, and error structures in the tracer measurements affect the apparent mean residence time and the apparent shape of the travel time distribution.
A novel multi-target regression framework for time-series prediction of drug efficacy.
Li, Haiqing; Zhang, Wei; Chen, Ying; Guo, Yumeng; Li, Guo-Zheng; Zhu, Xiaoxin
2017-01-18
Excavating from small samples is a challenging pharmacokinetic problem, where statistical methods can be applied. Pharmacokinetic data is special due to the small samples of high dimensionality, which makes it difficult to adopt conventional methods to predict the efficacy of traditional Chinese medicine (TCM) prescription. The main purpose of our study is to obtain some knowledge of the correlation in TCM prescription. Here, a novel method named Multi-target Regression Framework to deal with the problem of efficacy prediction is proposed. We employ the correlation between the values of different time sequences and add predictive targets of previous time as features to predict the value of current time. Several experiments are conducted to test the validity of our method and the results of leave-one-out cross-validation clearly manifest the competitiveness of our framework. Compared with linear regression, artificial neural networks, and partial least squares, support vector regression combined with our framework demonstrates the best performance, and appears to be more suitable for this task.
A novel multi-target regression framework for time-series prediction of drug efficacy
Li, Haiqing; Zhang, Wei; Chen, Ying; Guo, Yumeng; Li, Guo-Zheng; Zhu, Xiaoxin
2017-01-01
Excavating from small samples is a challenging pharmacokinetic problem, where statistical methods can be applied. Pharmacokinetic data is special due to the small samples of high dimensionality, which makes it difficult to adopt conventional methods to predict the efficacy of traditional Chinese medicine (TCM) prescription. The main purpose of our study is to obtain some knowledge of the correlation in TCM prescription. Here, a novel method named Multi-target Regression Framework to deal with the problem of efficacy prediction is proposed. We employ the correlation between the values of different time sequences and add predictive targets of previous time as features to predict the value of current time. Several experiments are conducted to test the validity of our method and the results of leave-one-out cross-validation clearly manifest the competitiveness of our framework. Compared with linear regression, artificial neural networks, and partial least squares, support vector regression combined with our framework demonstrates the best performance, and appears to be more suitable for this task. PMID:28098186
Sochor, Jiri; Ryvolova, Marketa; Krystofova, Olga; Salas, Petr; Hubalek, Jaromir; Adam, Vojtech; Trnkova, Libuse; Havel, Ladislav; Beklova, Miroslava; Zehnalek, Josef; Provaznik, Ivo; Kizek, Rene
2010-11-29
The aim of this study was to describe behaviour, kinetics, time courses and limitations of the six different fully automated spectrometric methods--DPPH, TEAC, FRAP, DMPD, Free Radicals and Blue CrO5. Absorption curves were measured and absorbance maxima were found. All methods were calibrated using the standard compounds Trolox® and/or gallic acid. Calibration curves were determined (relative standard deviation was within the range from 1.5 to 2.5%). The obtained characteristics were compared and discussed. Moreover, the data obtained were applied to optimize and to automate all mentioned protocols. Automatic analyzer allowed us to analyse simultaneously larger set of samples, to decrease the measurement time, to eliminate the errors and to provide data of higher quality in comparison to manual analysis. The total time of analysis for one sample was decreased to 10 min for all six methods. In contrary, the total time of manual spectrometric determination was approximately 120 min. The obtained data provided good correlations between studied methods (R=0.97-0.99).
Maes, Sharon; Huu, Son Nguyen; Heyndrickx, Marc; Weyenberg, Stephanie van; Steenackers, Hans; Verplaetse, Alex; Vackier, Thijs; Sampers, Imca; Raes, Katleen; Reu, Koen De
2017-12-01
Biofilms are an important source of contamination in food companies, yet the composition of biofilms in practice is still mostly unknown. The chemical and microbiological characterization of surface samples taken after cleaning and disinfection is very important to distinguish free-living bacteria from the attached bacteria in biofilms. In this study, sampling methods that are potentially useful for both chemical and microbiological analyses of surface samples were evaluated. In the manufacturing facilities of eight Belgian food companies, surfaces were sampled after cleaning and disinfection using two sampling methods: the scraper-flocked swab method and the sponge stick method. Microbiological and chemical analyses were performed on these samples to evaluate the suitability of the sampling methods for the quantification of extracellular polymeric substance components and microorganisms originating from biofilms in these facilities. The scraper-flocked swab method was most suitable for chemical analyses of the samples because the material in these swabs did not interfere with determination of the chemical components. For microbiological enumerations, the sponge stick method was slightly but not significantly more effective than the scraper-flocked swab method. In all but one of the facilities, at least 20% of the sampled surfaces had more than 10 2 CFU/100 cm 2 . Proteins were found in 20% of the chemically analyzed surface samples, and carbohydrates and uronic acids were found in 15 and 8% of the samples, respectively. When chemical and microbiological results were combined, 17% of the sampled surfaces were contaminated with both microorganisms and at least one of the analyzed chemical components; thus, these surfaces were characterized as carrying biofilm. Overall, microbiological contamination in the food industry is highly variable by food sector and even within a facility at various sampling points and sampling times.
Parenthood and the Quality of Experience in Daily Life: A Longitudinal Study
ERIC Educational Resources Information Center
Fave, Antonella Delle; Massimini, Fausto
2004-01-01
This longitudinal study analyzes the time budget and the quality of experience reported by new parents. Five primiparous couples were repeatedly administered Experience Sampling Method. They carried pagers sending random signals 6-8 times a day; at the signal reception, they filled out forms sampling current thoughts, activities, and the quality…
Decoder calibration with ultra small current sample set for intracortical brain-machine interface
NASA Astrophysics Data System (ADS)
Zhang, Peng; Ma, Xuan; Chen, Luyao; Zhou, Jin; Wang, Changyong; Li, Wei; He, Jiping
2018-04-01
Objective. Intracortical brain-machine interfaces (iBMIs) aim to restore efficient communication and movement ability for paralyzed patients. However, frequent recalibration is required for consistency and reliability, and every recalibration will require relatively large most current sample set. The aim in this study is to develop an effective decoder calibration method that can achieve good performance while minimizing recalibration time. Approach. Two rhesus macaques implanted with intracortical microelectrode arrays were trained separately on movement and sensory paradigm. Neural signals were recorded to decode reaching positions or grasping postures. A novel principal component analysis-based domain adaptation (PDA) method was proposed to recalibrate the decoder with only ultra small current sample set by taking advantage of large historical data, and the decoding performance was compared with other three calibration methods for evaluation. Main results. The PDA method closed the gap between historical and current data effectively, and made it possible to take advantage of large historical data for decoder recalibration in current data decoding. Using only ultra small current sample set (five trials of each category), the decoder calibrated using the PDA method could achieve much better and more robust performance in all sessions than using other three calibration methods in both monkeys. Significance. (1) By this study, transfer learning theory was brought into iBMIs decoder calibration for the first time. (2) Different from most transfer learning studies, the target data in this study were ultra small sample set and were transferred to the source data. (3) By taking advantage of historical data, the PDA method was demonstrated to be effective in reducing recalibration time for both movement paradigm and sensory paradigm, indicating a viable generalization. By reducing the demand for large current training data, this new method may facilitate the application of intracortical brain-machine interfaces in clinical practice.
Current Protocols in Pharmacology
2016-01-01
Determination of drug or drug metabolite concentrations in biological samples, particularly in serum or plasma, is fundamental to describing the relationships between administered dose, route of administration, and time after dose to the drug concentrations achieved and to the observed effects of the drug. A well-characterized, accurate analytical method is needed, but it must also be established that the analyte concentration in the sample at the time of analysis is the same as the concentration at sample acquisition. Drugs and metabolites may be susceptible to degradation in samples due to metabolism or to physical and chemical processes, resulting in a lower measured concentration than was in the original sample. Careful examination of analyte stability during processing and storage and adjustment of procedures and conditions to maximize that stability are a critical part of method validation for the analysis, and can ensure the accuracy of the measured concentrations. PMID:27960029
Gardenier, Nicole Ciotti; MacDonald, Rebecca; Green, Gina
2004-01-01
We compared partial-interval recording (PIR) and momentary time sampling (MTS) estimates against continuous measures of the actual durations of stereotypic behavior in young children with autism or pervasive developmental disorder-not otherwise specified. Twenty-two videotaped samples of stereotypy were scored using a low-tech duration recording method, and relative durations (i.e., proportions of observation periods consumed by stereotypy) were calculated. Then 10, 20, and 30s MTS and 10s PIR estimates of relative durations were derived from the raw duration data. Across all samples, PIR was found to grossly overestimate the relative duration of stereotypy. Momentary time sampling both over- and under-estimated the relative duration of stereotypy, but with much smaller errors than PIR (Experiment 1). These results were replicated across 27 samples of low, moderate and high levels of stereotypy (Experiment 2).
Peschka, Manuela; Roberts, Paul H; Knepper, Thomas P
2007-10-01
The analysis and presence of clotrimazole, an antifungal agent with logK(OW) > 4, was thoroughly studied in the aquatic environment. For that reason analytical methods based on gas chromatography-mass spectrometry and liquid chromatography-tandem mass spectrometry were developed and validated to quantify clotrimazole with limits of quantification down to 5 and 1 ng/L, respectively. Both methods were compared in an intercalibration exercise. The complete mass-spectrometric fragmentation pattern could be elucidated with the aid of quadrupole time of flight mass spectrometry. Since clotrimazole tends to adsorb to laboratory glassware, studies on its adsorption behaviour were made to ensure the appropriate handling of water samples, e.g. pH, storage time, pretreatment of sampling vessels or material of the vials used for final extracts. The phenomena of adsorption to suspended matter were investigated while analysing different waste-water samples. Application of the methods in various investigated wastewater and surface water samples demonstrated that clotrimazole could only be detected in the low nanogram per litre range of anthropogenic influenced unfiltered water samples after acidification to pH 2.
A Class of Prediction-Correction Methods for Time-Varying Convex Optimization
NASA Astrophysics Data System (ADS)
Simonetto, Andrea; Mokhtari, Aryan; Koppel, Alec; Leus, Geert; Ribeiro, Alejandro
2016-09-01
This paper considers unconstrained convex optimization problems with time-varying objective functions. We propose algorithms with a discrete time-sampling scheme to find and track the solution trajectory based on prediction and correction steps, while sampling the problem data at a constant rate of $1/h$, where $h$ is the length of the sampling interval. The prediction step is derived by analyzing the iso-residual dynamics of the optimality conditions. The correction step adjusts for the distance between the current prediction and the optimizer at each time step, and consists either of one or multiple gradient steps or Newton steps, which respectively correspond to the gradient trajectory tracking (GTT) or Newton trajectory tracking (NTT) algorithms. Under suitable conditions, we establish that the asymptotic error incurred by both proposed methods behaves as $O(h^2)$, and in some cases as $O(h^4)$, which outperforms the state-of-the-art error bound of $O(h)$ for correction-only methods in the gradient-correction step. Moreover, when the characteristics of the objective function variation are not available, we propose approximate gradient and Newton tracking algorithms (AGT and ANT, respectively) that still attain these asymptotical error bounds. Numerical simulations demonstrate the practical utility of the proposed methods and that they improve upon existing techniques by several orders of magnitude.
Detection of halogenated flame retardants in polyurethane foam by particle induced X-ray emission
NASA Astrophysics Data System (ADS)
Maley, Adam M.; Falk, Kyle A.; Hoover, Luke; Earlywine, Elly B.; Seymour, Michael D.; DeYoung, Paul A.; Blum, Arlene; Stapleton, Heather M.; Peaslee, Graham F.
2015-09-01
A novel application of particle-induced X-ray emission (PIXE) has been developed to detect the presence of chlorinated and brominated flame retardant chemicals in polyurethane foams. Traditional Gas Chromatography-Mass Spectrometry (GC-MS) methods for the detection and identification of halogenated flame retardants in foams require extensive sample preparation and data acquisition time. The elemental analysis of the halogens in polyurethane foam performed by PIXE offers the opportunity to identify the presence of halogenated flame retardants in a fraction of the time and sample preparation cost. Through comparative GC-MS and PIXE analysis of 215 foam samples, excellent agreement between the two methods was obtained. These results suggest that PIXE could be an ideal rapid screening method for the presence of chlorinated and brominated flame retardants in polyurethane foams.
Coagulation dynamics of a blood sample by multiple scattering analysis
NASA Astrophysics Data System (ADS)
Faivre, Magalie; Peltié, Philippe; Planat-Chrétien, Anne; Cosnier, Marie-Line; Cubizolles, Myriam; Nougier, Christophe; Négrier, Claude; Pouteau, Patrick
2011-05-01
We report a new technique to measure coagulation dynamics on whole-blood samples. The method relies on the analysis of the speckle figure resulting from a whole-blood sample mixed with coagulation reagent and introduced in a thin chamber illuminated with a coherent light. A dynamic study of the speckle reveals a typical behavior due to coagulation. We compare our measured coagulation times to a reference method obtained in a medical laboratory.
Predictive sensor method and apparatus
NASA Technical Reports Server (NTRS)
Nail, William L. (Inventor); Koger, Thomas L. (Inventor); Cambridge, Vivien (Inventor)
1990-01-01
A predictive algorithm is used to determine, in near real time, the steady state response of a slow responding sensor such as hydrogen gas sensor of the type which produces an output current proportional to the partial pressure of the hydrogen present. A microprocessor connected to the sensor samples the sensor output at small regular time intervals and predicts the steady state response of the sensor in response to a perturbation in the parameter being sensed, based on the beginning and end samples of the sensor output for the current sample time interval.
High throughput integrated thermal characterization with non-contact optical calorimetry
NASA Astrophysics Data System (ADS)
Hou, Sichao; Huo, Ruiqing; Su, Ming
2017-10-01
Commonly used thermal analysis tools such as calorimeter and thermal conductivity meter are separated instruments and limited by low throughput, where only one sample is examined each time. This work reports an infrared based optical calorimetry with its theoretical foundation, which is able to provide an integrated solution to characterize thermal properties of materials with high throughput. By taking time domain temperature information of spatially distributed samples, this method allows a single device (infrared camera) to determine the thermal properties of both phase change systems (melting temperature and latent heat of fusion) and non-phase change systems (thermal conductivity and heat capacity). This method further allows these thermal properties of multiple samples to be determined rapidly, remotely, and simultaneously. In this proof-of-concept experiment, the thermal properties of a panel of 16 samples including melting temperatures, latent heats of fusion, heat capacities, and thermal conductivities have been determined in 2 min with high accuracy. Given the high thermal, spatial, and temporal resolutions of the advanced infrared camera, this method has the potential to revolutionize the thermal characterization of materials by providing an integrated solution with high throughput, high sensitivity, and short analysis time.
Assessment of PAH-exposure among coke oven workers.
Vähäkangas, K; Pyy, L; Yrjänheikki, E
1992-12-01
Coke oven workers are exposed to high concentrations of polycyclic aromatic hydrocarbons. Only recently have methods been developed to try to assess the individual, biologically significant exposure. The only coke oven plant in Finland started to function in 1987, in Raahe, enabling the implementation of a cohort study among the workers to determine the usefulness of some currently available biomonitoring methods, e.g. methods of measuring PAH-DNA adducts. Urine and blood samples were taken several times from a sample of workers starting from before they worked at the plant. A questionnaire (smoking, diet, former and current occupations) was filled in by the workers at every sampling, and air samples (personal and stationary) were collected at the same time. The mean values of both benzo(a)pyrene diolepoxide (BPDE)-DNA adducts were measured by synchronous fluorescence spectrophotometry (SFS) and the antibodies to these adducts increased somewhat after the work at the plant started. However, all the adduct values were low, and no differences between the smokers and non-smokers at any time point were detected. Battery workers had slightly increased means of BPDE-DNA adducts compared to non-battery workers. Also, coke oven workers had slightly higher adduct values than age, sex and smoking matched controls.
Inagaki, Shinsuke; Noda, Takumi; Min, Jun Zhe; Toyo'oka, Toshimasa
2007-12-28
An exhaustive analysis of metabolites in hair samples has been performed for the first time using ultra performance liquid chromatography with electrospray ionization time-of-flight mass spectrometry (UPLC-ESI-TOF-MS). The hair samples were collected from spontaneously hypertensive model rats (SHR/Izm), stroke-prone SHR (SHRSP/Izm) and Wistar Kyoto (WKY/Izm) rats, and were analyzed by UPLC-ESI-TOF-MS; a multivariate statistical analysis method, such as the principal component analysis (PCA), was then used for screening the biomarkers. From the samples derived from the group of SHRSP/Izm at weeks 10, 18, 26 and 34, we successfully detected a potential biomarker of stroke, which existed at much higher concentrations as compared with that in the other groups. However, a significant difference could not be found at weeks less than 7 before the rats were subjected to stroke and hypertension. In addition, the present method was applicable to screening not only the disease markers, but also the markers related to aging. The method utilizing hair samples is expected to be quite useful for screening biomarkers of many other diseases, and not limited to stroke and hypertension.
Zhang, Yaping; Williams, Brent J.; Goldstein, Allen H.; ...
2016-11-25
Here, we present a rapid method for apportioning the sources of atmospheric organic aerosol composition measured by gas chromatography–mass spectrometry methods. Here, we specifically apply this new analysis method to data acquired on a thermal desorption aerosol gas chromatograph (TAG) system. Gas chromatograms are divided by retention time into evenly spaced bins, within which the mass spectra are summed. A previous chromatogram binning method was introduced for the purpose of chromatogram structure deconvolution (e.g., major compound classes) (Zhang et al., 2014). Here we extend the method development for the specific purpose of determining aerosol samples' sources. Chromatogram bins are arrangedmore » into an input data matrix for positive matrix factorization (PMF), where the sample number is the row dimension and the mass-spectra-resolved eluting time intervals (bins) are the column dimension. Then two-dimensional PMF can effectively do three-dimensional factorization on the three-dimensional TAG mass spectra data. The retention time shift of the chromatogram is corrected by applying the median values of the different peaks' shifts. Bin width affects chemical resolution but does not affect PMF retrieval of the sources' time variations for low-factor solutions. A bin width smaller than the maximum retention shift among all samples requires retention time shift correction. A six-factor PMF comparison among aerosol mass spectrometry (AMS), TAG binning, and conventional TAG compound integration methods shows that the TAG binning method performs similarly to the integration method. However, the new binning method incorporates the entirety of the data set and requires significantly less pre-processing of the data than conventional single compound identification and integration. In addition, while a fraction of the most oxygenated aerosol does not elute through an underivatized TAG analysis, the TAG binning method does have the ability to achieve molecular level resolution on other bulk aerosol components commonly observed by the AMS.« less
NASA Astrophysics Data System (ADS)
Zhang, Yaping; Williams, Brent J.; Goldstein, Allen H.; Docherty, Kenneth S.; Jimenez, Jose L.
2016-11-01
We present a rapid method for apportioning the sources of atmospheric organic aerosol composition measured by gas chromatography-mass spectrometry methods. Here, we specifically apply this new analysis method to data acquired on a thermal desorption aerosol gas chromatograph (TAG) system. Gas chromatograms are divided by retention time into evenly spaced bins, within which the mass spectra are summed. A previous chromatogram binning method was introduced for the purpose of chromatogram structure deconvolution (e.g., major compound classes) (Zhang et al., 2014). Here we extend the method development for the specific purpose of determining aerosol samples' sources. Chromatogram bins are arranged into an input data matrix for positive matrix factorization (PMF), where the sample number is the row dimension and the mass-spectra-resolved eluting time intervals (bins) are the column dimension. Then two-dimensional PMF can effectively do three-dimensional factorization on the three-dimensional TAG mass spectra data. The retention time shift of the chromatogram is corrected by applying the median values of the different peaks' shifts. Bin width affects chemical resolution but does not affect PMF retrieval of the sources' time variations for low-factor solutions. A bin width smaller than the maximum retention shift among all samples requires retention time shift correction. A six-factor PMF comparison among aerosol mass spectrometry (AMS), TAG binning, and conventional TAG compound integration methods shows that the TAG binning method performs similarly to the integration method. However, the new binning method incorporates the entirety of the data set and requires significantly less pre-processing of the data than conventional single compound identification and integration. In addition, while a fraction of the most oxygenated aerosol does not elute through an underivatized TAG analysis, the TAG binning method does have the ability to achieve molecular level resolution on other bulk aerosol components commonly observed by the AMS.
Polarization Sensitive Coherent Anti-Stokes Raman Spectroscopy of DCVJ in Doped Polymer
NASA Astrophysics Data System (ADS)
Ujj, Laszlo
2014-05-01
Coherent Raman Microscopy is an emerging technic and method to image biological samples such as living cells by recording vibrational fingerprints of molecules with high spatial resolution. The race is on to record the entire image during the shortest time possible in order to increase the time resolution of the recorded cellular events. The electronically enhanced polarization sensitive version of Coherent anti-Stokes Raman scattering is one of the method which can shorten the recording time and increase the sharpness of an image by enhancing the signal level of special molecular vibrational modes. In order to show the effectiveness of the method a model system, a highly fluorescence sample, DCVJ in a polymer matrix is investigated. Polarization sensitive resonance CARS spectra are recorded and analyzed. Vibrational signatures are extracted with model independent methods. Details of the measurements and data analysis will be presented. The author gratefully acknowledge the UWF for financial support.
Adaptively biased molecular dynamics: An umbrella sampling method with a time-dependent potential
NASA Astrophysics Data System (ADS)
Babin, Volodymyr; Karpusenka, Vadzim; Moradi, Mahmoud; Roland, Christopher; Sagui, Celeste
We discuss an adaptively biased molecular dynamics (ABMD) method for the computation of a free energy surface for a set of reaction coordinates. The ABMD method belongs to the general category of umbrella sampling methods with an evolving biasing potential. It is characterized by a small number of control parameters and an O(t) numerical cost with simulation time t. The method naturally allows for extensions based on multiple walkers and replica exchange mechanism. The workings of the method are illustrated with a number of examples, including sugar puckering, and free energy landscapes for polymethionine and polyproline peptides, and for a short β-turn peptide. ABMD has been implemented into the latest version (Case et al., AMBER 10; University of California: San Francisco, 2008) of the AMBER software package and is freely available to the simulation community.
Adaptive single-pixel imaging with aggregated sampling and continuous differential measurements
NASA Astrophysics Data System (ADS)
Huo, Yaoran; He, Hongjie; Chen, Fan; Tai, Heng-Ming
2018-06-01
This paper proposes an adaptive compressive imaging technique with one single-pixel detector and single arm. The aggregated sampling (AS) method enables the reduction of resolutions of the reconstructed images. It aims to reduce the time and space consumption. The target image with a resolution up to 1024 × 1024 can be reconstructed successfully at the 20% sampling rate. The continuous differential measurement (CDM) method combined with a ratio factor of significant coefficient (RFSC) improves the imaging quality. Moreover, RFSC reduces the human intervention in parameter setting. This technique enhances the practicability of single-pixel imaging with the benefits from less time and space consumption, better imaging quality and less human intervention.
Optical method for determining the mechanical properties of a material
Maris, H.J.; Stoner, R.J.
1998-12-01
Disclosed is a method for characterizing a sample, comprising the steps of: (a) acquiring data from the sample using at least one probe beam wavelength to measure, for times less than a few nanoseconds, a change in the reflectivity of the sample induced by a pump beam; (b) analyzing the data to determine at least one material property by comparing a background signal component of the data with data obtained for a similar delay time range from one or more samples prepared under conditions known to give rise to certain physical and chemical material properties; and (c) analyzing a component of the measured time dependent reflectivity caused by ultrasonic waves generated by the pump beam using the at least one determined material property. The first step of analyzing may include a step of interpolating between reference samples to obtain an intermediate set of material properties. The material properties may include sound velocity, density, and optical constants. In one embodiment, only a correlation is made with the background signal, and at least one of the structural phase, grain orientation, and stoichiometry is determined. 14 figs.
Optical method for determining the mechanical properties of a material
Maris, Humphrey J.; Stoner, Robert J.
1998-01-01
Disclosed is a method for characterizing a sample, comprising the steps of: (a) acquiring data from the sample using at least one probe beam wavelength to measure, for times less than a few nanoseconds, a change in the reflectivity of the sample induced by a pump beam; (b) analyzing the data to determine at least one material property by comparing a background signal component of the data with data obtained for a similar delay time range from one or more samples prepared under conditions known to give rise to certain physical and chemical material properties; and (c) analyzing a component of the measured time dependent reflectivity caused by ultrasonic waves generated by the pump beam using the at least one determined material property. The first step of analyzing may include a step of interpolating between reference samples to obtain an intermediate set of material properties. The material properties may include sound velocity, density, and optical constants. In one embodiment, only a correlation is made with the background signal, and at least one of the structural phase, grain orientation, and stoichiometry is determined.
NASA Astrophysics Data System (ADS)
Ma, Yinbiao; Wei, Xiaojuan
2017-04-01
A novel method for the determination of platinum in waste platinum-loaded carbon catalyst samples was established by inductively coupled plasma optical emission spectrometry after samples digested by microwave oven with aqua regia. Such experiment conditions were investigated as the influence of sample digestion methods, digestion time, digestion temperature and interfering ions on the determination. Under the optimized conditions, the linear range of calibration graph for Pt was 0 ˜ 200.00 mg L-1, and the recovery was 95.67% ˜ 104.29%. The relative standard deviation (RSDs) for Pt was 1.78 %. The proposed method was applied to determine the same samples with atomic absorption spectrometry with the results consistently, which is suitable for the determination of platinum in waste platinum-loaded carbon catalyst samples.
Chen, Meilian; Lee, Jong-Hyeon; Hur, Jin
2015-10-01
Despite literature evidence suggesting the importance of sampling methods on the properties of sediment pore waters, their effects on the dissolved organic matter (PW-DOM) have been unexplored to date. Here, we compared the effects of two commonly used sampling methods (i.e., centrifuge and Rhizon sampler) on the characteristics of PW-DOM for the first time. The bulk dissolved organic carbon (DOC), ultraviolet-visible (UV-Vis) absorption, and excitation-emission matrixes coupled with parallel factor analysis (EEM-PARAFAC) of the PW-DOM samples were compared for the two sampling methods with the sediments from minimal to severely contaminated sites. The centrifuged samples were found to have higher average values of DOC, UV absorption, and protein-like EEM-PARAFAC components. The samples collected with the Rhizon sampler, however, exhibited generally more humified characteristics than the centrifuged ones, implying a preferential collection of PW-DOM with respect to the sampling methods. Furthermore, the differences between the two sampling methods seem more pronounced in relatively more polluted sites. Our observations were possibly explained by either the filtration effect resulting from the smaller pore size of the Rhizon sampler or the desorption of DOM molecules loosely bound to minerals during centrifugation, or both. Our study suggests that consistent use of one sampling method is crucial for PW-DOM studies and also that caution should be taken in the comparison of data collected with different sampling methods.
Zhang, Fang; Wagner, Anita K; Soumerai, Stephen B; Ross-Degnan, Dennis
2009-02-01
Interrupted time series (ITS) is a strong quasi-experimental research design, which is increasingly applied to estimate the effects of health services and policy interventions. We describe and illustrate two methods for estimating confidence intervals (CIs) around absolute and relative changes in outcomes calculated from segmented regression parameter estimates. We used multivariate delta and bootstrapping methods (BMs) to construct CIs around relative changes in level and trend, and around absolute changes in outcome based on segmented linear regression analyses of time series data corrected for autocorrelated errors. Using previously published time series data, we estimated CIs around the effect of prescription alerts for interacting medications with warfarin on the rate of prescriptions per 10,000 warfarin users per month. Both the multivariate delta method (MDM) and the BM produced similar results. BM is preferred for calculating CIs of relative changes in outcomes of time series studies, because it does not require large sample sizes when parameter estimates are obtained correctly from the model. Caution is needed when sample size is small.
Khoomrung, Sakda; Chumnanpuen, Pramote; Jansa-ard, Suwanee; Nookaew, Intawat; Nielsen, Jens
2012-06-01
We present a fast and accurate method for preparation of fatty acid methyl esters (FAMEs) using microwave-assisted derivatization of fatty acids present in yeast samples. The esterification of free/bound fatty acids to FAMEs was completed within 5 min, which is 24 times faster than with conventional heating methods. The developed method was validated in two ways: (1) through comparison with a conventional method (hot plate) and (2) through validation with the standard reference material (SRM) 3275-2 omega-3 and omega-6 fatty acids in fish oil (from the Nation Institute of Standards and Technology, USA). There were no significant differences (P>0.05) in yields of FAMEs with both validations. By performing a simple modification of closed-vessel microwave heating, it was possible to carry out the esterification in Pyrex glass tubes kept inside the closed vessel. Hereby, we are able to increase the number of sample preparations to several hundred samples per day as the time for preparation of reused vessels was eliminated. Pretreated cell disruption steps are not required, since the direct FAME preparation provides equally quantitative results. The new microwave-assisted derivatization method facilitates the preparation of FAMEs directly from yeast cells, but the method is likely to also be applicable for other biological samples.
Measurement of Crystalline Silica Aerosol Using Quantum Cascade Laser-Based Infrared Spectroscopy.
Wei, Shijun; Kulkarni, Pramod; Ashley, Kevin; Zheng, Lina
2017-10-24
Inhalation exposure to airborne respirable crystalline silica (RCS) poses major health risks in many industrial environments. There is a need for new sensitive instruments and methods for in-field or near real-time measurement of crystalline silica aerosol. The objective of this study was to develop an approach, using quantum cascade laser (QCL)-based infrared spectroscopy (IR), to quantify airborne concentrations of RCS. Three sampling methods were investigated for their potential for effective coupling with QCL-based transmittance measurements: (i) conventional aerosol filter collection, (ii) focused spot sample collection directly from the aerosol phase, and (iii) dried spot obtained from deposition of liquid suspensions. Spectral analysis methods were developed to obtain IR spectra from the collected particulate samples in the range 750-1030 cm -1 . The new instrument was calibrated and the results were compared with standardized methods based on Fourier transform infrared (FTIR) spectrometry. Results show that significantly lower detection limits for RCS (≈330 ng), compared to conventional infrared methods, could be achieved with effective microconcentration and careful coupling of the particulate sample with the QCL beam. These results offer promise for further development of sensitive filter-based laboratory methods and portable sensors for near real-time measurement of crystalline silica aerosol.
Research of mine water source identification based on LIF technology
NASA Astrophysics Data System (ADS)
Zhou, Mengran; Yan, Pengcheng
2016-09-01
According to the problem that traditional chemical methods to the mine water source identification takes a long time, put forward a method for rapid source identification system of mine water inrush based on the technology of laser induced fluorescence (LIF). Emphatically analyzes the basic principle of LIF technology. The hardware composition of LIF system are analyzed and the related modules were selected. Through the fluorescence experiment with the water samples of coal mine in the LIF system, fluorescence spectra of water samples are got. Traditional water source identification mainly according to the ion concentration representative of the water, but it is hard to analysis the ion concentration of the water from the fluorescence spectra. This paper proposes a simple and practical method of rapid identification of water by fluorescence spectrum, which measure the space distance between unknown water samples and standard samples, and then based on the clustering analysis, the category of the unknown water sample can be get. Water source identification for unknown samples verified the reliability of the LIF system, and solve the problem that the current coal mine can't have a better real-time and online monitoring on water inrush, which is of great significance for coal mine safety in production.
NASA Astrophysics Data System (ADS)
Wang, Ting; Plecháč, Petr
2017-12-01
Stochastic reaction networks that exhibit bistable behavior are common in systems biology, materials science, and catalysis. Sampling of stationary distributions is crucial for understanding and characterizing the long-time dynamics of bistable stochastic dynamical systems. However, simulations are often hindered by the insufficient sampling of rare transitions between the two metastable regions. In this paper, we apply the parallel replica method for a continuous time Markov chain in order to improve sampling of the stationary distribution in bistable stochastic reaction networks. The proposed method uses parallel computing to accelerate the sampling of rare transitions. Furthermore, it can be combined with the path-space information bounds for parametric sensitivity analysis. With the proposed methodology, we study three bistable biological networks: the Schlögl model, the genetic switch network, and the enzymatic futile cycle network. We demonstrate the algorithmic speedup achieved in these numerical benchmarks. More significant acceleration is expected when multi-core or graphics processing unit computer architectures and programming tools such as CUDA are employed.
NASA Technical Reports Server (NTRS)
Goyal, S. S.; Rains, D. W.; Huffaker, R. C.
1988-01-01
A fast, sensitive, simple, and highly reproducible method for routine assay of ammonium ion (NH4+) was developed by using HPLC equipment. The method is based on the reaction of NH4+ with o-phthalaldehyde (OPA) in the presence of 2-mercaptoethanol. After an on-line derivatization, the resulting NH4(+)-OPA product was quantified by using fluorometric or spectrophotometric detection. For fluorometric detection, the excitation and emission wavelengths were 410 and 470 nm, respectively. The spectrophotometric detection was made by measuring absorbance at 410 nm. Results on the effects of OPA-reagent composition and pH, reaction temperature, sample matrix, and linearity of the assay are presented. Even though it took about 2 min from the time of sample injection to the appearance of sample peak, sample injections could be overlapped at an interval of about 1 min. Thus, the actual time needed for analysis was about 1 min per assay. The method can be used in a fully automated mode by using an autosampler injector.
Code of Federal Regulations, 2014 CFR
2014-07-01
... parts per million by dry volume 3-run average (1 hour minimum sample time per run) Performance test..., appendix A-4). Oxides of nitrogen 388 parts per million by dry volume 3-run average (1 hour minimum sample... (1 hour minimum sample time per run) Performance test (Method 6 or 6c of appendix A of this part) a...
Code of Federal Regulations, 2013 CFR
2013-07-01
... parts per million by dry volume 3-run average (1 hour minimum sample time per run) Performance test..., appendix A-4). Oxides of nitrogen 388 parts per million by dry volume 3-run average (1 hour minimum sample... (1 hour minimum sample time per run) Performance test (Method 6 or 6c of appendix A of this part) a...
Rutter, A.P.; Hanford, K.L.; Zwers, J.T.; Perillo-Nicholas, A. L.; Schauer, J.J.; Olson, M.L.
2008-01-01
Reactive gaseous mercury (RGM) and particulate mercury (PHg) were collected in Milwaukee, WI, between April 2004 and May 2005, and in Riverside, CA, between July 25 and August 7, 2005 using sorbent and filter substrates. The substrates were analyzed for mercury by thermal desorption analysis (TDA) using a purpose-built instrument. Results from this offline-TDA method were compared with measurements using a real-time atmospheric mercury analyzer. RGM measurements made with the offline-TDA agreed well with a commercial real-time method. However, the offline TDA reported PHg concentrations 2.7 times higher than the real-time method, indicating evaporative losses might be occurring from the real-time instrument during sample collection. TDA combined with reactive mercury collection on filter and absorbent substrates was cheap, relatively easy to use, did not introduce biases due to a semicontinuous sample collection strategy, and had a dynamic range appropriate for use in rural and urban locations. The results of this study demonstrate that offline-TDA is a feasible method for collecting reactive mercury concentrations in a large network of filter-based samplers. Copyright 2008 Air & Waste Management Association.
Methods of sampling airborne fungi in working environments of waste treatment facilities.
Černá, Kristýna; Wittlingerová, Zdeňka; Zimová, Magdaléna; Janovský, Zdeněk
2016-01-01
The objective of the present study was to evaluate and compare the efficiency of a filter based sampling method and a high volume sampling method for sampling airborne culturable fungi present in waste sorting facilities. Membrane filters method was compared with surface air system method. The selected sampling methods were modified and tested in 2 plastic waste sorting facilities. The total number of colony-forming units (CFU)/m3 of airborne fungi was dependent on the type of sampling device, on the time of sampling, which was carried out every hour from the beginning of the work shift, and on the type of cultivation medium (p < 0.001). Detected concentrations of airborne fungi ranged 2×102-1.7×106 CFU/m3 when using the membrane filters (MF) method, and 3×102-6.4×104 CFU/m3 when using the surface air system (SAS) method. Both methods showed comparable sensitivity to the fluctuations of the concentrations of airborne fungi during the work shifts. The SAS method is adequate for a fast indicative determination of concentration of airborne fungi. The MF method is suitable for thorough assessment of working environment contamination by airborne fungi. Therefore we recommend the MF method for the implementation of a uniform standard methodology of airborne fungi sampling in working environments of waste treatment facilities. This work is available in Open Access model and licensed under a CC BY-NC 3.0 PL license.
Adaptive control of theophylline therapy: importance of blood sampling times.
D'Argenio, D Z; Khakmahd, K
1983-10-01
A two-observation protocol for estimating theophylline clearance during a constant-rate intravenous infusion is used to examine the importance of blood sampling schedules with regard to the information content of resulting concentration data. Guided by a theory for calculating maximally informative sample times, population simulations are used to assess the effect of specific sampling times on the precision of resulting clearance estimates and subsequent predictions of theophylline plasma concentrations. The simulations incorporated noise terms for intersubject variability, dosing errors, sample collection errors, and assay error. Clearance was estimated using Chiou's method, least squares, and a Bayesian estimation procedure. The results of these simulations suggest that clinically significant estimation and prediction errors may result when using the above two-point protocol for estimating theophylline clearance if the time separating the two blood samples is less than one population mean elimination half-life.