Sample records for sample processing methods

  1. Method of evaluation of process of red blood cell sedimentation based on photometry of droplet samples.

    PubMed

    Aristov, Alexander; Nosova, Ekaterina

    2017-04-01

    The paper focuses on research aimed at creating and testing a new approach to evaluate the processes of aggregation and sedimentation of red blood cells for purpose of its use in clinical laboratory diagnostics. The proposed method is based on photometric analysis of blood sample formed as a sessile drop. The results of clinical approbation of this method are given in the paper. Analysis of the processes occurring in the sample in the form of sessile drop during the process of blood cells sedimentation is described. The results of experimental studies to evaluate the effect of the droplet sample focusing properties on light radiation transmittance are presented. It is shown that this method significantly reduces the sample volume and provides sufficiently high sensitivity to the studied processes.

  2. Innovative application of the moisture analyzer for determination of dry mass content of processed cheese

    NASA Astrophysics Data System (ADS)

    Kowalska, Małgorzata; Janas, Sławomir; Woźniak, Magdalena

    2018-04-01

    The aim of this work was the presentation of an alternative method of determination of the total dry mass content in processed cheese. The authors claim that the presented method can be used in industry's quality control laboratories for routine testing and for quick in-process control. For the test purposes both reference method of determination of dry mass in processed cheese and moisture analyzer method were used. The tests were carried out for three different kinds of processed cheese. In accordance with the reference method, the sample was placed on a layer of silica sand and dried at the temperature of 102 °C for about 4 h. The moisture analyzer test required method validation, with regard to drying temperature range and mass of the analyzed sample. Optimum drying temperature of 110 °C was determined experimentally. For Hochland cream processed cheese sample, the total dry mass content, obtained using the reference method, was 38.92%, whereas using the moisture analyzer method, it was 38.74%. An average analysis time in case of the moisture analyzer method was 9 min. For the sample of processed cheese with tomatoes, the reference method result was 40.37%, and the alternative method result was 40.67%. For the sample of cream processed cheese with garlic the reference method gave value of 36.88%, and the alternative method, of 37.02%. An average time of those determinations was 16 min. Obtained results confirmed that use of moisture analyzer is effective. Compliant values of dry mass content were obtained for both of the used methods. According to the authors, the fact that the measurement took incomparably less time for moisture analyzer method, is a key criterion of in-process control and final quality control method selection.

  3. Rapid method for sampling metals for materials identification

    NASA Technical Reports Server (NTRS)

    Higgins, L. E.

    1971-01-01

    Nondamaging process similar to electrochemical machining is useful in obtaining metal samples from places inaccessible to conventional sampling methods or where methods would be hazardous or contaminating to specimens. Process applies to industries where metals or metal alloys play a vital role.

  4. MStern Blotting-High Throughput Polyvinylidene Fluoride (PVDF) Membrane-Based Proteomic Sample Preparation for 96-Well Plates.

    PubMed

    Berger, Sebastian T; Ahmed, Saima; Muntel, Jan; Cuevas Polo, Nerea; Bachur, Richard; Kentsis, Alex; Steen, Judith; Steen, Hanno

    2015-10-01

    We describe a 96-well plate compatible membrane-based proteomic sample processing method, which enables the complete processing of 96 samples (or multiples thereof) within a single workday. This method uses a large-pore hydrophobic PVDF membrane that efficiently adsorbs proteins, resulting in fast liquid transfer through the membrane and significantly reduced sample processing times. Low liquid transfer speeds have prevented the useful 96-well plate implementation of FASP as a widely used membrane-based proteomic sample processing method. We validated our approach on whole-cell lysate and urine and cerebrospinal fluid as clinically relevant body fluids. Without compromising peptide and protein identification, our method uses a vacuum manifold and circumvents the need for digest desalting, making our processing method compatible with standard liquid handling robots. In summary, our new method maintains the strengths of FASP and simultaneously overcomes one of the major limitations of FASP without compromising protein identification and quantification. © 2015 by The American Society for Biochemistry and Molecular Biology, Inc.

  5. MStern Blotting–High Throughput Polyvinylidene Fluoride (PVDF) Membrane-Based Proteomic Sample Preparation for 96-Well Plates*

    PubMed Central

    Berger, Sebastian T.; Ahmed, Saima; Muntel, Jan; Cuevas Polo, Nerea; Bachur, Richard; Kentsis, Alex; Steen, Judith; Steen, Hanno

    2015-01-01

    We describe a 96-well plate compatible membrane-based proteomic sample processing method, which enables the complete processing of 96 samples (or multiples thereof) within a single workday. This method uses a large-pore hydrophobic PVDF membrane that efficiently adsorbs proteins, resulting in fast liquid transfer through the membrane and significantly reduced sample processing times. Low liquid transfer speeds have prevented the useful 96-well plate implementation of FASP as a widely used membrane-based proteomic sample processing method. We validated our approach on whole-cell lysate and urine and cerebrospinal fluid as clinically relevant body fluids. Without compromising peptide and protein identification, our method uses a vacuum manifold and circumvents the need for digest desalting, making our processing method compatible with standard liquid handling robots. In summary, our new method maintains the strengths of FASP and simultaneously overcomes one of the major limitations of FASP without compromising protein identification and quantification. PMID:26223766

  6. Changes to Serum Sample Tube and Processing Methodology Does Not Cause Inter-Individual Variation in Automated Whole Serum N-Glycan Profiling in Health and Disease

    PubMed Central

    Shubhakar, Archana; Kalla, Rahul; Nimmo, Elaine R.; Fernandes, Daryl L.; Satsangi, Jack; Spencer, Daniel I. R.

    2015-01-01

    Introduction Serum N-glycans have been identified as putative biomarkers for numerous diseases. The impact of different serum sample tubes and processing methods on N-glycan analysis has received relatively little attention. This study aimed to determine the effect of different sample tubes and processing methods on the whole serum N-glycan profile in both health and disease. A secondary objective was to describe a robot automated N-glycan release, labeling and cleanup process for use in a biomarker discovery system. Methods 25 patients with active and quiescent inflammatory bowel disease and controls had three different serum sample tubes taken at the same draw. Two different processing methods were used for three types of tube (with and without gel-separation medium). Samples were randomised and processed in a blinded fashion. Whole serum N-glycan release, 2-aminobenzamide labeling and cleanup was automated using a Hamilton Microlab STARlet Liquid Handling robot. Samples were analysed using a hydrophilic interaction liquid chromatography/ethylene bridged hybrid(BEH) column on an ultra-high performance liquid chromatography instrument. Data were analysed quantitatively by pairwise correlation and hierarchical clustering using the area under each chromatogram peak. Qualitatively, a blinded assessor attempted to match chromatograms to each individual. Results There was small intra-individual variation in serum N-glycan profiles from samples collected using different sample processing methods. Intra-individual correlation coefficients were between 0.99 and 1. Unsupervised hierarchical clustering and principal coordinate analyses accurately matched samples from the same individual. Qualitative analysis demonstrated good chromatogram overlay and a blinded assessor was able to accurately match individuals based on chromatogram profile, regardless of disease status. Conclusions The three different serum sample tubes processed using the described methods cause minimal inter-individual variation in serum whole N-glycan profile when processed using an automated workstream. This has important implications for N-glycan biomarker discovery studies using different serum processing standard operating procedures. PMID:25831126

  7. A unified method to process biosolids samples for the recovery of bacterial, viral, and helminths pathogens.

    PubMed

    Alum, Absar; Rock, Channah; Abbaszadegan, Morteza

    2014-01-01

    For land application, biosolids are classified as Class A or Class B based on the levels of bacterial, viral, and helminths pathogens in residual biosolids. The current EPA methods for the detection of these groups of pathogens in biosolids include discrete steps. Therefore, a separate sample is processed independently to quantify the number of each group of the pathogens in biosolids. The aim of the study was to develop a unified method for simultaneous processing of a single biosolids sample to recover bacterial, viral, and helminths pathogens. At the first stage for developing a simultaneous method, nine eluents were compared for their efficiency to recover viruses from a 100 gm spiked biosolids sample. In the second stage, the three top performing eluents were thoroughly evaluated for the recovery of bacteria, viruses, and helminthes. For all three groups of pathogens, the glycine-based eluent provided higher recovery than the beef extract-based eluent. Additional experiments were performed to optimize performance of glycine-based eluent under various procedural factors such as, solids to eluent ratio, stir time, and centrifugation conditions. Last, the new method was directly compared with the EPA methods for the recovery of the three groups of pathogens spiked in duplicate samples of biosolids collected from different sources. For viruses, the new method yielded up to 10% higher recoveries than the EPA method. For bacteria and helminths, recoveries were 74% and 83% by the new method compared to 34% and 68% by the EPA method, respectively. The unified sample processing method significantly reduces the time required for processing biosolids samples for different groups of pathogens; it is less impacted by the intrinsic variability of samples, while providing higher yields (P = 0.05) and greater consistency than the current EPA methods.

  8. Purification of crude glycerol from transesterification reaction of palm oil using direct method and multistep method

    NASA Astrophysics Data System (ADS)

    Nasir, N. F.; Mirus, M. F.; Ismail, M.

    2017-09-01

    Crude glycerol which produced from transesterification reaction has limited usage if it does not undergo purification process. It also contains excess methanol, catalyst and soap. Conventionally, purification method of the crude glycerol involves high cost and complex processes. This study aimed to determine the effects of using different purification methods which are direct method (comprises of ion exchange and methanol removal steps) and multistep method (comprises of neutralization, filtration, ion exchange and methanol removal steps). Two crude glycerol samples were investigated; the self-produced sample through the transesterification process of palm oil and the sample obtained from biodiesel plant. Samples were analysed using Fourier Transform Infrared Spectroscopy, Gas Chromatography and High Performance Liquid Chromatography. The results of this study for both samples after purification have showed that the pure glycerol was successfully produced and fatty acid salts were eliminated. Also, the results indicated the absence of methanol in both samples after purification process. In short, the combination of 4 purification steps has contributed to a higher quality of glycerol. Multistep purification method gave a better result compared to the direct method as neutralization and filtration steps helped in removing most excess salt, fatty acid and catalyst.

  9. Changes to serum sample tube and processing methodology does not cause Intra-Individual [corrected] variation in automated whole serum N-glycan profiling in health and disease.

    PubMed

    Ventham, Nicholas T; Gardner, Richard A; Kennedy, Nicholas A; Shubhakar, Archana; Kalla, Rahul; Nimmo, Elaine R; Fernandes, Daryl L; Satsangi, Jack; Spencer, Daniel I R

    2015-01-01

    Serum N-glycans have been identified as putative biomarkers for numerous diseases. The impact of different serum sample tubes and processing methods on N-glycan analysis has received relatively little attention. This study aimed to determine the effect of different sample tubes and processing methods on the whole serum N-glycan profile in both health and disease. A secondary objective was to describe a robot automated N-glycan release, labeling and cleanup process for use in a biomarker discovery system. 25 patients with active and quiescent inflammatory bowel disease and controls had three different serum sample tubes taken at the same draw. Two different processing methods were used for three types of tube (with and without gel-separation medium). Samples were randomised and processed in a blinded fashion. Whole serum N-glycan release, 2-aminobenzamide labeling and cleanup was automated using a Hamilton Microlab STARlet Liquid Handling robot. Samples were analysed using a hydrophilic interaction liquid chromatography/ethylene bridged hybrid(BEH) column on an ultra-high performance liquid chromatography instrument. Data were analysed quantitatively by pairwise correlation and hierarchical clustering using the area under each chromatogram peak. Qualitatively, a blinded assessor attempted to match chromatograms to each individual. There was small intra-individual variation in serum N-glycan profiles from samples collected using different sample processing methods. Intra-individual correlation coefficients were between 0.99 and 1. Unsupervised hierarchical clustering and principal coordinate analyses accurately matched samples from the same individual. Qualitative analysis demonstrated good chromatogram overlay and a blinded assessor was able to accurately match individuals based on chromatogram profile, regardless of disease status. The three different serum sample tubes processed using the described methods cause minimal inter-individual variation in serum whole N-glycan profile when processed using an automated workstream. This has important implications for N-glycan biomarker discovery studies using different serum processing standard operating procedures.

  10. Establishing Statistical Equivalence of Data from Different Sampling Approaches for Assessment of Bacterial Phenotypic Antimicrobial Resistance

    PubMed Central

    2018-01-01

    ABSTRACT To assess phenotypic bacterial antimicrobial resistance (AMR) in different strata (e.g., host populations, environmental areas, manure, or sewage effluents) for epidemiological purposes, isolates of target bacteria can be obtained from a stratum using various sample types. Also, different sample processing methods can be applied. The MIC of each target antimicrobial drug for each isolate is measured. Statistical equivalence testing of the MIC data for the isolates allows evaluation of whether different sample types or sample processing methods yield equivalent estimates of the bacterial antimicrobial susceptibility in the stratum. We demonstrate this approach on the antimicrobial susceptibility estimates for (i) nontyphoidal Salmonella spp. from ground or trimmed meat versus cecal content samples of cattle in processing plants in 2013-2014 and (ii) nontyphoidal Salmonella spp. from urine, fecal, and blood human samples in 2015 (U.S. National Antimicrobial Resistance Monitoring System data). We found that the sample types for cattle yielded nonequivalent susceptibility estimates for several antimicrobial drug classes and thus may gauge distinct subpopulations of salmonellae. The quinolone and fluoroquinolone susceptibility estimates for nontyphoidal salmonellae from human blood are nonequivalent to those from urine or feces, conjecturally due to the fluoroquinolone (ciprofloxacin) use to treat infections caused by nontyphoidal salmonellae. We also demonstrate statistical equivalence testing for comparing sample processing methods for fecal samples (culturing one versus multiple aliquots per sample) to assess AMR in fecal Escherichia coli. These methods yield equivalent results, except for tetracyclines. Importantly, statistical equivalence testing provides the MIC difference at which the data from two sample types or sample processing methods differ statistically. Data users (e.g., microbiologists and epidemiologists) may then interpret practical relevance of the difference. IMPORTANCE Bacterial antimicrobial resistance (AMR) needs to be assessed in different populations or strata for the purposes of surveillance and determination of the efficacy of interventions to halt AMR dissemination. To assess phenotypic antimicrobial susceptibility, isolates of target bacteria can be obtained from a stratum using different sample types or employing different sample processing methods in the laboratory. The MIC of each target antimicrobial drug for each of the isolates is measured, yielding the MIC distribution across the isolates from each sample type or sample processing method. We describe statistical equivalence testing for the MIC data for evaluating whether two sample types or sample processing methods yield equivalent estimates of the bacterial phenotypic antimicrobial susceptibility in the stratum. This includes estimating the MIC difference at which the data from the two approaches differ statistically. Data users (e.g., microbiologists, epidemiologists, and public health professionals) can then interpret whether that present difference is practically relevant. PMID:29475868

  11. Establishing Statistical Equivalence of Data from Different Sampling Approaches for Assessment of Bacterial Phenotypic Antimicrobial Resistance.

    PubMed

    Shakeri, Heman; Volkova, Victoriya; Wen, Xuesong; Deters, Andrea; Cull, Charley; Drouillard, James; Müller, Christian; Moradijamei, Behnaz; Jaberi-Douraki, Majid

    2018-05-01

    To assess phenotypic bacterial antimicrobial resistance (AMR) in different strata (e.g., host populations, environmental areas, manure, or sewage effluents) for epidemiological purposes, isolates of target bacteria can be obtained from a stratum using various sample types. Also, different sample processing methods can be applied. The MIC of each target antimicrobial drug for each isolate is measured. Statistical equivalence testing of the MIC data for the isolates allows evaluation of whether different sample types or sample processing methods yield equivalent estimates of the bacterial antimicrobial susceptibility in the stratum. We demonstrate this approach on the antimicrobial susceptibility estimates for (i) nontyphoidal Salmonella spp. from ground or trimmed meat versus cecal content samples of cattle in processing plants in 2013-2014 and (ii) nontyphoidal Salmonella spp. from urine, fecal, and blood human samples in 2015 (U.S. National Antimicrobial Resistance Monitoring System data). We found that the sample types for cattle yielded nonequivalent susceptibility estimates for several antimicrobial drug classes and thus may gauge distinct subpopulations of salmonellae. The quinolone and fluoroquinolone susceptibility estimates for nontyphoidal salmonellae from human blood are nonequivalent to those from urine or feces, conjecturally due to the fluoroquinolone (ciprofloxacin) use to treat infections caused by nontyphoidal salmonellae. We also demonstrate statistical equivalence testing for comparing sample processing methods for fecal samples (culturing one versus multiple aliquots per sample) to assess AMR in fecal Escherichia coli These methods yield equivalent results, except for tetracyclines. Importantly, statistical equivalence testing provides the MIC difference at which the data from two sample types or sample processing methods differ statistically. Data users (e.g., microbiologists and epidemiologists) may then interpret practical relevance of the difference. IMPORTANCE Bacterial antimicrobial resistance (AMR) needs to be assessed in different populations or strata for the purposes of surveillance and determination of the efficacy of interventions to halt AMR dissemination. To assess phenotypic antimicrobial susceptibility, isolates of target bacteria can be obtained from a stratum using different sample types or employing different sample processing methods in the laboratory. The MIC of each target antimicrobial drug for each of the isolates is measured, yielding the MIC distribution across the isolates from each sample type or sample processing method. We describe statistical equivalence testing for the MIC data for evaluating whether two sample types or sample processing methods yield equivalent estimates of the bacterial phenotypic antimicrobial susceptibility in the stratum. This includes estimating the MIC difference at which the data from the two approaches differ statistically. Data users (e.g., microbiologists, epidemiologists, and public health professionals) can then interpret whether that present difference is practically relevant. Copyright © 2018 Shakeri et al.

  12. Sampling Operations on Big Data

    DTIC Science & Technology

    2015-11-29

    gories. These include edge sampling methods where edges are selected by a predetermined criteria; snowball sampling methods where algorithms start... Sampling Operations on Big Data Vijay Gadepally, Taylor Herr, Luke Johnson, Lauren Milechin, Maja Milosavljevic, Benjamin A. Miller Lincoln...process and disseminate information for discovery and exploration under real-time constraints. Common signal processing operations such as sampling and

  13. Effect of a traditional processing method on the chemical composition of local white lupin (Lupinus albus L.) seed in North-Western Ethiopia.

    PubMed

    Yeheyis, Likawent; Kijora, Claudia; Wink, Michael; Peters, Kurt J

    2011-01-01

    The effect of a traditional Ethiopian lupin processing method on the chemical composition of lupin seed samples was studied. Two sampling districts, namely Mecha and Sekela, representing the mid- and high-altitude areas of north-western Ethiopia, respectively, were randomly selected. Different types of traditionally processed and marketed lupin seed samples (raw, roasted, and finished) were collected in six replications from each district. Raw samples are unprocessed, and roasted samples are roasted using firewood. Finished samples are those ready for human consumption as snack. Thousand seed weight for raw and roasted samples within a study district was similar (P > 0.05), but it was lower (P < 0.01) for finished samples compared to raw and roasted samples. The crude fibre content of finished lupin seed sample from Mecha was lower (P < 0.01) than that of raw and roasted samples. However, the different lupin samples from Sekela had similar crude fibre content (P > 0.05). The crude protein and crude fat contents of finished samples within a study district were higher (P < 0.01) than those of raw and roasted samples, respectively. Roasting had no effect on the crude protein content of lupin seed samples. The crude ash content of raw and roasted lupin samples within a study district was higher (P < 0.01) than that of finished lupin samples of the respective study districts. The content of quinolizidine alkaloids of finished lupin samples was lower than that of raw and roasted samples. There was also an interaction effect between location and lupin sample type. The traditional processing method of lupin seeds in Ethiopia has a positive contribution improving the crude protein and crude fat content, and lowering the alkaloid content of the finished product. The study showed the possibility of adopting the traditional processing method to process bitter white lupin for the use as protein supplement in livestock feed in Ethiopia, but further work has to be done on the processing method and animal evaluation.

  14. Rapid thermal processing by stamping

    DOEpatents

    Stradins, Pauls; Wang, Qi

    2013-03-05

    A rapid thermal processing device and methods are provided for thermal processing of samples such as semiconductor wafers. The device has components including a stamp (35) having a stamping surface and a heater or cooler (40) to bring it to a selected processing temperature, a sample holder (20) for holding a sample (10) in position for intimate contact with the stamping surface; and positioning components (25) for moving the stamping surface and the stamp (35) in and away from intimate, substantially non-pressured contact. Methods for using and making such devices are also provided. These devices and methods allow inexpensive, efficient, easily controllable thermal processing.

  15. Sampling and sample processing in pesticide residue analysis.

    PubMed

    Lehotay, Steven J; Cook, Jo Marie

    2015-05-13

    Proper sampling and sample processing in pesticide residue analysis of food and soil have always been essential to obtain accurate results, but the subject is becoming a greater concern as approximately 100 mg test portions are being analyzed with automated high-throughput analytical methods by agrochemical industry and contract laboratories. As global food trade and the importance of monitoring increase, the food industry and regulatory laboratories are also considering miniaturized high-throughput methods. In conjunction with a summary of the symposium "Residues in Food and Feed - Going from Macro to Micro: The Future of Sample Processing in Residue Analytical Methods" held at the 13th IUPAC International Congress of Pesticide Chemistry, this is an opportune time to review sampling theory and sample processing for pesticide residue analysis. If collected samples and test portions do not adequately represent the actual lot from which they came and provide meaningful results, then all costs, time, and efforts involved in implementing programs using sophisticated analytical instruments and techniques are wasted and can actually yield misleading results. This paper is designed to briefly review the often-neglected but crucial topic of sample collection and processing and put the issue into perspective for the future of pesticide residue analysis. It also emphasizes that analysts should demonstrate the validity of their sample processing approaches for the analytes/matrices of interest and encourages further studies on sampling and sample mass reduction to produce a test portion.

  16. Effect of different processing methods on antioxidant activity of underutilized legumes, Entada scandens seed kernel and Canavalia gladiata seeds.

    PubMed

    Sasipriya, Gopalakrishnan; Siddhuraju, Perumal

    2012-08-01

    The present study is proposed to determine the antioxidant activity of raw and processed samples of underutilized legumes, Entada scandens seed kernel and Canavalia gladiata seeds. The indigenous processing methods like dry heating, autoclaving and soaking followed by autoclaving in different solutions (plain water, ash, sugar and sodium bicarbonate) were adopted to seed samples. All other processing methods than dry heat showed significant reduction in phenolics (2.9-63%), tannins (26-100%) and flavonoids (14-67%). However, in processed samples of E. scandens, the hydroxyl radical scavenging activity and β-carotene bleaching inhibition activity were increased, whereas, 2,2-azinobis (3-ethyl benzothiazoline-6-sulfonic acid) diammonium salt (ABTS·(+)), ferric reducing antioxidant power (FRAP), metal chelating and superoxide anion scavenging activity were similar to unprocessed ones. In contrary, except dry heating in C. gladiata, all other processing methods significantly (P<0.05) reduced the 2,2'-diphenyl-1-picryl-hydrazyl (DPPH·) (20-35%), ABTS·(+) (22-75%), FRAP (34-74%), metal chelating (30-41%), superoxide anion radical scavenging (8-80%), hydroxyl radical scavenging (20-40%) and β-carotene bleaching inhibition activity (15-69%). In addition, the sample extracts of raw and dry heated samples protected DNA damage at 10 μg. All processing methods in E. scandens and dry heating in C. gladiata would be a suitable method for adopting in domestic or industrial processing. Copyright © 2012 Elsevier Ltd. All rights reserved.

  17. Sample processing approach for detection of ricin in surface samples.

    PubMed

    Kane, Staci; Shah, Sanjiv; Erler, Anne Marie; Alfaro, Teneile

    2017-12-01

    With several ricin contamination incidents reported over the past decade, rapid and accurate methods are needed for environmental sample analysis, especially after decontamination. A sample processing method was developed for common surface sampling devices to improve the limit of detection and avoid false negative/positive results for ricin analysis. Potential assay interferents from the sample matrix (bleach residue, sample material, wetting buffer), including reference dust, were tested using a Time-Resolved Fluorescence (TRF) immunoassay. Test results suggested that the sample matrix did not cause the elevated background fluorescence sometimes observed when analyzing post-bleach decontamination samples from ricin incidents. Furthermore, sample particulates (80mg/mL Arizona Test Dust) did not enhance background fluorescence or interfere with ricin detection by TRF. These results suggested that high background fluorescence in this immunoassay could be due to labeled antibody quality and/or quantity issues. Centrifugal ultrafiltration devices were evaluated for ricin concentration as a part of sample processing. Up to 30-fold concentration of ricin was observed by the devices, which serve to remove soluble interferents and could function as the front-end sample processing step to other ricin analytical methods. The procedure has the potential to be used with a broader range of environmental sample types and with other potential interferences and to be followed by other ricin analytical methods, although additional verification studies would be required. Published by Elsevier B.V.

  18. A cryopreservation method for Pasteurella multocida from wetland samples

    USGS Publications Warehouse

    Moore, Melody K.; Shadduck, D.J.; Goldberg, Diana R.; Samuel, M.D.

    1998-01-01

    A cryopreservation method and improved isolation techniques for detection of Pasteurella multocida from wetland samples were developed. Wetland water samples were collected in the field, diluted in dimethyl sulfoxide (DMSO, final concentration 10%), and frozen at -180 C in a liquid nitrogen vapor shipper. Frozen samples were transported to the laboratory where they were subsequently thawed and processed in Pasteurella multocida selective broth (PMSB) to isolate P. multocida. This method allowed for consistent isolation of 2 to 18 organisms/ml from water seeded with known concentrations of P. multocida. The method compared favorably with the standard mouse inoculation method and allowed for preservation of the samples until they could be processed in the laboratory.

  19. Detecting the sampling rate through observations

    NASA Astrophysics Data System (ADS)

    Shoji, Isao

    2018-09-01

    This paper proposes a method to detect the sampling rate of discrete time series of diffusion processes. Using the maximum likelihood estimates of the parameters of a diffusion process, we establish a criterion based on the Kullback-Leibler divergence and thereby estimate the sampling rate. Simulation studies are conducted to check whether the method can detect the sampling rates from data and their results show a good performance in the detection. In addition, the method is applied to a financial time series sampled on daily basis and shows the detected sampling rate is different from the conventional rates.

  20. Bioactive lipids in the butter production chain from Parmigiano Reggiano cheese area.

    PubMed

    Verardo, Vito; Gómez-Caravaca, Ana M; Gori, Alessandro; Losi, Giuseppe; Caboni, Maria F

    2013-11-01

    Bovine milk contains hundreds of diverse components, including proteins, peptides, amino acids, lipids, lactose, vitamins and minerals. Specifically, the lipid composition is influenced by different variables such as breed, feed and technological process. In this study the fatty acid and phospholipid compositions of different samples of butter and its by-products from the Parmigiano Reggiano cheese area, produced by industrial and traditional churning processes, were determined. The fatty acid composition of samples manufactured by the traditional method showed higher levels of monounsaturated and polyunsaturated fatty acids compared with industrial samples. In particular, the contents of n-3 fatty acids and conjugated linoleic acids were higher in samples produced by the traditional method than in samples produced industrially. Sample phospholipid composition also varied between the two technological processes. Phosphatidylethanolamine was the major phospholipid in cream, butter and buttermilk samples obtained by the industrial process as well as in cream and buttermilk samples from the traditional process, while phosphatidylcholine was the major phospholipid in traditionally produced butter. This result may be explained by the different churning processes causing different types of membrane disruption. Generally, samples produced traditionally had higher contents of total phospholipids; in particular, butter produced by the traditional method had a total phospholipid content 33% higher than that of industrially produced butter. The samples studied represent the two types of products present in the Parmigiano Reggiano cheese area, where the industrial churning process is widespread compared with the traditional processing of Reggiana cow's milk. This is because Reggiana cow's milk production is lower than that of other breeds and the traditional churning process is time-consuming and economically disadvantageous. However, its products have been demonstrated to contain more bioactive lipids compared with products obtained from other breeds and by the industrial process. © 2013 Society of Chemical Industry.

  1. Errors in patient specimen collection: application of statistical process control.

    PubMed

    Dzik, Walter Sunny; Beckman, Neil; Selleng, Kathleen; Heddle, Nancy; Szczepiorkowski, Zbigniew; Wendel, Silvano; Murphy, Michael

    2008-10-01

    Errors in the collection and labeling of blood samples for pretransfusion testing increase the risk of transfusion-associated patient morbidity and mortality. Statistical process control (SPC) is a recognized method to monitor the performance of a critical process. An easy-to-use SPC method was tested to determine its feasibility as a tool for monitoring quality in transfusion medicine. SPC control charts were adapted to a spreadsheet presentation. Data tabulating the frequency of mislabeled and miscollected blood samples from 10 hospitals in five countries from 2004 to 2006 were used to demonstrate the method. Control charts were produced to monitor process stability. The participating hospitals found the SPC spreadsheet very suitable to monitor the performance of the sample labeling and collection and applied SPC charts to suit their specific needs. One hospital monitored subcategories of sample error in detail. A large hospital monitored the number of wrong-blood-in-tube (WBIT) events. Four smaller-sized facilities, each following the same policy for sample collection, combined their data on WBIT samples into a single control chart. One hospital used the control chart to monitor the effect of an educational intervention. A simple SPC method is described that can monitor the process of sample collection and labeling in any hospital. SPC could be applied to other critical steps in the transfusion processes as a tool for biovigilance and could be used to develop regional or national performance standards for pretransfusion sample collection. A link is provided to download the spreadsheet for free.

  2. Perpendicular distance sampling: an alternative method for sampling downed coarse woody debris

    Treesearch

    Michael S. Williams; Jeffrey H. Gove

    2003-01-01

    Coarse woody debris (CWD) plays an important role in many forest ecosystem processes. In recent years, a number of new methods have been proposed to sample CWD. These methods select individual logs into the sample using some form of unequal probability sampling. One concern with most of these methods is the difficulty in estimating the volume of each log. A new method...

  3. Methods for producing silicon carbide architectural preforms

    NASA Technical Reports Server (NTRS)

    DiCarlo, James A. (Inventor); Yun, Hee (Inventor)

    2010-01-01

    Methods are disclosed for producing architectural preforms and high-temperature composite structures containing high-strength ceramic fibers with reduced preforming stresses within each fiber, with an in-situ grown coating on each fiber surface, with reduced boron within the bulk of each fiber, and with improved tensile creep and rupture resistance properties for each fiber. The methods include the steps of preparing an original sample of a preform formed from a pre-selected high-strength silicon carbide ceramic fiber type, placing the original sample in a processing furnace under a pre-selected preforming stress state and thermally treating the sample in the processing furnace at a pre-selected processing temperature and hold time in a processing gas having a pre-selected composition, pressure, and flow rate. For the high-temperature composite structures, the method includes additional steps of depositing a thin interphase coating on the surface of each fiber and forming a ceramic or carbon-based matrix within the sample.

  4. Use of one-ply composite tissues in an automated optical assay for recovery of Listeria from food contact surfaces and poultry-processing environments.

    PubMed

    Yan, Zhinong; Vorst, Keith L; Zhang, Lei; Ryser, Elliot T

    2007-05-01

    A novel one-ply composite tissue (CT) method using the Soleris (formerly BioSys) optical analysis system was compared with the conventional U.S. Department of Agriculture (USDA) environmental sponge enrichment method for recovery of Listeria from food contact surfaces and poultry-processing environments. Stainless steel and high-density polyethylene plates were inoculated to contain a six-strain L. monocytogenes cocktail at 10(4), 10(2), and 10 CFU per plate, whereas samples from naturally contaminated surfaces and floor drains from a poultry-processing facility were collected with CTs and environmental sponges. CT samples were transferred into Soleris system vials, and presumptive-positive samples were further confirmed. Sponge samples were processed for Listeria using the USDA culture method. L. monocytogenes recovery rates from inoculated stainless steel and polyethylene surfaces were then compared for the two methods in terms of sensitivity, specificity, and positive and negative predictive values. No significant differences (P > 0.05) were found between the two methods for recovery of L. monocytogenes from any of the inoculated stainless steel and polyethylene surfaces or environmental samples. Sensitivity, specificity, and overall accuracy of the CT-Soleris for recovery of Listeria from environmental samples were 83, 97, and 95%, respectively. Listeria was detected 2 to 3 days sooner with the CT-Soleris method than with the USDA culture method, thus supporting the increased efficacy of this new protocol for environmental sampling.

  5. A Hybrid DNA Extraction Method for the Qualitative and Quantitative Assessment of Bacterial Communities from Poultry Production Samples

    PubMed Central

    Rothrock, Michael J.; Hiett, Kelli L.; Gamble, John; Caudill, Andrew C.; Cicconi-Hogan, Kellie M.; Caporaso, J. Gregory

    2014-01-01

    The efficacy of DNA extraction protocols can be highly dependent upon both the type of sample being investigated and the types of downstream analyses performed. Considering that the use of new bacterial community analysis techniques (e.g., microbiomics, metagenomics) is becoming more prevalent in the agricultural and environmental sciences and many environmental samples within these disciplines can be physiochemically and microbiologically unique (e.g., fecal and litter/bedding samples from the poultry production spectrum), appropriate and effective DNA extraction methods need to be carefully chosen. Therefore, a novel semi-automated hybrid DNA extraction method was developed specifically for use with environmental poultry production samples. This method is a combination of the two major types of DNA extraction: mechanical and enzymatic. A two-step intense mechanical homogenization step (using bead-beating specifically formulated for environmental samples) was added to the beginning of the “gold standard” enzymatic DNA extraction method for fecal samples to enhance the removal of bacteria and DNA from the sample matrix and improve the recovery of Gram-positive bacterial community members. Once the enzymatic extraction portion of the hybrid method was initiated, the remaining purification process was automated using a robotic workstation to increase sample throughput and decrease sample processing error. In comparison to the strict mechanical and enzymatic DNA extraction methods, this novel hybrid method provided the best overall combined performance when considering quantitative (using 16S rRNA qPCR) and qualitative (using microbiomics) estimates of the total bacterial communities when processing poultry feces and litter samples. PMID:25548939

  6. Rapid microscale in-gel processing and digestion of proteins using surface acoustic waves.

    PubMed

    Kulkarni, Ketav P; Ramarathinam, Sri H; Friend, James; Yeo, Leslie; Purcell, Anthony W; Perlmutter, Patrick

    2010-06-21

    A new method for in-gel sample processing and tryptic digestion of proteins is described. Sample preparation, rehydration, in situ digestion and peptide extraction from gel slices are dramatically accelerated by treating the gel slice with surface acoustic waves (SAWs). Only 30 minutes total workflow time is required for this new method to produce base peak chromatograms (BPCs) of similar coverage and intensity to those observed for traditional processing and overnight digestion. Simple set up, good reproducibility, excellent peptide recoveries, rapid turnover of samples and high confidence protein identifications put this technology at the fore-front of the next generation of proteomics sample processing tools.

  7. Thermal imaging measurement of lateral diffusivity and non-invasive material defect detection

    DOEpatents

    Sun, Jiangang; Deemer, Chris

    2003-01-01

    A system and method for determining lateral thermal diffusivity of a material sample using a heat pulse; a sample oriented within an orthogonal coordinate system; an infrared camera; and a computer that has a digital frame grabber, and data acquisition and processing software. The mathematical model used within the data processing software is capable of determining the lateral thermal diffusivity of a sample of finite boundaries. The system and method may also be used as a nondestructive method for detecting and locating cracks within the material sample.

  8. Damage evolution analysis of coal samples under cyclic loading based on single-link cluster method

    NASA Astrophysics Data System (ADS)

    Zhang, Zhibo; Wang, Enyuan; Li, Nan; Li, Xuelong; Wang, Xiaoran; Li, Zhonghui

    2018-05-01

    In this paper, the acoustic emission (AE) response of coal samples under cyclic loading is measured. The results show that there is good positive relation between AE parameters and stress. The AE signal of coal samples under cyclic loading exhibits an obvious Kaiser Effect. The single-link cluster (SLC) method is applied to analyze the spatial evolution characteristics of AE events and the damage evolution process of coal samples. It is found that a subset scale of the SLC structure becomes smaller and smaller when the number of cyclic loading increases, and there is a negative linear relationship between the subset scale and the degree of damage. The spatial correlation length ξ of an SLC structure is calculated. The results show that ξ fluctuates around a certain value from the second cyclic loading process to the fifth cyclic loading process, but spatial correlation length ξ clearly increases in the sixth loading process. Based on the criterion of microcrack density, the coal sample failure process is the transformation from small-scale damage to large-scale damage, which is the reason for changes in the spatial correlation length. Through a systematic analysis, the SLC method is an effective method to research the damage evolution process of coal samples under cyclic loading, and will provide important reference values for studying coal bursts.

  9. A Rapid Identification Method for Calamine Using Near-Infrared Spectroscopy Based on Multi-Reference Correlation Coefficient Method and Back Propagation Artificial Neural Network.

    PubMed

    Sun, Yangbo; Chen, Long; Huang, Bisheng; Chen, Keli

    2017-07-01

    As a mineral, the traditional Chinese medicine calamine has a similar shape to many other minerals. Investigations of commercially available calamine samples have shown that there are many fake and inferior calamine goods sold on the market. The conventional identification method for calamine is complicated, therefore as a result of the large scale of calamine samples, a rapid identification method is needed. To establish a qualitative model using near-infrared (NIR) spectroscopy for rapid identification of various calamine samples, large quantities of calamine samples including crude products, counterfeits and processed products were collected and correctly identified using the physicochemical and powder X-ray diffraction method. The NIR spectroscopy method was used to analyze these samples by combining the multi-reference correlation coefficient (MRCC) method and the error back propagation artificial neural network algorithm (BP-ANN), so as to realize the qualitative identification of calamine samples. The accuracy rate of the model based on NIR and MRCC methods was 85%; in addition, the model, which took comprehensive multiple factors into consideration, can be used to identify crude calamine products, its counterfeits and processed products. Furthermore, by in-putting the correlation coefficients of multiple references as the spectral feature data of samples into BP-ANN, a BP-ANN model of qualitative identification was established, of which the accuracy rate was increased to 95%. The MRCC method can be used as a NIR-based method in the process of BP-ANN modeling.

  10. Methods for Producing High-Performance Silicon Carbide Fibers, Architectural Preforms, and High-Temperature Composite Structures

    NASA Technical Reports Server (NTRS)

    Yun, Hee-Mann (Inventor); DiCarlo, James A. (Inventor)

    2014-01-01

    Methods are disclosed for producing architectural preforms and high-temperature composite structures containing high-strength ceramic fibers with reduced preforming stresses within each fiber, with an in-situ grown coating on each fiber surface, with reduced boron within the bulk of each fiber, and with improved tensile creep and rupture resistance properties tier each fiber. The methods include the steps of preparing an original sample of a preform formed from a pre-selected high-strength silicon carbide ceramic fiber type, placing the original sample in a processing furnace under a pre-selected preforming stress state and thermally treating the sample in the processing furnace at a pre-selected processing temperature and hold time in a processing gas having a pre-selected composition, pressure, and flow rate. For the high-temperature composite structures, the method includes additional steps of depositing a thin interphase coating on the surface of each fiber and forming a ceramic or carbon-based matrix within the sample.

  11. MERCURY IN CRUDE OIL PROCESSED IN THE UNITED STATES (2004)

    EPA Science Inventory

    The mean and range of concentrations of mercury in crude oil processed in the U.S. were investigated using two analytical methods. The sample ensemble consisted of 329 samples from 170 separate crude oil streams that are processed by U.S. refineries. Samples were retrieved imme...

  12. Note: A simple image processing based fiducial auto-alignment method for sample registration.

    PubMed

    Robertson, Wesley D; Porto, Lucas R; Ip, Candice J X; Nantel, Megan K T; Tellkamp, Friedjof; Lu, Yinfei; Miller, R J Dwayne

    2015-08-01

    A simple method for the location and auto-alignment of sample fiducials for sample registration using widely available MATLAB/LabVIEW software is demonstrated. The method is robust, easily implemented, and applicable to a wide variety of experiment types for improved reproducibility and increased setup speed. The software uses image processing to locate and measure the diameter and center point of circular fiducials for distance self-calibration and iterative alignment and can be used with most imaging systems. The method is demonstrated to be fast and reliable in locating and aligning sample fiducials, provided here by a nanofabricated array, with accuracy within the optical resolution of the imaging system. The software was further demonstrated to register, load, and sample the dynamically wetted array.

  13. Impact of temperature and time storage on the microbial detection of oral samples by Checkerboard DNA-DNA hybridization method.

    PubMed

    do Nascimento, Cássio; dos Santos, Janine Navarro; Pedrazzi, Vinícius; Pita, Murillo Sucena; Monesi, Nadia; Ribeiro, Ricardo Faria; de Albuquerque, Rubens Ferreira

    2014-01-01

    Molecular diagnosis methods have been largely used in epidemiological or clinical studies to detect and quantify microbial species that may colonize the oral cavity in healthy or disease. The preservation of genetic material from samples remains the major challenge to ensure the feasibility of these methodologies. Long-term storage may compromise the final result. The aim of this study was to evaluate the effect of temperature and time storage on the microbial detection of oral samples by Checkerboard DNA-DNA hybridization. Saliva and supragingival biofilm were taken from 10 healthy subjects, aliquoted (n=364) and processed according to proposed protocols: immediate processing and processed after 2 or 4 weeks, and 6 or 12 months of storage at 4°C, -20°C and -80°C. Either total or individual microbial counts were recorded in lower values for samples processed after 12 months of storage, irrespective of temperatures tested. Samples stored up to 6 months at cold temperatures showed similar counts to those immediately processed. The microbial incidence was also significantly reduced in samples stored during 12 months in all temperatures. Temperature and time of oral samples storage have relevant impact in the detection and quantification of bacterial and fungal species by Checkerboard DNA-DNA hybridization method. Samples should be processed immediately after collection or up to 6 months if conserved at cold temperatures to avoid false-negative results. Copyright © 2013 Elsevier Ltd. All rights reserved.

  14. Improvements to sample processing and measurement to enable more widespread environmental application of tritium

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Moran, James; Alexander, Thomas; Aalseth, Craig

    2017-08-01

    Previous measurements have demonstrated the wealth of information that tritium (T) can provide on environmentally relevant processes. We present modifications to sample preparation approaches that enable T measurement by proportional counting on small sample sizes equivalent to 120 mg of water and demonstrate the accuracy of these methods on a suite of standardized water samples. This enhanced method should provide the analytical flexibility needed to address persistent knowledge gaps in our understanding of T behavior in the environment.

  15. Rapid quantitation of atorvastatin in process pharmaceutical powder sample using Raman spectroscopy and evaluation of parameters related to accuracy of analysis.

    PubMed

    Lim, Young-Il; Han, Janghee; Woo, Young-Ah; Kim, Jaejin; Kang, Myung Joo

    2018-07-05

    The purpose of this study was to determine the atorvastatin (ATV) content in process pharmaceutical powder sample using Raman spectroscopy. To establish the analysis method, the influence of the type of Raman measurements (back-scattering or transmission mode), preparation of calibration sample (simple admixing or granulation), sample pre-treatment (pelletization), and spectral pretreatment on the Raman spectra was investigated. The characteristic peak of the active compound was more distinctively detected in transmission Raman mode with a laser spot size of 4mm than in the back-scattering method. Preparation of calibration samples by wet granulation, identical to the actual manufacturing process, provided unchanged spectral patterns for the in process sample, with no changes and/or shifts in the spectrum. Pelletization before Raman analysis remarkably improved spectral reproducibility by decreasing the difference in density between the samples. Probabilistic quotient normalization led to accurate and consistent quantification of the ATV content in the calibration samples (standard error of cross validation: 1.21%). Moreover, the drug content in the granules obtained from five commercial batches were reliably quantified, with no statistical difference (p=0.09) with that obtained by HPLC assay. From these findings, we suggest that transmission Raman analysis may be a fast and non-invasive method for the quantification of ATV in actual manufacturing processes. Copyright © 2018 Elsevier B.V. All rights reserved.

  16. Rapid quantitation of atorvastatin in process pharmaceutical powder sample using Raman spectroscopy and evaluation of parameters related to accuracy of analysis

    NASA Astrophysics Data System (ADS)

    Lim, Young-Il; Han, Janghee; Woo, Young-Ah; Kim, Jaejin; Kang, Myung Joo

    2018-07-01

    The purpose of this study was to determine the atorvastatin (ATV) content in process pharmaceutical powder sample using Raman spectroscopy. To establish the analysis method, the influence of the type of Raman measurements (back-scattering or transmission mode), preparation of calibration sample (simple admixing or granulation), sample pre-treatment (pelletization), and spectral pretreatment on the Raman spectra was investigated. The characteristic peak of the active compound was more distinctively detected in transmission Raman mode with a laser spot size of 4 mm than in the back-scattering method. Preparation of calibration samples by wet granulation, identical to the actual manufacturing process, provided unchanged spectral patterns for the in process sample, with no changes and/or shifts in the spectrum. Pelletization before Raman analysis remarkably improved spectral reproducibility by decreasing the difference in density between the samples. Probabilistic quotient normalization led to accurate and consistent quantification of the ATV content in the calibration samples (standard error of cross validation: 1.21%). Moreover, the drug content in the granules obtained from five commercial batches were reliably quantified, with no statistical difference (p = 0.09) with that obtained by HPLC assay. From these findings, we suggest that transmission Raman analysis may be a fast and non-invasive method for the quantification of ATV in actual manufacturing processes.

  17. An improved filter elution and cell culture assay procedure for evaluating public groundwater systems for culturable enteroviruses.

    PubMed

    Dahling, Daniel R

    2002-01-01

    Large-scale virus studies of groundwater systems require practical and sensitive procedures for both sample processing and viral assay. Filter adsorption-elution procedures have traditionally been used to process large-volume water samples for viruses. In this study, five filter elution procedures using cartridge filters were evaluated for their effectiveness in processing samples. Of the five procedures tested, the third method, which incorporated two separate beef extract elutions (one being an overnight filter immersion in beef extract), recovered 95% of seeded poliovirus compared with recoveries of 36 to 70% for the other methods. For viral enumeration, an expanded roller bottle quantal assay was evaluated using seeded poliovirus. This cytopathic-based method was considerably more sensitive than the standard plaque assay method. The roller bottle system was more economical than the plaque assay for the evaluation of comparable samples. Using roller bottles required less time and manipulation than the plaque procedure and greatly facilitated the examination of large numbers of samples. The combination of the improved filter elution procedure and the roller bottle assay for viral analysis makes large-scale virus studies of groundwater systems practical. This procedure was subsequently field tested during a groundwater study in which large-volume samples (exceeding 800 L) were processed through the filters.

  18. Why minimally invasive skin sampling techniques? A bright scientific future.

    PubMed

    Wang, Christina Y; Maibach, Howard I

    2011-03-01

    There is increasing interest in minimally invasive skin sampling techniques to assay markers of molecular biology and biochemical processes. This overview examines methodology strengths and limitations, and exciting developments pending in the scientific community. Publications were searched via PubMed, the U.S. Patent and Trademark Office Website, the DermTech Website and the CuDerm Website. The keywords used were noninvasive skin sampling, skin stripping, skin taping, detergent method, ring method, mechanical scrub, reverse iontophoresis, glucose monitoring, buccal smear, hair root sampling, mRNA, DNA, RNA, and amino acid. There is strong interest in finding methods to access internal biochemical, molecular, and genetic processes through noninvasive and minimally invasive external means. Minimally invasive techniques include the widely used skin tape stripping, the abrasion method that includes scraping and detergent, and reverse iontophoresis. The first 2 methods harvest largely the stratum corneum. Hair root sampling (material deeper than the epidermis), buccal smear, shave biopsy, punch biopsy, and suction blistering are also methods used to obtain cellular material for analysis, but involve some degree of increased invasiveness and thus are only briefly mentioned. Existing and new sampling methods are being refined and validated, offering exciting, different noninvasive means of quickly and efficiently obtaining molecular material with which to monitor bodily functions and responses, assess drug levels, and follow disease processes without subjecting patients to unnecessary discomfort and risk.

  19. Impact of Processing Method on Recovery of Bacteria from Wipes Used in Biological Surface Sampling

    PubMed Central

    Olson, Nathan D.; Filliben, James J.; Morrow, Jayne B.

    2012-01-01

    Environmental sampling for microbiological contaminants is a key component of hygiene monitoring and risk characterization practices utilized across diverse fields of application. However, confidence in surface sampling results, both in the field and in controlled laboratory studies, has been undermined by large variation in sampling performance results. Sources of variation include controlled parameters, such as sampling materials and processing methods, which often differ among studies, as well as random and systematic errors; however, the relative contributions of these factors remain unclear. The objective of this study was to determine the relative impacts of sample processing methods, including extraction solution and physical dissociation method (vortexing and sonication), on recovery of Gram-positive (Bacillus cereus) and Gram-negative (Burkholderia thailandensis and Escherichia coli) bacteria from directly inoculated wipes. This work showed that target organism had the largest impact on extraction efficiency and recovery precision, as measured by traditional colony counts. The physical dissociation method (PDM) had negligible impact, while the effect of the extraction solution was organism dependent. Overall, however, extraction of organisms from wipes using phosphate-buffered saline with 0.04% Tween 80 (PBST) resulted in the highest mean recovery across all three organisms. The results from this study contribute to a better understanding of the factors that influence sampling performance, which is critical to the development of efficient and reliable sampling methodologies relevant to public health and biodefense. PMID:22706055

  20. Testing of the Defense Waste Processing Facility Cold Chemical Dissolution Method in Sludge Batch 9 Qualification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Edwards, T.; Pareizs, J.; Coleman, C.

    For each sludge batch that is processed in the Defense Waste Processing Facility (DWPF), the Savannah River National Laboratory (SRNL) tests the applicability of the digestion methods used by the DWPF Laboratory for elemental analysis of Sludge Receipt and Adjustment Tank (SRAT) Receipt samples and SRAT Product process control samples. DWPF SRAT samples are typically dissolved using a method referred to as the DWPF Cold Chemical or Cold Chem Method (CC), (see DWPF Procedure SW4- 15.201). Testing indicates that the CC method produced mixed results. The CC method did not result in complete dissolution of either the SRAT Receipt ormore » SRAT Product with some fine, dark solids remaining. However, elemental analyses did not reveal extreme biases for the major elements in the sludge when compared with analyses obtained following dissolution by hot aqua regia (AR) or sodium peroxide fusion (PF) methods. The CC elemental analyses agreed with the AR and PF methods well enough that it should be adequate for routine process control analyses in the DWPF after much more extensive side-by-side tests of the CC method and the PF method are performed on the first 10 SRAT cycles of the Sludge Batch 9 (SB9) campaign. The DWPF Laboratory should continue with their plans for further tests of the CC method during these 10 SRAT cycles.« less

  1. Improved methods for signal processing in measurements of mercury by Tekran® 2537A and 2537B instruments

    NASA Astrophysics Data System (ADS)

    Ambrose, Jesse L.

    2017-12-01

    Atmospheric Hg measurements are commonly carried out using Tekran® Instruments Corporation's model 2537 Hg vapor analyzers, which employ gold amalgamation preconcentration sampling and detection by thermal desorption (TD) and atomic fluorescence spectrometry (AFS). A generally overlooked and poorly characterized source of analytical uncertainty in those measurements is the method by which the raw Hg atomic fluorescence (AF) signal is processed. Here I describe new software-based methods for processing the raw signal from the Tekran® 2537 instruments, and I evaluate the performances of those methods together with the standard Tekran® internal signal processing method. For test datasets from two Tekran® instruments (one 2537A and one 2537B), I estimate that signal processing uncertainties in Hg loadings determined with the Tekran® method are within ±[1 % + 1.2 pg] and ±[6 % + 0.21 pg], respectively. I demonstrate that the Tekran® method can produce significant low biases (≥ 5 %) not only at low Hg sample loadings (< 5 pg) but also at tropospheric background concentrations of gaseous elemental mercury (GEM) and total mercury (THg) (˜ 1 to 2 ng m-3) under typical operating conditions (sample loadings of 5-10 pg). Signal processing uncertainties associated with the Tekran® method can therefore represent a significant unaccounted for addition to the overall ˜ 10 to 15 % uncertainty previously estimated for Tekran®-based GEM and THg measurements. Signal processing bias can also add significantly to uncertainties in Tekran®-based gaseous oxidized mercury (GOM) and particle-bound mercury (PBM) measurements, which often derive from Hg sample loadings < 5 pg. In comparison, estimated signal processing uncertainties associated with the new methods described herein are low, ranging from within ±0.053 pg, when the Hg thermal desorption peaks are defined manually, to within ±[2 % + 0.080 pg] when peak definition is automated. Mercury limits of detection (LODs) decrease by 31 to 88 % when the new methods are used in place of the Tekran® method. I recommend that signal processing uncertainties be quantified in future applications of the Tekran® 2537 instruments.

  2. A chemodynamic approach for estimating losses of target organic chemicals from water during sample holding time

    USGS Publications Warehouse

    Capel, P.D.; Larson, S.J.

    1995-01-01

    Minimizing the loss of target organic chemicals from environmental water samples between the time of sample collection and isolation is important to the integrity of an investigation. During this sample holding time, there is a potential for analyte loss through volatilization from the water to the headspace, sorption to the walls and cap of the sample bottle; and transformation through biotic and/or abiotic reactions. This paper presents a chemodynamic-based, generalized approach to estimate the most probable loss processes for individual target organic chemicals. The basic premise is that the investigator must know which loss process(es) are important for a particular analyte, based on its chemodynamic properties, when choosing the appropriate method(s) to prevent loss.

  3. Device and method for automated separation of a sample of whole blood into aliquots

    DOEpatents

    Burtis, Carl A.; Johnson, Wayne F.

    1989-01-01

    A device and a method for automated processing and separation of an unmeasured sample of whole blood into multiple aliquots of plasma. Capillaries are radially oriented on a rotor, with the rotor defining a sample chamber, transfer channels, overflow chamber, overflow channel, vent channel, cell chambers, and processing chambers. A sample of whole blood is placed in the sample chamber, and when the rotor is rotated, the blood moves outward through the transfer channels to the processing chambers where the blood is centrifugally separated into a solid cellular component and a liquid plasma component. When the rotor speed is decreased, the plasma component backfills the capillaries resulting in uniform aliquots of plasma which may be used for subsequent analytical procedures.

  4. EPA Tools and Resources Webinar: EPA’s Environmental Sampling and Analytical Methods for Environmental Remediation and Recovery

    EPA Pesticide Factsheets

    EPA’s Environmental Sampling and Analytical Methods (ESAM) is a website tool that supports the entire environmental characterization process from collection of samples all the way to their analyses.

  5. Assessing NIR & MIR Spectral Analysis as a Method for Soil C Estimation Across a Network of Sampling Sites

    NASA Astrophysics Data System (ADS)

    Spencer, S.; Ogle, S.; Borch, T.; Rock, B.

    2008-12-01

    Monitoring soil C stocks is critical to assess the impact of future climate and land use change on carbon sinks and sources in agricultural lands. A benchmark network for soil carbon monitoring of stock changes is being designed for US agricultural lands with 3000-5000 sites anticipated and re-sampling on a 5- to10-year basis. Approximately 1000 sites would be sampled per year producing around 15,000 soil samples to be processed for total, organic, and inorganic carbon, as well as bulk density and nitrogen. Laboratory processing of soil samples is cost and time intensive, therefore we are testing the efficacy of using near-infrared (NIR) and mid-infrared (MIR) spectral methods for estimating soil carbon. As part of an initial implementation of national soil carbon monitoring, we collected over 1800 soil samples from 45 cropland sites in the mid-continental region of the U.S. Samples were processed using standard laboratory methods to determine the variables above. Carbon and nitrogen were determined by dry combustion and inorganic carbon was estimated with an acid-pressure test. 600 samples are being scanned using a bench- top NIR reflectance spectrometer (30 g of 2 mm oven-dried soil and 30 g of 8 mm air-dried soil) and 500 samples using a MIR Fourier-Transform Infrared Spectrometer (FTIR) with a DRIFT reflectance accessory (0.2 g oven-dried ground soil). Lab-measured carbon will be compared to spectrally-estimated carbon contents using Partial Least Squares (PLS) multivariate statistical approach. PLS attempts to develop a soil C predictive model that can then be used to estimate C in soil samples not lab-processed. The spectral analysis of soil samples either whole or partially processed can potentially save both funding resources and time to process samples. This is particularly relevant for the implementation of a national monitoring network for soil carbon. This poster will discuss our methods, initial results and potential for using NIR and MIR spectral approaches to either replace or augment traditional lab-based carbon analyses of soils.

  6. The effect of B{sub 2}O{sub 3} flux on growth NLBCO superconductor by solid state reaction and wet-mixing methods

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Suharta, W. G., E-mail: wgsuharta@gmail.com; Wendri, N.; Ratini, N.

    The synthesis of B{sub 2}O{sub 3} flux substituted NLBCO superconductor NdBa{sub 1.75}La{sub 0.25}Cu{sub 3}O{sub 7-∂} has been done using solid state reaction and wet-mixing methods in order to obtain homogeneous crystals and single phase. From DTA/TGA characteritations showed the synthesis process by wet-mixing requires a lower temperature than the solid state reaction in growing the superconductor NdBa{sub 1.75}La{sub 0.25}Cu{sub 3}O{sub 7-∂}. Therefore, in this research NdBa{sub 1.75}La{sub 0.25}Cu{sub 3}O{sub 7-∂} sample calcinated at 650°C for wet-mixing method and 820°C for solid state reaction methods. The all samples was sintered at 950°C for ten hours. Crystallinity of the sample was confirmedmore » using X-ray techniques and generally obtained sharp peaks that indicates the sample already well crystallized. Search match analyses for diffraction data gave weight fractions of impurity phase of the solid state reaction method higher than wet-mixing method. In this research showed decreasing the price of the lattice parameter about 1% with the addition of B{sub 2}O{sub 3} flux for the both synthesis process and 2% of wet mixing process for all samples. Characterization using scanning electron microscopy (SEM) showed the distribution of crystal zise for wet-mixing method more homogeneous than solid state reaction method, with he grain size of samples is around 150–250 nm. The results of vibrating sample magnetometer (VSM) showed the paramagnetic properties for all samples.« less

  7. Frequency-time coherence for all-optical sampling without optical pulse source

    PubMed Central

    Preußler, Stefan; Raoof Mehrpoor, Gilda; Schneider, Thomas

    2016-01-01

    Sampling is the first step to convert an analogue optical signal into a digital electrical signal. The latter can be further processed and analysed by well-known electrical signal processing methods. Optical pulse sources like mode-locked lasers are commonly incorporated for all-optical sampling, but have several drawbacks. A novel approach for a simple all-optical sampling is to utilise the frequency-time coherence of each signal. The method is based on only using two coupled modulators driven with an electrical sine wave. Since no optical source is required, a simple integration in appropriate platforms, such as Silicon Photonics might be possible. The presented method grants all-optical sampling with electrically tunable bandwidth, repetition rate and time shift. PMID:27687495

  8. Direct detection of Mycobacterium tuberculosis rifampin resistance in bio-safe stained sputum smears.

    PubMed

    Lavania, Surabhi; Anthwal, Divya; Bhalla, Manpreet; Singh, Nagendra; Haldar, Sagarika; Tyagi, Jaya Sivaswami

    2017-01-01

    Direct smear microscopy of sputum forms the mainstay of TB diagnosis in resource-limited settings. Stained sputum smear slides can serve as a ready-made resource to transport sputum for molecular drug susceptibility testing. However, bio-safety is a major concern during transport of sputum/stained slides and for laboratory workers engaged in processing Mycobacterium tuberculosis infected sputum specimens. In this study, a bio-safe USP (Universal Sample Processing) concentration-based sputum processing method (Bio-safe method) was assessed on 87 M. tuberculosis culture positive sputum samples. Samples were processed for Ziehl-Neelsen (ZN) smear, liquid culture and DNA isolation. DNA isolated directly from sputum was subjected to an IS6110 PCR assay. Both sputum DNA and DNA extracted from bio-safe ZN concentrated smear slides were subjected to rpoB PCR and simultaneously assessed by DNA sequencing for determining rifampin (RIF) resistance. All sputum samples were rendered sterile by Bio-safe method. Bio-safe smears exhibited a 5% increment in positivity over direct smear with a 14% increment in smear grade status. All samples were positive for IS6110 and rpoB PCR. Thirty four percent samples were RIF resistant by rpoB PCR product sequencing. A 100% concordance (κ value = 1) was obtained between sequencing results derived from bio-safe smear slides and bio-safe sputum. This study demonstrates that Bio-safe method can address safety issues associated with sputum processing, provide an efficient alternative to sample transport in the form of bio-safe stained concentrated smear slides and can also provide information on drug (RIF) resistance by direct DNA sequencing.

  9. Direct detection of Mycobacterium tuberculosis rifampin resistance in bio-safe stained sputum smears

    PubMed Central

    Lavania, Surabhi; Anthwal, Divya; Bhalla, Manpreet; Singh, Nagendra; Haldar, Sagarika; Tyagi, Jaya Sivaswami

    2017-01-01

    Direct smear microscopy of sputum forms the mainstay of TB diagnosis in resource-limited settings. Stained sputum smear slides can serve as a ready-made resource to transport sputum for molecular drug susceptibility testing. However, bio-safety is a major concern during transport of sputum/stained slides and for laboratory workers engaged in processing Mycobacterium tuberculosis infected sputum specimens. In this study, a bio-safe USP (Universal Sample Processing) concentration-based sputum processing method (Bio-safe method) was assessed on 87 M. tuberculosis culture positive sputum samples. Samples were processed for Ziehl-Neelsen (ZN) smear, liquid culture and DNA isolation. DNA isolated directly from sputum was subjected to an IS6110 PCR assay. Both sputum DNA and DNA extracted from bio-safe ZN concentrated smear slides were subjected to rpoB PCR and simultaneously assessed by DNA sequencing for determining rifampin (RIF) resistance. All sputum samples were rendered sterile by Bio-safe method. Bio-safe smears exhibited a 5% increment in positivity over direct smear with a 14% increment in smear grade status. All samples were positive for IS6110 and rpoB PCR. Thirty four percent samples were RIF resistant by rpoB PCR product sequencing. A 100% concordance (κ value = 1) was obtained between sequencing results derived from bio-safe smear slides and bio-safe sputum. This study demonstrates that Bio-safe method can address safety issues associated with sputum processing, provide an efficient alternative to sample transport in the form of bio-safe stained concentrated smear slides and can also provide information on drug (RIF) resistance by direct DNA sequencing. PMID:29216262

  10. Evaluation of two platelet-rich plasma processing methods and two platelet-activation techniques for use in llamas and alpacas.

    PubMed

    Semevolos, Stacy A; Youngblood, Cori D; Grissom, Stephanie K; Gorman, M Elena; Larson, Maureen K

    2016-11-01

    OBJECTIVE To evaluate 2 processing methods (commercial kit vs conical tube centrifugation) for preparing platelet rich plasma (PRP) for use in llamas and alpacas. SAMPLES Blood samples (30 mL each) aseptically collected from 6 healthy llamas and 6 healthy alpacas. PROCEDURES PRP was prepared from blood samples by use of a commercial kit and by double-step conical tube centrifugation. A CBC was performed for blood and PRP samples. Platelets in PRP samples were activated by means of a freeze-thaw method with or without 23mM CaCl 2 , and concentrations of platelet-derived growth factor-BB and transforming growth factor-β 1 were measured. Values were compared between processing methods and camelid species. RESULTS Blood CBC values for llamas and alpacas were similar. The commercial kit yielded a significantly greater degree of platelet enrichment (mean increase, 8.5 fold vs 2.8 fold) and WBC enrichment (mean increase, 3.7 fold vs 1.9 fold) than did conical tube centrifugation. Llamas had a significantly greater degree of platelet enrichment than alpacas by either processing method. No difference in WBC enrichment was identified between species. Concentrations of both growth factors were significantly greater in PRP samples obtained by use of the commercial kit versus those obtained by conical tube centrifugation. CONCLUSIONS AND CLINICAL RELEVANCE For blood samples from camelids, the commercial kit yielded a PRP product with a higher platelet and WBC concentration than achieved by conical tube centrifugation. Optimal PRP platelet and WBC concentrations for various applications need to be determined for llamas and alpacas.

  11. Prevalence, types, and geographical distribution of Listeria monocytogenes from a survey of retail Queso Fresco and associated cheese processing plants and dairy farms in Sonora, Mexico.

    PubMed

    Moreno-Enriquez, R I; Garcia-Galaz, A; Acedo-Felix, E; Gonzalez-Rios, I H; Call, J E; Luchansky, J B; Diaz-Cinco, M E

    2007-11-01

    In the first part of this study, samples were collected from farms, cheese processing plants (CPPs), and retail markets located in various geographical areas of Sonora, Mexico, over a 12-month period during the summer of 2004 and winter of 2005. Four (all Queso Fresco [QF] from retail markets) of 349 total samples tested positive for Listeria monocytogenes (Lm). Of these four positive samples, three were collected in the northern region and one in the southern region of Sonora. Additionally, two were collected during the winter months, and two were collected during the summer months. For the second part of the study, a total of 39 samples from a farm, a CPP, and retail markets were collected and processed according to a combination of the Norma Oficial Mexicana NOM-143-SSA1-1995.10 method (NOM) and the U.S. Food and Drug Administration (FDA) Bacteriological Analytical Manual method, and 27 samples from these same locations were collected and processed according to the U.S. Department of Agriculture Food Safety and Inspection Service method (USDA-FSIS). The NOM-FDA method recovered the pathogen from 6 (15%) of 39 samples (one cheese and five product contact surfaces), while the USDA-FSIS method recovered the pathogen from 5 (18.5%) of 27 samples (all product contact surfaces). In addition, the 40 isolates recovered from the 15 total samples that tested positive for Lm grouped into five distinct pulsotypes that were ca. 60% related, as determined by pulsed-field gel electrophoresis analysis. The results of this study confirmed a 3.4% prevalence of Lm in QF collected from retail markets located in Sonora and no appreciable difference in the effectiveness of either the NOM-FDA or USDA-FSIS method to recover the pathogen from cheese or environmental samples.

  12. A flexible importance sampling method for integrating subgrid processes

    DOE PAGES

    Raut, E. K.; Larson, V. E.

    2016-01-29

    Numerical models of weather and climate need to compute grid-box-averaged rates of physical processes such as microphysics. These averages are computed by integrating subgrid variability over a grid box. For this reason, an important aspect of atmospheric modeling is spatial integration over subgrid scales. The needed integrals can be estimated by Monte Carlo integration. Monte Carlo integration is simple and general but requires many evaluations of the physical process rate. To reduce the number of function evaluations, this paper describes a new, flexible method of importance sampling. It divides the domain of integration into eight categories, such as the portion that containsmore » both precipitation and cloud, or the portion that contains precipitation but no cloud. It then allows the modeler to prescribe the density of sample points within each of the eight categories. The new method is incorporated into the Subgrid Importance Latin Hypercube Sampler (SILHS). Here, the resulting method is tested on drizzling cumulus and stratocumulus cases. In the cumulus case, the sampling error can be considerably reduced by drawing more sample points from the region of rain evaporation.« less

  13. Method for sampling and analysis of volatile biomarkers in process gas from aerobic digestion of poultry carcasses using time-weighted average SPME and GC-MS.

    PubMed

    Koziel, Jacek A; Nguyen, Lam T; Glanville, Thomas D; Ahn, Heekwon; Frana, Timothy S; Hans van Leeuwen, J

    2017-10-01

    A passive sampling method, using retracted solid-phase microextraction (SPME) - gas chromatography-mass spectrometry and time-weighted averaging, was developed and validated for tracking marker volatile organic compounds (VOCs) emitted during aerobic digestion of biohazardous animal tissue. The retracted SPME configuration protects the fragile fiber from buffeting by the process gas stream, and it requires less equipment and is potentially more biosecure than conventional active sampling methods. VOC concentrations predicted via a model based on Fick's first law of diffusion were within 6.6-12.3% of experimentally controlled values after accounting for VOC adsorption to the SPME fiber housing. Method detection limits for five marker VOCs ranged from 0.70 to 8.44ppbv and were statistically equivalent (p>0.05) to those for active sorbent-tube-based sampling. The sampling time of 30min and fiber retraction of 5mm were found to be optimal for the tissue digestion process. Copyright © 2017 Elsevier Ltd. All rights reserved.

  14. Supervision of Ethylene Propylene Diene M-Class (EPDM) Rubber Vulcanization and Recovery Processes Using Attenuated Total Reflection Fourier Transform Infrared (ATR FT-IR) Spectroscopy and Multivariate Analysis.

    PubMed

    Riba Ruiz, Jordi-Roger; Canals, Trini; Cantero, Rosa

    2017-01-01

    Ethylene propylene diene monomer (EPDM) rubber is widely used in a diverse type of applications, such as the automotive, industrial and construction sectors among others. Due to its appealing features, the consumption of vulcanized EPDM rubber is growing significantly. However, environmental issues are forcing the application of devulcanization processes to facilitate recovery, which has led rubber manufacturers to implement strict quality controls. Consequently, it is important to develop methods for supervising the vulcanizing and recovery processes of such products. This paper deals with the supervision process of EPDM compounds by means of Fourier transform mid-infrared (FT-IR) spectroscopy and suitable multivariate statistical methods. An expedited and nondestructive classification approach was applied to a sufficient number of EPDM samples with different applied processes, that is, with and without application of vulcanizing agents, vulcanized samples, and microwave treated samples. First the FT-IR spectra of the samples is acquired and next it is processed by applying suitable feature extraction methods, i.e., principal component analysis and canonical variate analysis to obtain the latent variables to be used for classifying test EPDM samples. Finally, the k nearest neighbor algorithm was used in the classification stage. Experimental results prove the accuracy of the proposed method and the potential of FT-IR spectroscopy in this area, since the classification accuracy can be as high as 100%.

  15. Numerical simulations of regolith sampling processes

    NASA Astrophysics Data System (ADS)

    Schäfer, Christoph M.; Scherrer, Samuel; Buchwald, Robert; Maindl, Thomas I.; Speith, Roland; Kley, Wilhelm

    2017-07-01

    We present recent improvements in the simulation of regolith sampling processes in microgravity using the numerical particle method smooth particle hydrodynamics (SPH). We use an elastic-plastic soil constitutive model for large deformation and failure flows for dynamical behaviour of regolith. In the context of projected small body (asteroid or small moons) sample return missions, we investigate the efficiency and feasibility of a particular material sampling method: Brushes sweep material from the asteroid's surface into a collecting tray. We analyze the influence of different material parameters of regolith such as cohesion and angle of internal friction on the sampling rate. Furthermore, we study the sampling process in two environments by varying the surface gravity (Earth's and Phobos') and we apply different rotation rates for the brushes. We find good agreement of our sampling simulations on Earth with experiments and provide estimations for the influence of the material properties on the collecting rate.

  16. Atmospheric vs. anaerobic processing of metabolome samples for the metabolite profiling of a strict anaerobic bacterium, Clostridium acetobutylicum.

    PubMed

    Lee, Sang-Hyun; Kim, Sooah; Kwon, Min-A; Jung, Young Hoon; Shin, Yong-An; Kim, Kyoung Heon

    2014-12-01

    Well-established metabolome sample preparation is a prerequisite for reliable metabolomic data. For metabolome sampling of a Gram-positive strict anaerobe, Clostridium acetobutylicum, fast filtration and metabolite extraction with acetonitrile/methanol/water (2:2:1, v/v) at -20°C under anaerobic conditions has been commonly used. This anaerobic metabolite processing method is laborious and time-consuming since it is conducted in an anaerobic chamber. Also, there have not been any systematic method evaluation and development of metabolome sample preparation for strict anaerobes and Gram-positive bacteria. In this study, metabolome sampling and extraction methods were rigorously evaluated and optimized for C. acetobutylicum by using gas chromatography/time-of-flight mass spectrometry-based metabolomics, in which a total of 116 metabolites were identified. When comparing the atmospheric (i.e., in air) and anaerobic (i.e., in an anaerobic chamber) processing of metabolome sample preparation, there was no significant difference in the quality and quantity of the metabolomic data. For metabolite extraction, pure methanol at -20°C was a better solvent than acetonitrile/methanol/water (2:2:1, v/v/v) at -20°C that is frequently used for C. acetobutylicum, and metabolite profiles were significantly different depending on extraction solvents. This is the first evaluation of metabolite sample preparation under aerobic processing conditions for an anaerobe. This method could be applied conveniently, efficiently, and reliably to metabolome analysis for strict anaerobes in air. © 2014 Wiley Periodicals, Inc.

  17. An improved sampling method of complex network

    NASA Astrophysics Data System (ADS)

    Gao, Qi; Ding, Xintong; Pan, Feng; Li, Weixing

    2014-12-01

    Sampling subnet is an important topic of complex network research. Sampling methods influence the structure and characteristics of subnet. Random multiple snowball with Cohen (RMSC) process sampling which combines the advantages of random sampling and snowball sampling is proposed in this paper. It has the ability to explore global information and discover the local structure at the same time. The experiments indicate that this novel sampling method could keep the similarity between sampling subnet and original network on degree distribution, connectivity rate and average shortest path. This method is applicable to the situation where the prior knowledge about degree distribution of original network is not sufficient.

  18. Comparison of a specific HPLC determination of toxic aconite alkaloids in processed Radix aconiti with a titration method of total alkaloids.

    PubMed

    Csupor, Dezso; Borcsa, Botond; Heydel, Barbara; Hohmann, Judit; Zupkó, István; Ma, Yan; Widowitz, Ute; Bauer, Rudolf

    2011-10-01

    In traditional Chinese medicine, Aconitum (Ranunculaceae) roots are only applied after processing. Nevertheless, several cases of poisoning by improperly processed aconite roots have been reported. The aim of this study was to develop a reliable analytical method to assess the amount of toxic aconite alkaloids in commercial aconite roots, and to compare this method with the commonly used total alkaloid content determination by titration. The content of mesaconitine, aconitine, and hypaconitine in 16 commercial samples of processed aconite roots was determined by an HPLC method and the total alkaloid content by indirect titration. Five samples were selected for in vivo toxicological investigation. In most of the commercial samples, toxic alkaloids were not detectable, or only traces were found. In four samples, we could detect >0.04% toxic aconite alkaloids, the highest with a content of 0.16%. The results of HPLC analysis were compared with the results obtained by titration, and no correlation was found between the two methods. The in vivo results reassured the validity of the HPLC determination. Samples with mesaconitine, aconitine, and hypaconitine content below the HPLC detection limit still contained up to 0.2% alkaloids determined by titration. Since titration of alkaloids gives no information selectively on the aconitine-type alkaloid content and toxicity of aconite roots this method is not appropriate for safety assessment. The HPLC method developed by us provides a quick and reliable assessment of toxicity and should be considered as a purity test in pharmacopoeia monographs.

  19. Nonuniform sampling and non-Fourier signal processing methods in multidimensional NMR

    PubMed Central

    Mobli, Mehdi; Hoch, Jeffrey C.

    2017-01-01

    Beginning with the introduction of Fourier Transform NMR by Ernst and Anderson in 1966, time domain measurement of the impulse response (the free induction decay, FID) consisted of sampling the signal at a series of discrete intervals. For compatibility with the discrete Fourier transform (DFT), the intervals are kept uniform, and the Nyquist theorem dictates the largest value of the interval sufficient to avoid aliasing. With the proposal by Jeener of parametric sampling along an indirect time dimension, extension to multidimensional experiments employed the same sampling techniques used in one dimension, similarly subject to the Nyquist condition and suitable for processing via the discrete Fourier transform. The challenges of obtaining high-resolution spectral estimates from short data records using the DFT were already well understood, however. Despite techniques such as linear prediction extrapolation, the achievable resolution in the indirect dimensions is limited by practical constraints on measuring time. The advent of non-Fourier methods of spectrum analysis capable of processing nonuniformly sampled data has led to an explosion in the development of novel sampling strategies that avoid the limits on resolution and measurement time imposed by uniform sampling. The first part of this review discusses the many approaches to data sampling in multidimensional NMR, the second part highlights commonly used methods for signal processing of such data, and the review concludes with a discussion of other approaches to speeding up data acquisition in NMR. PMID:25456315

  20. Comparing fixed sampling with minimizer sampling when using k-mer indexes to find maximal exact matches.

    PubMed

    Almutairy, Meznah; Torng, Eric

    2018-01-01

    Bioinformatics applications and pipelines increasingly use k-mer indexes to search for similar sequences. The major problem with k-mer indexes is that they require lots of memory. Sampling is often used to reduce index size and query time. Most applications use one of two major types of sampling: fixed sampling and minimizer sampling. It is well known that fixed sampling will produce a smaller index, typically by roughly a factor of two, whereas it is generally assumed that minimizer sampling will produce faster query times since query k-mers can also be sampled. However, no direct comparison of fixed and minimizer sampling has been performed to verify these assumptions. We systematically compare fixed and minimizer sampling using the human genome as our database. We use the resulting k-mer indexes for fixed sampling and minimizer sampling to find all maximal exact matches between our database, the human genome, and three separate query sets, the mouse genome, the chimp genome, and an NGS data set. We reach the following conclusions. First, using larger k-mers reduces query time for both fixed sampling and minimizer sampling at a cost of requiring more space. If we use the same k-mer size for both methods, fixed sampling requires typically half as much space whereas minimizer sampling processes queries only slightly faster. If we are allowed to use any k-mer size for each method, then we can choose a k-mer size such that fixed sampling both uses less space and processes queries faster than minimizer sampling. The reason is that although minimizer sampling is able to sample query k-mers, the number of shared k-mer occurrences that must be processed is much larger for minimizer sampling than fixed sampling. In conclusion, we argue that for any application where each shared k-mer occurrence must be processed, fixed sampling is the right sampling method.

  1. Comparing fixed sampling with minimizer sampling when using k-mer indexes to find maximal exact matches

    PubMed Central

    Torng, Eric

    2018-01-01

    Bioinformatics applications and pipelines increasingly use k-mer indexes to search for similar sequences. The major problem with k-mer indexes is that they require lots of memory. Sampling is often used to reduce index size and query time. Most applications use one of two major types of sampling: fixed sampling and minimizer sampling. It is well known that fixed sampling will produce a smaller index, typically by roughly a factor of two, whereas it is generally assumed that minimizer sampling will produce faster query times since query k-mers can also be sampled. However, no direct comparison of fixed and minimizer sampling has been performed to verify these assumptions. We systematically compare fixed and minimizer sampling using the human genome as our database. We use the resulting k-mer indexes for fixed sampling and minimizer sampling to find all maximal exact matches between our database, the human genome, and three separate query sets, the mouse genome, the chimp genome, and an NGS data set. We reach the following conclusions. First, using larger k-mers reduces query time for both fixed sampling and minimizer sampling at a cost of requiring more space. If we use the same k-mer size for both methods, fixed sampling requires typically half as much space whereas minimizer sampling processes queries only slightly faster. If we are allowed to use any k-mer size for each method, then we can choose a k-mer size such that fixed sampling both uses less space and processes queries faster than minimizer sampling. The reason is that although minimizer sampling is able to sample query k-mers, the number of shared k-mer occurrences that must be processed is much larger for minimizer sampling than fixed sampling. In conclusion, we argue that for any application where each shared k-mer occurrence must be processed, fixed sampling is the right sampling method. PMID:29389989

  2. Statistical Methods in Assembly Quality Management of Multi-Element Products on Automatic Rotor Lines

    NASA Astrophysics Data System (ADS)

    Pries, V. V.; Proskuriakov, N. E.

    2018-04-01

    To control the assembly quality of multi-element mass-produced products on automatic rotor lines, control methods with operational feedback are required. However, due to possible failures in the operation of the devices and systems of automatic rotor line, there is always a real probability of getting defective (incomplete) products into the output process stream. Therefore, a continuous sampling control of the products completeness, based on the use of statistical methods, remains an important element in managing the quality of assembly of multi-element mass products on automatic rotor lines. The feature of continuous sampling control of the multi-element products completeness in the assembly process is its breaking sort, which excludes the possibility of returning component parts after sampling control to the process stream and leads to a decrease in the actual productivity of the assembly equipment. Therefore, the use of statistical procedures for continuous sampling control of the multi-element products completeness when assembled on automatic rotor lines requires the use of such sampling plans that ensure a minimum size of control samples. Comparison of the values of the limit of the average output defect level for the continuous sampling plan (CSP) and for the automated continuous sampling plan (ACSP) shows the possibility of providing lower limit values for the average output defects level using the ACSP-1. Also, the average sample size when using the ACSP-1 plan is less than when using the CSP-1 plan. Thus, the application of statistical methods in the assembly quality management of multi-element products on automatic rotor lines, involving the use of proposed plans and methods for continuous selective control, will allow to automating sampling control procedures and the required level of quality of assembled products while minimizing sample size.

  3. HPAEC-PAD quantification of Haemophilus influenzae type b polysaccharide in upstream and downstream samples.

    PubMed

    van der Put, Robert M F; de Haan, Alex; van den IJssel, Jan G M; Hamidi, Ahd; Beurret, Michel

    2015-11-27

    Due to the rapidly increasing introduction of Haemophilus influenzae type b (Hib) and other conjugate vaccines worldwide during the last decade, reliable and robust analytical methods are needed for the quantitative monitoring of intermediate samples generated during fermentation (upstream processing, USP) and purification (downstream processing, DSP) of polysaccharide vaccine components. This study describes the quantitative characterization of in-process control (IPC) samples generated during the fermentation and purification of the capsular polysaccharide (CPS), polyribosyl-ribitol-phosphate (PRP), derived from Hib. Reliable quantitative methods are necessary for all stages of production; otherwise accurate process monitoring and validation is not possible. Prior to the availability of high performance anion exchange chromatography methods, this polysaccharide was predominantly quantified either with immunochemical methods, or with the colorimetric orcinol method, which shows interference from fermentation medium components and reagents used during purification. Next to an improved high performance anion exchange chromatography-pulsed amperometric detection (HPAEC-PAD) method, using a modified gradient elution, both the orcinol assay and high performance size exclusion chromatography (HPSEC) analyses were evaluated. For DSP samples, it was found that the correlation between the results obtained by HPAEC-PAD specific quantification of the PRP monomeric repeat unit released by alkaline hydrolysis, and those from the orcinol method was high (R(2)=0.8762), and that it was lower between HPAEC-PAD and HPSEC results. Additionally, HPSEC analysis of USP samples yielded surprisingly comparable results to those obtained by HPAEC-PAD. In the early part of the fermentation, medium components interfered with the different types of analysis, but quantitative HPSEC data could still be obtained, although lacking the specificity of the HPAEC-PAD method. Thus, the HPAEC-PAD method has the advantage of giving a specific response compared to the orcinol assay and HPSEC, and does not show interference from various components that can be present in intermediate and purified PRP samples. Copyright © 2014 Elsevier Ltd. All rights reserved.

  4. The UK Biobank sample handling and storage protocol for the collection, processing and archiving of human blood and urine.

    PubMed

    Elliott, Paul; Peakman, Tim C

    2008-04-01

    UK Biobank is a large prospective study in the UK to investigate the role of genetic factors, environmental exposures and lifestyle in the causes of major diseases of late and middle age. Extensive data and biological samples are being collected from 500,000 participants aged between 40 and 69 years. The biological samples that are collected and how they are processed and stored will have a major impact on the future scientific usefulness of the UK Biobank resource. The aim of the UK Biobank sample handling and storage protocol is to specify methods for the collection and storage of participant samples that give maximum scientific return within the available budget. Processing or storage methods that, as far as can be predicted, will preclude current or future assays have been avoided. The protocol was developed through a review of the literature on sample handling and processing, wide consultation within the academic community and peer review. Protocol development addressed which samples should be collected, how and when they should be processed and how the processed samples should be stored to ensure their long-term integrity. The recommended protocol was extensively tested in a series of validation studies. UK Biobank collects about 45 ml blood and 9 ml of urine with minimal local processing from each participant using the vacutainer system. A variety of preservatives, anti-coagulants and clot accelerators is used appropriate to the expected end use of the samples. Collection of other material (hair, nails, saliva and faeces) was also considered but rejected for the full cohort. Blood and urine samples from participants are transported overnight by commercial courier to a central laboratory where they are processed and aliquots of urine, plasma, serum, white cells and red cells stored in ultra-low temperature archives. Aliquots of whole blood are also stored for potential future production of immortalized cell lines. A standard panel of haematology assays is completed on whole blood from all participants, since such assays need to be conducted on fresh samples (whereas other assays can be done on stored samples). By the end of the recruitment phase, 15 million sample aliquots will be stored in two geographically separate archives: 9.5 million in a -80 degrees C automated archive and 5.5 million in a manual liquid nitrogen archive at -180 degrees C. Because of the size of the study and the numbers of samples obtained from participants, the protocol stipulates a highly automated approach for the processing and storage of samples. Implementation of the processes, technology, systems and facilities has followed best practices used in manufacturing industry to reduce project risk and to build in quality and robustness. The data produced from sample collection, processing and storage are highly complex and are managed by a commercially available LIMS system fully integrated with the entire process. The sample handling and storage protocol adopted by UK Biobank provides quality assured and validated methods that are feasible within the available funding and reflect the size and aims of the project. Experience from recruiting and processing the first 40,000 participants to the study demonstrates that the adopted methods and technologies are fit-for-purpose and robust.

  5. Evaluation of standard methods for collecting and processing fuel moisture samples

    Treesearch

    Sally M. Haase; José Sánchez; David R. Weise

    2016-01-01

    A variety of techniques for collecting and processing samples to determine moisture content of wildland fuels in support of fire management activities were evaluated. The effects of using a chainsaw or handsaw to collect samples of largediameter wood, containers for storing and transporting collected samples, and quick-response ovens for estimating moisture content...

  6. Properties of zirconia-toughened-alumina prepared via powder processing and colloidal processing routes.

    PubMed

    Rafferty, A; Alsebaie, A M; Olabi, A G; Prescott, T

    2009-01-15

    Alumina-zirconia composites were prepared by two routes: powder processing, and colloidal processing. Unstabilised zirconia powder was added to alumina in 5 wt%, 10 wt% and 20 wt% quantities. For the colloidal method, zirconium(IV) propoxide solution was added to alumina powder, also in 5 wt%, 10 wt% and 20 wt% quantities. Additions of glacial acetic acid were needed to form stable suspensions. Suspension stability was verified by pH measurements and sedimentation testing. For the powder processed samples Vickers hardness decreased indefinitely with increasing ZrO(2) additions, but for colloidal samples the hardness at first decreased but then increased again above >10 wt% ZrO(2). Elastic modulus (E) values decreased with ZrO(2) additions. However, samples containing 20 wt% zirconia prepared via a colloidal method exhibited a much higher modulus than the powder processed equivalent. This was due to the homogeneous dispersion of zirconia yielding a sample which was less prone to microcracking.

  7. [Influence of different original processing methods on quality of Salvia Miltiorrhizae Radix et Rhizoma from Shandong].

    PubMed

    Zhao, Zhi-Gang; Gao, Shu-Rui; Hou, Jun-Ling; Wang, Wen-Quan; Xu, Zhen-Guang; Song, Yan; Zhang, Xian-Ming; Li, Jun

    2014-04-01

    In this paper the contents of rosmarinic acid, salvianolic acid B, crytotanshinone, tanshinone II(A) in samples of different original processed Salvia Miltiorrhizae Radix et Rhizoma were determined by HPLC. Different processing methods have varied influences on four active ingredients in Salvia Miltiorrhizae Radix et Rhizoma. Sun-drying reduced the content of crytotanshinone, tanshi-none II(A) and rosmarinic acid, integralsamples were better than those cut into segments. Oven dry method had great influence on water--soluble ingredients, high temperature (80-100 degrees C) could easily cause big loss of rosmarinic acid and salvianolic acid B. The role of traditional processing method "fahan: was complicated, the content of rosmarinic acid decreased, crytotanshinone and tanshinone II(A) increased, and salvianolic acid B showed no difference after "fahan". Drying in the shade and oven dry under low temperatrure (40-60 degrees C) were all effective to keep active ingredients of Salvia Miltiorrhizae Radix et Rhizoma, and, there was no difference between integral samples and samples cut into segments. Therefore, considering comprehensively the content of active ingredients in Salvia Miltiorrhizae Radix et Rhizoma, and processing costing etc., shade-drying or oven dry underlow temperature (40-60 degrees C) should be the most suitable original processing method.

  8. Evaluation of counting methods for oceanic radium-228

    NASA Astrophysics Data System (ADS)

    Orr, James C.

    1988-07-01

    Measurement of open ocean 228Ra is difficult, typically requiring at least 200 L of seawater. The burden of collecting and processing these large-volume samples severely limits the widespread use of this promising tracer. To use smaller-volume samples, a more sensitive means of analysis is required. To seek out new and improved counting method(s), conventional 228Ra counting methods have been compared with some promising techniques which are currently used for other radionuclides. Of the conventional methods, α spectrometry possesses the highest efficiency (3-9%) and lowest background (0.0015 cpm), but it suffers from the need for complex chemical processing after sampling and the need to allow about 1 year for adequate ingrowth of 228Th granddaughter. The other two conventional counting methods measure the short-lived 228Ac daughter while it remains supported by 228Ra, thereby avoiding the complex sample processing and the long delay before counting. The first of these, high-resolution γ spectrometry, offers the simplest processing and an efficiency (4.8%) comparable to α spectrometry; yet its high background (0.16 cpm) and substantial equipment cost (˜30,000) limit its widespread use. The second no-wait method, β-γ coincidence spectrometry, also offers comparable efficiency (5.3%), but it possesses both lower background (0.0054 cpm) and lower initial cost (˜12,000). Three new (i.e., untried for 228Ra) techniques all seem to promise about a fivefold increase in efficiency over conventional methods. By employing liquid scintillation methods, both α spectrometry and β-γ coincidence spectrometry can improve their counter efficiency while retaining low background. The third new 228Ra counting method could be adapted from a technique which measures 224Ra by 220Rn emanation. After allowing for ingrowth and then counting for the 224Ra great-granddaughter, 228Ra could be back calculated, thereby yielding a method with high efficiency, where no sample processing is required. The efficiency and background of each of the three new methods have been estimated and are compared with those of the three methods currently employed to measure oceanic 228Ra. From efficiency and background, the relative figure of merit and the detection limit have been determined for each of the six counters. These data suggest that the new counting methods have the potential to measure most 228Ra samples with just 30 L of seawater, to better than 5% precision. Not only would this reduce the time, effort, and expense involved in sample collection, but 228Ra could then be measured on many small-volume samples (20-30 L) previously collected with only 226Ra in mind. By measuring 228Ra quantitatively on such small-volume samples, three analyses (large-volume 228Ra, large-volume 226Ra, and small-volume 226Ra) could be reduced to one, thereby dramatically improving analytical precision.

  9. Method of plasma etching Ga-based compound semiconductors

    DOEpatents

    Qiu, Weibin; Goddard, Lynford L.

    2012-12-25

    A method of plasma etching Ga-based compound semiconductors includes providing a process chamber and a source electrode adjacent to the process chamber. The process chamber contains a sample comprising a Ga-based compound semiconductor. The sample is in contact with a platen which is electrically connected to a first power supply, and the source electrode is electrically connected to a second power supply. The method includes flowing SiCl.sub.4 gas into the chamber, flowing Ar gas into the chamber, and flowing H.sub.2 gas into the chamber. RF power is supplied independently to the source electrode and the platen. A plasma is generated based on the gases in the process chamber, and regions of a surface of the sample adjacent to one or more masked portions of the surface are etched to create a substantially smooth etched surface including features having substantially vertical walls beneath the masked portions.

  10. A novel heterogeneous training sample selection method on space-time adaptive processing

    NASA Astrophysics Data System (ADS)

    Wang, Qiang; Zhang, Yongshun; Guo, Yiduo

    2018-04-01

    The performance of ground target detection about space-time adaptive processing (STAP) decreases when non-homogeneity of clutter power is caused because of training samples contaminated by target-like signals. In order to solve this problem, a novel nonhomogeneous training sample selection method based on sample similarity is proposed, which converts the training sample selection into a convex optimization problem. Firstly, the existing deficiencies on the sample selection using generalized inner product (GIP) are analyzed. Secondly, the similarities of different training samples are obtained by calculating mean-hausdorff distance so as to reject the contaminated training samples. Thirdly, cell under test (CUT) and the residual training samples are projected into the orthogonal subspace of the target in the CUT, and mean-hausdorff distances between the projected CUT and training samples are calculated. Fourthly, the distances are sorted in order of value and the training samples which have the bigger value are selective preference to realize the reduced-dimension. Finally, simulation results with Mountain-Top data verify the effectiveness of the proposed method.

  11. Image re-sampling detection through a novel interpolation kernel.

    PubMed

    Hilal, Alaa

    2018-06-01

    Image re-sampling involved in re-size and rotation transformations is an essential element block in a typical digital image alteration. Fortunately, traces left from such processes are detectable, proving that the image has gone a re-sampling transformation. Within this context, we present in this paper two original contributions. First, we propose a new re-sampling interpolation kernel. It depends on five independent parameters that controls its amplitude, angular frequency, standard deviation, and duration. Then, we demonstrate its capacity to imitate the same behavior of the most frequent interpolation kernels used in digital image re-sampling applications. Secondly, the proposed model is used to characterize and detect the correlation coefficients involved in re-sampling transformations. The involved process includes a minimization of an error function using the gradient method. The proposed method is assessed over a large database of 11,000 re-sampled images. Additionally, it is implemented within an algorithm in order to assess images that had undergone complex transformations. Obtained results demonstrate better performance and reduced processing time when compared to a reference method validating the suitability of the proposed approaches. Copyright © 2018 Elsevier B.V. All rights reserved.

  12. Rotor for processing liquids using movable capillary tubes

    DOEpatents

    Johnson, Wayne F.; Burtis, Carl A.; Walker, William A.

    1989-01-01

    A rotor assembly for processing liquids, especially whole blood samples, is disclosed. The assembly includes apparatus for separating non-liquid components of whole blood samples from liquid components, apparatus for diluting the separated liquid component with a diluent and apparatus for transferring the diluted sample to an external apparatus for analysis. The rotor assembly employs several movable capillary tubes to handle the sample and diluents. A method for using the rotor assembly to process liquids is also described.

  13. Rotor for processing liquids using movable capillary tubes

    DOEpatents

    Johnson, Wayne F [Loudon, TN; Burtis, Carl A [Oak Ridge, TN; Walker, William A [Knoxville, TN

    1989-05-30

    A rotor assembly for processing liquids, especially whole blood samples, is disclosed. The assembly includes apparatus for separating non-liquid components of whole blood samples from liquid components, apparatus for diluting the separated liquid component with a diluent and apparatus for transferring the diluted sample to an external apparatus for analysis. The rotor assembly employs several movable capillary tubes to handle the sample and diluents. A method for using the rotor assembly to process liquids is also described.

  14. Continuous-flow free acid monitoring method and system

    DOEpatents

    Strain, J.E.; Ross, H.H.

    1980-01-11

    A free acid monitoring method and apparatus is provided for continuously measuring the excess acid present in a process stream. The disclosed monitoring system and method is based on the relationship of the partial pressure ratio of water and acid in equilibrium with an acid solution at constant temperature. A portion of the process stream is pumped into and flows through the monitor under the influence of gravity and back to the process stream. A continuous flowing sample is vaporized at a constant temperature and the vapor is subsequently condensed. Conductivity measurements of the condensate produces a nonlinear response function from which the free acid molarity of the sample process stream is determined.

  15. Continuous-flow free acid monitoring method and system

    DOEpatents

    Strain, James E.; Ross, Harley H.

    1981-01-01

    A free acid monitoring method and apparatus is provided for continuously measuring the excess acid present in a process stream. The disclosed monitoring system and method is based on the relationship of the partial pressure ratio of water and acid in equilibrium with an acid solution at constant temperature. A portion of the process stream is pumped into and flows through the monitor under the influence of gravity and back to the process stream. A continuous flowing sample is vaporized at a constant temperature and the vapor is subsequently condensed. Conductivity measurements of the condensate produces a nonlinear response function from which the free acid molarity of the sample process stream is determined.

  16. Detection of genetically modified organisms in foreign-made processed foods containing corn and potato.

    PubMed

    Monma, Kimio; Araki, Rie; Sagi, Naoki; Satoh, Masaki; Ichikawa, Hisatsugu; Satoh, Kazue; Tobe, Takashi; Kamata, Kunihiro; Hino, Akihiro; Saito, Kazuo

    2005-06-01

    Investigations of the validity of labeling regarding genetically modified (GM) products were conducted using polymerase chain reaction (PCR) methods for foreign-made processed foods made from corn and potato purchased in the Tokyo area and in the USA. Several kinds of GM crops were detected in 12 of 32 samples of processed corn samples. More than two GM events for which safety reviews have been completed in Japan were simultaneously detected in 10 samples. GM events MON810 and Bt11 were most frequently detected in the samples by qualitative PCR methods. MON810 was detected in 11 of the 12 samples, and Bt11 was detected in 6 of the 12 samples. In addition, Roundup Ready soy was detected in one of the 12 samples. On the other hand, CBH351, for which the safety assessment was withdrawn in Japan, was not detected in any of the 12 samples. A trial quantitative analysis was performed on six of the GM maize qualitatively positive samples. The estimated amounts of GM maize in these samples ranged from 0.2 to 2.8%, except for one sample, which contained 24.1%. For this sample, the total amount found by event-specific quantitative analysis was 23.8%. Additionally, Roundup Ready soy was detected in one sample of 21 potato-processed foods, although GM potatoes were not detected in any sample.

  17. Capillary zone electrophoresis method for a highly glycosylated and sialylated recombinant protein: development, characterization and application for process development.

    PubMed

    Zhang, Le; Lawson, Ken; Yeung, Bernice; Wypych, Jette

    2015-01-06

    A purity method based on capillary zone electrophoresis (CZE) has been developed for the separation of isoforms of a highly glycosylated protein. The separation was found to be driven by the number of sialic acids attached to each isoform. The method has been characterized using orthogonal assays and shown to have excellent specificity, precision and accuracy. We have demonstrated the CZE method is a useful in-process assay to support cell culture and purification development of this glycoprotein. Compared to isoelectric focusing (IEF), the CZE method provides more quantitative results and higher sample throughput with excellent accuracy, qualities that are required for process development. In addition, the CZE method has been applied in the stability testing of purified glycoprotein samples.

  18. Development of an automated data processing method for sample to sample comparison of seized methamphetamines.

    PubMed

    Choe, Sanggil; Lee, Jaesin; Choi, Hyeyoung; Park, Yujin; Lee, Heesang; Pyo, Jaesung; Jo, Jiyeong; Park, Yonghoon; Choi, Hwakyung; Kim, Suncheun

    2012-11-30

    The information about the sources of supply, trafficking routes, distribution patterns and conspiracy links can be obtained from methamphetamine profiling. The precursor and synthetic method for the clandestine manufacture can be estimated from the analysis of minor impurities contained in methamphetamine. Also, the similarity between samples can be evaluated using the peaks that appear in chromatograms. In South Korea, methamphetamine was the most popular drug but the total seized amount of methamphetamine whole through the country was very small. Therefore, it would be more important to find the links between samples than the other uses of methamphetamine profiling. Many Asian countries including Japan and South Korea have been using the method developed by National Research Institute of Police Science of Japan. The method used gas chromatography-flame ionization detector (GC-FID), DB-5 column and four internal standards. It was developed to increase the amount of impurities and minimize the amount of methamphetamine. After GC-FID analysis, the raw data have to be processed. The data processing steps are very complex and require a lot of time and effort. In this study, Microsoft Visual Basic Application (VBA) modules were developed to handle these data processing steps. This module collected the results from the data into an Excel file and then corrected the retention time shift and response deviation generated from the sample preparation and instruments analysis. The developed modules were tested for their performance using 10 samples from 5 different cases. The processed results were analyzed with Pearson correlation coefficient for similarity assessment and the correlation coefficient of the two samples from the same case was more than 0.99. When the modules were applied to 131 seized methamphetamine samples, four samples from two different cases were found to have the common origin and the chromatograms of the four samples were appeared visually identical. The developed VBA modules could process raw data of GC-FID very quickly and easily. Also, they could assess the similarity between samples by peak pattern recognition using whole peaks without spectral identification of each peak that appeared in the chromatogram. The results collectively suggest that the modules would be useful tools to augment similarity assessment between seized methamphetamine samples. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  19. Time to stabilization in single leg drop jump landings: an examination of calculation methods and assessment of differences in sample rate, filter settings and trial length on outcome values.

    PubMed

    Fransz, Duncan P; Huurnink, Arnold; de Boode, Vosse A; Kingma, Idsart; van Dieën, Jaap H

    2015-01-01

    Time to stabilization (TTS) is the time it takes for an individual to return to a baseline or stable state following a jump or hop landing. A large variety exists in methods to calculate the TTS. These methods can be described based on four aspects: (1) the input signal used (vertical, anteroposterior, or mediolateral ground reaction force) (2) signal processing (smoothed by sequential averaging, a moving root-mean-square window, or fitting an unbounded third order polynomial), (3) the stable state (threshold), and (4) the definition of when the (processed) signal is considered stable. Furthermore, differences exist with regard to the sample rate, filter settings and trial length. Twenty-five healthy volunteers performed ten 'single leg drop jump landing' trials. For each trial, TTS was calculated according to 18 previously reported methods. Additionally, the effects of sample rate (1000, 500, 200 and 100 samples/s), filter settings (no filter, 40, 15 and 10 Hz), and trial length (20, 14, 10, 7, 5 and 3s) were assessed. The TTS values varied considerably across the calculation methods. The maximum effect of alterations in the processing settings, averaged over calculation methods, were 2.8% (SD 3.3%) for sample rate, 8.8% (SD 7.7%) for filter settings, and 100.5% (SD 100.9%) for trial length. Differences in TTS calculation methods are affected differently by sample rate, filter settings and trial length. The effects of differences in sample rate and filter settings are generally small, while trial length has a large effect on TTS values. Copyright © 2014 Elsevier B.V. All rights reserved.

  20. Using stable isotopes to monitor forms of sulfur during desulfurization processes: A quick screening method

    USGS Publications Warehouse

    Liu, Chao-Li; Hackley, Keith C.; Coleman, D.D.; Kruse, C.W.

    1987-01-01

    A method using stable isotope ratio analysis to monitor the reactivity of sulfur forms in coal during thermal and chemical desulfurization processes has been developed at the Illinois State Geological Survey. The method is based upon the fact that a significant difference exists in some coals between the 34S/32S ratios of the pyritic and organic sulfur. A screening method for determining the suitability of coal samples for use in isotope ratio analysis is described. Making these special coals available from coal sample programs would assist research groups in sorting out the complex sulfur chemistry which accompanies thermal and chemical processing of high sulfur coals. ?? 1987.

  1. Effective Porosity Measurements by Wet- and Dry-type Vacuum Saturations using Process-Programmable Vacuum Saturation System

    NASA Astrophysics Data System (ADS)

    Lee, T. J.; Lee, K. S., , Dr; Lee, S. K.

    2017-12-01

    One of the most important factors in measuring effective porosity by vacuum saturation method is that the air in the pore space can be fully substituted by water during the vacuum saturation process. International Society of Rock Mechanics (ISRM) suggests vacuuming a rock sample submerged in the water, while American Society of Test and Materials (ASTM) vacuuming the sample and water separately and then pour the water to the sample. In this study, we call the former wet-type vacuum saturation (WVS) method and the latter dry-type vacuum saturation (DVS) method, and compare the effective porosity measured by the two different vacuum saturation processes. For that purpose, a vacuum saturation system has been developed, which can support both WVS and DVS by only changing the process by programming. Comparison of effective porosity has been made for a cement mortar and rock samples. As a result, DVS can substitute more void volume to water than WVS, which in turn insists that DVS can provide more exact value of effective porosity than WVS.

  2. Nonuniform sampling and non-Fourier signal processing methods in multidimensional NMR.

    PubMed

    Mobli, Mehdi; Hoch, Jeffrey C

    2014-11-01

    Beginning with the introduction of Fourier Transform NMR by Ernst and Anderson in 1966, time domain measurement of the impulse response (the free induction decay, FID) consisted of sampling the signal at a series of discrete intervals. For compatibility with the discrete Fourier transform (DFT), the intervals are kept uniform, and the Nyquist theorem dictates the largest value of the interval sufficient to avoid aliasing. With the proposal by Jeener of parametric sampling along an indirect time dimension, extension to multidimensional experiments employed the same sampling techniques used in one dimension, similarly subject to the Nyquist condition and suitable for processing via the discrete Fourier transform. The challenges of obtaining high-resolution spectral estimates from short data records using the DFT were already well understood, however. Despite techniques such as linear prediction extrapolation, the achievable resolution in the indirect dimensions is limited by practical constraints on measuring time. The advent of non-Fourier methods of spectrum analysis capable of processing nonuniformly sampled data has led to an explosion in the development of novel sampling strategies that avoid the limits on resolution and measurement time imposed by uniform sampling. The first part of this review discusses the many approaches to data sampling in multidimensional NMR, the second part highlights commonly used methods for signal processing of such data, and the review concludes with a discussion of other approaches to speeding up data acquisition in NMR. Copyright © 2014 Elsevier B.V. All rights reserved.

  3. Recovery and Determination of Adsorbed Technetium on Savannah River Site Charcoal Stack Samples

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lahoda, Kristy G.; Engelmann, Mark D.; Farmer, Orville T.

    2008-03-01

    Experimental results are provided for the sample analyses for technetium (Tc) in charcoal samples placed in-line with a Savannah River Site (SRS) processing stack effluent stream as a part of an environmental surveillance program. The method for Tc removal from charcoal was based on that originally developed with high purity charcoal. Presented is the process that allowed for the quantitative analysis of 99Tc in SRS charcoal stack samples with and without 97Tc as a tracer. The results obtained with the method using the 97Tc tracer quantitatively confirm the results obtained with no tracer added. All samples contain 99Tc at themore » pg g-1 level.« less

  4. A Science and Risk-Based Pragmatic Methodology for Blend and Content Uniformity Assessment.

    PubMed

    Sayeed-Desta, Naheed; Pazhayattil, Ajay Babu; Collins, Jordan; Doshi, Chetan

    2018-04-01

    This paper describes a pragmatic approach that can be applied in assessing powder blend and unit dosage uniformity of solid dose products at Process Design, Process Performance Qualification, and Continued/Ongoing Process Verification stages of the Process Validation lifecycle. The statistically based sampling, testing, and assessment plan was developed due to the withdrawal of the FDA draft guidance for industry "Powder Blends and Finished Dosage Units-Stratified In-Process Dosage Unit Sampling and Assessment." This paper compares the proposed Grouped Area Variance Estimate (GAVE) method with an alternate approach outlining the practicality and statistical rationalization using traditional sampling and analytical methods. The approach is designed to fit solid dose processes assuring high statistical confidence in both powder blend uniformity and dosage unit uniformity during all three stages of the lifecycle complying with ASTM standards as recommended by the US FDA.

  5. Rotor for processing liquids using movable capillary tubes

    DOEpatents

    Johnson, W.F.; Burtis, C.A.; Walker, W.A.

    1987-07-17

    A rotor assembly for processing liquids, especially whole blood samples, is disclosed. The assembly includes apparatus for separating non-liquid components of whole blood samples from liquid components, apparatus for diluting the separated liquid component with a diluent and apparatus for transferring the diluted sample to an external apparatus for analysis. The rotor assembly employs several movable capillary tubes to handle the sample and diluents. A method for using the rotor assembly to process liquids is also described. 5 figs.

  6. A Fast Multiple Sampling Method for Low-Noise CMOS Image Sensors With Column-Parallel 12-bit SAR ADCs.

    PubMed

    Kim, Min-Kyu; Hong, Seong-Kwan; Kwon, Oh-Kyong

    2015-12-26

    This paper presents a fast multiple sampling method for low-noise CMOS image sensor (CIS) applications with column-parallel successive approximation register analog-to-digital converters (SAR ADCs). The 12-bit SAR ADC using the proposed multiple sampling method decreases the A/D conversion time by repeatedly converting a pixel output to 4-bit after the first 12-bit A/D conversion, reducing noise of the CIS by one over the square root of the number of samplings. The area of the 12-bit SAR ADC is reduced by using a 10-bit capacitor digital-to-analog converter (DAC) with four scaled reference voltages. In addition, a simple up/down counter-based digital processing logic is proposed to perform complex calculations for multiple sampling and digital correlated double sampling. To verify the proposed multiple sampling method, a 256 × 128 pixel array CIS with 12-bit SAR ADCs was fabricated using 0.18 μm CMOS process. The measurement results shows that the proposed multiple sampling method reduces each A/D conversion time from 1.2 μs to 0.45 μs and random noise from 848.3 μV to 270.4 μV, achieving a dynamic range of 68.1 dB and an SNR of 39.2 dB.

  7. Multi-laboratory survey of qPCR enterococci analysis method performance

    EPA Pesticide Factsheets

    Quantitative polymerase chain reaction (qPCR) has become a frequently used technique for quantifying enterococci in recreational surface waters, but there are several methodological options. Here we evaluated how three method permutations, type of mastermix, sample extract dilution and use of controls in results calculation, affect method reliability among multiple laboratories with respect to sample interference. Multiple samples from each of 22 sites representing an array of habitat types were analyzed using EPA Method 1611 and 1609 reagents with full strength and five-fold diluted extracts. The presence of interference was assessed three ways: using sample processing and PCR amplifications controls; consistency of results across extract dilutions; and relative recovery of target genes from spiked enterococci in water sample compared to control matrices with acceptable recovery defined as 50 to 200%. Method 1609, which is based on an environmental mastermix, was found to be superior to Method 1611, which is based on a universal mastermix. Method 1611 had over a 40% control assay failure rate with undiluted extracts and a 6% failure rate with diluted extracts. Method 1609 failed in only 11% and 3% of undiluted and diluted extracts analyses. Use of sample processing control assay results in the delta-delta Ct method for calculating relative target gene recoveries increased the number of acceptable recovery results. Delta-delta tended to bias recoveries fr

  8. Assessment of microbiological contamination of fresh, minimally processed, and ready-to-eat lettuces (Lactuca sativa), Rio de Janeiro State, Brazil.

    PubMed

    Brandão, Marcelo L L; Almeida, Davi O; Bispo, Fernanda C P; Bricio, Silvia M L; Marin, Victor A; Miagostovich, Marize P

    2014-05-01

    This study aimed to assess the microbiological contamination of lettuces commercialized in Rio de Janeiro, Brazil, in order to investigate detection of norovirus genogroup II (NoV GII), Salmonella spp., total and fecal coliforms, such as Escherichia coli. For NoV detection samples were processed using the adsorption-elution concentration method associated to real-time quantitative polymerase chain reaction (qPCR). A total of 90 samples of lettuce including 30 whole fresh lettuces, 30 minimally processed (MP) lettuces, and 30 raw ready-to-eat (RTE) lettuce salads were randomly collected from different supermarkets (fresh and MP lettuce samples), food services, and self-service restaurants (RTE lettuce salads), all located in Rio de Janeiro, Brazil, from October 2010 to December 2011. NoV GII was not detected and PP7 bacteriophage used as internal control process (ICP) was recovered in 40.0%, 86.7%, and 76.7% of those samples, respectively. Salmonella spp. was not detected although fecal contamination has been observed by fecal coliform concentrations higher than 10(2) most probable number/g. E. coli was detected in 70.0%, 6.7%, and 30.0% of fresh, MP, and RTE samples, respectively. This study highlights the need to improve hygiene procedures at all stages of vegetable production and to show PP7 bacteriophage as an ICP for recovering RNA viruses' methods from MP and RTE lettuce samples, encouraging the evaluation of new protocols that facilitate the establishment of methodologies for NoV detection in a greater number of food microbiology laboratories. The PP7 bacteriophage can be used as an internal control process in methods for recovering RNA viruses from minimally processed and ready-to-eat lettuce samples. © 2014 Institute of Food Technologists®

  9. Sampling naturally contaminated broiler carcasses for Salmonella by three different methods

    USDA-ARS?s Scientific Manuscript database

    Postchill neck skin (NS) maceration and whole carcass rinsing (WCR) are frequently used methods to detect salmonellae from commercially processed broilers. These are practical, nondestructive methods, but they are insensitive and may result in frequent false negatives (20 to 40%). NS samples only ...

  10. Determination of protein carbonyls in plasma, cell extracts, tissue homogenates, isolated proteins: Focus on sample preparation and derivatization conditions

    PubMed Central

    Weber, Daniela; Davies, Michael J.; Grune, Tilman

    2015-01-01

    Protein oxidation is involved in regulatory physiological events as well as in damage to tissues and is thought to play a key role in the pathophysiology of diseases and in the aging process. Protein-bound carbonyls represent a marker of global protein oxidation, as they are generated by multiple different reactive oxygen species in blood, tissues and cells. Sample preparation and stabilization are key steps in the accurate quantification of oxidation-related products and examination of physiological/pathological processes. This review therefore focuses on the sample preparation processes used in the most relevant methods to detect protein carbonyls after derivatization with 2,4-dinitrophenylhydrazine with an emphasis on measurement in plasma, cells, organ homogenates, isolated proteins and organelles. Sample preparation, derivatization conditions and protein handling are presented for the spectrophotometric and HPLC method as well as for immunoblotting and ELISA. An extensive overview covering these methods in previously published articles is given for researchers who plan to measure protein carbonyls in different samples. PMID:26141921

  11. Determination of protein carbonyls in plasma, cell extracts, tissue homogenates, isolated proteins: Focus on sample preparation and derivatization conditions.

    PubMed

    Weber, Daniela; Davies, Michael J; Grune, Tilman

    2015-08-01

    Protein oxidation is involved in regulatory physiological events as well as in damage to tissues and is thought to play a key role in the pathophysiology of diseases and in the aging process. Protein-bound carbonyls represent a marker of global protein oxidation, as they are generated by multiple different reactive oxygen species in blood, tissues and cells. Sample preparation and stabilization are key steps in the accurate quantification of oxidation-related products and examination of physiological/pathological processes. This review therefore focuses on the sample preparation processes used in the most relevant methods to detect protein carbonyls after derivatization with 2,4-dinitrophenylhydrazine with an emphasis on measurement in plasma, cells, organ homogenates, isolated proteins and organelles. Sample preparation, derivatization conditions and protein handling are presented for the spectrophotometric and HPLC method as well as for immunoblotting and ELISA. An extensive overview covering these methods in previously published articles is given for researchers who plan to measure protein carbonyls in different samples. © 2015 Published by Elsevier Ltd.

  12. lop-DWI: A Novel Scheme for Pre-Processing of Diffusion-Weighted Images in the Gradient Direction Domain.

    PubMed

    Sepehrband, Farshid; Choupan, Jeiran; Caruyer, Emmanuel; Kurniawan, Nyoman D; Gal, Yaniv; Tieng, Quang M; McMahon, Katie L; Vegh, Viktor; Reutens, David C; Yang, Zhengyi

    2014-01-01

    We describe and evaluate a pre-processing method based on a periodic spiral sampling of diffusion-gradient directions for high angular resolution diffusion magnetic resonance imaging. Our pre-processing method incorporates prior knowledge about the acquired diffusion-weighted signal, facilitating noise reduction. Periodic spiral sampling of gradient direction encodings results in an acquired signal in each voxel that is pseudo-periodic with characteristics that allow separation of low-frequency signal from high frequency noise. Consequently, it enhances local reconstruction of the orientation distribution function used to define fiber tracks in the brain. Denoising with periodic spiral sampling was tested using synthetic data and in vivo human brain images. The level of improvement in signal-to-noise ratio and in the accuracy of local reconstruction of fiber tracks was significantly improved using our method.

  13. Method Development in Forensic Toxicology.

    PubMed

    Peters, Frank T; Wissenbach, Dirk K; Busardo, Francesco Paolo; Marchei, Emilia; Pichini, Simona

    2017-01-01

    In the field of forensic toxicology, the quality of analytical methods is of great importance to ensure the reliability of results and to avoid unjustified legal consequences. A key to high quality analytical methods is a thorough method development. The presented article will provide an overview on the process of developing methods for forensic applications. This includes the definition of the method's purpose (e.g. qualitative vs quantitative) and the analytes to be included, choosing an appropriate sample matrix, setting up separation and detection systems as well as establishing a versatile sample preparation. Method development is concluded by an optimization process after which the new method is subject to method validation. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  14. Determination of epichlorohydrin and 1,3-dichloro-2-propanol in synthesis of cationic etherifying reagent by headspace gas chromatography.

    PubMed

    Tao, Zheng-Yi; Chai, Xin-Sheng; Wu, Shu-Bin

    2011-09-16

    This study demonstrates a headspace gas chromatographic(HS-GC) technique for the determination of residual epichlorohydrin (ECH) and generated 1,3-dichloro-2-propanol (DCP) in synthesis process of 3-chloro-2-hydroxypropyltrimethylammonium chloride (CHTAC). By a weight-based sampling method, coupled with significant dilution in 15.8% sodium sulfate and 0.1% silver nitrate mixed solution rapidly, the sample for HS-GC analysis is prepared. Based on the reaction stoichiometry, the conversion (R) of CHTAC during the synthesis process can be calculated from sampling weight and GC peak area. The results showed that the method has a good measurement precision (RSD<2.5%) and accuracy (recovery=101-104%) for the quantification of both ECH and DCP in the process samples. The present method is simple and accurate, which can be used for the efficient determination of the CHTAC conversion in the synthesis research. Copyright © 2011 Elsevier B.V. All rights reserved.

  15. Chemometrics-based process analytical technology (PAT) tools: applications and adaptation in pharmaceutical and biopharmaceutical industries.

    PubMed

    Challa, Shruthi; Potumarthi, Ravichandra

    2013-01-01

    Process analytical technology (PAT) is used to monitor and control critical process parameters in raw materials and in-process products to maintain the critical quality attributes and build quality into the product. Process analytical technology can be successfully implemented in pharmaceutical and biopharmaceutical industries not only to impart quality into the products but also to prevent out-of-specifications and improve the productivity. PAT implementation eliminates the drawbacks of traditional methods which involves excessive sampling and facilitates rapid testing through direct sampling without any destruction of sample. However, to successfully adapt PAT tools into pharmaceutical and biopharmaceutical environment, thorough understanding of the process is needed along with mathematical and statistical tools to analyze large multidimensional spectral data generated by PAT tools. Chemometrics is a chemical discipline which incorporates both statistical and mathematical methods to obtain and analyze relevant information from PAT spectral tools. Applications of commonly used PAT tools in combination with appropriate chemometric method along with their advantages and working principle are discussed. Finally, systematic application of PAT tools in biopharmaceutical environment to control critical process parameters for achieving product quality is diagrammatically represented.

  16. Rare behavior of growth processes via umbrella sampling of trajectories

    NASA Astrophysics Data System (ADS)

    Klymko, Katherine; Geissler, Phillip L.; Garrahan, Juan P.; Whitelam, Stephen

    2018-03-01

    We compute probability distributions of trajectory observables for reversible and irreversible growth processes. These results reveal a correspondence between reversible and irreversible processes, at particular points in parameter space, in terms of their typical and atypical trajectories. Thus key features of growth processes can be insensitive to the precise form of the rate constants used to generate them, recalling the insensitivity to microscopic details of certain equilibrium behavior. We obtained these results using a sampling method, inspired by the "s -ensemble" large-deviation formalism, that amounts to umbrella sampling in trajectory space. The method is a simple variant of existing approaches, and applies to ensembles of trajectories controlled by the total number of events. It can be used to determine large-deviation rate functions for trajectory observables in or out of equilibrium.

  17. Estimating removal rates of bacteria from poultry carcasses using two whole-carcass rinse volumes

    USDA-ARS?s Scientific Manuscript database

    Rinse sampling is a common method for determining the level of microbial contamination on poultry carcasses. One of the advantages of rinse sampling, over other carcass sampling methods, is that the results can be used for both process control applications and to estimate the total microbial level o...

  18. Large-volume constant-concentration sampling technique coupling with surface-enhanced Raman spectroscopy for rapid on-site gas analysis

    NASA Astrophysics Data System (ADS)

    Zhang, Zhuomin; Zhan, Yisen; Huang, Yichun; Li, Gongke

    2017-08-01

    In this work, a portable large-volume constant-concentration (LVCC) sampling technique coupling with surface-enhanced Raman spectroscopy (SERS) was developed for the rapid on-site gas analysis based on suitable derivatization methods. LVCC sampling technique mainly consisted of a specially designed sampling cell including the rigid sample container and flexible sampling bag, and an absorption-derivatization module with a portable pump and a gas flowmeter. LVCC sampling technique allowed large, alterable and well-controlled sampling volume, which kept the concentration of gas target in headspace phase constant during the entire sampling process and made the sampling result more representative. Moreover, absorption and derivatization of gas target during LVCC sampling process were efficiently merged in one step using bromine-thiourea and OPA-NH4+ strategy for ethylene and SO2 respectively, which made LVCC sampling technique conveniently adapted to consequent SERS analysis. Finally, a new LVCC sampling-SERS method was developed and successfully applied for rapid analysis of trace ethylene and SO2 from fruits. It was satisfied that trace ethylene and SO2 from real fruit samples could be actually and accurately quantified by this method. The minor concentration fluctuations of ethylene and SO2 during the entire LVCC sampling process were proved to be < 4.3% and 2.1% respectively. Good recoveries for ethylene and sulfur dioxide from fruit samples were achieved in range of 95.0-101% and 97.0-104% respectively. It is expected that portable LVCC sampling technique would pave the way for rapid on-site analysis of accurate concentrations of trace gas targets from real samples by SERS.

  19. Large-volume constant-concentration sampling technique coupling with surface-enhanced Raman spectroscopy for rapid on-site gas analysis.

    PubMed

    Zhang, Zhuomin; Zhan, Yisen; Huang, Yichun; Li, Gongke

    2017-08-05

    In this work, a portable large-volume constant-concentration (LVCC) sampling technique coupling with surface-enhanced Raman spectroscopy (SERS) was developed for the rapid on-site gas analysis based on suitable derivatization methods. LVCC sampling technique mainly consisted of a specially designed sampling cell including the rigid sample container and flexible sampling bag, and an absorption-derivatization module with a portable pump and a gas flowmeter. LVCC sampling technique allowed large, alterable and well-controlled sampling volume, which kept the concentration of gas target in headspace phase constant during the entire sampling process and made the sampling result more representative. Moreover, absorption and derivatization of gas target during LVCC sampling process were efficiently merged in one step using bromine-thiourea and OPA-NH 4 + strategy for ethylene and SO 2 respectively, which made LVCC sampling technique conveniently adapted to consequent SERS analysis. Finally, a new LVCC sampling-SERS method was developed and successfully applied for rapid analysis of trace ethylene and SO 2 from fruits. It was satisfied that trace ethylene and SO 2 from real fruit samples could be actually and accurately quantified by this method. The minor concentration fluctuations of ethylene and SO 2 during the entire LVCC sampling process were proved to be <4.3% and 2.1% respectively. Good recoveries for ethylene and sulfur dioxide from fruit samples were achieved in range of 95.0-101% and 97.0-104% respectively. It is expected that portable LVCC sampling technique would pave the way for rapid on-site analysis of accurate concentrations of trace gas targets from real samples by SERS. Copyright © 2017 Elsevier B.V. All rights reserved.

  20. Pre-analytical effects of blood sampling and handling in quantitative immunoassays for rheumatoid arthritis.

    PubMed

    Zhao, Xiaoyan; Qureshi, Ferhan; Eastman, P Scott; Manning, William C; Alexander, Claire; Robinson, William H; Hesterberg, Lyndal K

    2012-04-30

    Variability in pre-analytical blood sampling and handling can significantly impact results obtained in quantitative immunoassays. Understanding the impact of these variables is critical for accurate quantification and validation of biomarker measurements. Particularly, in the design and execution of large clinical trials, even small differences in sample processing and handling can have dramatic effects in analytical reliability, results interpretation, trial management and outcome. The effects of two common blood sampling methods (serum vs. plasma) and two widely-used serum handling methods (on the clot with ambient temperature shipping, "traditional", vs. centrifuged with cold chain shipping, "protocol") on protein and autoantibody concentrations were examined. Matched serum and plasma samples were collected from 32 rheumatoid arthritis (RA) patients representing a wide range of disease activity status. Additionally, a set of matched serum samples with two sample handling methods was collected. One tube was processed per manufacturer's instructions and shipped overnight on cold packs (protocol). The matched tube, without prior centrifugation, was simultaneously shipped overnight at ambient temperatures (traditional). Upon delivery, the traditional tube was centrifuged. All samples were subsequently aliquoted and frozen prior to analysis of protein and autoantibody biomarkers. Median correlation between paired serum and plasma across all autoantibody assays was 0.99 (0.98-1.00) with a median % difference of -3.3 (-7.5 to 6.0). In contrast, observed protein biomarker concentrations were significantly affected by sample types, with median correlation of 0.99 (0.33-1.00) and a median % difference of -10 (-55 to 23). When the two serum collection/handling methods were compared, the median correlation between paired samples for autoantibodies was 0.99 (0.91-1.00) with a median difference of 4%. In contrast, significant increases were observed in protein biomarker concentrations among certain biomarkers in samples processed with the 'traditional' method. Autoantibody quantification appears robust to both sample type (plasma vs. serum) and pre-analytical sample collection/handling methods (protocol vs. traditional). In contrast, for non-antibody protein biomarker concentrations, sample type had a significant impact; plasma samples generally exhibit decreased protein biomarker concentrations relative to serum. Similarly, sample handling significantly impacted the variability of protein biomarker concentrations. When biomarker concentrations are combined algorithmically into a single test score such as a multi-biomarker disease activity test for rheumatoid arthritis (MBDA), changes in protein biomarker concentrations may result in a bias of the score. These results illustrate the importance of characterizing pre-analytical methodology, sample type, sample processing and handling procedures for clinical testing in order to ensure test accuracy. Copyright © 2012 Elsevier B.V. All rights reserved.

  1. Automated PCR setup for forensic casework samples using the Normalization Wizard and PCR Setup robotic methods.

    PubMed

    Greenspoon, S A; Sykes, K L V; Ban, J D; Pollard, A; Baisden, M; Farr, M; Graham, N; Collins, B L; Green, M M; Christenson, C C

    2006-12-20

    Human genome, pharmaceutical and research laboratories have long enjoyed the application of robotics to performing repetitive laboratory tasks. However, the utilization of robotics in forensic laboratories for processing casework samples is relatively new and poses particular challenges. Since the quantity and quality (a mixture versus a single source sample, the level of degradation, the presence of PCR inhibitors) of the DNA contained within a casework sample is unknown, particular attention must be paid to procedural susceptibility to contamination, as well as DNA yield, especially as it pertains to samples with little biological material. The Virginia Department of Forensic Science (VDFS) has successfully automated forensic casework DNA extraction utilizing the DNA IQ(trade mark) System in conjunction with the Biomek 2000 Automation Workstation. Human DNA quantitation is also performed in a near complete automated fashion utilizing the AluQuant Human DNA Quantitation System and the Biomek 2000 Automation Workstation. Recently, the PCR setup for casework samples has been automated, employing the Biomek 2000 Automation Workstation and Normalization Wizard, Genetic Identity version, which utilizes the quantitation data, imported into the software, to create a customized automated method for DNA dilution, unique to that plate of DNA samples. The PCR Setup software method, used in conjunction with the Normalization Wizard method and written for the Biomek 2000, functions to mix the diluted DNA samples, transfer the PCR master mix, and transfer the diluted DNA samples to PCR amplification tubes. Once the process is complete, the DNA extracts, still on the deck of the robot in PCR amplification strip tubes, are transferred to pre-labeled 1.5 mL tubes for long-term storage using an automated method. The automation of these steps in the process of forensic DNA casework analysis has been accomplished by performing extensive optimization, validation and testing of the software methods.

  2. Application of binomial and multinomial probability statistics to the sampling design process of a global grain tracing and recall system

    USDA-ARS?s Scientific Manuscript database

    Small, coded, pill-sized tracers embedded in grain are proposed as a method for grain traceability. A sampling process for a grain traceability system was designed and investigated by applying probability statistics using a science-based sampling approach to collect an adequate number of tracers fo...

  3. Method and apparatus for automated processing and aliquoting of whole blood samples for analysis in a centrifugal fast analyzer

    DOEpatents

    Burtis, C.A.; Johnson, W.F.; Walker, W.A.

    1985-08-05

    A rotor and disc assembly for use in a centrifugal fast analyzer. The assembly is designed to process multiple samples of whole blood followed by aliquoting of the resultant serum into precisely measured samples for subsequent chemical analysis. The assembly requires minimal operator involvement with no mechanical pipetting. The system comprises: (1) a whole blood sample disc; (2) a serum sample disc; (3) a sample preparation rotor; and (4) an analytical rotor. The blood sample disc and serum sample disc are designed with a plurality of precision bore capillary tubes arranged in a spoked array. Samples of blood are loaded into the blood sample disc by capillary action and centrifugally discharged into cavities of the sample preparation rotor where separation of serum and solids is accomplished. The serum is loaded into the capillaries of the serum sample disc by capillary action and subsequently centrifugally expelled into cuvettes of the analyticaly rotor for conventional methods. 5 figs.

  4. Method and apparatus for automated processing and aliquoting of whole blood samples for analysis in a centrifugal fast analyzer

    DOEpatents

    Burtis, Carl A.; Johnson, Wayne F.; Walker, William A.

    1988-01-01

    A rotor and disc assembly for use in a centrifugal fast analyzer. The assembly is designed to process multiple samples of whole blood followed by aliquoting of the resultant serum into precisely measured samples for subsequent chemical analysis. The assembly requires minimal operator involvement with no mechanical pipetting. The system comprises (1) a whole blood sample disc, (2) a serum sample disc, (3) a sample preparation rotor, and (4) an analytical rotor. The blood sample disc and serum sample disc are designed with a plurality of precision bore capillary tubes arranged in a spoked array. Samples of blood are loaded into the blood sample disc in capillary tubes filled by capillary action and centrifugally discharged into cavities of the sample preparation rotor where separation of serum and solids is accomplished. The serum is loaded into the capillaries of the serum sample disc by capillary action and subsequently centrifugally expelled into cuvettes of the analytical rotor for analysis by conventional methods.

  5. Direct quantitation of the preservatives benzoic and sorbic acid in processed foods using derivative spectrophotometry combined with micro dialysis.

    PubMed

    Fujiyoshi, Tomoharu; Ikami, Takahito; Kikukawa, Koji; Kobayashi, Masato; Takai, Rina; Kozaki, Daisuke; Yamamoto, Atsushi

    2018-02-01

    The preservatives benzoic acid and sorbic acid are generally quantified with separation techniques, such as HPLC or GC. Here we describe a new method for determining these compounds in processed food samples based on a narrowness of the UV-visible spectral band width with derivative processing. It permits more selective identification and determination of target analytes in matrices. After a sample is purified by micro dialysis, UV spectra of sample solutions were measured and fourth order derivatives of the spectrum were calculated. The amplitude between the maximum and minimum values in a high-order derivative spectrum was used for the determination of benzoic acid and sorbic acid. Benzoic acid and sorbic acid levels in several commercially available processed foods were measured by HPLC and the proposed spectrometry method. The levels obtained by the two methods were highly correlated (r 2 >0.97) for both preservatives. Copyright © 2017 Elsevier Ltd. All rights reserved.

  6. Defect recognition in CFRP components using various NDT methods within a smart manufacturing process

    NASA Astrophysics Data System (ADS)

    Schumacher, David; Meyendorf, Norbert; Hakim, Issa; Ewert, Uwe

    2018-04-01

    The manufacturing process of carbon fiber reinforced polymer (CFRP) components is gaining a more and more significant role when looking at the increasing amount of CFRPs used in industries today. The monitoring of the manufacturing process and hence the reliability of the manufactured products, is one of the major challenges we need to face in the near future. Common defects which arise during manufacturing process are e.g. porosity and voids which may lead to delaminations during operation and under load. To find irregularities and classify them as possible defects in an early stage of the manufacturing process is of high importance for the safety and reliability of the finished products, as well as of significant impact from an economical point of view. In this study we compare various NDT methods which were applied to similar CFRP laminate samples in order to detect and characterize regions of defective volume. Besides ultrasound, thermography and eddy current, different X-ray methods like radiography, laminography and computed tomography are used to investigate the samples. These methods are compared with the intention to evaluate their capability to reliably detect and characterize defective volume. Beyond the detection and evaluation of defects, we also investigate possibilities to combine various NDT methods within a smart manufacturing process in which the decision which method shall be applied is inherent within the process. Is it possible to design an in-line or at-line testing process which can recognize defects reliably and reduce testing time and costs? This study aims to show up opportunities of designing a smart NDT process synchronized to the production based on the concepts of smart production (Industry 4.0). A set of defective CFRP laminate samples and different NDT methods were used to demonstrate how effective defects are recognized and how communication between interconnected NDT sensors and the manufacturing process could be organized.

  7. Improvements to sample processing and measurement to enable more widespread environmental application of tritium.

    PubMed

    Moran, James; Alexander, Thomas; Aalseth, Craig; Back, Henning; Mace, Emily; Overman, Cory; Seifert, Allen; Freeburg, Wilcox

    2017-08-01

    Previous measurements have demonstrated the wealth of information that tritium (T) can provide on environmentally relevant processes. We present modifications to sample preparation approaches that enable T measurement by proportional counting on small sample sizes equivalent to 120mg of water and demonstrate the accuracy of these methods on a suite of standardized water samples. We identify a current quantification limit of 92.2 TU which, combined with our small sample sizes, correlates to as little as 0.00133Bq of total T activity. This enhanced method should provide the analytical flexibility needed to address persistent knowledge gaps in our understanding of both natural and artificial T behavior in the environment. Copyright © 2017. Published by Elsevier Ltd.

  8. Improvements to sample processing and measurement to enable more widespread environmental application of tritium

    DOE PAGES

    Moran, James; Alexander, Thomas; Aalseth, Craig; ...

    2017-01-26

    Previous measurements have demonstrated the wealth of information that tritium (T) can provide on environmentally relevant processes. Here, we present modifications to sample preparation approaches that enable T measurement by proportional counting on small sample sizes equivalent to 120 mg of water and demonstrate the accuracy of these methods on a suite of standardized water samples. We also identify a current quantification limit of 92.2 TU which, combined with our small sample sizes, correlates to as little as 0.00133 Bq of total T activity. Furthermore, this enhanced method should provide the analytical flexibility needed to address persistent knowledge gaps inmore » our understanding of both natural and artificial T behavior in the environment.« less

  9. Comparison of two sampling and culture systems for detection of Salmonella enterica in the environment of a large animal hospital.

    PubMed

    Ruple-Czerniak, A; Bolte, D S; Burgess, B A; Morley, P S

    2014-07-01

    Nosocomial salmonellosis is an important problem in veterinary hospitals that treat horses and other large animals. Detection and mitigation of outbreaks and prevention of healthcare-associated infections often require detection of Salmonella enterica in the hospital environment. To compare 2 previously published methods for detecting environmental contamination with S. enterica in a large animal veterinary teaching hospital. Hospital-based comparison of environmental sampling techniques. A total of 100 pairs of environmental samples were collected from stalls used to house large animal cases (horses, cows or New World camelids) that were confirmed to be shedding S. enterica by faecal culture. Stalls were cleaned and disinfected prior to sampling, and the same areas within each stall were sampled for the paired samples. One method of detection used sterile, premoistened sponges that were cultured using thioglycolate enrichment before plating on XLT-4 agar. The other method used electrostatic wipes that were cultured using buffered peptone water, tetrathionate and Rappaport-Vassiliadis R10 broths before plating on XLT-4 agar. Salmonella enterica was recovered from 14% of samples processed using the electrostatic wipe sampling and culture procedure, whereas S. enterica was recovered from only 4% of samples processed using the sponge sampling and culture procedure. There was test agreement for 85 pairs of culture-negative samples and 3 pairs of culture-positive samples. However, the remaining 12 pairs of samples with discordant results created significant disagreement between the 2 detection methods (P<0.01). Persistence of Salmonella in the environment of veterinary hospitals can occur even with rigorous cleaning and disinfection. Use of sensitive methods for detection of environmental contamination is critical when detecting and mitigating this problem in veterinary hospitals. These results suggest that the electrostatic wipe sampling and culture method was more sensitive than the sponge sampling and culture method. © 2013 EVJ Ltd.

  10. High-frequency, long-duration water sampling in acid mine drainage studies: a short review of current methods and recent advances in automated water samplers

    USGS Publications Warehouse

    Chapin, Thomas

    2015-01-01

    Hand-collected grab samples are the most common water sampling method but using grab sampling to monitor temporally variable aquatic processes such as diel metal cycling or episodic events is rarely feasible or cost-effective. Currently available automated samplers are a proven, widely used technology and typically collect up to 24 samples during a deployment. However, these automated samplers are not well suited for long-term sampling in remote areas or in freezing conditions. There is a critical need for low-cost, long-duration, high-frequency water sampling technology to improve our understanding of the geochemical response to temporally variable processes. This review article will examine recent developments in automated water sampler technology and utilize selected field data from acid mine drainage studies to illustrate the utility of high-frequency, long-duration water sampling.

  11. Data preprocessing methods of FT-NIR spectral data for the classification cooking oil

    NASA Astrophysics Data System (ADS)

    Ruah, Mas Ezatul Nadia Mohd; Rasaruddin, Nor Fazila; Fong, Sim Siong; Jaafar, Mohd Zuli

    2014-12-01

    This recent work describes the data pre-processing method of FT-NIR spectroscopy datasets of cooking oil and its quality parameters with chemometrics method. Pre-processing of near-infrared (NIR) spectral data has become an integral part of chemometrics modelling. Hence, this work is dedicated to investigate the utility and effectiveness of pre-processing algorithms namely row scaling, column scaling and single scaling process with Standard Normal Variate (SNV). The combinations of these scaling methods have impact on exploratory analysis and classification via Principle Component Analysis plot (PCA). The samples were divided into palm oil and non-palm cooking oil. The classification model was build using FT-NIR cooking oil spectra datasets in absorbance mode at the range of 4000cm-1-14000cm-1. Savitzky Golay derivative was applied before developing the classification model. Then, the data was separated into two sets which were training set and test set by using Duplex method. The number of each class was kept equal to 2/3 of the class that has the minimum number of sample. Then, the sample was employed t-statistic as variable selection method in order to select which variable is significant towards the classification models. The evaluation of data pre-processing were looking at value of modified silhouette width (mSW), PCA and also Percentage Correctly Classified (%CC). The results show that different data processing strategies resulting to substantial amount of model performances quality. The effects of several data pre-processing i.e. row scaling, column standardisation and single scaling process with Standard Normal Variate indicated by mSW and %CC. At two PCs model, all five classifier gave high %CC except Quadratic Distance Analysis.

  12. Robust Design of Sheet Metal Forming Process Based on Kriging Metamodel

    NASA Astrophysics Data System (ADS)

    Xie, Yanmin

    2011-08-01

    Nowadays, sheet metal forming processes design is not a trivial task due to the complex issues to be taken into account (conflicting design goals, complex shapes forming and so on). Optimization methods have also been widely applied in sheet metal forming. Therefore, proper design methods to reduce time and costs have to be developed mostly based on computer aided procedures. At the same time, the existence of variations during manufacturing processes significantly may influence final product quality, rendering non-robust optimal solutions. In this paper, a small size of design of experiments is conducted to investigate how a stochastic behavior of noise factors affects drawing quality. The finite element software (LS_DYNA) is used to simulate the complex sheet metal stamping processes. The Kriging metamodel is adopted to map the relation between input process parameters and part quality. Robust design models for sheet metal forming process integrate adaptive importance sampling with Kriging model, in order to minimize impact of the variations and achieve reliable process parameters. In the adaptive sample, an improved criterion is used to provide direction in which additional training samples can be added to better the Kriging model. Nonlinear functions as test functions and a square stamping example (NUMISHEET'93) are employed to verify the proposed method. Final results indicate application feasibility of the aforesaid method proposed for multi-response robust design.

  13. A Comprehensive Software and Database Management System for Glomerular Filtration Rate Estimation by Radionuclide Plasma Sampling and Serum Creatinine Methods.

    PubMed

    Jha, Ashish Kumar

    2015-01-01

    Glomerular filtration rate (GFR) estimation by plasma sampling method is considered as the gold standard. However, this method is not widely used because the complex technique and cumbersome calculations coupled with the lack of availability of user-friendly software. The routinely used Serum Creatinine method (SrCrM) of GFR estimation also requires the use of online calculators which cannot be used without internet access. We have developed user-friendly software "GFR estimation software" which gives the options to estimate GFR by plasma sampling method as well as SrCrM. We have used Microsoft Windows(®) as operating system and Visual Basic 6.0 as the front end and Microsoft Access(®) as database tool to develop this software. We have used Russell's formula for GFR calculation by plasma sampling method. GFR calculations using serum creatinine have been done using MIRD, Cockcroft-Gault method, Schwartz method, and Counahan-Barratt methods. The developed software is performing mathematical calculations correctly and is user-friendly. This software also enables storage and easy retrieval of the raw data, patient's information and calculated GFR for further processing and comparison. This is user-friendly software to calculate the GFR by various plasma sampling method and blood parameter. This software is also a good system for storing the raw and processed data for future analysis.

  14. Quantitative Method for Simultaneous Analysis of Acetaminophen and 6 Metabolites.

    PubMed

    Lammers, Laureen A; Achterbergh, Roos; Pistorius, Marcel C M; Romijn, Johannes A; Mathôt, Ron A A

    2017-04-01

    Hepatotoxicity after ingestion of high-dose acetaminophen [N-acetyl-para-aminophenol (APAP)] is caused by the metabolites of the drug. To gain more insight into factors influencing susceptibility to APAP hepatotoxicity, quantification of APAP and metabolites is important. A few methods have been developed to simultaneously quantify APAP and its most important metabolites. However, these methods require a comprehensive sample preparation and long run times. The aim of this study was to develop and validate a simplified, but sensitive method for the simultaneous quantification of acetaminophen, the main metabolites acetaminophen glucuronide and acetaminophen sulfate, and 4 Cytochrome P450-mediated metabolites by using liquid chromatography with mass spectrometric (LC-MS) detection. The method was developed and validated for the human plasma, and it entailed a single method for sample preparation, enabling quick processing of the samples followed by an LC-MS method with a chromatographic run time of 9 minutes. The method was validated for selectivity, linearity, accuracy, imprecision, dilution integrity, recovery, process efficiency, ionization efficiency, and carryover effect. The method showed good selectivity without matrix interferences. For all analytes, the mean process efficiency was >86%, and the mean ionization efficiency was >94%. Furthermore, the accuracy was between 90.3% and 112% for all analytes, and the within- and between-run imprecision were <20% for the lower limit of quantification and <14.3% for the middle level and upper limit of quantification. The method presented here enables the simultaneous quantification of APAP and 6 of its metabolites. It is less time consuming than previously reported methods because it requires only a single and simple method for the sample preparation followed by an LC-MS method with a short run time. Therefore, this analytical method provides a useful method for both clinical and research purposes.

  15. Rapid-viability PCR method for detection of live, virulent Bacillus anthracis in environmental samples.

    PubMed

    Létant, Sonia E; Murphy, Gloria A; Alfaro, Teneile M; Avila, Julie R; Kane, Staci R; Raber, Ellen; Bunt, Thomas M; Shah, Sanjiv R

    2011-09-01

    In the event of a biothreat agent release, hundreds of samples would need to be rapidly processed to characterize the extent of contamination and determine the efficacy of remediation activities. Current biological agent identification and viability determination methods are both labor- and time-intensive such that turnaround time for confirmed results is typically several days. In order to alleviate this issue, automated, high-throughput sample processing methods were developed in which real-time PCR analysis is conducted on samples before and after incubation. The method, referred to as rapid-viability (RV)-PCR, uses the change in cycle threshold after incubation to detect the presence of live organisms. In this article, we report a novel RV-PCR method for detection of live, virulent Bacillus anthracis, in which the incubation time was reduced from 14 h to 9 h, bringing the total turnaround time for results below 15 h. The method incorporates a magnetic bead-based DNA extraction and purification step prior to PCR analysis, as well as specific real-time PCR assays for the B. anthracis chromosome and pXO1 and pXO2 plasmids. A single laboratory verification of the optimized method applied to the detection of virulent B. anthracis in environmental samples was conducted and showed a detection level of 10 to 99 CFU/sample with both manual and automated RV-PCR methods in the presence of various challenges. Experiments exploring the relationship between the incubation time and the limit of detection suggest that the method could be further shortened by an additional 2 to 3 h for relatively clean samples.

  16. Direct PCR amplification of forensic touch and other challenging DNA samples: A review.

    PubMed

    Cavanaugh, Sarah E; Bathrick, Abigail S

    2018-01-01

    DNA evidence sample processing typically involves DNA extraction, quantification, and STR amplification; however, DNA loss can occur at both the DNA extraction and quantification steps, which is not ideal for forensic evidence containing low levels of DNA. Direct PCR amplification of forensic unknown samples has been suggested as a means to circumvent extraction and quantification, thereby retaining the DNA typically lost during those procedures. Direct PCR amplification is a method in which a sample is added directly to an amplification reaction without being subjected to prior DNA extraction, purification, or quantification. It allows for maximum quantities of DNA to be targeted, minimizes opportunities for error and contamination, and reduces the time and monetary resources required to process samples, although data analysis may take longer as the increased DNA detection sensitivity of direct PCR may lead to more instances of complex mixtures. ISO 17025 accredited laboratories have successfully implemented direct PCR for limited purposes (e.g., high-throughput databanking analysis), and recent studies indicate that direct PCR can be an effective method for processing low-yield evidence samples. Despite its benefits, direct PCR has yet to be widely implemented across laboratories for the processing of evidentiary items. While forensic DNA laboratories are always interested in new methods that will maximize the quantity and quality of genetic information obtained from evidentiary items, there is often a lag between the advent of useful methodologies and their integration into laboratories. Delayed implementation of direct PCR of evidentiary items can be attributed to a variety of factors, including regulatory guidelines that prevent laboratories from omitting the quantification step when processing forensic unknown samples, as is the case in the United States, and, more broadly, a reluctance to validate a technique that is not widely used for evidence samples. The advantages of direct PCR of forensic evidentiary samples justify a re-examination of the factors that have delayed widespread implementation of this method and of the evidence supporting its use. In this review, the current and potential future uses of direct PCR in forensic DNA laboratories are summarized. Copyright © 2017 Elsevier B.V. All rights reserved.

  17. Efficient free energy calculations by combining two complementary tempering sampling methods.

    PubMed

    Xie, Liangxu; Shen, Lin; Chen, Zhe-Ning; Yang, Mingjun

    2017-01-14

    Although energy barriers can be efficiently crossed in the reaction coordinate (RC) guided sampling, this type of method suffers from identification of the correct RCs or requirements of high dimensionality of the defined RCs for a given system. If only the approximate RCs with significant barriers are used in the simulations, hidden energy barriers with small to medium height would exist in other degrees of freedom (DOFs) relevant to the target process and consequently cause the problem of insufficient sampling. To address the sampling in this so-called hidden barrier situation, here we propose an effective approach to combine temperature accelerated molecular dynamics (TAMD), an efficient RC-guided sampling method, with the integrated tempering sampling (ITS), a generalized ensemble sampling method. In this combined ITS-TAMD method, the sampling along the major RCs with high energy barriers is guided by TAMD and the sampling of the rest of the DOFs with lower but not negligible barriers is enhanced by ITS. The performance of ITS-TAMD to three systems in the processes with hidden barriers has been examined. In comparison to the standalone TAMD or ITS approach, the present hybrid method shows three main improvements. (1) Sampling efficiency can be improved at least five times even if in the presence of hidden energy barriers. (2) The canonical distribution can be more accurately recovered, from which the thermodynamic properties along other collective variables can be computed correctly. (3) The robustness of the selection of major RCs suggests that the dimensionality of necessary RCs can be reduced. Our work shows more potential applications of the ITS-TAMD method as the efficient and powerful tool for the investigation of a broad range of interesting cases.

  18. Efficient free energy calculations by combining two complementary tempering sampling methods

    NASA Astrophysics Data System (ADS)

    Xie, Liangxu; Shen, Lin; Chen, Zhe-Ning; Yang, Mingjun

    2017-01-01

    Although energy barriers can be efficiently crossed in the reaction coordinate (RC) guided sampling, this type of method suffers from identification of the correct RCs or requirements of high dimensionality of the defined RCs for a given system. If only the approximate RCs with significant barriers are used in the simulations, hidden energy barriers with small to medium height would exist in other degrees of freedom (DOFs) relevant to the target process and consequently cause the problem of insufficient sampling. To address the sampling in this so-called hidden barrier situation, here we propose an effective approach to combine temperature accelerated molecular dynamics (TAMD), an efficient RC-guided sampling method, with the integrated tempering sampling (ITS), a generalized ensemble sampling method. In this combined ITS-TAMD method, the sampling along the major RCs with high energy barriers is guided by TAMD and the sampling of the rest of the DOFs with lower but not negligible barriers is enhanced by ITS. The performance of ITS-TAMD to three systems in the processes with hidden barriers has been examined. In comparison to the standalone TAMD or ITS approach, the present hybrid method shows three main improvements. (1) Sampling efficiency can be improved at least five times even if in the presence of hidden energy barriers. (2) The canonical distribution can be more accurately recovered, from which the thermodynamic properties along other collective variables can be computed correctly. (3) The robustness of the selection of major RCs suggests that the dimensionality of necessary RCs can be reduced. Our work shows more potential applications of the ITS-TAMD method as the efficient and powerful tool for the investigation of a broad range of interesting cases.

  19. TECHNICAL MANUAL: A SURVEY OF EQUIPMENT AND METHODS FOR PARTICULATE SAMPLING IN INDUSTRIAL PROCESS STREAMS

    EPA Science Inventory

    The manual lists and describes the instruments and techniques that are available for measuring the concentration or size distribution of particles suspended in process streams. The standard, official, well established methods are described as well as some experimental methods and...

  20. Preservation And Processing Methods For Molecular Genetic Detection And Quantification Of Nosema Ceranae

    USDA-ARS?s Scientific Manuscript database

    The prevalence of Nosema ceranae in managed honey bee colonies has increased dramatically in the past 10 – 20 years worldwide. A variety of genetic testing methods for species identification and prevalence are now available. However sample size and preservation method of samples prior to testing hav...

  1. [Validation of measurement methods and estimation of uncertainty of measurement of chemical agents in the air at workstations].

    PubMed

    Dobecki, Marek

    2012-01-01

    This paper reviews the requirements for measurement methods of chemical agents in the air at workstations. European standards, which have a status of Polish standards, comprise some requirements and information on sampling strategy, measuring techniques, type of samplers, sampling pumps and methods of occupational exposure evaluation at a given technological process. Measurement methods, including air sampling and analytical procedure in a laboratory, should be appropriately validated before intended use. In the validation process, selected methods are tested and budget of uncertainty is set up. The validation procedure that should be implemented in the laboratory together with suitable statistical tools and major components of uncertainity to be taken into consideration, were presented in this paper. Methods of quality control, including sampling and laboratory analyses were discussed. Relative expanded uncertainty for each measurement expressed as a percentage, should not exceed the limit of values set depending on the type of occupational exposure (short-term or long-term) and the magnitude of exposure to chemical agents in the work environment.

  2. A Fast Multiple Sampling Method for Low-Noise CMOS Image Sensors With Column-Parallel 12-bit SAR ADCs

    PubMed Central

    Kim, Min-Kyu; Hong, Seong-Kwan; Kwon, Oh-Kyong

    2015-01-01

    This paper presents a fast multiple sampling method for low-noise CMOS image sensor (CIS) applications with column-parallel successive approximation register analog-to-digital converters (SAR ADCs). The 12-bit SAR ADC using the proposed multiple sampling method decreases the A/D conversion time by repeatedly converting a pixel output to 4-bit after the first 12-bit A/D conversion, reducing noise of the CIS by one over the square root of the number of samplings. The area of the 12-bit SAR ADC is reduced by using a 10-bit capacitor digital-to-analog converter (DAC) with four scaled reference voltages. In addition, a simple up/down counter-based digital processing logic is proposed to perform complex calculations for multiple sampling and digital correlated double sampling. To verify the proposed multiple sampling method, a 256 × 128 pixel array CIS with 12-bit SAR ADCs was fabricated using 0.18 μm CMOS process. The measurement results shows that the proposed multiple sampling method reduces each A/D conversion time from 1.2 μs to 0.45 μs and random noise from 848.3 μV to 270.4 μV, achieving a dynamic range of 68.1 dB and an SNR of 39.2 dB. PMID:26712765

  3. Aqueous Cleaning and Validation for Space Shuttle Propulsion Hardware at the White Sands Test Facility

    NASA Technical Reports Server (NTRS)

    Hornung, Steven D.; Biesinger, Paul; Kirsch, Mike; Beeson, Harold; Leuders, Kathy

    1999-01-01

    The NASA White Sands Test Facility (WSTF) has developed an entirely aqueous final cleaning and verification process to replace the current chlorofluorocarbon (CFC) 113 based process. This process has been accepted for final cleaning and cleanliness verification of WSTF ground support equipment. The aqueous process relies on ultrapure water at 50 C (323 K) and ultrasonic agitation for removal of organic compounds and particulate. The cleanliness is verified bv determining the total organic carbon (TOC) content and filtration with particulate counting. The effectiveness of the aqueous methods for detecting hydrocarbon contamination and particulate was compared to the accepted CFC 113 sampling procedures. Testing with known contaminants, such as hydraulic fluid and cutting and lubricating oils, to establish a correlation between aqueous TOC and CFC 113 nonvolatile residue (NVR) was performed. Particulate sampling on cleaned batches of hardware that were randomly separated and sampled by the two methods was performed. This paper presents the approach and results, and discusses the issues in establishing the equivalence of aqueous sampling to CFC 113 sampling, while describing the approach for implementing aqueous techniques on Space Shuttle Propulsion hardware.

  4. Twenty-first century brain banking. Processing brains for research: the Columbia University methods

    PubMed Central

    del Amaya, Maria Pilar; Keller, Christian E.

    2007-01-01

    Carefully categorized postmortem human brains are crucial for research. The lack of generally accepted methods for processing human postmortem brains for research persists. Thus, brain banking is essential; however, it cannot be achieved at the cost of the teaching mission of the academic institution by routing brains away from residency programs, particularly when the autopsy rate is steadily decreasing. A consensus must be reached whereby a brain can be utilizable for diagnosis, research, and teaching. The best diagnostic categorization possible must be secured and the yield of samples for basic investigation maximized. This report focuses on integrated, novel methods currently applied at the New York Brain Bank, Columbia University, New York, which are designed to reach accurate neuropathological diagnosis, optimize the yield of samples, and process fresh-frozen samples suitable for a wide range of modern investigations. The brains donated for research are processed as soon as possible after death. The prosector must have a good command of the neuroanatomy, neuropathology, and the protocol. One half of each brain is immersed in formalin for performing the thorough neuropathologic evaluation, which is combined with the teaching task. The contralateral half is extensively dissected at the fresh state. The anatomical origin of each sample is recorded using the map of Brodmann for the cortical samples. The samples are frozen at −160°C, barcode labeled, and ready for immediate disbursement once categorized diagnostically. A rigorous organization of freezer space, coupled to an electronic tracking system with its attached software, fosters efficient access for retrieval within minutes of any specific frozen samples in storage. This report describes how this achievement is feasible with emphasis on the actual processing of brains donated for research. PMID:17985145

  5. Nonlinear inversion of electrical resistivity imaging using pruning Bayesian neural networks

    NASA Astrophysics Data System (ADS)

    Jiang, Fei-Bo; Dai, Qian-Wei; Dong, Li

    2016-06-01

    Conventional artificial neural networks used to solve electrical resistivity imaging (ERI) inversion problem suffer from overfitting and local minima. To solve these problems, we propose to use a pruning Bayesian neural network (PBNN) nonlinear inversion method and a sample design method based on the K-medoids clustering algorithm. In the sample design method, the training samples of the neural network are designed according to the prior information provided by the K-medoids clustering results; thus, the training process of the neural network is well guided. The proposed PBNN, based on Bayesian regularization, is used to select the hidden layer structure by assessing the effect of each hidden neuron to the inversion results. Then, the hyperparameter α k , which is based on the generalized mean, is chosen to guide the pruning process according to the prior distribution of the training samples under the small-sample condition. The proposed algorithm is more efficient than other common adaptive regularization methods in geophysics. The inversion of synthetic data and field data suggests that the proposed method suppresses the noise in the neural network training stage and enhances the generalization. The inversion results with the proposed method are better than those of the BPNN, RBFNN, and RRBFNN inversion methods as well as the conventional least squares inversion.

  6. Instrumental methods of analysis of sulfur compounds in synfuel process streams. Quarterly technical progress report, July-September 1984

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jordan, J.; Talbott, J.

    1984-01-01

    Task 1. Methods development for the speciation of the polysulfides. Work on this task has been completed in December 1983 and reported accordingly in DOE/PC/40783-T13. Task 2. Methods development for the speciation of dithionite and polythionates. Work on Task 2 has been completed in June 1984 and has been reported accordingly in DOE/PC/40783-T15. Task 3. Total accounting of the sulfur balance in representative samples of synfuel process streams. A systematic and critical comparison of results, obtained in the analysis of sulfur moieties in representative samples of coal conversion process streams, revealed the following general trends. (a) In specimens of highmore » pH (9-10) and low redox potential (-0.3 to -0.4 volt versus NHE) sulfidic and polysulfidic sulfur moieties predominate. (b) In process streams of lower pH and more positive redox potential, higher oxidation states of sulfur (notably sulfate) account for most of the total sulfur present. (c) Oxidative wastewater treatment procedures by the PETC stripping process convert lower oxidation states of sulfur into thiosulfate and sulfate. In this context, remarkable similarities were observed between liquefaction and gasification process streams. However, the thiocyanate present in samples from the Grand Forks gasifier were impervious to the PETC stripping process. (d) Total sulfur contaminant levels in coal conversion process stream wastewater samples are primarily determined by the abundance of sulfur in the coal used as starting material than by the nature of the conversion process (liquefaction or gasification). 13 references.« less

  7. A method for determining the weak statistical stationarity of a random process

    NASA Technical Reports Server (NTRS)

    Sadeh, W. Z.; Koper, C. A., Jr.

    1978-01-01

    A method for determining the weak statistical stationarity of a random process is presented. The core of this testing procedure consists of generating an equivalent ensemble which approximates a true ensemble. Formation of an equivalent ensemble is accomplished through segmenting a sufficiently long time history of a random process into equal, finite, and statistically independent sample records. The weak statistical stationarity is ascertained based on the time invariance of the equivalent-ensemble averages. Comparison of these averages with their corresponding time averages over a single sample record leads to a heuristic estimate of the ergodicity of a random process. Specific variance tests are introduced for evaluating the statistical independence of the sample records, the time invariance of the equivalent-ensemble autocorrelations, and the ergodicity. Examination and substantiation of these procedures were conducted utilizing turbulent velocity signals.

  8. Ecology of Arcobacter species in chicken rearing and processing.

    PubMed

    Gude, A; Hillman, T J; Helps, C R; Allen, V M; Corry, J E L

    2005-01-01

    To investigate whether Arcobacter spp. colonize the poultry-rearing environment or whether they are contaminants acquired during transportation and/or from the processing plant. Samples were collected on poultry farms and in the processing plant during slaughter and dressing. Two cultural methods of detection were used. Isolates were identified to species level using a multiplex-polymerase chain reaction (m-PCR) method, either on the initial suspensions, or after enrichment, or on pure cultures of isolates. Of the 62 samples examined from poultry farms, arcobacters were found only outside the rearing sheds (in effluent sludge and stagnant water). Thirty-four samples were examined from the processing plant and 26 were positive for arcobacters. All the isolates were Arcobacter butzleri. Arcobacters were not found in any sample by direct plating nor by m-PCR on the initial suspensions, thus it was concluded that numbers were very low. Arcobacter spp. were not found in samples from the live birds and their immediate environment, but A. butzleri was found in effluent sludge and stagnant water outside the rearing sheds. However, A. butzleri is common in poultry abattoirs, and it appears that poultry carcasses are contaminated during processing. Arcobacters are not found inside poultry-rearing sheds, but are contaminants in the processing environment.

  9. 40 CFR 412.37 - Additional measures.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... STANDARDS CONCENTRATED ANIMAL FEEDING OPERATIONS (CAFO) POINT SOURCE CATEGORY Dairy Cows and Cattle Other... application; (4) Test methods used to sample and analyze manure, litter, process waste water, and soil; (5) Results from manure, litter, process waste water, and soil sampling; (6) Explanation of the basis for...

  10. 40 CFR 412.37 - Additional measures.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... STANDARDS CONCENTRATED ANIMAL FEEDING OPERATIONS (CAFO) POINT SOURCE CATEGORY Dairy Cows and Cattle Other... application; (4) Test methods used to sample and analyze manure, litter, process waste water, and soil; (5) Results from manure, litter, process waste water, and soil sampling; (6) Explanation of the basis for...

  11. 40 CFR 412.37 - Additional measures.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... STANDARDS CONCENTRATED ANIMAL FEEDING OPERATIONS (CAFO) POINT SOURCE CATEGORY Dairy Cows and Cattle Other... application; (4) Test methods used to sample and analyze manure, litter, process waste water, and soil; (5) Results from manure, litter, process waste water, and soil sampling; (6) Explanation of the basis for...

  12. Effect of Dephytinization by Fermentation and Hydrothermal Autoclaving Treatments on the Antioxidant Activity, Dietary Fiber, and Phenolic Content of Oat Bran.

    PubMed

    Özkaya, H; Özkaya, B; Duman, B; Turksoy, S

    2017-07-19

    Fermentation and hydrothermal methods were tested to reduce the phytic acid (PA) content of oat bran, and the effects of these methods on the dietary fiber (DF) and total phenolic (TP) contents as well as the antioxidant activity (AA) were also investigated. Fermentation with 6% yeast and for 6 h resulted in 88.2% reduction in PA content, while it only resulted in 32.5% reduction in the sample incubated for 6 h without yeast addition. The PA loss in autoclaved oat bran sample (1.5 h, pH 4.0) was 95.2% while it was 41.8% at most in the sample autoclaved without pH adjustment. In both methods, soluble, insoluble, and total DF contents of samples were remarkably higher than the control samples. Also for TP in the oat bran samples, both processes led to 17% and 39% increases, respectively, while AA values were 8% and 15%, respectively. Among all samples, the autoclaving process resulted in the lowest PA and the greatest amount of bioactive compounds.

  13. Automated grain extraction and classification by combining improved region growing segmentation and shape descriptors in electromagnetic mill classification system

    NASA Astrophysics Data System (ADS)

    Budzan, Sebastian

    2018-04-01

    In this paper, the automatic method of grain detection and classification has been presented. As input, it uses a single digital image obtained from milling process of the copper ore with an high-quality digital camera. The grinding process is an extremely energy and cost consuming process, thus granularity evaluation process should be performed with high efficiency and time consumption. The method proposed in this paper is based on the three-stage image processing. First, using Seeded Region Growing (SRG) segmentation with proposed adaptive thresholding based on the calculation of Relative Standard Deviation (RSD) all grains are detected. In the next step results of the detection are improved using information about the shape of the detected grains using distance map. Finally, each grain in the sample is classified into one of the predefined granularity class. The quality of the proposed method has been obtained by using nominal granularity samples, also with a comparison to the other methods.

  14. Network Model-Assisted Inference from Respondent-Driven Sampling Data

    PubMed Central

    Gile, Krista J.; Handcock, Mark S.

    2015-01-01

    Summary Respondent-Driven Sampling is a widely-used method for sampling hard-to-reach human populations by link-tracing over their social networks. Inference from such data requires specialized techniques because the sampling process is both partially beyond the control of the researcher, and partially implicitly defined. Therefore, it is not generally possible to directly compute the sampling weights for traditional design-based inference, and likelihood inference requires modeling the complex sampling process. As an alternative, we introduce a model-assisted approach, resulting in a design-based estimator leveraging a working network model. We derive a new class of estimators for population means and a corresponding bootstrap standard error estimator. We demonstrate improved performance compared to existing estimators, including adjustment for an initial convenience sample. We also apply the method and an extension to the estimation of HIV prevalence in a high-risk population. PMID:26640328

  15. Network Model-Assisted Inference from Respondent-Driven Sampling Data.

    PubMed

    Gile, Krista J; Handcock, Mark S

    2015-06-01

    Respondent-Driven Sampling is a widely-used method for sampling hard-to-reach human populations by link-tracing over their social networks. Inference from such data requires specialized techniques because the sampling process is both partially beyond the control of the researcher, and partially implicitly defined. Therefore, it is not generally possible to directly compute the sampling weights for traditional design-based inference, and likelihood inference requires modeling the complex sampling process. As an alternative, we introduce a model-assisted approach, resulting in a design-based estimator leveraging a working network model. We derive a new class of estimators for population means and a corresponding bootstrap standard error estimator. We demonstrate improved performance compared to existing estimators, including adjustment for an initial convenience sample. We also apply the method and an extension to the estimation of HIV prevalence in a high-risk population.

  16. The Use of the Visualisation of Multidimensional Data Using PCA to Evaluate Possibilities of the Division of Coal Samples Space Due to their Suitability for Fluidised Gasification

    NASA Astrophysics Data System (ADS)

    Jamróz, Dariusz; Niedoba, Tomasz; Surowiak, Agnieszka; Tumidajski, Tadeusz

    2016-09-01

    Methods serving to visualise multidimensional data through the transformation of multidimensional space into two-dimensional space, enable to present the multidimensional data on the computer screen. Thanks to this, qualitative analysis of this data can be performed in the most natural way for humans, through the sense of sight. An example of such a method of multidimensional data visualisation is PCA (principal component analysis) method. This method was used in this work to present and analyse a set of seven-dimensional data (selected seven properties) describing coal samples obtained from Janina and Wieczorek coal mines. Coal from these mines was previously subjected to separation by means of a laboratory ring jig, consisting of ten rings. With 5 layers of both types of coal (with 2 rings each) were obtained in this way. It was decided to check if the method of multidimensional data visualisation enables to divide the space of such divided samples into areas with different suitability for the fluidised gasification process. To that end, the card of technological suitability of coal was used (Sobolewski et al., 2012; 2013), in which key, relevant and additional parameters, having effect on the gasification process, were described. As a result of analyses, it was stated that effective determination of coal samples suitability for the on-surface gasification process in a fluidised reactor is possible. The PCA method enables the visualisation of the optimal subspace containing the set requirements concerning the properties of coals intended for this process.

  17. Development, validation, and application of a novel LC-MS/MS trace analysis method for the simultaneous quantification of seven iodinated X-ray contrast media and three artificial sweeteners in surface, ground, and drinking water.

    PubMed

    Ens, Waldemar; Senner, Frank; Gygax, Benjamin; Schlotterbeck, Götz

    2014-05-01

    A new method for the simultaneous determination of iodated X-ray contrast media (ICM) and artificial sweeteners (AS) by liquid chromatography-tandem mass spectrometry (LC-MS/MS) operated in positive and negative ionization switching mode was developed. The method was validated for surface, ground, and drinking water samples. In order to gain higher sensitivities, a 10-fold sample enrichment step using a Genevac EZ-2 plus centrifugal vacuum evaporator that provided excellent recoveries (90 ± 6 %) was selected for sample preparation. Limits of quantification below 10 ng/L were obtained for all compounds. Furthermore, sample preparation recoveries and matrix effects were investigated thoroughly for all matrix types. Considerable matrix effects were observed in surface water and could be compensated by the use of four stable isotope-labeled internal standards. Due to their persistence, fractions of diatrizoic acid, iopamidol, and acesulfame could pass the whole drinking water production process and were observed also in drinking water. To monitor the fate and occurrence of these compounds, the validated method was applied to samples from different stages of the drinking water production process of the Industrial Works of Basel (IWB). Diatrizoic acid was found as the most persistent compound which was eliminated by just 40 % during the whole drinking water treatment process, followed by iopamidol (80 % elimination) and acesulfame (85 % elimination). All other compounds were completely restrained and/or degraded by the soil and thus were not detected in groundwater. Additionally, a direct injection method without sample preparation achieving 3-20 ng/L limits of quantification was compared to the developed method.

  18. Development and validation of a simplified titration method for monitoring volatile fatty acids in anaerobic digestion.

    PubMed

    Sun, Hao; Guo, Jianbin; Wu, Shubiao; Liu, Fang; Dong, Renjie

    2017-09-01

    The volatile fatty acids (VFAs) concentration has been considered as one of the most sensitive process performance indicators in anaerobic digestion (AD) process. However, the accurate determination of VFAs concentration in AD processes normally requires advanced equipment and complex pretreatment procedures. A simplified method with fewer sample pretreatment procedures and improved accuracy is greatly needed, particularly for on-site application. This report outlines improvements to the Nordmann method, one of the most popular titrations used for VFA monitoring. The influence of ion and solid interfering subsystems in titrated samples on results accuracy was discussed. The total solid content in titrated samples was the main factor affecting accuracy in VFA monitoring. Moreover, a high linear correlation was established between the total solids contents and VFA measurement differences between the traditional Nordmann equation and gas chromatography (GC). Accordingly, a simplified titration method was developed and validated using a semi-continuous experiment of chicken manure anaerobic digestion with various organic loading rates. The good fitting of the results obtained by this method in comparison with GC results strongly supported the potential application of this method to VFA monitoring. Copyright © 2017. Published by Elsevier Ltd.

  19. The effect of processing on the surface physical stability of amorphous solid dispersions.

    PubMed

    Yang, Ziyi; Nollenberger, Kathrin; Albers, Jessica; Moffat, Jonathan; Craig, Duncan; Qi, Sheng

    2014-11-01

    The focus of this study was to investigate the effect of processing on the surface crystallization of amorphous molecular dispersions and gain insight into the mechanisms underpinning this effect. The model systems, amorphous molecular dispersions of felodipine-EUDRAGIT® E PO, were processed both using spin coating (an ultra-fast solvent evaporation based method) and hot melt extrusion (HME) (a melting based method). Amorphous solid dispersions with drug loadings of 10-90% (w/w) were obtained by both processing methods. Samples were stored under 75% RH/room temperatures for up to 10months. Surface crystallization was observed shortly after preparation for the HME samples with high drug loadings (50-90%). Surface crystallization was characterized by powder X-ray diffraction (PXRD), ATR-FTIR spectroscopy and imaging techniques (SEM, AFM and localized thermal analysis). Spin coated molecular dispersions showed significantly higher surface physical stability than hot melt extruded samples. For both systems, the progress of the surface crystal growth followed zero order kinetics on aging. Drug enrichment at the surfaces of HME samples on aging was observed, which may contribute to surface crystallization of amorphous molecular dispersions. In conclusion it was found the amorphous molecular dispersions prepared by spin coating had a significantly higher surface physical stability than the corresponding HME samples, which may be attributed to the increased process-related apparent drug-polymer solubility and reduced molecular mobility due to the quenching effect caused by the rapid solvent evaporation in spin coating. Copyright © 2014 Elsevier B.V. All rights reserved.

  20. Pesticide-sampling equipment, sample-collection and processing procedures, and water-quality data at Chicod Creek, North Carolina, 1992

    USGS Publications Warehouse

    Manning, T.K.; Smith, K.E.; Wood, C.D.; Williams, J.B.

    1994-01-01

    Water-quality samples were collected from Chicod Creek in the Coastal Plain Province of North Carolina during the summer of 1992 as part of the U.S. Geological Survey's National Water-Quality Assessment Program. Chicod Creek is in the Albemarle-Pamlico drainage area, one of four study units designated to test equipment and procedures for collecting and processing samples for the solid-phase extraction of selected pesticides, The equipment and procedures were used to isolate 47 pesticides, including organonitrogen, carbamate, organochlorine, organophosphate, and other compounds, targeted to be analyzed by gas chromatography/mass spectrometry. Sample-collection and processing equipment equipment cleaning and set-up procedures, methods pertaining to collecting, splitting, and solid-phase extraction of samples, and water-quality data resulting from the field test are presented in this report Most problems encountered during this intensive sampling exercise were operational difficulties relating to equipment used to process samples.

  1. Selected problems with boron determination in water treatment processes. Part I: comparison of the reference methods for ICP-MS and ICP-OES determinations.

    PubMed

    Kmiecik, Ewa; Tomaszewska, Barbara; Wątor, Katarzyna; Bodzek, Michał

    2016-06-01

    The aim of the study was to compare the two reference methods for the determination of boron in water samples and further assess the impact of the method of preparation of samples for analysis on the results obtained. Samples were collected during different desalination processes, ultrafiltration and the double reverse osmosis system, connected in series. From each point, samples were prepared in four different ways: the first was filtered (through a membrane filter of 0.45 μm) and acidified (using 1 mL ultrapure nitric acid for each 100 mL of samples) (FA), the second was unfiltered and not acidified (UFNA), the third was filtered but not acidified (FNA), and finally, the fourth was unfiltered but acidified (UFA). All samples were analysed using two analytical methods: inductively coupled plasma mass spectrometry (ICP-MS) and inductively coupled plasma optical emission spectrometry (ICP-OES). The results obtained were compared and correlated, and the differences between them were studied. The results show that there are statistically significant differences between the concentrations obtained using the ICP-MS and ICP-OES techniques regardless of the methods of sampling preparation (sample filtration and preservation). Finally, both the ICP-MS and ICP-OES methods can be used for determination of the boron concentration in water. The differences in the boron concentrations obtained using these two methods can be caused by several high-level concentrations in selected whole-water digestates and some matrix effects. Higher concentrations of iron (from 1 to 20 mg/L) than chromium (0.02-1 mg/L) in the samples analysed can influence boron determination. When iron concentrations are high, we can observe the emission spectrum as a double joined and overlapping peak.

  2. Developmental validation of a Nextera XT mitogenome Illumina MiSeq sequencing method for high-quality samples.

    PubMed

    Peck, Michelle A; Sturk-Andreaggi, Kimberly; Thomas, Jacqueline T; Oliver, Robert S; Barritt-Ross, Suzanne; Marshall, Charla

    2018-05-01

    Generating mitochondrial genome (mitogenome) data from reference samples in a rapid and efficient manner is critical to harnessing the greater power of discrimination of the entire mitochondrial DNA (mtDNA) marker. The method of long-range target enrichment, Nextera XT library preparation, and Illumina sequencing on the MiSeq is a well-established technique for generating mitogenome data from high-quality samples. To this end, a validation was conducted for this mitogenome method processing up to 24 samples simultaneously along with analysis in the CLC Genomics Workbench and utilizing the AQME (AFDIL-QIAGEN mtDNA Expert) tool to generate forensic profiles. This validation followed the Federal Bureau of Investigation's Quality Assurance Standards (QAS) for forensic DNA testing laboratories and the Scientific Working Group on DNA Analysis Methods (SWGDAM) validation guidelines. The evaluation of control DNA, non-probative samples, blank controls, mixtures, and nonhuman samples demonstrated the validity of this method. Specifically, the sensitivity was established at ≥25 pg of nuclear DNA input for accurate mitogenome profile generation. Unreproducible low-level variants were observed in samples with low amplicon yields. Further, variant quality was shown to be a useful metric for identifying sequencing error and crosstalk. Success of this method was demonstrated with a variety of reference sample substrates and extract types. These studies further demonstrate the advantages of using NGS techniques by highlighting the quantitative nature of heteroplasmy detection. The results presented herein from more than 175 samples processed in ten sequencing runs, show this mitogenome sequencing method and analysis strategy to be valid for the generation of reference data. Copyright © 2018 Elsevier B.V. All rights reserved.

  3. Detection of Acetone Processing of Castor Bean Mash for Forensic Investigation of Ricin Preparation Methods

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kreuzer-Martin, Helen W.; Wahl, Jon H.; Metoyer, Candace N.

    The toxic protein ricin is of concern as a potential biological threat agent (BTA) Recently, several samples of ricin have been seized in connection with biocriminal activity. Analytical methods are needed that enable federal investigators to determine how the samples were prepared, to match seized samples to potential source materials, and to identify samples that may have been prepared by the same method using the same source materials. One commonly described crude ricin preparation method is acetone extraction of crushed castor beans. Here we describe the use of solid-phase microextraction and headspace analysis of crude ricin preparation samples to determinemore » whether they were processed by acetone extraction. In all cases, acetone-extracted bean mash could be distinguished from un-extracted mash or mash extracted with other organic solvents. Statistical analysis showed that storage in closed containers for up to 109 days had no effect on acetone signal intensity. Signal intensity in acetone-extracted mash decreased during storage in open containers, but extracted mash could still be distinguished from un-extracted mash after 94 days.« less

  4. DNA Everywhere. A Guide for Simplified Environmental Genomic DNA Extraction Suitable for Use in Remote Areas

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gabrielle N. Pecora; Francine C. Reid; Lauren M. Tom

    2016-05-01

    Collecting field samples from remote or geographically distant areas can be a financially and logistically challenging. With participation of a local organization where the samples are originated from, gDNA samples can be extracted from the field and shipped to a research institution for further processing and analysis. The ability to set up gDNA extraction capabilities in the field can drastically reduce cost and time when running long-term microbial studies with a large sample set. The method outlined here has developed a compact and affordable method for setting up a “laboratory” and extracting and shipping gDNA samples from anywhere in themore » world. This white paper explains the process of setting up the “laboratory”, choosing and training individuals with no prior scientific experience how to perform gDNA extractions and safe methods for shipping extracts to any research institution. All methods have been validated by the Andersen group at Lawrence Berkeley National Laboratory using the Berkeley Lab PhyloChip.« less

  5. On the enhanced sampling over energy barriers in molecular dynamics simulations.

    PubMed

    Gao, Yi Qin; Yang, Lijiang

    2006-09-21

    We present here calculations of free energies of multidimensional systems using an efficient sampling method. The method uses a transformed potential energy surface, which allows an efficient sampling of both low and high energy spaces and accelerates transitions over barriers. It allows efficient sampling of the configuration space over and only over the desired energy range(s). It does not require predetermined or selected reaction coordinate(s). We apply this method to study the dynamics of slow barrier crossing processes in a disaccharide and a dipeptide system.

  6. Gaussian process based intelligent sampling for measuring nano-structure surfaces

    NASA Astrophysics Data System (ADS)

    Sun, L. J.; Ren, M. J.; Yin, Y. H.

    2016-09-01

    Nanotechnology is the science and engineering that manipulate matters at nano scale, which can be used to create many new materials and devices with a vast range of applications. As the nanotech product increasingly enters the commercial marketplace, nanometrology becomes a stringent and enabling technology for the manipulation and the quality control of the nanotechnology. However, many measuring instruments, for instance scanning probe microscopy, are limited to relatively small area of hundreds of micrometers with very low efficiency. Therefore some intelligent sampling strategies should be required to improve the scanning efficiency for measuring large area. This paper presents a Gaussian process based intelligent sampling method to address this problem. The method makes use of Gaussian process based Bayesian regression as a mathematical foundation to represent the surface geometry, and the posterior estimation of Gaussian process is computed by combining the prior probability distribution with the maximum likelihood function. Then each sampling point is adaptively selected by determining the position which is the most likely outside of the required tolerance zone among the candidates and then inserted to update the model iteratively. Both simulationson the nominal surface and manufactured surface have been conducted on nano-structure surfaces to verify the validity of the proposed method. The results imply that the proposed method significantly improves the measurement efficiency in measuring large area structured surfaces.

  7. Acetobacter strains isolated during the acetification of blueberry (Vaccinium corymbosum L.) wine.

    PubMed

    Hidalgo, C; García, D; Romero, J; Mas, A; Torija, M J; Mateo, E

    2013-09-01

    Highbush blueberries (Vaccinium corymbosum L.) are known to have positive health benefits. The production of blueberry vinegar is one method to preserve this seasonal fruit and allow extended consumption. In this study, blueberry wine acetification was performed with naturally occurring micro-organisms and with an inoculated Acetobacter cerevisiae strain. Acetifications were carried out in triplicate using the Schützenbach method. The successful spontaneous processes took up to 66% more time than the processes involving inoculation. The isolation of acetic acid bacteria (AAB) and the analysis of these AAB using molecular methods allowed the identification of the main genotypes responsible of the blueberry acetification. Although the Acet. cerevisiae strain was the predominant strain isolated from the inoculated process samples, Acetobacter pasteurianus was isolated from samples for both processes and was the only species present in the spontaneous acetification samples. To the best of our knowledge, this is the first report describing the identification and variability of AAB isolated during blueberry acetification. The isolated Acet. pasteurianus strains could be used for large-scale blueberry vinegar production or as a starter culture in studies of other vinegar production methods. © 2013 The Society for Applied Microbiology.

  8. Comparison of soil sampling and analytical methods for asbestos at the Sumas Mountain Asbestos Site-Working towards a toolbox for better assessment.

    PubMed

    Wroble, Julie; Frederick, Timothy; Frame, Alicia; Vallero, Daniel

    2017-01-01

    Established soil sampling methods for asbestos are inadequate to support risk assessment and risk-based decision making at Superfund sites due to difficulties in detecting asbestos at low concentrations and difficulty in extrapolating soil concentrations to air concentrations. Environmental Protection Agency (EPA)'s Office of Land and Emergency Management (OLEM) currently recommends the rigorous process of Activity Based Sampling (ABS) to characterize site exposures. The purpose of this study was to compare three soil analytical methods and two soil sampling methods to determine whether one method, or combination of methods, would yield more reliable soil asbestos data than other methods. Samples were collected using both traditional discrete ("grab") samples and incremental sampling methodology (ISM). Analyses were conducted using polarized light microscopy (PLM), transmission electron microscopy (TEM) methods or a combination of these two methods. Data show that the fluidized bed asbestos segregator (FBAS) followed by TEM analysis could detect asbestos at locations that were not detected using other analytical methods; however, this method exhibited high relative standard deviations, indicating the results may be more variable than other soil asbestos methods. The comparison of samples collected using ISM versus discrete techniques for asbestos resulted in no clear conclusions regarding preferred sampling method. However, analytical results for metals clearly showed that measured concentrations in ISM samples were less variable than discrete samples.

  9. Comparison of soil sampling and analytical methods for asbestos at the Sumas Mountain Asbestos Site—Working towards a toolbox for better assessment

    PubMed Central

    2017-01-01

    Established soil sampling methods for asbestos are inadequate to support risk assessment and risk-based decision making at Superfund sites due to difficulties in detecting asbestos at low concentrations and difficulty in extrapolating soil concentrations to air concentrations. Environmental Protection Agency (EPA)’s Office of Land and Emergency Management (OLEM) currently recommends the rigorous process of Activity Based Sampling (ABS) to characterize site exposures. The purpose of this study was to compare three soil analytical methods and two soil sampling methods to determine whether one method, or combination of methods, would yield more reliable soil asbestos data than other methods. Samples were collected using both traditional discrete (“grab”) samples and incremental sampling methodology (ISM). Analyses were conducted using polarized light microscopy (PLM), transmission electron microscopy (TEM) methods or a combination of these two methods. Data show that the fluidized bed asbestos segregator (FBAS) followed by TEM analysis could detect asbestos at locations that were not detected using other analytical methods; however, this method exhibited high relative standard deviations, indicating the results may be more variable than other soil asbestos methods. The comparison of samples collected using ISM versus discrete techniques for asbestos resulted in no clear conclusions regarding preferred sampling method. However, analytical results for metals clearly showed that measured concentrations in ISM samples were less variable than discrete samples. PMID:28759607

  10. Robowell: An automated process for monitoring ground water quality using established sampling protocols

    USGS Publications Warehouse

    Granato, G.E.; Smith, K.P.

    1999-01-01

    Robowell is an automated process for monitoring selected ground water quality properties and constituents by pumping a well or multilevel sampler. Robowell was developed and tested to provide a cost-effective monitoring system that meets protocols expected for manual sampling. The process uses commercially available electronics, instrumentation, and hardware, so it can be configured to monitor ground water quality using the equipment, purge protocol, and monitoring well design most appropriate for the monitoring site and the contaminants of interest. A Robowell prototype was installed on a sewage treatment plant infiltration bed that overlies a well-studied unconfined sand and gravel aquifer at the Massachusetts Military Reservation, Cape Cod, Massachusetts, during a time when two distinct plumes of constituents were released. The prototype was operated from May 10 to November 13, 1996, and quality-assurance/quality-control measurements demonstrated that the data obtained by the automated method was equivalent to data obtained by manual sampling methods using the same sampling protocols. Water level, specific conductance, pH, water temperature, dissolved oxygen, and dissolved ammonium were monitored by the prototype as the wells were purged according to U.S Geological Survey (USGS) ground water sampling protocols. Remote access to the data record, via phone modem communications, indicated the arrival of each plume over a few days and the subsequent geochemical reactions over the following weeks. Real-time availability of the monitoring record provided the information needed to initiate manual sampling efforts in response to changes in measured ground water quality, which proved the method and characterized the screened portion of the plume in detail through time. The methods and the case study described are presented to document the process for future use.

  11. Integral-equation based methods for parameter estimation in output pulses of radiation detectors: Application in nuclear medicine and spectroscopy

    NASA Astrophysics Data System (ADS)

    Mohammadian-Behbahani, Mohammad-Reza; Saramad, Shahyar

    2018-04-01

    Model based analysis methods are relatively new approaches for processing the output data of radiation detectors in nuclear medicine imaging and spectroscopy. A class of such methods requires fast algorithms for fitting pulse models to experimental data. In order to apply integral-equation based methods for processing the preamplifier output pulses, this article proposes a fast and simple method for estimating the parameters of the well-known bi-exponential pulse model by solving an integral equation. The proposed method needs samples from only three points of the recorded pulse as well as its first and second order integrals. After optimizing the sampling points, the estimation results were calculated and compared with two traditional integration-based methods. Different noise levels (signal-to-noise ratios from 10 to 3000) were simulated for testing the functionality of the proposed method, then it was applied to a set of experimental pulses. Finally, the effect of quantization noise was assessed by studying different sampling rates. Promising results by the proposed method endorse it for future real-time applications.

  12. Syringe filtration methods for examining dissolved and colloidal trace element distributions in remote field locations

    NASA Technical Reports Server (NTRS)

    Shiller, Alan M.

    2003-01-01

    It is well-established that sampling and sample processing can easily introduce contamination into dissolved trace element samples if precautions are not taken. However, work in remote locations sometimes precludes bringing bulky clean lab equipment into the field and likewise may make timely transport of samples to the lab for processing impossible. Straightforward syringe filtration methods are described here for collecting small quantities (15 mL) of 0.45- and 0.02-microm filtered river water in an uncontaminated manner. These filtration methods take advantage of recent advances in analytical capabilities that require only small amounts of waterfor analysis of a suite of dissolved trace elements. Filter clogging and solute rejection artifacts appear to be minimal, although some adsorption of metals and organics does affect the first approximately 10 mL of water passing through the filters. Overall the methods are clean, easy to use, and provide reproducible representations of the dissolved and colloidal fractions of trace elements in river waters. Furthermore, sample processing materials can be prepared well in advance in a clean lab and transported cleanly and compactly to the field. Application of these methods is illustrated with data from remote locations in the Rocky Mountains and along the Yukon River. Evidence from field flow fractionation suggests that the 0.02-microm filters may provide a practical cutoff to distinguish metals associated with small inorganic and organic complexes from those associated with silicate and oxide colloids.

  13. Authentication of Closely Related Fish and Derived Fish Products Using Tandem Mass Spectrometry and Spectral Library Matching.

    PubMed

    Nessen, Merel A; van der Zwaan, Dennis J; Grevers, Sander; Dalebout, Hans; Staats, Martijn; Kok, Esther; Palmblad, Magnus

    2016-05-11

    Proteomics methodology has seen increased application in food authentication, including tandem mass spectrometry of targeted species-specific peptides in raw, processed, or mixed food products. We have previously described an alternative principle that uses untargeted data acquisition and spectral library matching, essentially spectral counting, to compare and identify samples without the need for genomic sequence information in food species populations. Here, we present an interlaboratory comparison demonstrating how a method based on this principle performs in a realistic context. We also increasingly challenge the method by using data from different types of mass spectrometers, by trying to distinguish closely related and commercially important flatfish, and by analyzing heavily contaminated samples. The method was found to be robust in different laboratories, and 94-97% of the analyzed samples were correctly identified, including all processed and contaminated samples.

  14. Rapid-Viability PCR Method for Detection of Live, Virulent Bacillus anthracis in Environmental Samples ▿

    PubMed Central

    Létant, Sonia E.; Murphy, Gloria A.; Alfaro, Teneile M.; Avila, Julie R.; Kane, Staci R.; Raber, Ellen; Bunt, Thomas M.; Shah, Sanjiv R.

    2011-01-01

    In the event of a biothreat agent release, hundreds of samples would need to be rapidly processed to characterize the extent of contamination and determine the efficacy of remediation activities. Current biological agent identification and viability determination methods are both labor- and time-intensive such that turnaround time for confirmed results is typically several days. In order to alleviate this issue, automated, high-throughput sample processing methods were developed in which real-time PCR analysis is conducted on samples before and after incubation. The method, referred to as rapid-viability (RV)-PCR, uses the change in cycle threshold after incubation to detect the presence of live organisms. In this article, we report a novel RV-PCR method for detection of live, virulent Bacillus anthracis, in which the incubation time was reduced from 14 h to 9 h, bringing the total turnaround time for results below 15 h. The method incorporates a magnetic bead-based DNA extraction and purification step prior to PCR analysis, as well as specific real-time PCR assays for the B. anthracis chromosome and pXO1 and pXO2 plasmids. A single laboratory verification of the optimized method applied to the detection of virulent B. anthracis in environmental samples was conducted and showed a detection level of 10 to 99 CFU/sample with both manual and automated RV-PCR methods in the presence of various challenges. Experiments exploring the relationship between the incubation time and the limit of detection suggest that the method could be further shortened by an additional 2 to 3 h for relatively clean samples. PMID:21764960

  15. Development of a testing method for asbestos fibers in treated materials of asbestos containing wastes by transmission electron microscopy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yamamoto, Takashi, E-mail: tyama@nies.go.jp; Kida, Akiko; Noma, Yukio

    Highlights: • A high sensitive and selective testing method for asbestos in treated materials of asbestos containing wastes was developed. • Asbestos can be determined at a limits are a few million fibers per gram and a few μg g{sup −1}. • High temperature melting treatment samples were determined by this method. Asbestos fiber concentration were below the quantitation limit in all samples, and total fiber concentrations were determined as 47–170 × 10{sup 6} g{sup −1}. - Abstract: Appropriate treatment of asbestos-containing wastes is a significant problem. In Japan, the inertization of asbestos-containing wastes based on new treatment processes approvedmore » by the Minister of the Environment is promoted. A highly sensitive method for testing asbestos fibers in inertized materials is required so that these processes can be approved. We developed a method in which fibers from milled treated materials are extracted in water by shaking, and are counted and identified by transmission electron microscopy. Evaluation of this method by using asbestos standards and simulated slag samples confirmed that the quantitation limits are a few million fibers per gram and a few μg/g in a sample of 50 mg per filter. We used this method to assay asbestos fibers in slag samples produced by high-temperature melting of asbestos-containing wastes. Fiber concentrations were below the quantitation limit in all samples, and total fiber concentrations were determined as 47–170 × 10{sup −6} f/g. Because the evaluation of treated materials by TEM is difficult owing to the limited amount of sample observable, this testing method should be used in conjunction with bulk analytical methods for sure evaluation of treated materials.« less

  16. Gross alpha and beta activity analyses in urine-a routine laboratory method for internal human radioactivity detection.

    PubMed

    Chen, Xiaowen; Zhao, Luqian; Qin, Hongran; Zhao, Meijia; Zhou, Yirui; Yang, Shuqiang; Su, Xu; Xu, Xiaohua

    2014-05-01

    The aim of this work was to develop a method to provide rapid results for humans with internal radioactive contamination. The authors hypothesized that valuable information could be obtained from gas proportional counter techniques by screening urine samples from potentially exposed individuals rapidly. Recommended gross alpha and beta activity screening methods generally employ gas proportional counting techniques. Based on International Standards Organization (ISO) methods, improvements were made in the evaporation process to develop a method to provide rapid results, adequate sensitivity, and minimum sample preparation and operator intervention for humans with internal radioactive contamination. The method described by an American National Standards Institute publication was used to calibrate the gas proportional counter, and urine samples from patients with or without radionuclide treatment were measured to validate the method. By improving the evaporation process, the time required to perform the assay was reduced dramatically. Compared with the reference data, the results of the validation samples were very satisfactory with respect to gross-alpha and gross-beta activities. The gas flow proportional counting method described here has the potential for radioactivity monitoring in the body. This method was easy, efficient, and fast, and its application is of great utility in determining whether a sample should be analyzed by a more complicated method, for example radiochemical and/or γ-spectroscopy. In the future, it may be used commonly in medical examination and nuclear emergency treatment.Health Phys. 106(5):000-000; 2014.

  17. Method and apparatus for differential spectroscopic atomic-imaging using scanning tunneling microscopy

    DOEpatents

    Kazmerski, Lawrence L.

    1990-01-01

    A Method and apparatus for differential spectroscopic atomic-imaging is disclosed for spatial resolution and imaging for display not only individual atoms on a sample surface, but also bonding and the specific atomic species in such bond. The apparatus includes a scanning tunneling microscope (STM) that is modified to include photon biasing, preferably a tuneable laser, modulating electronic surface biasing for the sample, and temperature biasing, preferably a vibration-free refrigerated sample mounting stage. Computer control and data processing and visual display components are also included. The method includes modulating the electronic bias voltage with and without selected photon wavelengths and frequency biasing under a stabilizing (usually cold) bias temperature to detect bonding and specific atomic species in the bonds as the STM rasters the sample. This data is processed along with atomic spatial topography data obtained from the STM raster scan to create a real-time visual image of the atoms on the sample surface.

  18. Porous calcium polyphosphate bone substitutes: additive manufacturing versus conventional gravity sinter processing-effect on structure and mechanical properties.

    PubMed

    Hu, Youxin; Shanjani, Yaser; Toyserkani, Ehsan; Grynpas, Marc; Wang, Rizhi; Pilliar, Robert

    2014-02-01

    Porous calcium polyphosphate (CPP) structures proposed as bone-substitute implants and made by sintering CPP powders to form bending test samples of approximately 35 vol % porosity were machined from preformed blocks made either by additive manufacturing (AM) or conventional gravity sintering (CS) methods and the structure and mechanical characteristics of samples so made were compared. AM-made samples displayed higher bending strengths (≈1.2-1.4 times greater than CS-made samples), whereas elastic constant (i.e., effective elastic modulus of the porous structures) that is determined by material elastic modulus and structural geometry of the samples was ≈1.9-2.3 times greater for AM-made samples. X-ray diffraction analysis showed that samples made by either method displayed the same crystal structure forming β-CPP after sinter annealing. The material elastic modulus, E, determined using nanoindentation tests also showed the same value for both sample types (i.e., E ≈ 64 GPa). Examination of the porous structures indicated that significantly larger sinter necks resulted in the AM-made samples which presumably resulted in the higher mechanical properties. The development of mechanical properties was attributed to the different sinter anneal procedures required to make 35 vol % porous samples by the two methods. A primary objective of the present study, in addition to reporting on bending strength and sample stiffness (elastic constant) characteristics, was to determine why the two processes resulted in the observed mechanical property differences for samples of equivalent volume percentage of porosity. An understanding of the fundamental reason(s) for the observed effect is considered important for developing improved processes for preparation of porous CPP implants as bone substitutes for use in high load-bearing skeletal sites. Copyright © 2013 Wiley Periodicals, Inc.

  19. Defining And Characterizing Sample Representativeness For DWPF Melter Feed Samples

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shine, E. P.; Poirier, M. R.

    2013-10-29

    Representative sampling is important throughout the Defense Waste Processing Facility (DWPF) process, and the demonstrated success of the DWPF process to achieve glass product quality over the past two decades is a direct result of the quality of information obtained from the process. The objective of this report was to present sampling methods that the Savannah River Site (SRS) used to qualify waste being dispositioned at the DWPF. The goal was to emphasize the methodology, not a list of outcomes from those studies. This methodology includes proven methods for taking representative samples, the use of controlled analytical methods, and datamore » interpretation and reporting that considers the uncertainty of all error sources. Numerous sampling studies were conducted during the development of the DWPF process and still continue to be performed in order to evaluate options for process improvement. Study designs were based on use of statistical tools applicable to the determination of uncertainties associated with the data needs. Successful designs are apt to be repeated, so this report chose only to include prototypic case studies that typify the characteristics of frequently used designs. Case studies have been presented for studying in-tank homogeneity, evaluating the suitability of sampler systems, determining factors that affect mixing and sampling, comparing the final waste glass product chemical composition and durability to that of the glass pour stream sample and other samples from process vessels, and assessing the uniformity of the chemical composition in the waste glass product. Many of these studies efficiently addressed more than one of these areas of concern associated with demonstrating sample representativeness and provide examples of statistical tools in use for DWPF. The time when many of these designs were implemented was in an age when the sampling ideas of Pierre Gy were not as widespread as they are today. Nonetheless, the engineers and statisticians used carefully thought out designs that systematically and economically provided plans for data collection from the DWPF process. Key shared features of the sampling designs used at DWPF and the Gy sampling methodology were the specification of a standard for sample representativeness, an investigation that produced data from the process to study the sampling function, and a decision framework used to assess whether the specification was met based on the data. Without going into detail with regard to the seven errors identified by Pierre Gy, as excellent summaries are readily available such as Pitard [1989] and Smith [2001], SRS engineers understood, for example, that samplers can be biased (Gy's extraction error), and developed plans to mitigate those biases. Experiments that compared installed samplers with more representative samples obtained directly from the tank may not have resulted in systematically partitioning sampling errors into the now well-known error categories of Gy, but did provide overall information on the suitability of sampling systems. Most of the designs in this report are related to the DWPF vessels, not the large SRS Tank Farm tanks. Samples from the DWPF Slurry Mix Evaporator (SME), which contains the feed to the DWPF melter, are characterized using standardized analytical methods with known uncertainty. The analytical error is combined with the established error from sampling and processing in DWPF to determine the melter feed composition. This composition is used with the known uncertainty of the models in the Product Composition Control System (PCCS) to ensure that the wasteform that is produced is comfortably within the acceptable processing and product performance region. Having the advantage of many years of processing that meets the waste glass product acceptance criteria, the DWPF process has provided a considerable amount of data about itself in addition to the data from many special studies. Demonstrating representative sampling directly from the large Tank Farm tanks is a difficult, if not unsolvable enterprise due to limited accessibility. However, the consistency and the adequacy of sampling and mixing at SRS could at least be studied under the controlled process conditions based on samples discussed by Ray and others [2012a] in Waste Form Qualification Report (WQR) Volume 2 and the transfers from Tanks 40H and 51H to the Sludge Receipt and Adjustment Tank (SRAT) within DWPF. It is important to realize that the need for sample representativeness becomes more stringent as the material gets closer to the melter, and the tanks within DWPF have been studied extensively to meet those needs.« less

  20. Monte Carlo sampling in diffusive dynamical systems

    NASA Astrophysics Data System (ADS)

    Tapias, Diego; Sanders, David P.; Altmann, Eduardo G.

    2018-05-01

    We introduce a Monte Carlo algorithm to efficiently compute transport properties of chaotic dynamical systems. Our method exploits the importance sampling technique that favors trajectories in the tail of the distribution of displacements, where deviations from a diffusive process are most prominent. We search for initial conditions using a proposal that correlates states in the Markov chain constructed via a Metropolis-Hastings algorithm. We show that our method outperforms the direct sampling method and also Metropolis-Hastings methods with alternative proposals. We test our general method through numerical simulations in 1D (box-map) and 2D (Lorentz gas) systems.

  1. Robust high-throughput batch screening method in 384-well format with optical in-line resin quantification.

    PubMed

    Kittelmann, Jörg; Ottens, Marcel; Hubbuch, Jürgen

    2015-04-15

    High-throughput batch screening technologies have become an important tool in downstream process development. Although continuative miniaturization saves time and sample consumption, there is yet no screening process described in the 384-well microplate format. Several processes are established in the 96-well dimension to investigate protein-adsorbent interactions, utilizing between 6.8 and 50 μL resin per well. However, as sample consumption scales with resin volumes and throughput scales with experiments per microplate, they are limited in costs and saved time. In this work, a new method for in-well resin quantification by optical means, applicable in the 384-well format, and resin volumes as small as 0.1 μL is introduced. A HTS batch isotherm process is described, utilizing this new method in combination with optical sample volume quantification for screening of isotherm parameters in 384-well microplates. Results are qualified by confidence bounds determined by bootstrap analysis and a comprehensive Monte Carlo study of error propagation. This new approach opens the door to a variety of screening processes in the 384-well format on HTS stations, higher quality screening data and an increase in throughput. Copyright © 2015 Elsevier B.V. All rights reserved.

  2. The effect of processing on the mechanical properties of self-reinforced composites

    NASA Astrophysics Data System (ADS)

    Hassani, Farzaneh; Martin, Peter J.; Falzon, Brian G.

    2018-05-01

    Hot-compaction is one of the most common manufacturing methods for creating recyclable all thermoplastic composites. The current work investigates the compaction of highly oriented self-reinforced fabrics with three processing methods to study the effect of pressure and temperature in the tensile mechanical properties of the consolidated laminates. Hot-press, calender roller and vacuum bag technique were adopted to consolidate bi-component polypropylene woven fabrics in a range of pressures and compaction temperatures. Hot-pressed samples exhibited the highest quality of compaction. The modulus of the hot-pressed samples increased with compaction temperature initially due to the improved interlayer bonding and decreased after a maximum at 150°C because of partial melting of the reinforcement phase. The calender roller technique exhibited to have smaller processing temperature window as the pressure is only applied for a short time and the fabrics start to shrink with increasing the processing temperature. The need for constraining the fabrics through the process is therefore found to be paramount. The Vacuum bag results showed this technique to be the least efficient method because of the low compaction pressure. Microscopic images and void content measurement of the consolidated samples further validate the results from tensile testing.

  3. The clustering-based case-based reasoning for imbalanced business failure prediction: a hybrid approach through integrating unsupervised process with supervised process

    NASA Astrophysics Data System (ADS)

    Li, Hui; Yu, Jun-Ling; Yu, Le-An; Sun, Jie

    2014-05-01

    Case-based reasoning (CBR) is one of the main forecasting methods in business forecasting, which performs well in prediction and holds the ability of giving explanations for the results. In business failure prediction (BFP), the number of failed enterprises is relatively small, compared with the number of non-failed ones. However, the loss is huge when an enterprise fails. Therefore, it is necessary to develop methods (trained on imbalanced samples) which forecast well for this small proportion of failed enterprises and performs accurately on total accuracy meanwhile. Commonly used methods constructed on the assumption of balanced samples do not perform well in predicting minority samples on imbalanced samples consisting of the minority/failed enterprises and the majority/non-failed ones. This article develops a new method called clustering-based CBR (CBCBR), which integrates clustering analysis, an unsupervised process, with CBR, a supervised process, to enhance the efficiency of retrieving information from both minority and majority in CBR. In CBCBR, various case classes are firstly generated through hierarchical clustering inside stored experienced cases, and class centres are calculated out by integrating cases information in the same clustered class. When predicting the label of a target case, its nearest clustered case class is firstly retrieved by ranking similarities between the target case and each clustered case class centre. Then, nearest neighbours of the target case in the determined clustered case class are retrieved. Finally, labels of the nearest experienced cases are used in prediction. In the empirical experiment with two imbalanced samples from China, the performance of CBCBR was compared with the classical CBR, a support vector machine, a logistic regression and a multi-variant discriminate analysis. The results show that compared with the other four methods, CBCBR performed significantly better in terms of sensitivity for identifying the minority samples and generated high total accuracy meanwhile. The proposed approach makes CBR useful in imbalanced forecasting.

  4. WEIGHTED LIKELIHOOD ESTIMATION UNDER TWO-PHASE SAMPLING

    PubMed Central

    Saegusa, Takumi; Wellner, Jon A.

    2013-01-01

    We develop asymptotic theory for weighted likelihood estimators (WLE) under two-phase stratified sampling without replacement. We also consider several variants of WLEs involving estimated weights and calibration. A set of empirical process tools are developed including a Glivenko–Cantelli theorem, a theorem for rates of convergence of M-estimators, and a Donsker theorem for the inverse probability weighted empirical processes under two-phase sampling and sampling without replacement at the second phase. Using these general results, we derive asymptotic distributions of the WLE of a finite-dimensional parameter in a general semiparametric model where an estimator of a nuisance parameter is estimable either at regular or nonregular rates. We illustrate these results and methods in the Cox model with right censoring and interval censoring. We compare the methods via their asymptotic variances under both sampling without replacement and the more usual (and easier to analyze) assumption of Bernoulli sampling at the second phase. PMID:24563559

  5. Is it really theoretical? A review of sampling in grounded theory studies in nursing journals.

    PubMed

    McCrae, Niall; Purssell, Edward

    2016-10-01

    Grounded theory is a distinct method of qualitative research, where core features are theoretical sampling and constant comparative analysis. However, inconsistent application of these activities has been observed in published studies. This review assessed the use of theoretical sampling in grounded theory studies in nursing journals. An adapted systematic review was conducted. Three leading nursing journals (2010-2014) were searched for studies stating grounded theory as the method. Sampling was assessed using a concise rating tool. A high proportion (86%) of the 134 articles described an iterative process of data collection and analysis. However, half of the studies did not demonstrate theoretical sampling, with many studies declaring or indicating a purposive sampling approach throughout. Specific reporting guidelines for grounded theory studies should be developed to ensure that study reports describe an iterative process of fieldwork and theoretical development. © 2016 John Wiley & Sons Ltd.

  6. Automation of sample processing for ICP-MS determination of 90Sr radionuclide at ppq level for nuclear technology and environmental purposes.

    PubMed

    Kołacińska, Kamila; Chajduk, Ewelina; Dudek, Jakub; Samczyński, Zbigniew; Łokas, Edyta; Bojanowska-Czajka, Anna; Trojanowicz, Marek

    2017-07-01

    90 Sr is a widely determined radionuclide for environmental purposes, nuclear waste control, and can be also monitored in coolants in nuclear reactor plants. In the developed method, the ICP-MS detection was employed together with sample processing in sequential injection analysis (SIA) setup, equipped with a lab-on-valve with mechanized renewal of sorbent bed for solid-phase extraction. The optimized conditions of determination included preconcentration of 90 Sr on cation-exchange column and removal of different type of interferences using extraction Sr-resin. The limit of detection of the developed procedure depends essentially on the configuration of the employed ICP-MS spectrometer and on the available volume of the sample to be analyzed. For 1L initial sample volume, the method detection limit (MDL) value was evaluated as 2.9ppq (14.5BqL -1 ). The developed method was applied to analyze spiked river water samples, water reference materials, and also simulated and real samples of the nuclear reactor coolant. Copyright © 2016 Elsevier B.V. All rights reserved.

  7. Study on Electro-Polishing Process by Niobium-Plate Sample With Artificial Pits

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    T. Saeki, H. Hayano, S. Kato, M. Nishiwaki, M. Sawabe, W.A. Clemens, R.L. Geng, R. Manus, P.V. Tyagi

    2011-07-01

    The Electro-polishing (EP) process is the best candidate of final surface-treatment for the production of ILC cavities. Nevertheless, the development of defects on the inner-surface of the Superconducting RF cavity during EP process has not been studied by experimental method. We made artificial pits on the surface of a Nb-plate sample and observed the development of the pit-shapes after each step of 30um-EP process where 120um was removed by EP in total. This article describes the results of this EP-test of Nb-sample with artificial pits.

  8. Method for charging a hydrogen getter

    DOEpatents

    Tracy, C. Edwin; Keyser, Matthew A.; Benson, David K.

    1998-01-01

    A method for charging a sample of either a permanent or reversible getter material with a high concentration of hydrogen while maintaining a base pressure below 10.sup.-4 torr at room temperature involves placing the sample of hydrogen getter material in a chamber, activating the sample of hydrogen getter material, overcharging the sample of getter material through conventional charging techniques to a high concentration of hydrogen, and then subjecting the sample of getter material to a low temperature vacuum bake-out process. Application of the method results in a reversible hydrogen getter which is highly charged to maximum capacities of hydrogen and which concurrently exhibits minimum hydrogen vapor pressures at room temperatures.

  9. Rapid Radiochemical Methods for Asphalt Paving Material ...

    EPA Pesticide Factsheets

    Technical Brief Validated rapid radiochemical methods for alpha and beta emitters in solid matrices that are commonly encountered in urban environments were previously unavailable for public use by responding laboratories. A lack of tested rapid methods would delay the quick determination of contamination levels and the assessment of acceptable site-specific exposure levels. Of special concern are matrices with rough and porous surfaces, which allow the movement of radioactive material deep into the building material making it difficult to detect. This research focuses on methods that address preparation, radiochemical separation, and analysis of asphalt paving materials and asphalt roofing shingles. These matrices, common to outdoor environments, challenge the capability and capacity of very experienced radiochemistry laboratories. Generally, routine sample preparation and dissolution techniques produce liquid samples (representative of the original sample material) that can be processed using available radiochemical methods. The asphalt materials are especially difficult because they do not readily lend themselves to these routine sample preparation and dissolution techniques. The HSRP and ORIA coordinate radiological reference laboratory priorities and activities in conjunction with HSRP’s Partner Process. As part of the collaboration, the HSRP worked with ORIA to publish rapid radioanalytical methods for selected radionuclides in building material matrice

  10. Phytoforensics—Using trees to find contamination

    USGS Publications Warehouse

    Wilson, Jordan L.

    2017-09-28

    The water we drink, air we breathe, and soil we come into contact with have the potential to adversely affect our health because of contaminants in the environment. Environmental samples can characterize the extent of potential contamination, but traditional methods for collecting water, air, and soil samples below the ground (for example, well drilling or direct-push soil sampling) are expensive and time consuming. Trees are closely connected to the subsurface and sampling tree trunks can indicate subsurface pollutants, a process called phytoforensics. Scientists at the Missouri Water Science Center were among the first to use phytoforensics to screen sites for contamination before using traditional sampling methods, to guide additional sampling, and to show the large cost savings associated with tree sampling compared to traditional methods. 

  11. Use of immuno assays during the development of a Hemophilus influenzae type b vaccine for technology transfer to emerging vaccine manufacturers.

    PubMed

    Hamidi, Ahd; Kreeftenberg, Hans

    2014-01-01

    Quality control of Hemophilus Influenzae type b (Hib) conjugate vaccines is mainly dependent on physicochemical methods. Overcoming sample matrix interference when using physicochemical tests is very challenging, these tests are therefore only used to test purified samples of polysaccharide, protein, bulk conjugate, and final product. For successful development of a Hib conjugate vaccine, several ELISA (enzyme-linked immunosorbent assay) methods were needed as an additional tool to enable testing of in process (IP) samples. In this paper, three of the ELISA's that have been very valuable during the process development, implementation and scaling up are highlighted. The PRP-ELISA, was a very efficient tool in testing in process (IP) samples generated during the development of the cultivation and purification process of the Hib-polysaccharide. The antigenicity ELISA, was used to confirm the covalent linkage of PRP and TTd in the conjugate. The anti-PRP IgG ELISA was developed as part of the immunogenicity test, used to demonstrate the ability of the Hib conjugate vaccine to elicit a T-cell dependent immune response in mice. ELISA methods are relatively cheap and easy to implement and therefore very useful during the development of polysaccharide conjugate vaccines.

  12. The Direct Lighting Computation in Global Illumination Methods

    NASA Astrophysics Data System (ADS)

    Wang, Changyaw Allen

    1994-01-01

    Creating realistic images is a computationally expensive process, but it is very important for applications such as interior design, product design, education, virtual reality, and movie special effects. To generate realistic images, state-of-art rendering techniques are employed to simulate global illumination, which accounts for the interreflection of light among objects. In this document, we formalize the global illumination problem into a eight -dimensional integral and discuss various methods that can accelerate the process of approximating this integral. We focus on the direct lighting computation, which accounts for the light reaching the viewer from the emitting sources after exactly one reflection, Monte Carlo sampling methods, and light source simplification. Results include a new sample generation method, a framework for the prediction of the total number of samples used in a solution, and a generalized Monte Carlo approach for computing the direct lighting from an environment which for the first time makes ray tracing feasible for highly complex environments.

  13. A New Sampling Strategy for the Detection of Fecal Bacteria Integrated with USEPA Method 1622/1623

    EPA Science Inventory

    USEPA Method 1622/1623 requires the concentration of Cryptosporidium and Giardia from 10 liters of water samples prior to detection. During this process the supernatant is discarded because it is assumed that most protozoa are retained in the filtration and centrifugation steps....

  14. Off-line real-time FTIR analysis of a process step in imipenem production

    NASA Astrophysics Data System (ADS)

    Boaz, Jhansi R.; Thomas, Scott M.; Meyerhoffer, Steven M.; Staskiewicz, Steven J.; Lynch, Joseph E.; Egan, Richard S.; Ellison, Dean K.

    1992-08-01

    We have developed an FT-IR method, using a Spectra-Tech Monit-IR 400 systems, to monitor off-line the completion of a reaction in real-time. The reaction is moisture-sensitive and analysis by more conventional methods (normal-phase HPLC) is difficult to reproduce. The FT-IR method is based on the shift of a diazo band when a conjugated beta-diketone is transformed into a silyl enol ether during the reaction. The reaction mixture is examined directly by IR and does not require sample workup. Data acquisition time is less than one minute. The method has been validated for specificity, precision and accuracy. The results obtained by the FT-IR method for known mixtures and in-process samples compare favorably with those from a normal-phase HPLC method.

  15. [Effects of post-harvest processing and extraction methods on polysaccharides content of Dendrobium officinale].

    PubMed

    Li, Cong; Ning, Li-Dan; Si, Jin-Ping; Wu, Ling-Shang; Liu, Jing-Jing; Song, Xian-Shui; Yu, Qiao-Xian

    2013-02-01

    To reveal the quality variation of polysaccharide in Dendrobium officinale by post-harvest processing and extraction methods, and provide a basis for post-harvest processing and clinical and hygienical applications of Tiepifengdou (Dendrobii Officinalis Caulis). The content of polysaccharides were studied by 4 post-harvest processing methods, i. e. drying by drying closet, drying after scalding by boiling water, drying while twisting, and drying while twisting after scalding by boiling water. And a series of temperatures were set in each processing procedure. An orthogonal test L9 (3(4)) with crushed degrees, solid-liquid ratio, extraction time and extraction times as factors were designed to analyze the dissolution rate of polysaccharides in Tiepifengdou processed by drying while twisting at 80 degrees C. The content of polysaccharides was ranged from 26.59% to 32.70% in different samples processed by different processing methods, among which drying while twisting at 80 degrees C and 100 degrees C respectively were the best. Crushed degree was the most important influence on the dissolution rate of polysaccharides. The dissolution rate of polysaccharides was extremely low when the sample was boiled directly without crushing and sieving. Drying while twisting at 80 degrees C was the best post-harvest processing method, which can help to dry the fresh herbs and improve the accumulation of polysaccharides. Boiling the uncrushed Tiepifengdou for a long time as traditional method could not fully extract polysaccharides, while boiling the crushed Tiepifengdou can efficiently extract polysaccharides.

  16. The experimental research on response characteristics of coal samples under the uniaxial loading process

    NASA Astrophysics Data System (ADS)

    Jia, Bing; Wei, Jian-Ping; Wen, Zhi-Hui; Wang, Yun-Gang; Jia, Lin-Xing

    2017-11-01

    In order to study the response characteristics of infrasound in coal samples under the uniaxial loading process, coal samples were collected from GengCun mine. Coal rock stress loading device, acoustic emission tested system and infrasound tested system were used to test the infrasonic signal and acoustic emission signal under uniaxial loading process. The tested results were analyzed by the methods of wavelet filter, threshold denoise, time-frequency analysis and so on. The results showed that in the loading process, the change of the infrasonic wave displayed the characteristics of stage, and it could be divided into three stages: initial stage with a certain amount infrasound events, middle stage with few infrasound events, and late stage gradual decrease. It had a good consistency with changing characteristics of acoustic emission. At the same time, the frequency of infrasound was very low. It can propagate over a very long distance with little attenuation, and the characteristics of the infrasound before the destruction of the coal samples were obvious. A method of using the infrasound characteristics to predict the destruction of coal samples was proposed. This is of great significance to guide the prediction of geological hazards in coal mines.

  17. Extraction and reliable determination of acrylamide from thermally processed foods using ionic liquid-based ultrasound-assisted selective microextraction combined with spectrophotometry.

    PubMed

    Altunay, Nail; Elik, Adil; Gürkan, Ramazan

    2018-02-01

    Acrylamide (AAm) is a carcinogenic chemical that can form in thermally processed foods by the Maillard reaction of glucose with asparagine. AAm can easily be formed especially in frequently consumed chips and cereal-based foods depending on processing conditions. Considering these properties of AAm, a new, simple and green method is proposed for the extraction of AAm from thermally processed food samples. In this study, an ionic liquid (1-butyl-3-methylimidazolium tetrafluoroborate, [Bmim][BF 4 ]) as extractant was used in the presence of a cationic phenazine group dye, 3,7-diamino-5-phenylphenazinium chloride (PSH + , phenosafranine) at pH 7.5 for the extraction of AAm as an ion-pair complex from selected samples. Under optimum conditions, the analytical features obtained for the proposed method were as follows; linear working range, the limits of detection (LOD, 3S b /m) and quantification (LOQ, 10S b /m), preconcentration factor, sensitivity enhancement factor, sample volume and recovery% were 2.2-350 µg kg -1 , 0.7 µg kg -1 , 2.3 µg kg -1 , 120, 95, 60 mL and 94.1-102.7%, respectively. The validity of the method was tested by analysis of two certified reference materials (CRMs) and intra-day and inter-day precision studies. Finally, the method was successfully applied to the determination of AAm levels in thermally processed foods using the standard addition method.

  18. Plasma heating for containerless and microgravity materials processing

    NASA Technical Reports Server (NTRS)

    Leung, Emily W. (Inventor); Man, Kin F. (Inventor)

    1994-01-01

    A method for plasma heating of levitated samples to be used in containerless microgravity processing is disclosed. A sample is levitated by electrostatic, electromagnetic, aerodynamic, or acoustic systems, as is appropriate for the physical properties of the particular sample. The sample is heated by a plasma torch at atmospheric pressure. A ground plate is provided to help direct the plasma towards the sample. In addition, Helmholtz coils are provided to produce a magnetic field that can be used to spiral the plasma around the sample. The plasma heating system is oriented such that it does not interfere with the levitation system.

  19. Development and validation of a solid phase extraction sample cleanup procedure for the recovery of trace levels of nitro-organic explosives in soil.

    PubMed

    Thomas, Jennifer L; Donnelly, Christopher C; Lloyd, Erin W; Mothershead, Robert F; Miller, Mark L

    2018-03-01

    An improved cleanup method has been developed for the recovery of trace levels of 12 nitro-organic explosives in soil, which is important not only for the forensic community, but also has environmental implications. A wide variety of explosives or explosive-related compounds were evaluated, including nitramines, nitrate esters, nitroaromatics, and a nitroalkane. Fortified soil samples were extracted with acetone, processed via solid phase extraction (SPE), and then analyzed by gas chromatography with electron capture detection. The following three SPE sorbents in cartridge format were compared: Empore™ SDB-XC, Oasis ® HLB, and Bond Elut NEXUS cartridges. The NEXUS cartridges provided the best overall recoveries for the 12 explosives in potting soil (average 48%) and the fastest processing times (<30min). It also rejected matrix components from spent motor oil on potting soil. The SPE method was validated by assessing limit of detection (LOD), processed sample stability, and interferences. All 12 compounds were detectable at 0.02μg explosive/gram of soil or lower in the three matrices tested (potting soil, sand, and loam) over three days. Seven explosives were stable up to seven days at 2μg/g and three were stable at 0.2μg/g, both in processed loam, which was the most challenging matrix. In the interference study, five interferences above the determined LOD for soil were detected in matrices collected across the United States and in purchased all-purpose sand, potting soil, and loam. This represented a 3.2% false positive rate for the 13 matrices processed by the screening method for interferences. The reported SPE cleanup method provides a fast and simple extraction process for separating organic explosives from matrix components, facilitating sample throughput and reducing instrument maintenance. In addition, a comparison study of the validated SPE method versus conventional syringe filtration was completed and highlighted the benefits of sample cleanup for removing matrix interferences, while also providing lower supply cost, order of magnitude lower LODs for most explosives, higher percent recoveries for complex matrices, and fewer instrument maintenance issues. Published by Elsevier B.V.

  20. Microgravity Disturbance Characterization of the Quench Module Insert (QMI) Phase Change Device (PCD)

    NASA Technical Reports Server (NTRS)

    Gattis, Christy; Rodriguez, Pete (Technical Monitor)

    2000-01-01

    The Materials Science Research Facility (MSRF) is a multi-user, multi-purpose facility for materials science research. One experiment within the MSRF will be the Quench Module Insert (QMI), a high-temperature furnace with unique capabilities for processing different classes of materials. The primary functions of the QMI furnace are to melt, directionally solidify, and quench metallic samples, providing data to aid in understanding the effects of the microgravity environment on the characteristics of these processed metals. The QMI houses sealed individual sample ampoules containing material to be processed. Quenching of the samples in the QMI furnace is accomplished by releasing low-melting-point metallic shoes into contact with the outside of the sample ampoule, dissipating heat and cooling the sample inside. The impact from this method of quench will induce sample vibrations which could be large enough to adversely affect sample quality. Utilizing breadboard hardware, the sample quench sequence, releasing the shoes, was conducted. Data was collected from accelerometers located on the breadboard sample cartridge, indicating the maximum acceleration achieved by the sample. The primary objective of the test described in this presentation was to determine the acceleration imparted on the sample by the shoe contact. From this information, the science community can better assess whether this method of quench will allow them to obtain the data they need.

  1. After site selection and before data analysis: sampling, sorting, and laboratory procedures used in stream benthic macroinvertebrate monitoring programs by USA state agencies

    USGS Publications Warehouse

    Carter, James L.; Resh, Vincent H.

    2001-01-01

    A survey of methods used by US state agencies for collecting and processing benthic macroinvertebrate samples from streams was conducted by questionnaire; 90 responses were received and used to describe trends in methods. The responses represented an estimated 13,000-15,000 samples collected and processed per year. Kicknet devices were used in 64.5% of the methods; other sampling devices included fixed-area samplers (Surber and Hess), artificial substrates (Hester-Dendy and rock baskets), grabs, and dipnets. Regional differences existed, e.g., the 1-m kicknet was used more often in the eastern US than in the western US. Mesh sizes varied among programs but 80.2% of the methods used a mesh size between 500 and 600 (mu or u)m. Mesh size variations within US Environmental Protection Agency regions were large, with size differences ranging from 100 to 700 (mu or u)m. Most samples collected were composites; the mean area sampled was 1.7 m2. Samples rarely were collected using a random method (4.7%); most samples (70.6%) were collected using "expert opinion", which may make data obtained operator-specific. Only 26.3% of the methods sorted all the organisms from a sample; the remainder subsampled in the laboratory. The most common method of subsampling was to remove 100 organisms (range = 100-550). The magnification used for sorting ranged from 1 (sorting by eye) to 30x, which results in inconsistent separation of macroinvertebrates from detritus. In addition to subsampling, 53% of the methods sorted large/rare organisms from a sample. The taxonomic level used for identifying organisms varied among taxa; Ephemeroptera, Plecoptera, and Trichoptera were generally identified to a finer taxonomic resolution (genus and species) than other taxa. Because there currently exists a large range of field and laboratory methods used by state programs, calibration among all programs to increase data comparability would be exceptionally challenging. However, because many techniques are shared among methods, limited testing could be designed to evaluate whether procedural differences affect the ability to determine levels of environmental impairment using benthic macroinvertebrate communities.

  2. Quality Evaluation and Chemical Markers Screening of Salvia miltiorrhiza Bge. (Danshen) Based on HPLC Fingerprints and HPLC-MSn Coupled with Chemometrics.

    PubMed

    Liang, Wenyi; Chen, Wenjing; Wu, Lingfang; Li, Shi; Qi, Qi; Cui, Yaping; Liang, Linjin; Ye, Ting; Zhang, Lanzhen

    2017-03-17

    Danshen, the dried root of Salvia miltiorrhiza Bge., is a widely used commercially available herbal drug, and unstable quality of different samples is a current issue. This study focused on a comprehensive and systematic method combining fingerprints and chemical identification with chemometrics for discrimination and quality assessment of Danshen samples. Twenty-five samples were analyzed by HPLC-PAD and HPLC-MS n . Forty-nine components were identified and characteristic fragmentation regularities were summarized for further interpretation of bioactive components. Chemometric analysis was employed to differentiate samples and clarify the quality differences of Danshen including hierarchical cluster analysis, principal component analysis, and partial least squares discriminant analysis. Consistent results were that the samples were divided into three categories which reflected the difference in quality of Danshen samples. By analyzing the reasons for sample classification, it was revealed that the processing method had a more obvious impact on sample classification than the geographical origin, it induced the different content of bioactive compounds and finally lead to different qualities. Cryptotanshinone, trijuganone B, and 15,16-dihydrotanshinone I were screened out as markers to distinguish samples by different processing methods. The developed strategy could provide a reference for evaluation and discrimination of other traditional herbal medicines.

  3. New Trends in Pesticide Residue Analysis in Cereals, Nutraceuticals, Baby Foods, and Related Processed Consumer Products.

    PubMed

    Raina-Fulton, Renata

    2015-01-01

    Pesticide residue methods have been developed for a wide variety of food products including cereal-based foods, nutraceuticals and related plant products, and baby foods. These cereal, fruit, vegetable, and plant-based products provide the basis for many processed consumer products. For cereal and nutraceuticals, which are dry sample products, a modified QuEChERS (quick, easy, cheap, effective, rugged, and safe) method has been used with additional steps to allow wetting of the dry sample matrix and subsequent cleanup using dispersive or cartridge format SPE to reduce matrix effects. More processed foods may have lower pesticide concentrations but higher co-extracts that can lead to signal suppression or enhancement with MS detection. For complex matrixes, GC/MS/MS or LC/electrospray ionization (positive or negative ion)-MS/MS is more frequently used. The extraction and cleanup methods vary with different sample types particularly for cereal-based products, and these different approaches are discussed in this review. General instrument considerations are also discussed.

  4. Forensic discrimination of copper wire using trace element concentrations.

    PubMed

    Dettman, Joshua R; Cassabaum, Alyssa A; Saunders, Christopher P; Snyder, Deanna L; Buscaglia, JoAnn

    2014-08-19

    Copper may be recovered as evidence in high-profile cases such as thefts and improvised explosive device incidents; comparison of copper samples from the crime scene and those associated with the subject of an investigation can provide probative associative evidence and investigative support. A solution-based inductively coupled plasma mass spectrometry method for measuring trace element concentrations in high-purity copper was developed using standard reference materials. The method was evaluated for its ability to use trace element profiles to statistically discriminate between copper samples considering the precision of the measurement and manufacturing processes. The discriminating power was estimated by comparing samples chosen on the basis of the copper refining and production process to represent the within-source (samples expected to be similar) and between-source (samples expected to be different) variability using multivariate parametric- and empirical-based data simulation models with bootstrap resampling. If the false exclusion rate is set to 5%, >90% of the copper samples can be correctly determined to originate from different sources using a parametric-based model and >87% with an empirical-based approach. These results demonstrate the potential utility of the developed method for the comparison of copper samples encountered as forensic evidence.

  5. Model Identification of Integrated ARMA Processes

    ERIC Educational Resources Information Center

    Stadnytska, Tetiana; Braun, Simone; Werner, Joachim

    2008-01-01

    This article evaluates the Smallest Canonical Correlation Method (SCAN) and the Extended Sample Autocorrelation Function (ESACF), automated methods for the Autoregressive Integrated Moving-Average (ARIMA) model selection commonly available in current versions of SAS for Windows, as identification tools for integrated processes. SCAN and ESACF can…

  6. A Novel Calibration-Minimum Method for Prediction of Mole Fraction in Non-Ideal Mixture.

    PubMed

    Shibayama, Shojiro; Kaneko, Hiromasa; Funatsu, Kimito

    2017-04-01

    This article proposes a novel concentration prediction model that requires little training data and is useful for rapid process understanding. Process analytical technology is currently popular, especially in the pharmaceutical industry, for enhancement of process understanding and process control. A calibration-free method, iterative optimization technology (IOT), was proposed to predict pure component concentrations, because calibration methods such as partial least squares, require a large number of training samples, leading to high costs. However, IOT cannot be applied to concentration prediction in non-ideal mixtures because its basic equation is derived from the Beer-Lambert law, which cannot be applied to non-ideal mixtures. We proposed a novel method that realizes prediction of pure component concentrations in mixtures from a small number of training samples, assuming that spectral changes arising from molecular interactions can be expressed as a function of concentration. The proposed method is named IOT with virtual molecular interaction spectra (IOT-VIS) because the method takes spectral change as a virtual spectrum x nonlin,i into account. It was confirmed through the two case studies that the predictive accuracy of IOT-VIS was the highest among existing IOT methods.

  7. Compressive sensing method for recognizing cat-eye effect targets.

    PubMed

    Li, Li; Li, Hui; Dang, Ersheng; Liu, Bo

    2013-10-01

    This paper proposes a cat-eye effect target recognition method with compressive sensing (CS) and presents a recognition method (sample processing before reconstruction based on compressed sensing, or SPCS) for image processing. In this method, the linear projections of original image sequences are applied to remove dynamic background distractions and extract cat-eye effect targets. Furthermore, the corresponding imaging mechanism for acquiring active and passive image sequences is put forward. This method uses fewer images to recognize cat-eye effect targets, reduces data storage, and translates the traditional target identification, based on original image processing, into measurement vectors processing. The experimental results show that the SPCS method is feasible and superior to the shape-frequency dual criteria method.

  8. Textural Analysis and Substrate Classification in the Nearshore Region of Lake Superior Using High-Resolution Multibeam Bathymetry

    NASA Astrophysics Data System (ADS)

    Dennison, Andrew G.

    Classification of the seafloor substrate can be done with a variety of methods. These methods include Visual (dives, drop cameras); mechanical (cores, grab samples); acoustic (statistical analysis of echosounder returns). Acoustic methods offer a more powerful and efficient means of collecting useful information about the bottom type. Due to the nature of an acoustic survey, larger areas can be sampled, and by combining the collected data with visual and mechanical survey methods provide greater confidence in the classification of a mapped region. During a multibeam sonar survey, both bathymetric and backscatter data is collected. It is well documented that the statistical characteristic of a sonar backscatter mosaic is dependent on bottom type. While classifying the bottom-type on the basis on backscatter alone can accurately predict and map bottom-type, i.e a muddy area from a rocky area, it lacks the ability to resolve and capture fine textural details, an important factor in many habitat mapping studies. Statistical processing of high-resolution multibeam data can capture the pertinent details about the bottom-type that are rich in textural information. Further multivariate statistical processing can then isolate characteristic features, and provide the basis for an accurate classification scheme. The development of a new classification method is described here. It is based upon the analysis of textural features in conjunction with ground truth sampling. The processing and classification result of two geologically distinct areas in nearshore regions of Lake Superior; off the Lester River,MN and Amnicon River, WI are presented here, using the Minnesota Supercomputer Institute's Mesabi computing cluster for initial processing. Processed data is then calibrated using ground truth samples to conduct an accuracy assessment of the surveyed areas. From analysis of high-resolution bathymetry data collected at both survey sites is was possible to successfully calculate a series of measures that describe textural information about the lake floor. Further processing suggests that the features calculated capture a significant amount of statistical information about the lake floor terrain as well. Two sources of error, an anomalous heave and refraction error significantly deteriorated the quality of the processed data and resulting validate results. Ground truth samples used to validate the classification methods utilized for both survey sites, however, resulted in accuracy values ranging from 5 -30 percent at the Amnicon River, and between 60-70 percent for the Lester River. The final results suggest that this new processing methodology does adequately capture textural information about the lake floor and does provide an acceptable classification in the absence of significant data quality issues.

  9. Advances in spectroscopic methods for quantifying soil carbon

    USGS Publications Warehouse

    Reeves, James B.; McCarty, Gregory W.; Calderon, Francisco; Hively, W. Dean

    2012-01-01

    The current gold standard for soil carbon (C) determination is elemental C analysis using dry combustion. However, this method requires expensive consumables, is limited by the number of samples that can be processed (~100/d), and is restricted to the determination of total carbon. With increased interest in soil C sequestration, faster methods of analysis are needed, and there is growing interest in methods based on diffuse reflectance spectroscopy in the visible, near-infrared or mid-infrared spectral ranges. These spectral methods can decrease analytical requirements and speed sample processing, be applied to large landscape areas using remote sensing imagery, and be used to predict multiple analytes simultaneously. However, the methods require localized calibrations to establish the relationship between spectral data and reference analytical data, and also have additional, specific problems. For example, remote sensing is capable of scanning entire watersheds for soil carbon content but is limited to the surface layer of tilled soils and may require difficult and extensive field sampling to obtain proper localized calibration reference values. The objective of this chapter is to discuss the present state of spectroscopic methods for determination of soil carbon.

  10. Microfluidic-Based Robotic Sampling System for Radioactive Solutions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jack D. Law; Julia L. Tripp; Tara E. Smith

    A novel microfluidic based robotic sampling system has been developed for sampling and analysis of liquid solutions in nuclear processes. This system couples the use of a microfluidic sample chip with a robotic system designed to allow remote, automated sampling of process solutions in-cell and facilitates direct coupling of the microfluidic sample chip with analytical instrumentation. This system provides the capability for near real time analysis, reduces analytical waste, and minimizes the potential for personnel exposure associated with traditional sampling methods. A prototype sampling system was designed, built and tested. System testing demonstrated operability of the microfluidic based sample systemmore » and identified system modifications to optimize performance.« less

  11. The distribution of minor constituents in the stratosphere and lower mesosphere

    NASA Technical Reports Server (NTRS)

    Martell, E. A.

    1973-01-01

    The complex circulation processes within the stratosphere and mesosphere have been clarified by recent studies. The distribution of minor constituents in the middle atmosphere is significantly influenced by these transport processes. Rocket sampling results are discussed, giving attention to the sampling method, noble gases, methane, water vapor, molecular hydrogen, and carbon dioxide.

  12. 40 CFR 98.264 - Monitoring and QA/QC requirements.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ...-process phosphoric acid process line. You can use existing plant procedures that are used for accounting... the process line. Conduct the representative bulk sampling using the applicable standard method in the...

  13. Microwave Processing for Sample Preparation to Evaluate Mitochondrial Ultrastructural Damage in Hemorrhagic Shock

    NASA Astrophysics Data System (ADS)

    Josephsen, Gary D.; Josephsen, Kelly A.; Beilman, Greg J.; Taylor, Jodie H.; Muiler, Kristine E.

    2005-12-01

    This is a report of the adaptation of microwave processing in the preparation of liver biopsies for transmission electron microscopy (TEM) to examine ultrastructural damage of mitochondria in the setting of metabolic stress. Hemorrhagic shock was induced in pigs via 35% total blood volume bleed and a 90-min period of shock followed by resuscitation. Hepatic biopsies were collected before shock and after resuscitation. Following collection, biopsies were processed for TEM by a rapid method involving microwave irradiation (Giberson, 2001). Samples pre- and postshock of each of two animals were viewed and scored using the mitochondrial ultrastructure scoring system (Crouser et al., 2002), a system used to quantify the severity of ultrastructural damage during shock. Results showed evidence of increased ultrastructural damage in the postshock samples, which scored 4.00 and 3.42, versus their preshock controls, which scored 1.18 and 1.27. The results of this analysis were similar to those obtained in another model of shock (Crouser et al., 2002). However, the amount of time used to process the samples was significantly shortened with methods involving microwave irradiation.

  14. Adaptive Swarm Balancing Algorithms for rare-event prediction in imbalanced healthcare data

    PubMed Central

    Wong, Raymond K.; Mohammed, Sabah; Fiaidhi, Jinan; Sung, Yunsick

    2017-01-01

    Clinical data analysis and forecasting have made substantial contributions to disease control, prevention and detection. However, such data usually suffer from highly imbalanced samples in class distributions. In this paper, we aim to formulate effective methods to rebalance binary imbalanced dataset, where the positive samples take up only the minority. We investigate two different meta-heuristic algorithms, particle swarm optimization and bat algorithm, and apply them to empower the effects of synthetic minority over-sampling technique (SMOTE) for pre-processing the datasets. One approach is to process the full dataset as a whole. The other is to split up the dataset and adaptively process it one segment at a time. The experimental results reported in this paper reveal that the performance improvements obtained by the former methods are not scalable to larger data scales. The latter methods, which we call Adaptive Swarm Balancing Algorithms, lead to significant efficiency and effectiveness improvements on large datasets while the first method is invalid. We also find it more consistent with the practice of the typical large imbalanced medical datasets. We further use the meta-heuristic algorithms to optimize two key parameters of SMOTE. The proposed methods lead to more credible performances of the classifier, and shortening the run time compared to brute-force method. PMID:28753613

  15. VERIFICATION OF THE DEFENSE WASTE PROCESSING FACILITY PROCESS DIGESTION METHOD FOR THE SLUDGE BATCH 6 QUALIFICATION SAMPLE

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Click, D.; Jones, M.; Edwards, T.

    2010-06-09

    For each sludge batch that is processed in the Defense Waste Processing Facility (DWPF), the Savannah River National Laboratory (SRNL) confirms applicability of the digestion method to be used by the DWPF lab for elemental analysis of Sludge Receipt and Adjustment Tank (SRAT) receipt samples and SRAT product process control samples.1 DWPF SRAT samples are typically dissolved using a room temperature HF-HNO3 acid dissolution (i.e., DWPF Cold Chem (CC) Method, see DWPF Procedure SW4-15.201) and then analyzed by inductively coupled plasma - atomic emission spectroscopy (ICPAES). In addition to the CC method confirmation, the DWPF lab's mercury (Hg) digestion methodmore » was also evaluated for applicability to SB6 (see DWPF procedure 'Mercury System Operating Manual', Manual: SW4-15.204. Section 6.1, Revision 5, Effective date: 12-04-03). This report contains the results and comparison of data generated from performing the Aqua Regia (AR), Sodium Peroxide/Hydroxide Fusion (PF) and DWPF Cold Chem (CC) method digestion of Sludge Batch 6 (SB6) SRAT Receipt and SB6 SRAT Product samples. For validation of the DWPF lab's Hg method, only SRAT receipt material was used and compared to AR digestion results. The SB6 SRAT Receipt and SB6 SRAT Product samples were prepared in the SRNL Shielded Cells, and the SRAT Receipt material is representative of the sludge that constitutes the SB6 Batch or qualification composition. This is the sludge in Tank 51 that is to be transferred into Tank 40, which will contain the heel of Sludge Batch 5 (SB5), to form the SB6 Blend composition. In addition to the 16 elements currently measured by the DWPF, this report includes Hg and thorium (Th) data (Th comprising {approx}2.5 - 3 Wt% of the total solids in SRAT Receipt and SRAT Product, respectively) and provides specific details of ICP-AES analysis of Th. Thorium was found to interfere with the U 367.007 nm emission line, and an inter-element correction (IEC) had to be applied to U data, which is also discussed. The results for any one particular element should not be used in any way to identify the form or speciation of a particular element without support from XRD analysis or used to estimate ratios of compounds in the sludge.« less

  16. Light assisted drying (LAD) for protein stabilization: optimization of laser processing parameters

    NASA Astrophysics Data System (ADS)

    Young, Madison A.; Antczak, Andrew T.; Elliott, Gloria D.; Trammell, Susan R.

    2017-02-01

    In this study, a novel light-based processing method to create an amorphous trehalose matrix for the stabilization of proteins is discussed. Near-IR radiation is used to remove water from samples, leaving behind an amorphous solid with embedded protein. This method has potential applications in the stabilization of protein-based therapeutics and diagnostics that are becoming widely used in the treatment and diagnosis of a variety of diseases. Freeze-drying or freezing are currently the standard for the preservation of proteins, but these methods are expensive and can be challenging in some environments due to a lack of available infrastructure. Light-assisted drying offers a relatively inexpensive method for drying samples. Proteins suspended in a trehalose solution are dehydrated using near-infrared laser light. The laser radiation speeds drying and as water is removed the sugar forms a protective matrix. The goal of this study is to determine processing parameters that result in fast processing times and low end moisture contents (EMC), while maintaining the functionality of embedded proteins. We compare the effect of changing processing wavelength, power and resulting sample temperature, and substrate material on the EMC for two NIR laser sources (1064 nm and 1850 nm). The 1850 nm laser resulted in the lowest EMC (0.1836+/-0.09 gH2O/gDryWeight) after 10 minutes of processing on borosilicate glass microfiber paper. This suggests a storage temperature of 3°C.

  17. A modified method for COD determination of solid waste, using a commercial COD kit and an adapted disposable weighing support.

    PubMed

    André, L; Pauss, A; Ribeiro, T

    2017-03-01

    The chemical oxygen demand (COD) is an essential parameter in waste management, particularly when monitoring wet anaerobic digestion processes. An adapted method to determine COD was developed for solid waste (total solids >15%). This method used commercial COD tubes and did not require sample dilution. A homemade plastic weighing support was used to transfer the solid sample into COD tubes. Potassium hydrogen phthalate and glucose used as standards showed an excellent repeatability. A small underestimation of the theoretical COD value (standard values around 5% lower than theoretical values) was also observed, mainly due to the intrinsic COD of the weighing support and to measurement uncertainties. The adapted COD method was tested using various solid wastes in the range of 1-8 mg COD , determining the COD of dried and ground cellulose, cattle manure, straw and a mixed-substrate sample. This new adapted method could be used to monitor and design dry anaerobic digestion processes.

  18. Varying face occlusion detection and iterative recovery for face recognition

    NASA Astrophysics Data System (ADS)

    Wang, Meng; Hu, Zhengping; Sun, Zhe; Zhao, Shuhuan; Sun, Mei

    2017-05-01

    In most sparse representation methods for face recognition (FR), occlusion problems were usually solved via removing the occlusion part of both query samples and training samples to perform the recognition process. This practice ignores the global feature of facial image and may lead to unsatisfactory results due to the limitation of local features. Considering the aforementioned drawback, we propose a method called varying occlusion detection and iterative recovery for FR. The main contributions of our method are as follows: (1) to detect an accurate occlusion area of facial images, an image processing and intersection-based clustering combination method is used for occlusion FR; (2) according to an accurate occlusion map, the new integrated facial images are recovered iteratively and put into a recognition process; and (3) the effectiveness on recognition accuracy of our method is verified by comparing it with three typical occlusion map detection methods. Experiments show that the proposed method has a highly accurate detection and recovery performance and that it outperforms several similar state-of-the-art methods against partial contiguous occlusion.

  19. 14C sample preparation for AMS microdosing studies at Lund University using online combustion and septa-sealed vials

    NASA Astrophysics Data System (ADS)

    Sydoff, Marie; Stenström, Kristina

    2010-04-01

    The Department of Physics at Lund University is participating in a European Union project called EUMAPP (European Union Microdose AMS Partnership Programme), in which sample preparation and accelerator mass spectrometry (AMS) measurements of biological samples from microdosing studies have been made. This paper describes a simplified method of converting biological samples to solid graphite for 14C analysis with AMS. The method is based on online combustion of the samples, and reduction of CO 2 in septa-sealed vials. The septa-sealed vials and disposable materials are used to eliminate sample cross-contamination. Measurements of ANU and Ox I standards show deviations of 2% and 3%, respectively, relative to reference values. This level of accuracy is sufficient for biological samples from microdosing studies. Since the method has very few handling steps from sample to graphite, the risk of failure during the sample preparation process is minimized, making the method easy to use in routine preparation of samples.

  20. [Influence of different processing methods and mature stages on 3,29-dibenzoyl rarounitriol of Trichosanthes kirilowii seeds].

    PubMed

    Liu, Jin-Na; Xie, Xiao-Liang; Yang, Tai-Xin; Zhang, Cun-Li; Jia, Dong-Sheng; Liu, Ming; Wen, Chun-Xiu

    2014-04-01

    To study the different mature stages and the best processing methods on the quality of Trichosanthes kirilowii seeds. The content of 3,29-dibenzoyl rarounitriol in Trichosanthes kirilowii seeds was determined by HPLC. The sample of different mature stages such as immature, near mature and fully mature and processed by different methods were studied. Fully mature Trichosanthes kirilowii seeds were better than the immatured, and the best processing method was dried under 60degrees C, the content of 3,29-dibenzoyl rarounitriol reached up to 131.63microlg/mL. Different processing methods and different mature stages had a significant influence on the quality of Trichosanthes kirilowii seeds.

  1. Macroinvertebrate community sample collection methods and data collected from Sand Creek and Medano Creek, Great Sand Dunes National Park and Preserve, Colorado, 2005–07

    USGS Publications Warehouse

    Ford, Morgan A.; Zuellig, Robert E.; Walters, David M.; Bruce, James F.

    2016-08-11

    This report provides a table of site descriptions, sample information, and semiquantitative aquatic macroinvertebrate data from 105 samples collected between 2005 and 2007 from 7 stream sites within the Sand Creek and Medano Creek watersheds in Great Sand Dunes National Park and Preserve, Saguache County, Colorado. Additionally, a short description of sample collection methods and laboratory sample processing procedures is presented. These data were collected in anticipation of assessing the potential effects of fish toxicants on macroinvertebrates.

  2. Irregular analytical errors in diagnostic testing - a novel concept.

    PubMed

    Vogeser, Michael; Seger, Christoph

    2018-02-23

    In laboratory medicine, routine periodic analyses for internal and external quality control measurements interpreted by statistical methods are mandatory for batch clearance. Data analysis of these process-oriented measurements allows for insight into random analytical variation and systematic calibration bias over time. However, in such a setting, any individual sample is not under individual quality control. The quality control measurements act only at the batch level. Quantitative or qualitative data derived for many effects and interferences associated with an individual diagnostic sample can compromise any analyte. It is obvious that a process for a quality-control-sample-based approach of quality assurance is not sensitive to such errors. To address the potential causes and nature of such analytical interference in individual samples more systematically, we suggest the introduction of a new term called the irregular (individual) analytical error. Practically, this term can be applied in any analytical assay that is traceable to a reference measurement system. For an individual sample an irregular analytical error is defined as an inaccuracy (which is the deviation from a reference measurement procedure result) of a test result that is so high it cannot be explained by measurement uncertainty of the utilized routine assay operating within the accepted limitations of the associated process quality control measurements. The deviation can be defined as the linear combination of the process measurement uncertainty and the method bias for the reference measurement system. Such errors should be coined irregular analytical errors of the individual sample. The measurement result is compromised either by an irregular effect associated with the individual composition (matrix) of the sample or an individual single sample associated processing error in the analytical process. Currently, the availability of reference measurement procedures is still highly limited, but LC-isotope-dilution mass spectrometry methods are increasingly used for pre-market validation of routine diagnostic assays (these tests also involve substantial sets of clinical validation samples). Based on this definition/terminology, we list recognized causes of irregular analytical error as a risk catalog for clinical chemistry in this article. These issues include reproducible individual analytical errors (e.g. caused by anti-reagent antibodies) and non-reproducible, sporadic errors (e.g. errors due to incorrect pipetting volume due to air bubbles in a sample), which can both lead to inaccurate results and risks for patients.

  3. Finite-key analysis for quantum key distribution with weak coherent pulses based on Bernoulli sampling

    NASA Astrophysics Data System (ADS)

    Kawakami, Shun; Sasaki, Toshihiko; Koashi, Masato

    2017-07-01

    An essential step in quantum key distribution is the estimation of parameters related to the leaked amount of information, which is usually done by sampling of the communication data. When the data size is finite, the final key rate depends on how the estimation process handles statistical fluctuations. Many of the present security analyses are based on the method with simple random sampling, where hypergeometric distribution or its known bounds are used for the estimation. Here we propose a concise method based on Bernoulli sampling, which is related to binomial distribution. Our method is suitable for the Bennett-Brassard 1984 (BB84) protocol with weak coherent pulses [C. H. Bennett and G. Brassard, Proceedings of the IEEE Conference on Computers, Systems and Signal Processing (IEEE, New York, 1984), Vol. 175], reducing the number of estimated parameters to achieve a higher key generation rate compared to the method with simple random sampling. We also apply the method to prove the security of the differential-quadrature-phase-shift (DQPS) protocol in the finite-key regime. The result indicates that the advantage of the DQPS protocol over the phase-encoding BB84 protocol in terms of the key rate, which was previously confirmed in the asymptotic regime, persists in the finite-key regime.

  4. UHPLC-TQ-MS Coupled with Multivariate Statistical Analysis to Characterize Nucleosides, Nucleobases and Amino Acids in Angelicae Sinensis Radix Obtained by Different Drying Methods.

    PubMed

    Zhu, Shaoqing; Guo, Sheng; Duan, Jin-Ao; Qian, Dawei; Yan, Hui; Sha, Xiuxiu; Zhu, Zhenhua

    2017-06-01

    To explore the nutrients in roots of Angelica sinensis (Angelicae Sinensis Radix, ASR), a medicinal and edible plant, and evaluate its nutritional value, a rapid and reliable UHPLC-TQ-MS method was established and used to determine the potential nutritional compounds, including nucleosides, nucleobases and amino acids, in 50 batches of ASR samples obtained using two drying methods. The results showed that ASR is a healthy food rich in nucleosides, nucleobases and amino acids, especially arginine. The total average content of nucleosides and nucleobases in all ASR samples was 3.94 mg/g, while that of amino acids reached as high as 61.79 mg/g. Principle component analysis showed that chemical profile differences exist between the two groups of ASR samples prepared using different drying methods, and the contents of nutritional compounds in samples dried with the tempering-intermittent drying processing method (TIDM) were generally higher than those dried using the traditional solar processing method. The above results suggest that ASR should be considered an ideal healthy food and TIDM could be a suitable drying method for ASR when taking nucleosides, nucleobases and amino acids as the major consideration for their known human health benefits.

  5. Method development in high-performance liquid chromatography for high-throughput profiling and metabonomic studies of biofluid samples.

    PubMed

    Pham-Tuan, Hai; Kaskavelis, Lefteris; Daykin, Clare A; Janssen, Hans-Gerd

    2003-06-15

    "Metabonomics" has in the past decade demonstrated enormous potential in furthering the understanding of, for example, disease processes, toxicological mechanisms, and biomarker discovery. The same principles can also provide a systematic and comprehensive approach to the study of food ingredient impact on consumer health. However, "metabonomic" methodology requires the development of rapid, advanced analytical tools to comprehensively profile biofluid metabolites within consumers. Until now, NMR spectroscopy has been used for this purpose almost exclusively. Chromatographic techniques and in particular HPLC, have not been exploited accordingly. The main drawbacks of chromatography are the long analysis time, instabilities in the sample fingerprint and the rigorous sample preparation required. This contribution addresses these problems in the quest to develop generic methods for high-throughput profiling using HPLC. After a careful optimization process, stable fingerprints of biofluid samples can be obtained using standard HPLC equipment. A method using a short monolithic column and a rapid gradient with a high flow-rate has been developed that allowed rapid and detailed profiling of larger numbers of urine samples. The method can be easily translated into a slow, shallow-gradient high-resolution method for identification of interesting peaks by LC-MS/NMR. A similar approach has been applied for cell culture media samples. Due to the much higher protein content of such samples non-porous polymer-based small particle columns yielded the best results. The study clearly shows that HPLC can be used in metabonomic fingerprinting studies.

  6. Frequency Domain Analysis of Sensor Data for Event Classification in Real-Time Robot Assisted Deburring

    PubMed Central

    Pappachan, Bobby K; Caesarendra, Wahyu; Tjahjowidodo, Tegoeh; Wijaya, Tomi

    2017-01-01

    Process monitoring using indirect methods relies on the usage of sensors. Using sensors to acquire vital process related information also presents itself with the problem of big data management and analysis. Due to uncertainty in the frequency of events occurring, a higher sampling rate is often used in real-time monitoring applications to increase the chances of capturing and understanding all possible events related to the process. Advanced signal processing methods are used to further decipher meaningful information from the acquired data. In this research work, power spectrum density (PSD) of sensor data acquired at sampling rates between 40–51.2 kHz was calculated and the corelation between PSD and completed number of cycles/passes is presented. Here, the progress in number of cycles/passes is the event this research work intends to classify and the algorithm used to compute PSD is Welch’s estimate method. A comparison between Welch’s estimate method and statistical methods is also discussed. A clear co-relation was observed using Welch’s estimate to classify the number of cycles/passes. The paper also succeeds in classifying vibration signal generated by the spindle from the vibration signal acquired during finishing process. PMID:28556809

  7. Sampling Strategies and Processing of Biobank Tissue Samples from Porcine Biomedical Models.

    PubMed

    Blutke, Andreas; Wanke, Rüdiger

    2018-03-06

    In translational medical research, porcine models have steadily become more popular. Considering the high value of individual animals, particularly of genetically modified pig models, and the often-limited number of available animals of these models, establishment of (biobank) collections of adequately processed tissue samples suited for a broad spectrum of subsequent analyses methods, including analyses not specified at the time point of sampling, represent meaningful approaches to take full advantage of the translational value of the model. With respect to the peculiarities of porcine anatomy, comprehensive guidelines have recently been established for standardized generation of representative, high-quality samples from different porcine organs and tissues. These guidelines are essential prerequisites for the reproducibility of results and their comparability between different studies and investigators. The recording of basic data, such as organ weights and volumes, the determination of the sampling locations and of the numbers of tissue samples to be generated, as well as their orientation, size, processing and trimming directions, are relevant factors determining the generalizability and usability of the specimen for molecular, qualitative, and quantitative morphological analyses. Here, an illustrative, practical, step-by-step demonstration of the most important techniques for generation of representative, multi-purpose biobank specimen from porcine tissues is presented. The methods described here include determination of organ/tissue volumes and densities, the application of a volume-weighted systematic random sampling procedure for parenchymal organs by point-counting, determination of the extent of tissue shrinkage related to histological embedding of samples, and generation of randomly oriented samples for quantitative stereological analyses, such as isotropic uniform random (IUR) sections generated by the "Orientator" and "Isector" methods, and vertical uniform random (VUR) sections.

  8. Development of a new protocol for rapid bacterial identification and susceptibility testing directly from urine samples.

    PubMed

    Zboromyrska, Y; Rubio, E; Alejo, I; Vergara, A; Mons, A; Campo, I; Bosch, J; Marco, F; Vila, J

    2016-06-01

    The current gold standard method for the diagnosis of urinary tract infections (UTI) is urine culture that requires 18-48 h for the identification of the causative microorganisms and an additional 24 h until the results of antimicrobial susceptibility testing (AST) are available. The aim of this study was to shorten the time of urine sample processing by a combination of flow cytometry for screening and matrix-assisted laser desorption/ionization time-of-flight mass spectrometry (MALDI-TOF-MS) for bacterial identification followed by AST directly from urine. The study was divided into two parts. During the first part, 675 urine samples were processed by a flow cytometry device and a cut-off value of bacterial count was determined to select samples for direct identification by MALDI-TOF-MS at ≥5 × 10(6) bacteria/mL. During the second part, 163 of 1029 processed samples reached the cut-off value. The sample preparation protocol for direct identification included two centrifugation and two washing steps. Direct AST was performed by the disc diffusion method if a reliable direct identification was obtained. Direct MALDI-TOF-MS identification was performed in 140 urine samples; 125 of the samples were positive by urine culture, 12 were contaminated and 3 were negative. Reliable direct identification was obtained in 108 (86.4%) of the 125 positive samples. AST was performed in 102 identified samples, and the results were fully concordant with the routine method among 83 monomicrobial infections. In conclusion, the turnaround time of the protocol described to diagnose UTI was about 1 h for microbial identification and 18-24 h for AST. Copyright © 2016 European Society of Clinical Microbiology and Infectious Diseases. Published by Elsevier Ltd. All rights reserved.

  9. Accuracy or precision: Implications of sample design and methodology on abundance estimation

    USGS Publications Warehouse

    Kowalewski, Lucas K.; Chizinski, Christopher J.; Powell, Larkin A.; Pope, Kevin L.; Pegg, Mark A.

    2015-01-01

    Sampling by spatially replicated counts (point-count) is an increasingly popular method of estimating population size of organisms. Challenges exist when sampling by point-count method, and it is often impractical to sample entire area of interest and impossible to detect every individual present. Ecologists encounter logistical limitations that force them to sample either few large-sample units or many small sample-units, introducing biases to sample counts. We generated a computer environment and simulated sampling scenarios to test the role of number of samples, sample unit area, number of organisms, and distribution of organisms in the estimation of population sizes using N-mixture models. Many sample units of small area provided estimates that were consistently closer to true abundance than sample scenarios with few sample units of large area. However, sample scenarios with few sample units of large area provided more precise abundance estimates than abundance estimates derived from sample scenarios with many sample units of small area. It is important to consider accuracy and precision of abundance estimates during the sample design process with study goals and objectives fully recognized, although and with consequence, consideration of accuracy and precision of abundance estimates is often an afterthought that occurs during the data analysis process.

  10. Laser excited confocal microscope fluorescence scanner and method

    DOEpatents

    Mathies, Richard A.; Peck, Konan

    1992-01-01

    A fluorescent scanner for scanning the fluorescence from a fluorescence labeled separated sample on a sample carrier including a confocal microscope for illuminating a predetermined volume of the sample carrier and/or receiving and processing fluorescence emissions from said volume to provide a display of the separated sample.

  11. EVALUATION OF ARG-1 SAMPLES PREPARED BY CESIUM CARBONATE DISSOLUTION DURING THE ISOLOK SME ACCEPTABILITY TESTING

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Edwards, T.; Hera, K.; Coleman, C.

    2011-12-05

    Evaluation of Defense Waste Processing Facility (DWPF) Chemical Process Cell (CPC) cycle time identified several opportunities to improve the CPC processing time. The Mechanical Systems & Custom Equipment Development (MS&CED) Section of the Savannah River National Laboratory (SRNL) recently completed the evaluation of one of these opportunities - the possibility of using an Isolok sampling valve as an alternative to the Hydragard valve for taking DWPF process samples at the Slurry Mix Evaporator (SME). The use of an Isolok for SME sampling has the potential to improve operability, reduce maintenance time, and decrease CPC cycle time. The SME acceptability testingmore » for the Isolok was requested in Task Technical Request (TTR) HLW-DWPF-TTR-2010-0036 and was conducted as outlined in Task Technical and Quality Assurance Plan (TTQAP) SRNLRP-2011-00145. RW-0333P QA requirements applied to the task, and the results from the investigation were documented in SRNL-STI-2011-00693. Measurement of the chemical composition of study samples was a critical component of the SME acceptability testing of the Isolok. A sampling and analytical plan supported the investigation with the analytical plan directing that the study samples be prepared by a cesium carbonate (Cs{sub 2}CO{sub 3}) fusion dissolution method and analyzed by Inductively Coupled Plasma - Optical Emission Spectroscopy (ICP-OES). The use of the cesium carbonate preparation method for the Isolok testing provided an opportunity for an additional assessment of this dissolution method, which is being investigated as a potential replacement for the two methods (i.e., sodium peroxide fusion and mixed acid dissolution) that have been used at the DWPF for the analysis of SME samples. Earlier testing of the Cs{sub 2}CO{sub 3} method yielded promising results which led to a TTR from Savannah River Remediation, LLC (SRR) to SRNL for additional support and an associated TTQAP to direct the SRNL efforts. A technical report resulting from this work was issued that recommended that the mixed acid method be replaced by the Cs{sub 2}CO{sub 3} method for the measurement of magnesium (Mg), sodium (Na), and zirconium (Zr) with additional testing of the method by DWPF Laboratory being needed before further implementation of the Cs{sub 2}CO{sub 3} method at that laboratory. While the SME acceptability testing of the Isolok does not address any of the open issues remaining after the publication of the recommendation for the replacement of the mixed acid method by the Cs{sub 2}CO{sub 3} method (since those issues are to be addressed by the DWPF Laboratory), the Cs{sub 2}CO{sub 3} testing associated with the Isolok testing does provide additional insight into the performance of the method as conducted by SRNL. The performance is to be investigated by looking to the composition measurement data generated by the samples of a standard glass, the Analytical Reference Glass - 1 (ARG-1), that were prepared by the Cs{sub 2}CO{sub 3} method and included in the SME acceptability testing of the Isolok. The measurements of these samples were presented as part of the study results, but no statistical analysis of these measurements was conducted as part of those results. It is the purpose of this report to provide that analysis, which was supported using JMP Version 7.0.2.« less

  12. Method for using polarization gating to measure a scattering sample

    DOEpatents

    Baba, Justin S.

    2015-08-04

    Described herein are systems, devices, and methods facilitating optical characterization of scattering samples. A polarized optical beam can be directed to pass through a sample to be tested. The optical beam exiting the sample can then be analyzed to determine its degree of polarization, from which other properties of the sample can be determined. In some cases, an apparatus can include a source of an optical beam, an input polarizer, a sample, an output polarizer, and a photodetector. In some cases, a signal from a photodetector can be processed through attenuation, variable offset, and variable gain.

  13. Quick counting method for estimating the number of viable microbes on food and food processing equipment.

    PubMed

    Winter, F H; York, G K; el-Nakhal, H

    1971-07-01

    A rapid method for estimating the extent of microbial contamination on food and on food processing equipment is described. Microbial cells are rinsed from food or swab samples with sterile diluent and concentrated on the surface of membrane filters. The filters are incubated on a suitable bacteriological medium for 4 hr at 30 C, heated at 105 C for 5 min, and stained. The membranes are then dried at 60 C for 15 min, rendered transparent with immersion oil, and examined microscopically. Data obtained by the rapid method were compared with counts of the same samples determined by the standard plate count method. Over 60 comparisons resulted in a correlation coefficient of 0.906. Because the rapid technique can provide reliable microbiological count information in extremely short times, it can be a most useful tool in the routine evaluation of microbial contamination of food processing facilities and for some foods.

  14. [Determination of Alternaria mycotoxins in selected raw and processed fruit and vegetable products].

    PubMed

    Giryn, H; Szteke, B

    1995-01-01

    The purpose of the study was the assessment of Alternaria-mycotoxins contamination in some raw and processed plant products. The analytical method for detection of alternariol (AOH) and alternariol methyl ether (AME) is described. After extraction and purification of sample crude extracts by column chromatography on silica gel, the qualitative and quantitative analyses were carried out by two-dimensional TLC. There were 110 samples analyzed -44 (included 32 moulded) raw plant samples and 56 processed plant products. Levels of alternaria-mycotoxins found in fruits and tomatoes visibly decayed ranged between 3 to 420 micrograms/kg for AOH and 10 to 100 micrograms/kg for AME. Trace amounts of AOH were detected in 3 samples of processed products.

  15. A review of empirical research related to the use of small quantitative samples in clinical outcome scale development.

    PubMed

    Houts, Carrie R; Edwards, Michael C; Wirth, R J; Deal, Linda S

    2016-11-01

    There has been a notable increase in the advocacy of using small-sample designs as an initial quantitative assessment of item and scale performance during the scale development process. This is particularly true in the development of clinical outcome assessments (COAs), where Rasch analysis has been advanced as an appropriate statistical tool for evaluating the developing COAs using a small sample. We review the benefits such methods are purported to offer from both a practical and statistical standpoint and detail several problematic areas, including both practical and statistical theory concerns, with respect to the use of quantitative methods, including Rasch-consistent methods, with small samples. The feasibility of obtaining accurate information and the potential negative impacts of misusing large-sample statistical methods with small samples during COA development are discussed.

  16. Materials and Methods for Streamlined Laboratory Analysis of Environmental Samples, FY 2016 Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Addleman, Raymond S.; Naes, Benjamin E.; McNamara, Bruce K.

    The International Atomic Energy Agency (IAEA) relies upon laboratory analysis of environmental samples (typically referred to as “swipes”) collected during on-site inspections of safeguarded facilities to support the detection and deterrence of undeclared activities. Unfortunately, chemical processing and assay of the samples is slow and expensive. A rapid, effective, and simple extraction process and analysis method is needed to provide certified results with improved timeliness at reduced costs (principally in the form of reduced labor), while maintaining or improving sensitivity and efficacy. To address these safeguard needs the Pacific Northwest National Laboratory (PNNL) explored and demonstrated improved methods for environmentalmore » sample (ES) analysis. Improvements for both bulk and particle analysis were explored. To facilitate continuity and adoption, the new sampling materials and processing methods will be compatible with existing IAEA protocols for ES analysis. PNNL collaborated with Oak Ridge National Laboratory (ORNL), which performed independent validation of the new bulk analysis methods and compared performance to traditional IAEA’s Network of Analytical Laboratories (NWAL) protocol. ORNL efforts are reported separately. This report describes PNNL’s FY 2016 progress, which was focused on analytical application supporting environmental monitoring of uranium enrichment plants and nuclear fuel processing. In the future the technology could be applied to other safeguard applications and analytes related to fuel manufacturing, reprocessing, etc. PNNL’s FY 2016 efforts were broken into two tasks and a summary of progress, accomplishments and highlights are provided below. Principal progress and accomplishments on Task 1, Optimize Materials and Methods for ICP-MS Environmental Sample Analysis, are listed below. • Completed initial procedure for rapid uranium extraction from ES swipes based upon carbonate-peroxide chemistry (delivered to ORNL for evaluation). • Explored improvements to carbonate-peroxide rapid uranium extraction chemistry. • Evaluated new sampling materials and methods (in collaboration with ORNL). • Demonstrated successful ES extractions from standard and novel swipes for a wide range uranium compounds of interest including UO 2F 2 and UO 2(NO 3) 2, U 3O 8 and uranium ore concentrate. • Completed initial discussions with commercial suppliers of PTFE swipe materials. • Submitted one manuscript for publication. Two additional drafts are being prepared. Principal progress and accomplishments on Task 2, Optimize Materials and Methods for Direct SIMS Environmental Sample Analysis, are listed below. • Designed a SIMS swipe sample holder that retrofits into existing equipment and provides simple, effective, and rapid mounting of ES samples for direct assay while enabling automation and laboratory integration. • Identified preferred conductive sampling materials with better performance characteristics. • Ran samples on the new PNNL NWAL equivalent Cameca 1280 SIMS system. • Obtained excellent agreement between isotopic ratios for certified materials and direct SIMS assay of very low levels of LEU and HEU UO 2F 2 particles on carbon fiber sampling material. Sample activities range from 1 to 500 CPM (uranium mass on sample is dependent upon specific isotope ratio but is frequently in the subnanogram range). • Found that the presence of the UF molecular ions, as measured by SIMS, provides chemical information about the particle that is separate from the uranium isotopics and strongly suggests that those particles originated from an UF6 enrichment activity. • Submitted one manuscript for publication. Another manuscript is in preparation.« less

  17. Protocol for Detection of Yersinia pestis in Environmental ...

    EPA Pesticide Factsheets

    Methods Report This is the first ever open-access and detailed protocol available to all government departments and agencies, and their contractors to detect Yersinia pestis, the pathogen that causes plague, from multiple environmental sample types including water. Each analytical method includes sample processing procedure for each sample type in a step-by-step manner. It includes real-time PCR, traditional microbiological culture, and the Rapid Viability PCR (RV-PCR) analytical methods. For large volume water samples it also includes an ultra-filtration-based sample concentration procedure. Because of such a non-restrictive availability of this protocol to all government departments and agencies, and their contractors, the nation will now have increased laboratory capacity to analyze large number of samples during a wide-area plague incident.

  18. Adaptively biased sequential importance sampling for rare events in reaction networks with comparison to exact solutions from finite buffer dCME method

    PubMed Central

    Cao, Youfang; Liang, Jie

    2013-01-01

    Critical events that occur rarely in biological processes are of great importance, but are challenging to study using Monte Carlo simulation. By introducing biases to reaction selection and reaction rates, weighted stochastic simulation algorithms based on importance sampling allow rare events to be sampled more effectively. However, existing methods do not address the important issue of barrier crossing, which often arises from multistable networks and systems with complex probability landscape. In addition, the proliferation of parameters and the associated computing cost pose significant problems. Here we introduce a general theoretical framework for obtaining optimized biases in sampling individual reactions for estimating probabilities of rare events. We further describe a practical algorithm called adaptively biased sequential importance sampling (ABSIS) method for efficient probability estimation. By adopting a look-ahead strategy and by enumerating short paths from the current state, we estimate the reaction-specific and state-specific forward and backward moving probabilities of the system, which are then used to bias reaction selections. The ABSIS algorithm can automatically detect barrier-crossing regions, and can adjust bias adaptively at different steps of the sampling process, with bias determined by the outcome of exhaustively generated short paths. In addition, there are only two bias parameters to be determined, regardless of the number of the reactions and the complexity of the network. We have applied the ABSIS method to four biochemical networks: the birth-death process, the reversible isomerization, the bistable Schlögl model, and the enzymatic futile cycle model. For comparison, we have also applied the finite buffer discrete chemical master equation (dCME) method recently developed to obtain exact numerical solutions of the underlying discrete chemical master equations of these problems. This allows us to assess sampling results objectively by comparing simulation results with true answers. Overall, ABSIS can accurately and efficiently estimate rare event probabilities for all examples, often with smaller variance than other importance sampling algorithms. The ABSIS method is general and can be applied to study rare events of other stochastic networks with complex probability landscape. PMID:23862966

  19. Adaptively biased sequential importance sampling for rare events in reaction networks with comparison to exact solutions from finite buffer dCME method

    NASA Astrophysics Data System (ADS)

    Cao, Youfang; Liang, Jie

    2013-07-01

    Critical events that occur rarely in biological processes are of great importance, but are challenging to study using Monte Carlo simulation. By introducing biases to reaction selection and reaction rates, weighted stochastic simulation algorithms based on importance sampling allow rare events to be sampled more effectively. However, existing methods do not address the important issue of barrier crossing, which often arises from multistable networks and systems with complex probability landscape. In addition, the proliferation of parameters and the associated computing cost pose significant problems. Here we introduce a general theoretical framework for obtaining optimized biases in sampling individual reactions for estimating probabilities of rare events. We further describe a practical algorithm called adaptively biased sequential importance sampling (ABSIS) method for efficient probability estimation. By adopting a look-ahead strategy and by enumerating short paths from the current state, we estimate the reaction-specific and state-specific forward and backward moving probabilities of the system, which are then used to bias reaction selections. The ABSIS algorithm can automatically detect barrier-crossing regions, and can adjust bias adaptively at different steps of the sampling process, with bias determined by the outcome of exhaustively generated short paths. In addition, there are only two bias parameters to be determined, regardless of the number of the reactions and the complexity of the network. We have applied the ABSIS method to four biochemical networks: the birth-death process, the reversible isomerization, the bistable Schlögl model, and the enzymatic futile cycle model. For comparison, we have also applied the finite buffer discrete chemical master equation (dCME) method recently developed to obtain exact numerical solutions of the underlying discrete chemical master equations of these problems. This allows us to assess sampling results objectively by comparing simulation results with true answers. Overall, ABSIS can accurately and efficiently estimate rare event probabilities for all examples, often with smaller variance than other importance sampling algorithms. The ABSIS method is general and can be applied to study rare events of other stochastic networks with complex probability landscape.

  20. Adaptively biased sequential importance sampling for rare events in reaction networks with comparison to exact solutions from finite buffer dCME method.

    PubMed

    Cao, Youfang; Liang, Jie

    2013-07-14

    Critical events that occur rarely in biological processes are of great importance, but are challenging to study using Monte Carlo simulation. By introducing biases to reaction selection and reaction rates, weighted stochastic simulation algorithms based on importance sampling allow rare events to be sampled more effectively. However, existing methods do not address the important issue of barrier crossing, which often arises from multistable networks and systems with complex probability landscape. In addition, the proliferation of parameters and the associated computing cost pose significant problems. Here we introduce a general theoretical framework for obtaining optimized biases in sampling individual reactions for estimating probabilities of rare events. We further describe a practical algorithm called adaptively biased sequential importance sampling (ABSIS) method for efficient probability estimation. By adopting a look-ahead strategy and by enumerating short paths from the current state, we estimate the reaction-specific and state-specific forward and backward moving probabilities of the system, which are then used to bias reaction selections. The ABSIS algorithm can automatically detect barrier-crossing regions, and can adjust bias adaptively at different steps of the sampling process, with bias determined by the outcome of exhaustively generated short paths. In addition, there are only two bias parameters to be determined, regardless of the number of the reactions and the complexity of the network. We have applied the ABSIS method to four biochemical networks: the birth-death process, the reversible isomerization, the bistable Schlögl model, and the enzymatic futile cycle model. For comparison, we have also applied the finite buffer discrete chemical master equation (dCME) method recently developed to obtain exact numerical solutions of the underlying discrete chemical master equations of these problems. This allows us to assess sampling results objectively by comparing simulation results with true answers. Overall, ABSIS can accurately and efficiently estimate rare event probabilities for all examples, often with smaller variance than other importance sampling algorithms. The ABSIS method is general and can be applied to study rare events of other stochastic networks with complex probability landscape.

  1. Optimization of the solvent-based dissolution method to sample volatile organic compound vapors for compound-specific isotope analysis.

    PubMed

    Bouchard, Daniel; Wanner, Philipp; Luo, Hong; McLoughlin, Patrick W; Henderson, James K; Pirkle, Robert J; Hunkeler, Daniel

    2017-10-20

    The methodology of the solvent-based dissolution method used to sample gas phase volatile organic compounds (VOC) for compound-specific isotope analysis (CSIA) was optimized to lower the method detection limits for TCE and benzene. The sampling methodology previously evaluated by [1] consists in pulling the air through a solvent to dissolve and accumulate the gaseous VOC. After the sampling process, the solvent can then be treated similarly as groundwater samples to perform routine CSIA by diluting an aliquot of the solvent into water to reach the required concentration of the targeted contaminant. Among solvents tested, tetraethylene glycol dimethyl ether (TGDE) showed the best aptitude for the method. TGDE has a great affinity with TCE and benzene, hence efficiently dissolving the compounds during their transition through the solvent. The method detection limit for TCE (5±1μg/m 3 ) and benzene (1.7±0.5μg/m 3 ) is lower when using TGDE compared to methanol, which was previously used (385μg/m 3 for TCE and 130μg/m 3 for benzene) [2]. The method detection limit refers to the minimal gas phase concentration in ambient air required to load sufficient VOC mass into TGDE to perform δ 13 C analysis. Due to a different analytical procedure, the method detection limit associated with δ 37 Cl analysis was found to be 156±6μg/m 3 for TCE. Furthermore, the experimental results validated the relationship between the gas phase TCE and the progressive accumulation of dissolved TCE in the solvent during the sampling process. Accordingly, based on the air-solvent partitioning coefficient, the sampling methodology (e.g. sampling rate, sampling duration, amount of solvent) and the final TCE concentration in the solvent, the concentration of TCE in the gas phase prevailing during the sampling event can be determined. Moreover, the possibility to analyse for TCE concentration in the solvent after sampling (or other targeted VOCs) allows the field deployment of the sampling method without the need to determine the initial gas phase TCE concentration. The simplified field deployment approach of the solvent-based dissolution method combined with the conventional analytical procedure used for groundwater samples substantially facilitates the application of CSIA to gas phase studies. Copyright © 2017 Elsevier B.V. All rights reserved.

  2. Straightforward rapid spectrophotometric quantification of total cyanogenic glycosides in fresh and processed cassava products.

    PubMed

    Tivana, Lucas Daniel; Da Cruz Francisco, Jose; Zelder, Felix; Bergenståhl, Bjorn; Dejmek, Petr

    2014-09-01

    In this study, we extend pioneering studies and demonstrate straightforward applicability of the corrin-based chemosensor, aquacyanocobyrinic acid (ACCA), for the instantaneous detection and rapid quantification of endogenous cyanide in fresh and processed cassava roots. Hydrolytically liberated endogenous cyanide from cyanogenic glycosides (CNp) reacts with ACCA to form dicyanocobyrinic acid (DCCA), accompanied by a change of colour from orange to violet. The method was successfully tested on various cassava samples containing between 6 and 200 mg equiv. HCN/kg as verified with isonicotinate/1,3-dimethylbarbiturate as an independent method. The affinity of ACCA sensor to cyanide is high, coordination occurs fast and the colorimetric response can therefore be instantaneously monitored with spectrophotometric methods. Direct applications of the sensor without need of extensive and laborious extraction processes are demonstrated in water-extracted samples, in acid-extracted samples, and directly on juice drops. ACCA showed high precision with a standard deviation (STDV) between 0.03 and 0.06 and high accuracy (93-96%). Overall, the ACCA procedure is straightforward, safe and easily performed. In a proof-of-concept study, rapid screening of ten samples within 20 min has been tested. Copyright © 2014 Elsevier Ltd. All rights reserved.

  3. Sampling Methods in Cardiovascular Nursing Research: An Overview.

    PubMed

    Kandola, Damanpreet; Banner, Davina; O'Keefe-McCarthy, Sheila; Jassal, Debbie

    2014-01-01

    Cardiovascular nursing research covers a wide array of topics from health services to psychosocial patient experiences. The selection of specific participant samples is an important part of the research design and process. The sampling strategy employed is of utmost importance to ensure that a representative sample of participants is chosen. There are two main categories of sampling methods: probability and non-probability. Probability sampling is the random selection of elements from the population, where each element of the population has an equal and independent chance of being included in the sample. There are five main types of probability sampling including simple random sampling, systematic sampling, stratified sampling, cluster sampling, and multi-stage sampling. Non-probability sampling methods are those in which elements are chosen through non-random methods for inclusion into the research study and include convenience sampling, purposive sampling, and snowball sampling. Each approach offers distinct advantages and disadvantages and must be considered critically. In this research column, we provide an introduction to these key sampling techniques and draw on examples from the cardiovascular research. Understanding the differences in sampling techniques may aid nurses in effective appraisal of research literature and provide a reference pointfor nurses who engage in cardiovascular research.

  4. Image portion identification methods, image parsing methods, image parsing systems, and articles of manufacture

    DOEpatents

    Lassahn, Gordon D.; Lancaster, Gregory D.; Apel, William A.; Thompson, Vicki S.

    2013-01-08

    Image portion identification methods, image parsing methods, image parsing systems, and articles of manufacture are described. According to one embodiment, an image portion identification method includes accessing data regarding an image depicting a plurality of biological substrates corresponding to at least one biological sample and indicating presence of at least one biological indicator within the biological sample and, using processing circuitry, automatically identifying a portion of the image depicting one of the biological substrates but not others of the biological substrates.

  5. Method for charging a hydrogen getter

    DOEpatents

    Tracy, C.E.; Keyser, M.A.; Benson, D.K.

    1998-09-15

    A method for charging a sample of either a permanent or reversible getter material with a high concentration of hydrogen while maintaining a base pressure below 10{sup {minus}4} torr at room temperature involves placing the sample of hydrogen getter material in a chamber, activating the sample of hydrogen getter material, overcharging the sample of getter material through conventional charging techniques to a high concentration of hydrogen, and then subjecting the sample of getter material to a low temperature vacuum bake-out process. Application of the method results in a reversible hydrogen getter which is highly charged to maximum capacities of hydrogen and which concurrently exhibits minimum hydrogen vapor pressures at room temperatures. 9 figs.

  6. Flow cytometric analysis of microbial contamination in food industry technological lines--initial study.

    PubMed

    Józwa, Wojciech; Czaczyk, Katarzyna

    2012-04-02

    Flow cytometry constitutes an alternative for traditional methods of microorganisms identification and analysis, including methods requiring cultivation step. It enables the detection of pathogens and other microorganisms contaminants without the need to culture microbial cells meaning that the sample (water, waste or food e.g. milk, wine, beer) may be analysed directly. This leads to a significant reduction of time required for analysis allowing monitoring of production processes and immediate reaction in case of contamination or any disruption occurs. Apart from the analysis of raw materials or products on different stages of manufacturing process, the flow cytometry seems to constitute an ideal tool for the assessment of microbial contamination on the surface of technological lines. In the present work samples comprising smears from 3 different surfaces of technological lines from fruit and vegetable processing company from Greater Poland were analysed directly with flow cytometer. The measured parameters were forward and side scatter of laser light signals allowing the estimation of microbial cell contents in each sample. Flow cytometric analysis of the surface of food industry production lines enable the preliminary evaluation of microbial contamination within few minutes from the moment of sample arrival without the need of sample pretreatment. The presented method of fl ow cytometric initial evaluation of microbial state of food industry technological lines demonstrated its potential for developing a robust, routine method for the rapid and labor-saving detection of microbial contamination in food industry.

  7. Rapid fusion method for the determination of Pu, Np, and Am in large soil samples

    DOE PAGES

    Maxwell, Sherrod L.; Culligan, Brian; Hutchison, Jay B.; ...

    2015-02-14

    A new rapid sodium hydroxide fusion method for the preparation of 10-20 g soil samples has been developed by the Savannah River National Laboratory (SRNL). The method enables lower detection limits for plutonium, neptunium, and americium in environmental soil samples. The method also significantly reduces sample processing time and acid fume generation compared to traditional soil digestion techniques using hydrofluoric acid. Ten gram soil aliquots can be ashed and fused using the new method in 1-2 hours, completely dissolving samples, including refractory particles. Pu, Np and Am are separated using stacked 2mL cartridges of TEVA and DGA Resin and measuredmore » using alpha spectrometry. The method can be adapted for measurement by inductively-coupled plasma mass spectrometry (ICP-MS). Two 10 g soil aliquots of fused soil may be combined prior to chromatographic separations to further improve detection limits. Total sample preparation time, including chromatographic separations and alpha spectrometry source preparation, is less than 8 hours.« less

  8. Rotor assembly and method for automatically processing liquids

    DOEpatents

    Burtis, Carl A.; Johnson, Wayne F.; Walker, William A.

    1992-01-01

    A rotor assembly for performing a relatively large number of processing steps upon a sample, such as a whole blood sample, and a diluent, such as water, includes a rotor body for rotation about an axis and including a network of chambers within which various processing steps are performed upon the sample and diluent and passageways through which the sample and diluent are transferred. A transfer mechanism is movable through the rotor body by the influence of a magnetic field generated adjacent the transfer mechanism and movable along the rotor body, and the assembly utilizes centrifugal force, a transfer of momentum and capillary action to perform any of a number of processing steps such as separation, aliquoting, transference, washing, reagent addition and mixing of the sample and diluent within the rotor body. The rotor body is particularly suitable for automatic immunoassay analyses.

  9. Development of automation software for neutron activation analysis process in Malaysian nuclear agency

    NASA Astrophysics Data System (ADS)

    Yussup, N.; Rahman, N. A. A.; Ibrahim, M. M.; Mokhtar, M.; Salim, N. A. A.; Soh@Shaari, S. C.; Azman, A.

    2017-01-01

    Neutron Activation Analysis (NAA) process has been established in Malaysian Nuclear Agency (Nuclear Malaysia) since 1980s. Most of the procedures established especially from sample registration to sample analysis are performed manually. These manual procedures carried out by the NAA laboratory personnel are time consuming and inefficient. Hence, a software to support the system automation is developed to provide an effective method to replace redundant manual data entries and produce faster sample analysis and calculation process. This paper describes the design and development of automation software for NAA process which consists of three sub-programs. The sub-programs are sample registration, hardware control and data acquisition; and sample analysis. The data flow and connection between the sub-programs will be explained. The software is developed by using National Instrument LabView development package.

  10. Containerless processing of single crystals in low-G environment

    NASA Technical Reports Server (NTRS)

    Walter, H. U.

    1974-01-01

    Experiments on containerless crystal growth from the melt were conducted during Skylab missions SL3 and SL4 (Skylab Experiment M-560). Six samples of InSb were processed, one of them heavily doped with selenium. The concept of the experiment is discussed and related to general crystal growth methods and their merits as techniques for containerless processing in space. The morphology of the crystals obtained is explained in terms of volume changes associated with solidification and wetting conditions during solidification. All samples exhibit extremely well developed growth facets. Analysis by X-ray topographical methods and chemical etching shows that the crystals are of high structural perfection. Average dislocation density as revealed by etching is of the order of 100 per sq cm; no dislocation clusters could be observed in the space-grown samples. A sequence of striations that is observed in the first half of the selenium-doped sample is explained as being caused by periodic surface breakdown.

  11. Multivariate survivorship analysis using two cross-sectional samples.

    PubMed

    Hill, M E

    1999-11-01

    As an alternative to survival analysis with longitudinal data, I introduce a method that can be applied when one observes the same cohort in two cross-sectional samples collected at different points in time. The method allows for the estimation of log-probability survivorship models that estimate the influence of multiple time-invariant factors on survival over a time interval separating two samples. This approach can be used whenever the survival process can be adequately conceptualized as an irreversible single-decrement process (e.g., mortality, the transition to first marriage among a cohort of never-married individuals). Using data from the Integrated Public Use Microdata Series (Ruggles and Sobek 1997), I illustrate the multivariate method through an investigation of the effects of race, parity, and educational attainment on the survival of older women in the United States.

  12. Micro-CT scouting for transmission electron microscopy of human tissue specimens

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Morales, A. G.; Stempinski, E. S.; XIAO, X.

    Transmission electron microscopy (TEM) provides sub-nanometre-scale details in volumetric samples. Samples such as pathology tissue specimens are often stained with a metal element to enhance contrast, which makes them opaque to optical microscopes. As a result, it can be a lengthy procedure to find the region of interest inside a sample through sectioning. Here, we describe micro-CT scouting for TEM that allows noninvasive identification of regions of interest within a block sample to guide the sectioning step. In a tissue pathology study, a bench-top micro-CT scanner with 10 m resolution was used to determine the location of patches of themore » mucous membrane in osmium-stained human nasal scraping samples. Furthermore, once the regions of interest were located, the sample block was sectioned to expose that location, followed by ultra-thin sectioning and TEM to inspect the internal structure of the cilia of the membrane epithelial cells with nanometre resolution. This method substantially reduced the time and labour of the search process from typically 20 sections for light microscopy to three sections with no added sample preparation. Lay description Electron microscopy provides very high levels of detail in a small area, and thus the question of where to look in an opaque sample, such as a stained tissue specimen, needs to be answered by sectioning the sample in small steps and examining the sections under a light microscope, until the region of interest is found. The search process can be lengthy and labor intensive, especially for a study involving a large number of samples. Small areas of interest can be missed in the process if not enough regions are examined. We also describe a method to directly locate the region of interest within a whole sample using micro-CT imaging, bypassing the need of blindly sectioning. Micro-CT enables locating the region within 3D space; this information provides a guide for sectioning the sample to expose that precise location for high resolution electron microscopy imaging. In a human tissue specimen study, this method considerably reduced the time and labor of the search process.« less

  13. Micro-CT scouting for transmission electron microscopy of human tissue specimens

    DOE PAGES

    Morales, A. G.; Stempinski, E. S.; XIAO, X.; ...

    2016-02-08

    Transmission electron microscopy (TEM) provides sub-nanometre-scale details in volumetric samples. Samples such as pathology tissue specimens are often stained with a metal element to enhance contrast, which makes them opaque to optical microscopes. As a result, it can be a lengthy procedure to find the region of interest inside a sample through sectioning. Here, we describe micro-CT scouting for TEM that allows noninvasive identification of regions of interest within a block sample to guide the sectioning step. In a tissue pathology study, a bench-top micro-CT scanner with 10 m resolution was used to determine the location of patches of themore » mucous membrane in osmium-stained human nasal scraping samples. Furthermore, once the regions of interest were located, the sample block was sectioned to expose that location, followed by ultra-thin sectioning and TEM to inspect the internal structure of the cilia of the membrane epithelial cells with nanometre resolution. This method substantially reduced the time and labour of the search process from typically 20 sections for light microscopy to three sections with no added sample preparation. Lay description Electron microscopy provides very high levels of detail in a small area, and thus the question of where to look in an opaque sample, such as a stained tissue specimen, needs to be answered by sectioning the sample in small steps and examining the sections under a light microscope, until the region of interest is found. The search process can be lengthy and labor intensive, especially for a study involving a large number of samples. Small areas of interest can be missed in the process if not enough regions are examined. We also describe a method to directly locate the region of interest within a whole sample using micro-CT imaging, bypassing the need of blindly sectioning. Micro-CT enables locating the region within 3D space; this information provides a guide for sectioning the sample to expose that precise location for high resolution electron microscopy imaging. In a human tissue specimen study, this method considerably reduced the time and labor of the search process.« less

  14. An on-line potentiometric sequential injection titration process analyser for the determination of acetic acid.

    PubMed

    van Staden, J F; Mashamba, Mulalo G; Stefan, Raluca I

    2002-09-01

    An on-line potentiometric sequential injection titration process analyser for the determination of acetic acid is proposed. A solution of 0.1 mol L(-1) sodium chloride is used as carrier. Titration is achieved by aspirating acetic acid samples between two strong base-zone volumes into a holding coil and by channelling the stack of well-defined zones with flow reversal through a reaction coil to a potentiometric sensor where the peak widths were measured. A linear relationship between peak width and logarithm of the acid concentration was obtained in the range 1-9 g/100 mL. Vinegar samples were analysed without any sample pre-treatment. The method has a relative standard deviation of 0.4% with a sample frequency of 28 samples per hour. The results revealed good agreement between the proposed sequential injection and an automated batch titration method.

  15. Current Protocols in Pharmacology

    PubMed Central

    2016-01-01

    Determination of drug or drug metabolite concentrations in biological samples, particularly in serum or plasma, is fundamental to describing the relationships between administered dose, route of administration, and time after dose to the drug concentrations achieved and to the observed effects of the drug. A well-characterized, accurate analytical method is needed, but it must also be established that the analyte concentration in the sample at the time of analysis is the same as the concentration at sample acquisition. Drugs and metabolites may be susceptible to degradation in samples due to metabolism or to physical and chemical processes, resulting in a lower measured concentration than was in the original sample. Careful examination of analyte stability during processing and storage and adjustment of procedures and conditions to maximize that stability are a critical part of method validation for the analysis, and can ensure the accuracy of the measured concentrations. PMID:27960029

  16. Comparison of particle sizes between 238PuO 2 before aqueous processing, after aqueous processing, and after ball milling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mulford, Roberta Nancy

    Particle sizes determined for a single lot of incoming Russian fuel and for a lot of fuel after aqueous processing are compared with particle sizes measured on fuel after ball-milling. The single samples of each type are believed to have particle size distributions typical of oxide from similar lots, as the processing of fuel lots is fairly uniform. Variation between lots is, as yet, uncharacterized. Sampling and particle size measurement methods are discussed elsewhere.

  17. The Problem of Sample Contamination in a Fluvial Geochemistry Research Experience for Undergraduates.

    ERIC Educational Resources Information Center

    Andersen, Charles B.

    2001-01-01

    Introduces the analysis of a river as an excellent way to teach geochemical techniques because of the relative ease of sample collection and speed of sample analysis. Focuses on the potential sources of sample contamination during sampling, filtering, and bottle cleaning processes, and reviews methods to reduce and detect contamination. Includes…

  18. Method and apparatus for maintaining multi-component sample gas constituents in vapor phase during sample extraction and cooling

    DOEpatents

    Farthing, William Earl [Pinson, AL; Felix, Larry Gordon [Pelham, AL; Snyder, Todd Robert [Birmingham, AL

    2008-02-12

    An apparatus and method for diluting and cooling that is extracted from high temperature and/or high pressure industrial processes. Through a feedback process, a specialized, CFD-modeled dilution cooler is employed along with real-time estimations of the point at which condensation will occur within the dilution cooler to define a level of dilution and diluted gas temperature that results in a gas that can be conveyed to standard gas analyzers that contains no condensed hydrocarbon compounds or condensed moisture.

  19. Method and apparatus maintaining multi-component sample gas constituents in vapor phase during sample extraction and cooling

    DOEpatents

    Farthing, William Earl; Felix, Larry Gordon; Snyder, Todd Robert

    2009-12-15

    An apparatus and method for diluting and cooling that is extracted from high temperature and/or high pressure industrial processes. Through a feedback process, a specialized, CFD-modeled dilution cooler is employed along with real-time estimations of the point at which condensation will occur within the dilution cooler to define a level of dilution and diluted gas temperature that results in a gas that can be conveyed to standard gas analyzers that contains no condensed hydrocarbon compounds or condensed moisture.

  20. A method for the determination of acrylamide in a broad variety of processed foods by GC-MS using xanthydrol derivatization.

    PubMed

    Yamazaki, Kumiko; Isagawa, Satoshi; Kibune, Nobuyuki; Urushiyama, Tetsuo

    2012-01-01

    A novel GC-MS method was developed for the determination of acrylamide, which is applicable to a variety of processed foods, including potato snacks, corn snacks, biscuits, instant noodles, coffee, soy sauces and miso (fermented soy bean paste). The method involves the derivatization of acrylamide with xanthydrol instead of a bromine compound. Isotopically labelled acrylamide (d₃-acrylamide) was used as the internal standard. The aqueous extract from samples was purified using Sep-Pak™ C₁₈ and Sep-Pak™ AC-2 columns. For amino acid-rich samples, such as miso or soy sauce, an Extrelut™ column was used for purification or extraction. After reaction with xanthydrol, the resultant N-xanthyl acrylamide was determined by GC-MS. The method was validated for various food matrices and showed good linearity, precision and trueness. The limit of detection and limit of quantification ranged 0.5-5 and 5-20 µg kg⁻¹), respectively. The developed method was applied as an exploratory survey of acrylamide in Japanese foods and the method was shown to be applicable for all samples tested.

  1. Determination of the Degree of Substitution of Cationic Guar Gum by Headspace-Based Gas Chromatography during Its Synthesis.

    PubMed

    Wan, Xiaofang; Guo, Congbao; Feng, Jiarui; Yu, Teng; Chai, Xin-Sheng; Chen, Guangxue; Xie, Wei-Qi

    2017-08-16

    This study reports on a headspace-based gas chromatography (HS-GC) technique for determining the degree of substitution (DS) of cationic guar gum during the synthesis process. The method is based on the determination of 2,3-epoxypropyltrimethylammonium chloride in the process medium. After a modest pretreatment procedure, the sample was added to a headspace vial containing bicarbonate solution for measurement of evolved CO 2 by HS-GC. The results showed that the method had a good precision (relative standard deviation of <3.60%) and accuracy for the 2,3-epoxypropyltrimethylammonium chloride measurement, with recoveries in the range of 96-102%, matching with the data obtained by a reference method, and were within 12% of the values obtained by the more arduous Kjeldahl method for the calculated DS of cationic guar gum. The HS-GC method requires only a small volume of sample and, thus, is suitable for determining the DS of cationic guar gum in laboratory-scale process-related applications.

  2. Capillary absorption spectrometer and process for isotopic analysis of small samples

    DOEpatents

    Alexander, M. Lizabeth; Kelly, James F.; Sams, Robert L.; Moran, James J.; Newburn, Matthew K.; Blake, Thomas A.

    2016-03-29

    A capillary absorption spectrometer and process are described that provide highly sensitive and accurate stable absorption measurements of analytes in a sample gas that may include isotopologues of carbon and oxygen obtained from gas and biological samples. It further provides isotopic images of microbial communities that allow tracking of nutrients at the single cell level. It further targets naturally occurring variations in carbon and oxygen isotopes that avoids need for expensive isotopically labeled mixtures which allows study of samples taken from the field without modification. The method also permits sampling in vivo permitting real-time ambient studies of microbial communities.

  3. Water as foaming agent for open cell polyurethane structures.

    PubMed

    Haugen, H; Ried, V; Brunner, M; Will, J; Wintermantel, E

    2004-04-01

    The problem of moisture in polymer processing is known to any polymer engineer, as air bubbles may be formed. Hence granulates are generally dried prior to manufacturing. This study tried to develop a novel processing methods for scaffolds with controlled moisture content in thermoplastic polyurethane. The common foaming agents for polyurethane are organic solvents, whose residues remaining in the scaffold may be harmful to adherent cells, protein growth factors or nearby tissues. Water was used as a foaming agent and NaCl was used as porogens to achieve an open-cell structure. The polyether-polyurethane samples were processed in a heated press, and achieved a porosity of 64%. The pore size ranged between 50 and 500 microm. Human fibroblasts adhered and proliferate in the scaffold. A non-toxic production process was developed to manufacture a porous structure with a thermoplastic polyether-polyurethane. The process enables a mass-production of samples with adjustable pore size and porosity. In contrast to an existing method (solvent casting), the processing of the samples was not limited by its thickness. The process parameters, which attribute mostly to the pore building, were filling volume, temperature, NaCl-concentration and water-uptake rate.

  4. 76 FR 65953 - CBP Audit Procedures; Use of Sampling Methods and Offsetting of Overpayments and Over-Declarations

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-10-25

    ... FURTHER INFORMATION CONTACT: For Legal Aspects: Alan C. Cohen, Penalties Branch, Regulations and Rulings... claimant to resolve defects. It is recognized that in some cases the sampling will be so flawed it cannot... companies' internal processes and systems during the application process. ISA members are companies with...

  5. Fingerprint analysis of Radix Aconiti using ultra-performance liquid chromatography-electrospray ionization/ tandem mass spectrometry (UPLC-ESI/MS n) combined with stoichiometry.

    PubMed

    Zhu, Hongbin; Wang, Chunyan; Qi, Yao; Song, Fengrui; Liu, Zhiqiang; Liu, Shuying

    2013-01-15

    A fingerprinting approach was developed by means of UPLC-ESI/MS(n) (ultra-performance liquid chromatography-electrospray ionization/mass spectrometry) for the quality control of processed Radix Aconiti, a widely used toxic traditional herbal medicine. The present fingerprinting approach was based on the two processing methods recorded in Chinese Pharmacopoeia for the purpose of reducing the toxicity and ensuring the clinical therapeutic efficacy. Similarity evaluation, hierarchical cluster analysis and principal component analysis were performed to evaluate the similarity and variation of the samples. The results showed that the well processed, unqualified processed and the raw Radix Aconiti could be clustered reasonably corresponding to the contents of their constituents. The loading plot shows that the main chemical markers having the most influence on the discrimination amongst the qualified and unqualified samples were mainly some monoester diterpenoid aconitines and diester diterpenoid aconitines. Finally, the UPLC-UV and UPLC-ESI/MS(n) characteristic fingerprints were established according to the well processed and purchased qualified samples. At the same time, a complementary quantification method of six Aconitine-type alkaloids was developed using UPLC-UV and UPLC-ESI/MS. The average recovery of the monoester diterpenoid aconitines was 95.4-99.1% and the average recovery of the diester diterpenoid aconitines was 103-112%. The proposed combined quantification method by UPLC-UV and UPLC-ESI/MS allows the samples analyzed in a wide concentration range. Therefore, the established fingerprinting approach in combination with chemometric analysis provides a flexible and reliable method for quality assessment of toxic herbal medicine. Copyright © 2012 Elsevier B.V. All rights reserved.

  6. Evaluation of growth performance, serum biochemistry and haematological parameters on broiler birds fed with raw and processed samples of Entada scandens, Canavalia gladiata and Canavalia ensiformis seed meal as an alternative protein source.

    PubMed

    Sasipriya, Gopalakrishnan; Siddhuraju, Perumal

    2013-03-01

    The experiment was carried out to investigate the inclusion of underutilised legumes, Entada scandens, Canavalia gladiata and Canavalia ensiformis, seed meal in soybean-based diet in broilers. The utilisation of these wild legumes is limited by the presence of antinutrient compounds. Processing methods like soaking followed by autoclaving in sodium bicarbonate solution in E. scandens and C. gladiata and soaking followed by autoclaving in ash solution in C. ensiformis were adopted. The proximate composition of raw and processed samples of E. scandens, C. gladiata and C. ensiformis were determined. The protein content was enhanced in processed sample of E. scandens (46 %) and C. ensiformis (16 %). This processing method had reduced the maximum number of antinutrients such as tannins (10-100 %), trypsin inhibitor activity (99 %), chymotrypsin inhibitor activity (72-100 %), canavanine (60-62 %), amylase inhibitor activity (73-100 %), saponins (78-92 %), phytic acid (19-40 %) and lectins. Hence, the raw samples at 15 % and processed samples at 15 and 30 % were replaced with soybean protein in commercial broiler diet respectively. Birds fed with 30 % processed samples of E. scandens, C. gladiata and C. ensiformis showed significantly similar results of growth performance, carcass characteristics, organ weight, haematological parameters and serum biochemical parameters (cholesterol, protein, bilirubin, albumin, globulin and liver and kidney function parameters) without any adverse effects after 42 days of supplementation. The proper utilisation of these underutilised legumes may act as an alternative protein ingredient in poultry diets.

  7. Method for producing a thin sample band in a microchannel device

    DOEpatents

    Griffiths, Stewart K [Livermore, CA; Nilson, Robert H [Cardiff, CA

    2004-08-03

    The present invention improves the performance of microchannel systems for chemical and biological synthesis and analysis by providing a method and apparatus for producing a thin band of a species sample. Thin sample bands improve the resolution of microchannel separation processes, as well as many other processes requiring precise control of sample size and volume. The new method comprises a series of steps in which a species sample is manipulated by controlled transport through a junction formed at the intersection of four or more channels. A sample is first inserted into the end of one of these channels in the vicinity of the junction. Next, this sample is thinned by transport across the junction one or more times. During these thinning steps, flow enters the junction through one of the channels and exists through those remaining, providing a divergent flow field that progressively stretches and thins the band with each traverse of the junction. The thickness of the resulting sample band may be smaller than the channel width. Moreover, the thickness of the band may be varied and controlled by altering the method alone, without modification to the channel or junction geometries. The invention is applicable to both electroosmotic and electrophoretic transport, to combined electrokinetic transport, and to some special cases in which bulk fluid transport is driven by pressure gradients. It is further applicable to channels that are open, filled with a gel or filled with a porous or granular material.

  8. Evaluation of soil water stable isotope analysis by H2O(liquid)-H2O(vapor) equilibration method

    NASA Astrophysics Data System (ADS)

    Gralher, Benjamin; Stumpp, Christine

    2014-05-01

    Environmental tracers like stable isotopes of water (δ18O, δ2H) have proven to be valuable tools to study water flow and transport processes in soils. Recently, a new technique for soil water isotope analysis has been developed that employs a vapor phase being in isothermal equilibrium with the liquid phase of interest. This has increased the potential application of water stable isotopes in unsaturated zone studies as it supersedes laborious extraction of soil water. However, uncertainties of analysis and influencing factors need to be considered. Therefore, the objective of this study was to evaluate different methodologies of analysing stable isotopes in soil water in order to reduce measurement uncertainty. The methodologies included different preparation procedures of soil cores for equilibration of vapor and soil water as well as raw data correction. Two different inflatable sample containers (freezer bags, bags containing a metal layer) and equilibration atmospheres (N2, dry air) were tested. The results showed that uncertainties for δ18O were higher compared to δ2H that cannot be attributed to any specific detail of the processing routine. Particularly, soil samples with high contents of organic matter showed an apparent isotope enrichment which is indicative for fractionation due to evaporation. However, comparison of water samples obtained from suction cups with the local meteoric water line indicated negligible fractionation processes in the investigated soils. Therefore, a method was developed to correct the raw data reducing the uncertainties of the analysis.. We conclude that the evaluated method is advantageous over traditional methods regarding simplicity, resource requirements and sample throughput but careful consideration needs to be made regarding sample handling and data processing. Thus, stable isotopes of water are still a good tool to determine water flow and transport processes in the unsaturated zone.

  9. On-Line Ion Exchange Liquid Chromatography as a Process Analytical Technology for Monoclonal Antibody Characterization in Continuous Bioprocessing.

    PubMed

    Patel, Bhumit A; Pinto, Nuno D S; Gospodarek, Adrian; Kilgore, Bruce; Goswami, Kudrat; Napoli, William N; Desai, Jayesh; Heo, Jun H; Panzera, Dominick; Pollard, David; Richardson, Daisy; Brower, Mark; Richardson, Douglas D

    2017-11-07

    Combining process analytical technology (PAT) with continuous production provides a powerful tool to observe and control monoclonal antibody (mAb) fermentation and purification processes. This work demonstrates on-line liquid chromatography (on-line LC) as a PAT tool for monitoring a continuous biologics process and forced degradation studies. Specifically, this work focused on ion exchange chromatography (IEX), which is a critical separation technique to detect charge variants. Product-related impurities, including charge variants, that impact function are classified as critical quality attributes (CQAs). First, we confirmed no significant differences were observed in the charge heterogeneity profile of a mAb through both at-line and on-line sampling and that the on-line method has the ability to rapidly detect changes in protein quality over time. The robustness and versatility of the PAT methods were tested by sampling from two purification locations in a continuous mAb process. The PAT IEX methods used with on-line LC were a weak cation exchange (WCX) separation and a newly developed shorter strong cation exchange (SCX) assay. Both methods provided similar results with the distribution of percent acidic, main, and basic species remaining unchanged over a 2 week period. Second, a forced degradation study showed an increase in acidic species and a decrease in basic species when sampled on-line over 7 days. These applications further strengthen the use of on-line LC to monitor CQAs of a mAb continuously with various PAT IEX analytical methods. Implementation of on-line IEX will enable faster decision making during process development and could potentially be applied to control in biomanufacturing.

  10. Evaluation of the effect of processing methods on the nutritional and anti-nutritional compositions of two under-utilized Nigerian grain legumes.

    PubMed

    Oke, M O; Sobowale, S S; Ogunlakin, G O

    2013-12-15

    The nutritional and anti-nutritional compositions of African Yam Bean (AYB) and Lima bean flours under different processing methods were determined. Nutritional and anti-nutritional properties studied include moisture content, crude protein, crude fibre, ash content, ether extract, carbohydrate, tannin, protease inhibitor and phytate. The moisture content of AYB flours ranged from 9.31 to 9.61% while that of lima beans ranged from 9.32 to 9.56%. There is a significant different among the samples when the unprocessed AYB (control) and the processed AYB were compared. The same trend was also observed with lima bean flours. However, some nutrient did not show significant variations with processing. It was observed that samples of soaked/de-hulled AYB have the least protease inhibitor of 0.73 mg/100 g and it is significantly different from the unprocessed samples. Soaked/de-hulled flours of both AYB and lima beans have the most percentage decrease in anti-nutritional content. Lima bean flours were observed to have higher anti-nutritional content than AYB. The percentage decrease of anti-nutritional factors in the samples is proportionally higher than that of the nutrients. The nutritional and anti-nutritional compositions of the samples suggest that processed African Yam Bean (AYB) and Lima bean flours would have useful application in fabricated foods.

  11. Feller processes: the next generation in modeling. Brownian motion, Lévy processes and beyond.

    PubMed

    Böttcher, Björn

    2010-12-03

    We present a simple construction method for Feller processes and a framework for the generation of sample paths of Feller processes. The construction is based on state space dependent mixing of Lévy processes. Brownian Motion is one of the most frequently used continuous time Markov processes in applications. In recent years also Lévy processes, of which Brownian Motion is a special case, have become increasingly popular. Lévy processes are spatially homogeneous, but empirical data often suggest the use of spatially inhomogeneous processes. Thus it seems necessary to go to the next level of generalization: Feller processes. These include Lévy processes and in particular brownian motion as special cases but allow spatial inhomogeneities. Many properties of Feller processes are known, but proving the very existence is, in general, very technical. Moreover, an applicable framework for the generation of sample paths of a Feller process was missing. We explain, with practitioners in mind, how to overcome both of these obstacles. In particular our simulation technique allows to apply Monte Carlo methods to Feller processes.

  12. Feller Processes: The Next Generation in Modeling. Brownian Motion, Lévy Processes and Beyond

    PubMed Central

    Böttcher, Björn

    2010-01-01

    We present a simple construction method for Feller processes and a framework for the generation of sample paths of Feller processes. The construction is based on state space dependent mixing of Lévy processes. Brownian Motion is one of the most frequently used continuous time Markov processes in applications. In recent years also Lévy processes, of which Brownian Motion is a special case, have become increasingly popular. Lévy processes are spatially homogeneous, but empirical data often suggest the use of spatially inhomogeneous processes. Thus it seems necessary to go to the next level of generalization: Feller processes. These include Lévy processes and in particular Brownian motion as special cases but allow spatial inhomogeneities. Many properties of Feller processes are known, but proving the very existence is, in general, very technical. Moreover, an applicable framework for the generation of sample paths of a Feller process was missing. We explain, with practitioners in mind, how to overcome both of these obstacles. In particular our simulation technique allows to apply Monte Carlo methods to Feller processes. PMID:21151931

  13. [Research status and prospects of DNA test on difficult specimens].

    PubMed

    Dang, Hua-Wei; Mao, Jiong; Wang, Hui; Huang, Jiang-Ping; Bai, Xiao-Gang

    2012-02-01

    This paper reviews the advances of DNA detection on three types of difficult biological specimens including degraded samples, trace evidences and mixed samples. The source of different samples, processing methods and announcements were analyzed. New methods such as mitochondrial test system, changing the original experimental conditions, low-volume PCR amplification and new technologies such as whole genome amplification techniques, laser capture micro-dissection, and mini-STR technology in recent years are introduced.

  14. Method for determining the octane rating of gasoline samples by observing corresponding acoustic resonances therein

    DOEpatents

    Sinha, Dipen N.; Anthony, Brian W.

    1997-01-01

    A method for determining the octane rating of gasoline samples by observing corresponding acoustic resonances therein. A direct correlation between the octane rating of gasoline and the frequency of corresponding acoustic resonances therein has been experimentally observed. Therefore, the octane rating of a gasoline sample can be directly determined through speed of sound measurements instead of by the cumbersome process of quantifying the knocking quality of the gasoline. Various receptacle geometries and construction materials may be employed. Moreover, it is anticipated that the measurements can be performed on flowing samples in pipes, thereby rendering the present method useful in refineries and distilleries.

  15. Preview of the NASA NNWG NDE Sample Preparation Handbook

    NASA Technical Reports Server (NTRS)

    2010-01-01

    This viewgraph presents a step-by-step how-to fabrication documentation of every kind of sample that is fabricated for MSFC by UA Huntsville, including photos and illustrations. The tabulation of what kind of samples are being fabricated for what NDE method, detailed instructions/documentation of the inclusion/creation of defects, detailed specifications for materials, processes, and equipment, case histories and/or experiences with the different fabrication methods and defect inclusion techniques, discussion of pitfalls and difficulties associated with sample fabrication and defect inclusion techniques, and a discussion of why certain fabrication techniques are needed as related to the specific NDE methods are included in this presentation.

  16. Rapid extraction and assay of uranium from environmental surface samples

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Barrett, Christopher A.; Chouyyok, Wilaiwan; Speakman, Robert J.

    Extraction methods enabling faster removal and concentration of uranium compounds for improved trace and low-level assay are demonstrated for standard surface sampling material in support of nuclear safeguards efforts, health monitoring, and other nuclear analysis applications. A key problem with the existing surface sampling swipes is the requirement for complete digestion of sample and sampling matrix. This is a time-consuming and labour-intensive process that limits laboratory throughput, elevates costs, and increases background levels. Various extraction methods are explored for their potential to quickly and efficiently remove different chemical forms of uranium from standard surface sampling material. A combination of carbonatemore » and peroxide solutions is shown to give the most rapid and complete form of uranyl compound extraction and dissolution. This rapid extraction process is demonstrated to be compatible with standard inductive coupled plasma mass spectrometry methods for uranium isotopic assay as well as screening techniques such as x-ray fluorescence. The general approach described has application beyond uranium to other analytes of nuclear forensic interest (e.g., rare earth elements and plutonium) as well as heavy metals for environmental and industrial hygiene monitoring.« less

  17. Compressive Sensing of Roller Bearing Faults via Harmonic Detection from Under-Sampled Vibration Signals

    PubMed Central

    Tang, Gang; Hou, Wei; Wang, Huaqing; Luo, Ganggang; Ma, Jianwei

    2015-01-01

    The Shannon sampling principle requires substantial amounts of data to ensure the accuracy of on-line monitoring of roller bearing fault signals. Challenges are often encountered as a result of the cumbersome data monitoring, thus a novel method focused on compressed vibration signals for detecting roller bearing faults is developed in this study. Considering that harmonics often represent the fault characteristic frequencies in vibration signals, a compressive sensing frame of characteristic harmonics is proposed to detect bearing faults. A compressed vibration signal is first acquired from a sensing matrix with information preserved through a well-designed sampling strategy. A reconstruction process of the under-sampled vibration signal is then pursued as attempts are conducted to detect the characteristic harmonics from sparse measurements through a compressive matching pursuit strategy. In the proposed method bearing fault features depend on the existence of characteristic harmonics, as typically detected directly from compressed data far before reconstruction completion. The process of sampling and detection may then be performed simultaneously without complete recovery of the under-sampled signals. The effectiveness of the proposed method is validated by simulations and experiments. PMID:26473858

  18. Effect of food processing on degradation of hexachlorocyclohexane and its isomers in milk

    PubMed Central

    Singh, Sujatha; Nelapati, Krishnaiah

    2017-01-01

    Aim: To study the effect of different food processing techniques on the degradation of organochlorine compounds (α, β, ɣ and δ hexachlorocyclohexane isomers (HCH)) residues in both natural and fortified samples of milk. Materials and Methods: Raw milk samples are collected from the local areas of Hyderabad, India. Naturally and fortified milk samples (HCH) were subjected to various food processing techniques, pasteurization (63ºC for ½ h), sterilization (121ºC for 15 min) and boiling for 5 min and analyzed by gas chromatography with electron capture detector using quick, easy, cheap, effective, rugged and safe method for multiresidue analysis of pesticides in milk with slight modification. Results: The final mean residual concentration of pesticide in milk after heat processing and percentage of degradation were calculated with respective treatments. Conclusion: Heat treatments are highly effective on reduction of mean residual concentration of HCH in milk. In which Sterilization and boiling proved to be more effective in degradation of HCH isomers. PMID:28435187

  19. A validated stability-indicating RP-HPLC method for levofloxacin in the presence of degradation products, its process related impurities and identification of oxidative degradant.

    PubMed

    Lalitha Devi, M; Chandrasekhar, K B

    2009-12-05

    The objective of current study was to develop a validated specific stability indicating reversed-phase liquid chromatographic method for the quantitative determination of levofloxacin as well as its related substances determination in bulk samples, pharmaceutical dosage forms in the presence of degradation products and its process related impurities. Forced degradation studies were performed on bulk sample of levofloxacin as per ICH prescribed stress conditions using acid, base, oxidative, water hydrolysis, thermal stress and photolytic degradation to show the stability indicating power of the method. Significant degradation was observed during oxidative stress and the degradation product formed was identified by LCMS/MS, slight degradation in acidic stress and no degradation was observed in other stress conditions. The chromatographic method was optimized using the samples generated from forced degradation studies and the impurity spiked solution. Good resolution between the peaks corresponds to process related impurities and degradation products from the analyte were achieved on ACE C18 column using the mobile phase consists a mixture of 0.5% (v/v) triethyl amine in sodium dihydrogen orthophosphate dihydrate (25 mM; pH 6.0) and methanol using a simple linear gradient. The detection was carried out at 294 nm. The limit of detection and the limit of quantitation for the levofloxacin and its process related impurities were established. The stressed test solutions were assayed against the qualified working standard of levofloxacin and the mass balance in each case was in between 99.4 and 99.8% indicating that the developed LC method was stability indicating. Validation of the developed LC method was carried out as per ICH requirements. The developed LC method was found to be suitable to check the quality of bulk samples of levofloxacin at the time of batch release and also during its stability studies (long term and accelerated stability).

  20. Experimental assessment of the purity of α-cellulose produced by variations of the Brendel method: Implications for stable isotope (δ13C, δ18O) dendroclimatology

    NASA Astrophysics Data System (ADS)

    Brookman, Tom; Whittaker, Thomas

    2012-09-01

    Stable isotope dendroclimatology using α-cellulose has unique potential to deliver multimillennial-scale, sub-annually resolved, terrestrial climate records. However, lengthy processing and analytical methods often preclude such reconstructions. Variants of the Brendel extraction method have reduced these limitations, providing fast, easy methods of isolating α-cellulose in some species. Here, we investigate application of Standard Brendel (SBrendel) variants to resinous soft-woods by treating samples of kauri (Agathis australis), ponderosa pine (Pinus ponderosa) and huon pine (Lagarastrobus franklinii), varying reaction vessel, temperature, boiling time and reagent volume. Numerous samples were visibly `under-processed' and Fourier Transform infrared spectroscopic (FTIR) investigation showed absorption peaks at 1520 cm-1 and ˜1600 cm-1 in those fibers suggesting residual lignin and retained resin respectively. Replicate analyses of all samples processed at high temperature yielded consistent δ13C and δ18O despite color and spectral variations. Spectra and isotopic data revealed that α-cellulose δ13C can be altered during processing, most likely due to chemical contamination from insufficient acetone removal, but is not systematically affected by methodological variation. Reagent amount, temperature and extraction time all influence δ18O, however, and our results demonstrate that different species may require different processing methods. FTIR prior to isotopic analysis is a fast and cost effective way to determine α-cellulose extract purity. Furthermore, a systematic isotopic test such as we present here can also determine sensitivity of isotopic values to methodological variables. Without these tests, isotopic variability introduced by the method could obscure or `create' climatic signals within a data set.

  1. Learning process mapping heuristics under stochastic sampling overheads

    NASA Technical Reports Server (NTRS)

    Ieumwananonthachai, Arthur; Wah, Benjamin W.

    1991-01-01

    A statistical method was developed previously for improving process mapping heuristics. The method systematically explores the space of possible heuristics under a specified time constraint. Its goal is to get the best possible heuristics while trading between the solution quality of the process mapping heuristics and their execution time. The statistical selection method is extended to take into consideration the variations in the amount of time used to evaluate heuristics on a problem instance. The improvement in performance is presented using the more realistic assumption along with some methods that alleviate the additional complexity.

  2. Laser excited confocal microscope fluorescence scanner and method

    DOEpatents

    Mathies, R.A.; Peck, K.

    1992-02-25

    A fluorescent scanner is designed for scanning the fluorescence from a fluorescence labeled separated sample on a sample carrier. The scanner includes a confocal microscope for illuminating a predetermined volume of the sample carrier and/or receiving and processing fluorescence emissions from the volume to provide a display of the separated sample. 8 figs.

  3. Comparability of river suspended-sediment sampling and laboratory analysis methods

    USGS Publications Warehouse

    Groten, Joel T.; Johnson, Gregory D.

    2018-03-06

    Accurate measurements of suspended sediment, a leading water-quality impairment in many Minnesota rivers, are important for managing and protecting water resources; however, water-quality standards for suspended sediment in Minnesota are based on grab field sampling and total suspended solids (TSS) laboratory analysis methods that have underrepresented concentrations of suspended sediment in rivers compared to U.S. Geological Survey equal-width-increment or equal-discharge-increment (EWDI) field sampling and suspended sediment concentration (SSC) laboratory analysis methods. Because of this underrepresentation, the U.S. Geological Survey, in collaboration with the Minnesota Pollution Control Agency, collected concurrent grab and EWDI samples at eight sites to compare results obtained using different combinations of field sampling and laboratory analysis methods.Study results determined that grab field sampling and TSS laboratory analysis results were biased substantially low compared to EWDI sampling and SSC laboratory analysis results, respectively. Differences in both field sampling and laboratory analysis methods caused grab and TSS methods to be biased substantially low. The difference in laboratory analysis methods was slightly greater than field sampling methods.Sand-sized particles had a strong effect on the comparability of the field sampling and laboratory analysis methods. These results indicated that grab field sampling and TSS laboratory analysis methods fail to capture most of the sand being transported by the stream. The results indicate there is less of a difference among samples collected with grab field sampling and analyzed for TSS and concentration of fines in SSC. Even though differences are present, the presence of strong correlations between SSC and TSS concentrations provides the opportunity to develop site specific relations to address transport processes not captured by grab field sampling and TSS laboratory analysis methods.

  4. A Synopsis of Technical Issues for Monitoring Sediment in Highway and Urban Runoff

    USGS Publications Warehouse

    Bent, Gardner C.; Gray, John R.; Smith, Kirk P.; Glysson, G. Douglas

    2000-01-01

    Accurate and representative sediment data are critical for assessing the potential effects of highway and urban runoff on receiving waters. The U.S. Environmental Protection Agency identified sediment as the most widespread pollutant in the Nation's rivers and streams, affecting aquatic habitat, drinking water treatment processes, and recreational uses of rivers, lakes, and estuaries. Representative sediment data are also necessary for quantifying and interpreting concentrations, loads, and effects of trace elements and organic constituents associated with highway and urban runoff. Many technical issues associated with the collecting, processing, and analyzing of samples must be addressed to produce valid (useful for intended purposes), current, complete, and technically defensible data for local, regional, and national information needs. All aspects of sediment data-collection programs need to be evaluated, and adequate quality-control data must be collected and documented so that the comparability and representativeness of data obtained for highway- and urban-runoff studies may be assessed. Collection of representative samples for the measurement of sediment in highway and urban runoff involves a number of interrelated issues. Temporal and spatial variability in runoff result from a combination of factors, including volume and intensity of precipitation, rate of snowmelt, and features of the drainage basin such as area, slope, infiltration capacity, channel roughness, and storage characteristics. In small drainage basins such as those found in many highway and urban settings, automatic samplers are often the most suitable method for collecting samples of runoff for a variety of reasons. Indirect sediment-measurement methods are also useful as supplementary and(or) surrogate means for monitoring sediment in runoff. All of these methods have limitations in addition to benefits, which must be identified and quantified to produce representative data. Methods for processing raw sediment samples (including homogenization and subsampling) for subsequent analysis for total suspended solids or suspended-sediment concentration often increase variance and may introduce bias. Processing artifacts can be substantial if the methods used are not appropriate for the concentrations and particle-size distributions present in the samples collected. Analytical methods for determining sediment concentrations include the suspended-sediment concentration and the total suspended solids methods. Although the terms suspended-sediment concentration and total suspended solids are often used interchangeably to describe the total concentration of suspended solid-phase material, the analytical methods differ and can produce substantially different results. The total suspended solids method, which commonly is used to produce highway- and urban-runoff sediment data, may not be valid for studies of runoff water quality. Studies of fluvial and highway-runoff sediment data indicate that analyses of samples by the total suspended solids method tends to under represent the true sediment concentration, and that relations between total suspended solids and suspended-sediment concentration are not transferable from site to site even when grain-size distribution information is available. Total suspended solids data used to calculate suspended-sediment loads in highways and urban runoff may be fundamentally unreliable. Consequently, use of total suspended solids data may have adverse consequences for the assessment, design, and maintenance of sediment-removal best management practices. Therefore, it may be necessary to analyze water samples using the suspended-sediment concentration method. Data quality, comparability, and utility are important considerations in collection, processing, and analysis of sediment samples and interpretation of sediment data for highway- and urban-runoff studies. Results from sediment studies must be comparable and readily transf

  5. Multi-parameters monitoring during traditional Chinese medicine concentration process with near infrared spectroscopy and chemometrics

    NASA Astrophysics Data System (ADS)

    Liu, Ronghua; Sun, Qiaofeng; Hu, Tian; Li, Lian; Nie, Lei; Wang, Jiayue; Zhou, Wanhui; Zang, Hengchang

    2018-03-01

    As a powerful process analytical technology (PAT) tool, near infrared (NIR) spectroscopy has been widely used in real-time monitoring. In this study, NIR spectroscopy was applied to monitor multi-parameters of traditional Chinese medicine (TCM) Shenzhiling oral liquid during the concentration process to guarantee the quality of products. Five lab scale batches were employed to construct quantitative models to determine five chemical ingredients and physical change (samples density) during concentration process. The paeoniflorin, albiflorin, liquiritin and samples density were modeled by partial least square regression (PLSR), while the content of the glycyrrhizic acid and cinnamic acid were modeled by support vector machine regression (SVMR). Standard normal variate (SNV) and/or Savitzkye-Golay (SG) smoothing with derivative methods were adopted for spectra pretreatment. Variable selection methods including correlation coefficient (CC), competitive adaptive reweighted sampling (CARS) and interval partial least squares regression (iPLS) were performed for optimizing the models. The results indicated that NIR spectroscopy was an effective tool to successfully monitoring the concentration process of Shenzhiling oral liquid.

  6. Hierarchical Bayesian modelling of gene expression time series across irregularly sampled replicates and clusters.

    PubMed

    Hensman, James; Lawrence, Neil D; Rattray, Magnus

    2013-08-20

    Time course data from microarrays and high-throughput sequencing experiments require simple, computationally efficient and powerful statistical models to extract meaningful biological signal, and for tasks such as data fusion and clustering. Existing methodologies fail to capture either the temporal or replicated nature of the experiments, and often impose constraints on the data collection process, such as regularly spaced samples, or similar sampling schema across replications. We propose hierarchical Gaussian processes as a general model of gene expression time-series, with application to a variety of problems. In particular, we illustrate the method's capacity for missing data imputation, data fusion and clustering.The method can impute data which is missing both systematically and at random: in a hold-out test on real data, performance is significantly better than commonly used imputation methods. The method's ability to model inter- and intra-cluster variance leads to more biologically meaningful clusters. The approach removes the necessity for evenly spaced samples, an advantage illustrated on a developmental Drosophila dataset with irregular replications. The hierarchical Gaussian process model provides an excellent statistical basis for several gene-expression time-series tasks. It has only a few additional parameters over a regular GP, has negligible additional complexity, is easily implemented and can be integrated into several existing algorithms. Our experiments were implemented in python, and are available from the authors' website: http://staffwww.dcs.shef.ac.uk/people/J.Hensman/.

  7. [Optimization of Cryptosporidium and Giardia detection in water environment using automatic elution station Filta-Max xpress].

    PubMed

    Matuszewska, Renata; Szczotko, Maciej; Krogulska, Bozena

    2012-01-01

    The presence of parasitic protozoa in drinking water is mostly a result of improperly maintened the water treatment process. Currently, in Poland the testing of Cryptosporidium and Giardia in water as a part of routine monitoring of water is not perform. The aim of this study was the optimization of the method of Cryptosporidium and Giardia detection in water according to the main principles of standard ISO 15553:2006 and using Filta-Max xpress automatic elution station. Preliminary tests were performed on the samples contaminated with oocysts and cysts of reference strains of both parasitic protozoa. Further studies were carried out on environmental samples of surface water sampled directly from the intakes of water (21 samples from Vistula River and 8 samples from Zegrzynski Lake). Filtration process and samples volume reducing were performed using an automatic elution system Filta-Max xpress. Next, samples were purified during immunomagnetic separation process (IMS). Isolated cysts and oocysts were stained with FITC and DAPI and than the microscopic observation using an epifluorescence microscope was carried out. Recovery of parasite protozoa in all contaminated water samples after 9-cycles elution process applied was mean 60.6% for Cryptosporidium oocysts and 36.1% for Giardia cysts. Studies on the environmental surface water samples showed the presence of both parasitic protozoa. Number of detected Giardia cysts ranged from 1.0/10 L up to 4.5/10 L in samples from Zegrzynski Lake and from 1.0/10 L up to 38.9/10 L in samples from Vistula River. Cryptosporidium oocysts were present in 50% of samples from the Zegrzynski Lake and in 47.6% of samples from the Vistula River, and their number in both cases was similar and ranged from 0.5 up to 2.5 oocyst/10 L. The results show that applied procedure is appropriate for detection the presence of parasitic protosoan in water, but when water contains much amount of inorganic matter and suspended solids test method have to be modified like subsamples preparation and filtration process speed reduction. The applied method with the modification using Filta-Max xpress system can be useful for the routine monitoring of water. Detection of Cryptosporidium and Giardia in all samples of water taken from the intakes of surface water shows the possibility oftransfering of the protozoan cysts into the water intended for the consumption, therefore the testing of Cryptosporidium and Giardia should be included into the monitoring of water.

  8. Floating Ultrasonic Transducer Inspection System and Method for Nondestructive Evaluation

    NASA Technical Reports Server (NTRS)

    Johnston, Patrick H. (Inventor); Zalameda, Joseph N. (Inventor)

    2016-01-01

    A method for inspecting a structural sample using ultrasonic energy includes positioning an ultrasonic transducer adjacent to a surface of the sample, and then transmitting ultrasonic energy into the sample. Force pulses are applied to the transducer concurrently with transmission of the ultrasonic energy. A host machine processes ultrasonic return pulses from an ultrasonic pulser/receiver to quantify attenuation of the ultrasonic energy within the sample. The host machine detects a defect in the sample using the quantified level of attenuation. The method may include positioning a dry couplant between an ultrasonic transducer and the surface. A system includes an actuator, an ultrasonic transducer, a dry couplant between the transducer the sample, a scanning device that moves the actuator and transducer, and a measurement system having a pulsed actuator power supply, an ultrasonic pulser/receiver, and a host machine that executes the above method.

  9. Digital Curation of Earth Science Samples Starts in the Field

    NASA Astrophysics Data System (ADS)

    Lehnert, K. A.; Hsu, L.; Song, L.; Carter, M. R.

    2014-12-01

    Collection of physical samples in the field is an essential part of research in the Earth Sciences. Samples provide a basis for progress across many disciplines, from the study of global climate change now and over the Earth's history, to present and past biogeochemical cycles, to magmatic processes and mantle dynamics. The types of samples, methods of collection, and scope and scale of sampling campaigns are highly diverse, ranging from large-scale programs to drill rock and sediment cores on land, in lakes, and in the ocean, to environmental observation networks with continuous sampling, to single investigator or small team expeditions to remote areas around the globe or trips to local outcrops. Cyberinfrastructure for sample-related fieldwork needs to cater to the different needs of these diverse sampling activities, aligning with specific workflows, regional constraints such as connectivity or climate, and processing of samples. In general, digital tools should assist with capture and management of metadata about the sampling process (location, time, method) and the sample itself (type, dimension, context, images, etc.), management of the physical objects (e.g., sample labels with QR codes), and the seamless transfer of sample metadata to data systems and software relevant to the post-sampling data acquisition, data processing, and sample curation. In order to optimize CI capabilities for samples, tools and workflows need to adopt community-based standards and best practices for sample metadata, classification, identification and registration. This presentation will provide an overview and updates of several ongoing efforts that are relevant to the development of standards for digital sample management: the ODM2 project that has generated an information model for spatially-discrete, feature-based earth observations resulting from in-situ sensors and environmental samples, aligned with OGC's Observation & Measurements model (Horsburgh et al, AGU FM 2014); implementation of the IGSN (International Geo Sample Number) as a globally unique sample identifier via a distributed system of allocating agents and a central registry; and the EarthCube Research Coordination Network iSamplES (Internet of Samples in the Earth Sciences) that aims to improve sharing and curation of samples through the use of CI.

  10. Comparison of human umbilical cord blood processing with or without hydroxyethyl starch.

    PubMed

    Souri, Milad; Nikougoftar Zarif, Mahin; Rasouli, Mahboobeh; Golzadeh, Khadijeh; Nakhlestani Hagh, Mozhdeh; Ezzati, Nasim; Atarodi, Kamran

    2017-11-01

    Umbilical cord blood (UCB) processing with hydroxyethyl starch (HES) is the most common protocol in the cord blood banks. The quality of UCB volume reduction was guaranteed by minimum manipulation of cord blood samples in the closed system. This study aimed to analyze and compare cell recovery and viability of UCB processed using the Sepax automated system in the presence and absence of HES. Thirty UCB bags with a total nucleated cell (TNC) count of more than 2.5 × 10 9 were divided in two bags with equal volume. HES solution was added to one bag and another was intact. Both bags were processed with the Sepax. To determine cell recovery, viability, and potential of colony-forming cells (CFCs), preprocessing, postprocessing, and thawing samples were analyzed. The mean TNC recovery after processing and after thaw was significantly better with the HES method (p < 0.01 for the postprocessing step and p < 0.05 for the postthaw step). There were no significant differences to mononucleated cells (MNCs) and CD34+ cell recovery between the two methods after processing and after thaw. TNC and MNC viability was significantly higher without HES after processing and after thaw (p < 0.01). The results of the CFC assay were similar for both methods after processing and after thaw. These results showed that processing of UCB using the Sepax system with the without-HES protocol due to the lower manipulation of samples could be used as an eligible protocol to reduce the volume of UCB. © 2017 AABB.

  11. 40 CFR 60.285 - Test methods and procedures.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... cleanup solvent instead of acetone in the sample recovery procedure. The particulate concentration shall... dscm (31.8 dscf). Water shall be used instead of acetone in the sample recovery. (3) Process data shall...

  12. Environmental Sampling & Analytical Methods (ESAM) Program - Home

    EPA Pesticide Factsheets

    ESAM is a comprehensive program to facilitate a coordinated response to a chemical, radiochemical, biotoxin or pathogen contamination incident focusing on sample collection, processing, and analysis to provide quality results to the field.

  13. Rapid quality assessment of Radix Aconiti Preparata using direct analysis in real time mass spectrometry.

    PubMed

    Zhu, Hongbin; Wang, Chunyan; Qi, Yao; Song, Fengrui; Liu, Zhiqiang; Liu, Shuying

    2012-11-08

    This study presents a novel and rapid method to identify chemical markers for the quality control of Radix Aconiti Preparata, a world widely used traditional herbal medicine. In the method, the samples with a fast extraction procedure were analyzed using direct analysis in real time mass spectrometry (DART MS) combined with multivariate data analysis. At present, the quality assessment approach of Radix Aconiti Preparata was based on the two processing methods recorded in Chinese Pharmacopoeia for the purpose of reducing the toxicity of Radix Aconiti and ensuring its clinical therapeutic efficacy. In order to ensure the safety and effectivity in clinical use, the processing degree of Radix Aconiti should be well controlled and assessed. In the paper, hierarchical cluster analysis and principal component analysis were performed to evaluate the DART MS data of Radix Aconiti Preparata samples in different processing times. The results showed that the well processed Radix Aconiti Preparata, unqualified processed and the raw Radix Aconiti could be clustered reasonably corresponding to their constituents. The loading plot shows that the main chemical markers having the most influence on the discrimination amongst the qualified and unqualified samples were mainly some monoester diterpenoid aconitines and diester diterpenoid aconitines, i.e. benzoylmesaconine, hypaconitine, mesaconitine, neoline, benzoylhypaconine, benzoylaconine, fuziline, aconitine and 10-OH-mesaconitine. The established DART MS approach in combination with multivariate data analysis provides a very flexible and reliable method for quality assessment of toxic herbal medicine. Copyright © 2012 Elsevier B.V. All rights reserved.

  14. Consensus Classification Using Non-Optimized Classifiers.

    PubMed

    Brownfield, Brett; Lemos, Tony; Kalivas, John H

    2018-04-03

    Classifying samples into categories is a common problem in analytical chemistry and other fields. Classification is usually based on only one method, but numerous classifiers are available with some being complex, such as neural networks, and others are simple, such as k nearest neighbors. Regardless, most classification schemes require optimization of one or more tuning parameters for best classification accuracy, sensitivity, and specificity. A process not requiring exact selection of tuning parameter values would be useful. To improve classification, several ensemble approaches have been used in past work to combine classification results from multiple optimized single classifiers. The collection of classifications for a particular sample are then combined by a fusion process such as majority vote to form the final classification. Presented in this Article is a method to classify a sample by combining multiple classification methods without specifically classifying the sample by each method, that is, the classification methods are not optimized. The approach is demonstrated on three analytical data sets. The first is a beer authentication set with samples measured on five instruments, allowing fusion of multiple instruments by three ways. The second data set is composed of textile samples from three classes based on Raman spectra. This data set is used to demonstrate the ability to classify simultaneously with different data preprocessing strategies, thereby reducing the need to determine the ideal preprocessing method, a common prerequisite for accurate classification. The third data set contains three wine cultivars for three classes measured at 13 unique chemical and physical variables. In all cases, fusion of nonoptimized classifiers improves classification. Also presented are atypical uses of Procrustes analysis and extended inverted signal correction (EISC) for distinguishing sample similarities to respective classes.

  15. Rapid prediction of ochratoxin A-producing strains of Penicillium on dry-cured meat by MOS-based electronic nose.

    PubMed

    Lippolis, Vincenzo; Ferrara, Massimo; Cervellieri, Salvatore; Damascelli, Anna; Epifani, Filomena; Pascale, Michelangelo; Perrone, Giancarlo

    2016-02-02

    The availability of rapid diagnostic methods for monitoring ochratoxigenic species during the seasoning processes for dry-cured meats is crucial and constitutes a key stage in order to prevent the risk of ochratoxin A (OTA) contamination. A rapid, easy-to-perform and non-invasive method using an electronic nose (e-nose) based on metal oxide semiconductors (MOS) was developed to discriminate dry-cured meat samples in two classes based on the fungal contamination: class P (samples contaminated by OTA-producing Penicillium strains) and class NP (samples contaminated by OTA non-producing Penicillium strains). Two OTA-producing strains of Penicillium nordicum and two OTA non-producing strains of Penicillium nalgiovense and Penicillium salamii, were tested. The feasibility of this approach was initially evaluated by e-nose analysis of 480 samples of both Yeast extract sucrose (YES) and meat-based agar media inoculated with the tested Penicillium strains and incubated up to 14 days. The high recognition percentages (higher than 82%) obtained by Discriminant Function Analysis (DFA), either in calibration and cross-validation (leave-more-out approach), for both YES and meat-based samples demonstrated the validity of the used approach. The e-nose method was subsequently developed and validated for the analysis of dry-cured meat samples. A total of 240 e-nose analyses were carried out using inoculated sausages, seasoned by a laboratory-scale process and sampled at 5, 7, 10 and 14 days. DFA provided calibration models that permitted discrimination of dry-cured meat samples after only 5 days of seasoning with mean recognition percentages in calibration and cross-validation of 98 and 88%, respectively. A further validation of the developed e-nose method was performed using 60 dry-cured meat samples produced by an industrial-scale seasoning process showing a total recognition percentage of 73%. The pattern of volatile compounds of dry-cured meat samples was identified and characterized by a developed HS-SPME/GC-MS method. Seven volatile compounds (2-methyl-1-butanol, octane, 1R-α-pinene, d-limonene, undecane, tetradecanal, 9-(Z)-octadecenoic acid methyl ester) allowed discrimination between dry-cured meat samples of classes P and NP. These results demonstrate that MOS-based electronic nose can be a useful tool for a rapid screening in preventing OTA contamination in the cured meat supply chain. Copyright © 2015 Elsevier B.V. All rights reserved.

  16. Processing of solid solution, mixed uranium/refractory metal carbides for advanced space nuclear power and propulsion systems

    NASA Astrophysics Data System (ADS)

    Knight, Travis Warren

    Nuclear thermal propulsion (NTP) and space nuclear power are two enabling technologies for the manned exploration of space and the development of research outposts in space and on other planets such as Mars. Advanced carbide nuclear fuels have been proposed for application in space nuclear power and propulsion systems. This study examined the processing technologies and optimal parameters necessary to fabricate samples of single phase, solid solution, mixed uranium/refractory metal carbides. In particular, the pseudo-ternary carbide, UC-ZrC-NbC, system was examined with uranium metal mole fractions of 5% and 10% and corresponding uranium densities of 0.8 to 1.8 gU/cc. Efforts were directed to those methods that could produce simple geometry fuel elements or wafers such as those used to fabricate a Square Lattice Honeycomb (SLHC) fuel element and reactor core. Methods of cold uniaxial pressing, sintering by induction heating, and hot pressing by self-resistance heating were investigated. Solid solution, high density (low porosity) samples greater than 95% TD were processed by cold pressing at 150 MPa and sintering above 2600 K for times longer than 90 min. Some impurity oxide phases were noted in some samples attributed to residual gases in the furnace during processing. Also, some samples noted secondary phases of carbon and UC2 due to some hyperstoichiometric powder mixtures having carbon-to-metal ratios greater than one. In all, 33 mixed carbide samples were processed and analyzed with half bearing uranium as ternary carbides of UC-ZrC-NbC. Scanning electron microscopy, x-ray diffraction, and density measurements were used to characterize samples. Samples were processed from powders of the refractory mono-carbides and UC/UC 2 or from powders of uranium hydride (UH3), graphite, and refractory metal carbides to produce hypostoichiometric mixed carbides. Samples processed from the constituent carbide powders and sintered at temperatures above the melting point of UC showed signs of liquid phase sintering and were shown to be largely solid solutions. Pre-compaction of mixed carbide powders prior to sintering was shown to be necessary to achieve high densities. Hypostoichiometric, samples processed at 2500 K exhibited only the initial stage of sintering and solid solution formation. Based on these findings, a suggested processing methodology is proposed for producing high density, solid solution, mixed carbide fuels. Pseudo-binary, refractory carbide samples hot pressed at 3100 K and 6 MPa showed comparable densities (approximately 85% of the theoretical value) to samples processed by cold pressing and sintering at temperatures of 2800 K.

  17. Rotor assembly and method for automatically processing liquids

    DOEpatents

    Burtis, C.A.; Johnson, W.F.; Walker, W.A.

    1992-12-22

    A rotor assembly is described for performing a relatively large number of processing steps upon a sample, such as a whole blood sample, and a diluent, such as water. It includes a rotor body for rotation about an axis and includes a network of chambers within which various processing steps are performed upon the sample and diluent and passageways through which the sample and diluent are transferred. A transfer mechanism is movable through the rotor body by the influence of a magnetic field generated adjacent the transfer mechanism and movable along the rotor body, and the assembly utilizes centrifugal force, a transfer of momentum and capillary action to perform any of a number of processing steps such as separation, aliquoting, transference, washing, reagent addition and mixing of the sample and diluent within the rotor body. The rotor body is particularly suitable for automatic immunoassay analyses. 34 figs.

  18. Application of Fourier Transform Infrared Spectra (FTIR) Fingerprint in the Quality Control of Mineral Chinese Medicine Limonitum.

    PubMed

    Liu, Sheng-jin; Yang, Huan; Wu, De-kang; Xu, Chun-xiang; Lin, Rui-chao; Tian, Jin-gai; Fang, Fang

    2015-04-01

    In the present paper, the fingerprint of Limonitum (a mineral Chinese medicine) by FTIR was established, and the spectrograms among crude samples, processed one and the adulterant sample were compared. Eighteen batches of Limonitum samples from different production areas were analyzed and the angle cosine value of transmittance (%) of common peaks was calculated to get the similarity of the FTIR fingerprints. The result showed that the similarities and the coefficients of the samples were all more than 0.90. The processed samples revealed significant differences compared with the crude one. This study analyzed the composition characteristics of Limonitum in FTIR fingerprint, and it was simple and fast to distinguish the crude, processed and the counterfeit samples. The FTIR fingerprints provide a new method for evaluating the quality of Limonitum.

  19. Quantification of Aconitum alkaloids in aconite roots by a modified RP-HPLC method.

    PubMed

    Jiang, Zhi-Hong; Xie, Ying; Zhou, Hua; Wang, Jing-Rong; Liu, Zhong-Qiu; Wong, Yuen-Fan; Cai, Xiong; Xu, Hong-Xi; Liu, Liang

    2005-01-01

    The three Aconitum alkaloids, aconitine (1), mesaconitine (2) and hypaconitine (3), are pharmacologically active but also highly toxic. A standardised method is needed for assessing the levels of these alkaloids in aconite roots in order to ensure the safe use of these plant materials as medicinal herbs. By optimising extraction, separation and measurement conditions, a reliable, reproducible and accurate method for the quantitative determination of all three Aconitum alkaloids in unprocessed and processed aconite roots has been developed. This method should be appropriate for use in the quality control of Aconitum products. The three Aconitum alkaloids were separated by a modified HPLC method employing a C18 column gradient eluted with acetonitrile and ammonium bicarbonate buffer. Quantification of Aconitum alkaloids, detected at 240 nm, in different batches of samples showed that the content of 1, 2 and 3 varied significantly. In general, the alkaloid content of unprocessed roots was higher than that of processed roots. These variations were considered to be the result of differences in species, processing methods and places of origin of the samples.

  20. Evaluation of cotton-fabric bleaching using hydrogen peroxide and Blue LED

    NASA Astrophysics Data System (ADS)

    de Oliveira, Bruno P.; Moriyama, Lilian T.; Bagnato, Vanderlei S.

    2015-06-01

    The raw cotton production requires multiple steps being one of them the removal of impurities acquired during previous processes. This procedure is widely used by textile industries around the world and is called bleaching. The raw cotton is composed by cellulosic and non-cellulosic materials like waxes, pectins and oils, which are responsible for its characteristic yellowish color. The bleaching process aims to remove the non-cellulosic materials concentration in the fabric, increasing its whiteness degree. The most used bleaching method utilizes a bath in an alkali solution of hydrogen peroxide, stabilizers and buffer solutions under high temperature. In the present study we evaluated the possibility of using a blue illumination for the bleaching process. We used blue LEDs (450 nm) to illuminate an acid hydrogen peroxide solution at room temperature. The samples treated by this method were compared with the conventional bleaching process through a colorimetric analysis and by a multiple comparison visual inspection by volunteers. The samples were also studied by a tensile test in order to verify the integrity of the cloth after bleaching. The results of fabric visual inspection and colorimetric analysis showed a small advantage for the sample treated by the standard method. The tensile test showed an increasing on the yield strength of the cloth after blue light bleaching. The presented method has great applicability potential due to the similar results compared to the standard method, with relative low cost and reduced production of chemical waste.

  1. Surface Flashover of Semiconductors: A Fundamental Study

    DTIC Science & Technology

    1993-06-16

    surface electric fields for a number of samples with aluminum and gold contacts. Effects of processing varia- tions such as anneal method (rapid thermal...more uniform pre- breakdown surface fields. 3. Various contact materials and processing methods were used to determine effects on flashover...diffusion depths determined by this method were generally consistent with the estimated depths. 2-4 In order to characterize better the diffused layers

  2. SEIPS-based process modeling in primary care.

    PubMed

    Wooldridge, Abigail R; Carayon, Pascale; Hundt, Ann Schoofs; Hoonakker, Peter L T

    2017-04-01

    Process mapping, often used as part of the human factors and systems engineering approach to improve care delivery and outcomes, should be expanded to represent the complex, interconnected sociotechnical aspects of health care. Here, we propose a new sociotechnical process modeling method to describe and evaluate processes, using the SEIPS model as the conceptual framework. The method produces a process map and supplementary table, which identify work system barriers and facilitators. In this paper, we present a case study applying this method to three primary care processes. We used purposeful sampling to select staff (care managers, providers, nurses, administrators and patient access representatives) from two clinics to observe and interview. We show the proposed method can be used to understand and analyze healthcare processes systematically and identify specific areas of improvement. Future work is needed to assess usability and usefulness of the SEIPS-based process modeling method and further refine it. Copyright © 2016 Elsevier Ltd. All rights reserved.

  3. SEIPS-Based Process Modeling in Primary Care

    PubMed Central

    Wooldridge, Abigail R.; Carayon, Pascale; Hundt, Ann Schoofs; Hoonakker, Peter

    2016-01-01

    Process mapping, often used as part of the human factors and systems engineering approach to improve care delivery and outcomes, should be expanded to represent the complex, interconnected sociotechnical aspects of health care. Here, we propose a new sociotechnical process modeling method to describe and evaluate processes, using the SEIPS model as the conceptual framework. The method produces a process map and supplementary table, which identify work system barriers and facilitators. In this paper, we present a case study applying this method to three primary care processes. We used purposeful sampling to select staff (care managers, providers, nurses, administrators and patient access representatives) from two clinics to observe and interview. We show the proposed method can be used to understand and analyze healthcare processes systematically and identify specific areas of improvement. Future work is needed to assess usability and usefulness of the SEIPS-based process modeling method and further refine it. PMID:28166883

  4. Application of Advanced Nondestructive Evaluation Techniques for Cylindrical Composite Test Samples

    NASA Technical Reports Server (NTRS)

    Martin, Richard E.; Roth, Donald J.; Salem, Jonathan A.

    2013-01-01

    Two nondestructive methods were applied to composite cylinder samples pressurized to failure in order to determine manufacturing quality and monitor damage progression under load. A unique computed tomography (CT) image processing methodology developed at NASA Glenn Research was used to assess the condition of the as-received samples while acoustic emission (AE) monitoring was used to identify both the extent and location of damage within the samples up to failure. Results show the effectiveness of both of these methods in identifying potentially critical fabrication issues and their resulting impact on performance.

  5. A new method to process testicular sperm: combining enzymatic digestion, accumulation of spermatozoa, and stimulation of motility.

    PubMed

    Wöber, Martina; Ebner, Thomas; Steiner, Sarah L; Strohmer, Heinz; Oppelt, Peter; Plas, Eugen; Obruca, Andreas

    2015-03-01

    In azoospermia processing of the TESE material often results in a sample of reduced purity. This prospective study was set up to clarify whether a combination of enzymatic digestion, density gradient centrifugation and stimulation of motility (where indicated) is a feasible option in TESE patients. A total of 63 samples (showing spermatozoa) were processed by the present tripartite processing method. The resulting sperm sample of high purity was directly used for ICSI and subsequent cryopreservation when quality of the accumulated sperm sample allowed for it (n = 39 cycles). Compared to the control group blastocyst formation rate in the present tripartite processing technique was significantly (P < 0.01) higher (55.2 vs. 43.7%). Fertilization rates differed significantly (P < 0.001) between cases in which motile sperm could be used (58.4%) compared to ICSI with immotile sperm (45.0%). Clinical pregnancy rate per transfer was 40.0% (24/60) using fresh and 21.6% (8/37) with cryopreserved TESE material. The calculated live birth rates were 31.7 and 21.6%, respectively. Thirty-five healthy children were born. A comparison with a control group suggests that the present approach using standardized ready-to-use products is efficient and reliable. Presumably healthy live births further indicate the safety of the procedure.

  6. Job Performance as Multivariate Dynamic Criteria: Experience Sampling and Multiway Component Analysis.

    PubMed

    Spain, Seth M; Miner, Andrew G; Kroonenberg, Pieter M; Drasgow, Fritz

    2010-08-06

    Questions about the dynamic processes that drive behavior at work have been the focus of increasing attention in recent years. Models describing behavior at work and research on momentary behavior indicate that substantial variation exists within individuals. This article examines the rationale behind this body of work and explores a method of analyzing momentary work behavior using experience sampling methods. The article also examines a previously unused set of methods for analyzing data produced by experience sampling. These methods are known collectively as multiway component analysis. Two archetypal techniques of multimode factor analysis, the Parallel factor analysis and the Tucker3 models, are used to analyze data from Miner, Glomb, and Hulin's (2010) experience sampling study of work behavior. The efficacy of these techniques for analyzing experience sampling data is discussed as are the substantive multimode component models obtained.

  7. Reducing acquisition times in multidimensional NMR with a time-optimized Fourier encoding algorithm

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Zhiyong; Department of Electronic Science, Fujian Provincial Key Laboratory of Plasma and Magnetic Resonance, Xiamen University, Xiamen, Fujian 361005; Smith, Pieter E. S.

    Speeding up the acquisition of multidimensional nuclear magnetic resonance (NMR) spectra is an important topic in contemporary NMR, with central roles in high-throughput investigations and analyses of marginally stable samples. A variety of fast NMR techniques have been developed, including methods based on non-uniform sampling and Hadamard encoding, that overcome the long sampling times inherent to schemes based on fast-Fourier-transform (FFT) methods. Here, we explore the potential of an alternative fast acquisition method that leverages a priori knowledge, to tailor polychromatic pulses and customized time delays for an efficient Fourier encoding of the indirect domain of an NMR experiment. Bymore » porting the encoding of the indirect-domain to the excitation process, this strategy avoids potential artifacts associated with non-uniform sampling schemes and uses a minimum number of scans equal to the number of resonances present in the indirect dimension. An added convenience is afforded by the fact that a usual 2D FFT can be used to process the generated data. Acquisitions of 2D heteronuclear correlation NMR spectra on quinine and on the anti-inflammatory drug isobutyl propionic phenolic acid illustrate the new method's performance. This method can be readily automated to deal with complex samples such as those occurring in metabolomics, in in-cell as well as in in vivo NMR applications, where speed and temporal stability are often primary concerns.« less

  8. Minimal Polynomial Method for Estimating Parameters of Signals Received by an Antenna Array

    NASA Astrophysics Data System (ADS)

    Ermolaev, V. T.; Flaksman, A. G.; Elokhin, A. V.; Kuptsov, V. V.

    2018-01-01

    The effectiveness of the projection minimal polynomial method for solving the problem of determining the number of sources of signals acting on an antenna array (AA) with an arbitrary configuration and their angular directions has been studied. The method proposes estimating the degree of the minimal polynomial of the correlation matrix (CM) of the input process in the AA on the basis of a statistically validated root-mean-square criterion. Special attention is paid to the case of the ultrashort sample of the input process when the number of samples is considerably smaller than the number of AA elements, which is important for multielement AAs. It is shown that the proposed method is more effective in this case than methods based on the AIC (Akaike's Information Criterion) or minimum description length (MDL) criterion.

  9. Comparison of diffusion- and pumped-sampling methods to monitor volatile organic compounds in ground water, Massachusetts Military Reservation, Cape Cod, Massachusetts, July 1999-December 2002

    USGS Publications Warehouse

    Archfield, Stacey A.; LeBlanc, Denis R.

    2005-01-01

    To evaluate diffusion sampling as an alternative method to monitor volatile organic compound (VOC) concentrations in ground water, concentrations in samples collected by traditional pumped-sampling methods were compared to concentrations in samples collected by diffusion-sampling methods for 89 monitoring wells at or near the Massachusetts Military Reservation, Cape Cod. Samples were analyzed for 36 VOCs. There was no substantial difference between the utility of diffusion and pumped samples to detect the presence or absence of a VOC. In wells where VOCs were detected, diffusion-sample concentrations of tetrachloroethene (PCE) and trichloroethene (TCE) were significantly lower than pumped-sample concentrations. Because PCE and TCE concentrations detected in the wells dominated the calculation of many of the total VOC concentrations, when VOC concentrations were summed and compared by sampling method, visual inspection also showed a downward concentration bias in the diffusion-sample concentration. The degree to which pumped- and diffusion-sample concentrations agreed was not a result of variability inherent within the sampling methods or the diffusion process itself. A comparison of the degree of agreement in the results from the two methods to 13 quantifiable characteristics external to the sampling methods offered only well-screen length as being related to the degree of agreement between the methods; however, there is also evidence to indicate that the flushing rate of water through the well screen affected the agreement between the sampling methods. Despite poor agreement between the concentrations obtained by the two methods at some wells, the degree to which the concentrations agree at a given well is repeatable. A one-time, well-bywell comparison between diffusion- and pumped-sampling methods could determine which wells are good candidates for the use of diffusion samplers. For wells with good method agreement, the diffusion-sampling method is a time-saving and cost-effective alternative to pumped-sampling methods in a long-term monitoring program, such as at the Massachusetts Military Reservation.

  10. Verification Of The Defense Waste Processing Facility's (DWPF) Process Digestion Methods For The Sludge Batch 8 Qualification Sample

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Click, D. R.; Edwards, T. B.; Wiedenman, B. J.

    2013-03-18

    This report contains the results and comparison of data generated from inductively coupled plasma – atomic emission spectroscopy (ICP-AES) analysis of Aqua Regia (AR), Sodium Peroxide/Sodium Hydroxide Fusion Dissolution (PF) and Cold Chem (CC) method digestions and Cold Vapor Atomic Absorption analysis of Hg digestions from the DWPF Hg digestion method of Sludge Batch 8 (SB8) Sludge Receipt and Adjustment Tank (SRAT) Receipt and SB8 SRAT Product samples. The SB8 SRAT Receipt and SB8 SRAT Product samples were prepared in the SRNL Shielded Cells, and the SRAT Receipt material is representative of the sludge that constitutes the SB8 Batch ormore » qualification composition. This is the sludge in Tank 51 that is to be transferred into Tank 40, which will contain the heel of Sludge Batch 7b (SB7b), to form the SB8 Blend composition.« less

  11. Pretreatment of whole blood using hydrogen peroxide and UV irradiation. Design of the advanced oxidation process.

    PubMed

    Bragg, Stefanie A; Armstrong, Kristie C; Xue, Zi-Ling

    2012-08-15

    A new process to pretreat blood samples has been developed. This process combines the Advanced Oxidation Process (AOP) treatment (using H(2)O(2) and UV irradiation) with acid deactivation of the enzyme catalase in blood. A four-cell reactor has been designed and built in house. The effect of pH on the AOP process has been investigated. The kinetics of the pretreatment process shows that at high C(H(2)O(2),t=0), the reaction is zeroth order with respect to C(H(2)O(2)) and first order with respect to C(blood). The rate limiting process is photon flux from the UV lamp. Degradation of whole blood has been compared with that of pure hemoglobin samples. The AOP pretreatment of the blood samples has led to the subsequent determination of chromium and zinc concentrations in the samples using electrochemical methods. Copyright © 2012 Elsevier B.V. All rights reserved.

  12. Most Probable Number Rapid Viability PCR Method to Detect Viable Spores of Bacillus anthracis in Swab Samples

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Letant, S E; Kane, S R; Murphy, G A

    2008-05-30

    This note presents a comparison of Most-Probable-Number Rapid Viability (MPN-RV) PCR and traditional culture methods for the quantification of Bacillus anthracis Sterne spores in macrofoam swabs generated by the Centers for Disease Control and Prevention (CDC) for a multi-center validation study aimed at testing environmental swab processing methods for recovery, detection, and quantification of viable B. anthracis spores from surfaces. Results show that spore numbers provided by the MPN RV-PCR method were in statistical agreement with the CDC conventional culture method for all three levels of spores tested (10{sup 4}, 10{sup 2}, and 10 spores) even in the presence ofmore » dirt. In addition to detecting low levels of spores in environmental conditions, the MPN RV-PCR method is specific, and compatible with automated high-throughput sample processing and analysis protocols.« less

  13. 40 CFR 63.645 - Test methods and procedures for miscellaneous process vents.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... TOC emission rate, as specified under paragraph (f) of this section, the sampling site shall be after... process vent TOC mass flow rate is less than 33 kilograms per day for an existing source or less than 6.8... shall determine the TOC mass flow rate by the following procedures: (1) The sampling site shall be...

  14. Method and apparatus for processing a test sample to concentrate an analyte in the sample from a solvent in the sample

    DOEpatents

    Turner, Terry D.; Beller, Laurence S.; Clark, Michael L.; Klingler, Kerry M.

    1997-01-01

    A method of processing a test sample to concentrate an analyte in the sample from a solvent in the sample includes: a) boiling the test sample containing the analyte and solvent in a boiling chamber to a temperature greater than or equal to the solvent boiling temperature and less than the analyte boiling temperature to form a rising sample vapor mixture; b) passing the sample vapor mixture from the boiling chamber to an elongated primary separation tube, the separation tube having internal sidewalls and a longitudinal axis, the longitudinal axis being angled between vertical and horizontal and thus having an upper region and a lower region; c) collecting the physically transported liquid analyte on the internal sidewalls of the separation tube; and d) flowing the collected analyte along the angled internal sidewalls of the separation tube to and pass the separation tube lower region. The invention also includes passing a turbulence inducing wave through a vapor mixture to separate physically transported liquid second material from vaporized first material. Apparatus are also disclosed for effecting separations. Further disclosed is a fluidically powered liquid test sample withdrawal apparatus for withdrawing a liquid test sample from a test sample container and for cleaning the test sample container.

  15. Method and apparatus for processing a test sample to concentrate an analyte in the sample from a solvent in the sample

    DOEpatents

    Turner, T.D.; Beller, L.S.; Clark, M.L.; Klingler, K.M.

    1997-10-14

    A method of processing a test sample to concentrate an analyte in the sample from a solvent in the sample includes: (a) boiling the test sample containing the analyte and solvent in a boiling chamber to a temperature greater than or equal to the solvent boiling temperature and less than the analyte boiling temperature to form a rising sample vapor mixture; (b) passing the sample vapor mixture from the boiling chamber to an elongated primary separation tube, the separation tube having internal sidewalls and a longitudinal axis, the longitudinal axis being angled between vertical and horizontal and thus having an upper region and a lower region; (c) collecting the physically transported liquid analyte on the internal sidewalls of the separation tube; and (d) flowing the collected analyte along the angled internal sidewalls of the separation tube to and pass the separation tube lower region. The invention also includes passing a turbulence inducing wave through a vapor mixture to separate physically transported liquid second material from vaporized first material. Apparatus is also disclosed for effecting separations. Further disclosed is a fluidically powered liquid test sample withdrawal apparatus for withdrawing a liquid test sample from a test sample container and for cleaning the test sample container. 8 figs.

  16. Literature Review on Processing and Analytical Methods for ...

    EPA Pesticide Factsheets

    Report The purpose of this report was to survey the open literature to determine the current state of the science regarding the processing and analytical methods currently available for recovery of F. tularensis from water and soil matrices, and to determine what gaps remain in the collective knowledge concerning F. tularensis identification from environmental samples.

  17. Assessment of Counselors' Supervision Processes

    ERIC Educational Resources Information Center

    Ünal, Ali; Sürücü, Abdullah; Yavuz, Mustafa

    2013-01-01

    The aim of this study is to investigate elementary and high school counselors' supervision processes and efficiency of their supervision. The interview method was used as it was thought to be better for realizing the aim of the study. The study group was composed of ten counselors who were chosen through purposeful sampling method. Data were…

  18. How Pre-Service Teachers' Understand and Perform Science Process Skills

    ERIC Educational Resources Information Center

    Chabalengula, Vivien Mweene; Mumba, Frackson; Mbewe, Simeon

    2012-01-01

    This study explored pre-service teachers' conceptual understanding and performance on science process skills. A sample comprised 91 elementary pre-service teachers at a university in the Midwest of the USA. Participants were enrolled in two science education courses; introductory science teaching methods course and advanced science methods course.…

  19. Comparative analysis of whole mount processing and systematic sampling of radical prostatectomy specimens: pathological outcomes and risk of biochemical recurrence.

    PubMed

    Salem, Shady; Chang, Sam S; Clark, Peter E; Davis, Rodney; Herrell, S Duke; Kordan, Yakup; Wills, Marcia L; Shappell, Scott B; Baumgartner, Roxelyn; Phillips, Sharon; Smith, Joseph A; Cookson, Michael S; Barocas, Daniel A

    2010-10-01

    Whole mount processing is more resource intensive than routine systematic sampling of radical retropubic prostatectomy specimens. We compared whole mount and systematic sampling for detecting pathological outcomes, and compared the prognostic value of pathological findings across pathological methods. We included men (608 whole mount and 525 systematic sampling samples) with no prior treatment who underwent radical retropubic prostatectomy at Vanderbilt University Medical Center between January 2000 and June 2008. We used univariate and multivariate analysis to compare the pathological outcome detection rate between pathological methods. Kaplan-Meier curves and the log rank test were used to compare the prognostic value of pathological findings across pathological methods. There were no significant differences between the whole mount and the systematic sampling groups in detecting extraprostatic extension (25% vs 30%), positive surgical margins (31% vs 31%), pathological Gleason score less than 7 (49% vs 43%), 7 (39% vs 43%) or greater than 7 (12% vs 13%), seminal vesicle invasion (8% vs 10%) or lymph node involvement (3% vs 5%). Tumor volume was higher in the systematic sampling group and whole mount detected more multiple surgical margins (each p <0.01). There were no significant differences in the likelihood of biochemical recurrence between the pathological methods when patients were stratified by pathological outcome. Except for estimated tumor volume and multiple margins whole mount and systematic sampling yield similar pathological information. Each method stratifies patients into comparable risk groups for biochemical recurrence. Thus, while whole mount is more resource intensive, it does not appear to result in improved detection of clinically important pathological outcomes or prognostication. Copyright © 2010 American Urological Association Education and Research, Inc. Published by Elsevier Inc. All rights reserved.

  20. Tuning the reactivity of Al/Fe{sub 2}O{sub 3} nanoenergetic materials via an approach combining soft template self-assembly with sol–gel process process

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Tianfu; Wang, Zhen; Li, Guoping

    2015-10-15

    A bottom-up approach combining soft template self-assembly with sol–gel process, was adopted to prepare the assembled Al/Fe{sub 2}O{sub 3} nanoenergetic materials, assembly-Al/Fe{sub 2}O{sub 3} sample. The other two unassembled Al/Fe{sub 2}O{sub 3}a nanoenergetic materials, sol–gel–Al/Fe{sub 2}O{sub 3} sample and mixing-Al/Fe{sub 2}O{sub 3} sample, were prepared by sol–gel method and physical mixing method respectively. The assembly process within the preparation of the assembly-Al/Fe{sub 2}O{sub 3} sample was analyzed through the changes in the average hydrodynamic diameters of the particles and the micelles in solution. SEM, EDS and TEM tests were performed to demonstrate a significant improvement regarding to dispersity and arrangementsmore » of the Al and Fe{sub 2}O{sub 3} particles in the assembled samples, compared to that of the unassembled Al/Fe{sub 2}O{sub 3} samples. DSC test was employed to characterize the reactivity of the samples. The heat release of the assembled Al/Fe{sub 2}O{sub 3} sample was 2088 J/g, about 400 and 990 J/g more than that of the sol–gel–Al/Fe{sub 2}O{sub 3} sample and mixing-Al/Fe{sub 2}O{sub 3} sample, respectively. - Graphical abstract: Modified aluminum (Al) nanoparticles with hydrophobic surface assembled into the Brij S10 micelle in Fe(III) sol, then the well dispersed system was transformed into Al/Fe{sub 2}O{sub 3} nanoenergetic materials with high reactivity. - Highlights: • An approach combining soft template self-assembly with sol–gel process was adopted. • The aggregation of Al nanoparticles in the final product was reduced significantly. • The reactivity of Al/Fe{sub 2}O{sub 3} nanoenergetic materials was improved to a large extent.« less

  1. Minority carrier diffusion lengths and absorption coefficients in silicon sheet material

    NASA Technical Reports Server (NTRS)

    Dumas, K. A.; Swimm, R. T.

    1980-01-01

    Most of the methods which have been developed for the measurement of the minority carrier diffusion length of silicon wafers require that the material have either a Schottky or an ohmic contact. The surface photovoltage (SPV) technique is an exception. The SPV technique could, therefore, become a valuable diagnostic tool in connection with current efforts to develop low-cost processes for the production of solar cells. The technique depends on a knowledge of the optical absorption coefficient. The considered investigation is concerned with a reevaluation of the absorption coefficient as a function of silicon processing. A comparison of absorption coefficient values showed these values to be relatively consistent from sample to sample, and independent of the sample growth method.

  2. Crystallization characteristics of iron-rich glass ceramics prepared from nickel slag and blast furnace slag

    NASA Astrophysics Data System (ADS)

    Wang, Zhong-Jie; Ni, Wen; Li, Ke-Qing; Huang, Xiao-Yan; Zhu, Li-Ping

    2011-08-01

    The crystallization process of iron-rich glass-ceramics prepared from the mixture of nickel slag (NS) and blast furnace slag (BFS) with a small amount of quartz sand was investigated. A modified melting method which was more energy-saving than the traditional methods was used to control the crystallization process. The results show that the iron-rich system has much lower melting temperature, glass transition temperature ( T g), and glass crystallization temperature ( T c), which can result in a further energy-saving process. The results also show that the system has a quick but controllable crystallization process with its peak crystallization temperature at 918°C. The crystallization of augite crystals begins from the edge of the sample and invades into the whole sample. The crystallization process can be completed in a few minutes. A distinct boundary between the crystallized part and the non-crystallized part exists during the process. In the non-crystallized part showing a black colour, some sphere-shaped augite crystals already exist in the glass matrix before samples are heated to T c. In the crystallized part showing a khaki colour, a compact structure is formed by augite crystals.

  3. New Methods of Sample Preparation for Atom Probe Specimens

    NASA Technical Reports Server (NTRS)

    Kuhlman, Kimberly, R.; Kowalczyk, Robert S.; Ward, Jennifer R.; Wishard, James L.; Martens, Richard L.; Kelly, Thomas F.

    2003-01-01

    Magnetite is a common conductive mineral found on Earth and Mars. Disk-shaped precipitates approximately 40 nm in diameter have been shown to have manganese and aluminum concentrations. Atom-probe field-ion microscopy (APFIM) is the only technique that can potentially quantify the composition of these precipitates. APFIM will be used to characterize geological and planetary materials, analyze samples of interest for geomicrobiology; and, for the metrology of nanoscale instrumentation. Prior to APFIM sample preparation was conducted by electropolishing, the method of sharp shards (MSS), or Bosch process (deep reactive ion etching) with focused ion beam (FIB) milling as a final step. However, new methods are required for difficult samples. Many materials are not easily fabricated using electropolishing, MSS, or the Bosch process, FIB milling is slow and expensive, and wet chemistry and the reactive ion etching are typically limited to Si and other semiconductors. APFIM sample preparation using the dicing saw is commonly used to section semiconductor wafers into individual devices following manufacture. The dicing saw is a time-effective method for preparing high aspect ratio posts of poorly conducting materials. Femtosecond laser micromachining is also suitable for preparation of posts. FIB time required is reduced by about a factor of 10 and multi-tip specimens can easily be fabricated using the dicing saw.

  4. Application of the microbiological method DEFT/APC to detect minimally processed vegetables treated with gamma radiation

    NASA Astrophysics Data System (ADS)

    Araújo, M. M.; Duarte, R. C.; Silva, P. V.; Marchioni, E.; Villavicencio, A. L. C. H.

    2009-07-01

    Marketing of minimally processed vegetables (MPV) are gaining impetus due to its convenience, freshness and apparent health effect. However, minimal processing does not reduce pathogenic microorganisms to safe levels. Food irradiation is used to extend the shelf life and to inactivate food-borne pathogens. In combination with minimal processing it could improve safety and quality of MPV. A microbiological screening method based on the use of direct epifluorescent filter technique (DEFT) and aerobic plate count (APC) has been established for the detection of irradiated foodstuffs. The aim of this study was to evaluate the applicability of this technique in detecting MPV irradiation. Samples from retail markets were irradiated with 0.5 and 1.0 kGy using a 60Co facility. In general, with a dose increment, DEFT counts remained similar independent of the irradiation while APC counts decreased gradually. The difference of the two counts gradually increased with dose increment in all samples. It could be suggested that a DEFT/APC difference over 2.0 log would be a criteria to judge if a MPV was treated by irradiation. The DEFT/APC method could be used satisfactorily as a screening method for indicating irradiation processing.

  5. Quantitative measurement of intact alpha-synuclein proteoforms from post-mortem control and Parkinson's disease brain tissue by intact protein mass spectrometry.

    PubMed

    Kellie, John F; Higgs, Richard E; Ryder, John W; Major, Anthony; Beach, Thomas G; Adler, Charles H; Merchant, Kalpana; Knierman, Michael D

    2014-07-23

    A robust top down proteomics method is presented for profiling alpha-synuclein species from autopsied human frontal cortex brain tissue from Parkinson's cases and controls. The method was used to test the hypothesis that pathology associated brain tissue will have a different profile of post-translationally modified alpha-synuclein than the control samples. Validation of the sample processing steps, mass spectrometry based measurements, and data processing steps were performed. The intact protein quantitation method features extraction and integration of m/z data from each charge state of a detected alpha-synuclein species and fitting of the data to a simple linear model which accounts for concentration and charge state variability. The quantitation method was validated with serial dilutions of intact protein standards. Using the method on the human brain samples, several previously unreported modifications in alpha-synuclein were identified. Low levels of phosphorylated alpha synuclein were detected in brain tissue fractions enriched for Lewy body pathology and were marginally significant between PD cases and controls (p = 0.03).

  6. A Metallurgical Study of Nāga Bhasma

    PubMed Central

    Singh Gautam, Dev Nath

    2017-01-01

    Background: The metal Nāga (Lead) is being used by Indians since ancient times. Its external and internal uses have been described in Caraka, Suśruta and other Ayurvedic Saṃhitā. According to most of the Rasa texts, Nāga Bhasma and its formulations are used in many diseases such as Prameha, Jvara, Gulma, Śukrameha etc. Objectives: In the present study, Nāga Bhasma was prepared by the traditional Puṭa method (TPM) and by the electric muffle furnace Puṭa method (EMFPM) and standardized using Metallographic studies. Doing so helps in the study of the microstructure of Nāga Bhasma and also helps in the identification of the metal particles along with the nature of compound formed during the Māraṇa (Bhasmīkaraṇa) process. Setting and Design: Different samples from initial raw material to final product of Nāga Bhasma were collected during the pharmaceutical process (1st, 30th and 60th Puṭa) from both methods i.e. TPM and EMFPM. Samples from both methods were studied using metallographic examination. Materials and Methods: The processing of the Nāga Bhasma (ṣaṣṭipuṭa) was done according to Ānanda Kanda[9] Samples from the raw material i.e. Aśodhita Nāga (raw Lead) and that processed after 1st, 30th and 60th Puṭa from both methods i.e. traditional Puṭa method (using heat from burning of cow dung cakes) and electric muffle furnace Puṭa method were taken. They were mounted on self hardening acrylic base. After careful polishing to obtain scratch free surface of product, they were used for metallurgical study. Conclusion: This study shows that traditional Puṭa method may be better than electric muffle furnace Puṭa method because of more homogeneous distribution of Lead sulphide in the Nāga Bhasma which is prepared by traditional method. PMID:29269968

  7. The EIPeptiDi tool: enhancing peptide discovery in ICAT-based LC MS/MS experiments.

    PubMed

    Cannataro, Mario; Cuda, Giovanni; Gaspari, Marco; Greco, Sergio; Tradigo, Giuseppe; Veltri, Pierangelo

    2007-07-15

    Isotope-coded affinity tags (ICAT) is a method for quantitative proteomics based on differential isotopic labeling, sample digestion and mass spectrometry (MS). The method allows the identification and relative quantification of proteins present in two samples and consists of the following phases. First, cysteine residues are either labeled using the ICAT Light or ICAT Heavy reagent (having identical chemical properties but different masses). Then, after whole sample digestion, the labeled peptides are captured selectively using the biotin tag contained in both ICAT reagents. Finally, the simplified peptide mixture is analyzed by nanoscale liquid chromatography-tandem mass spectrometry (LC-MS/MS). Nevertheless, the ICAT LC-MS/MS method still suffers from insufficient sample-to-sample reproducibility on peptide identification. In particular, the number and the type of peptides identified in different experiments can vary considerably and, thus, the statistical (comparative) analysis of sample sets is very challenging. Low information overlap at the peptide and, consequently, at the protein level, is very detrimental in situations where the number of samples to be analyzed is high. We designed a method for improving the data processing and peptide identification in sample sets subjected to ICAT labeling and LC-MS/MS analysis, based on cross validating MS/MS results. Such a method has been implemented in a tool, called EIPeptiDi, which boosts the ICAT data analysis software improving peptide identification throughout the input data set. Heavy/Light (H/L) pairs quantified but not identified by the MS/MS routine, are assigned to peptide sequences identified in other samples, by using similarity criteria based on chromatographic retention time and Heavy/Light mass attributes. EIPeptiDi significantly improves the number of identified peptides per sample, proving that the proposed method has a considerable impact on the protein identification process and, consequently, on the amount of potentially critical information in clinical studies. The EIPeptiDi tool is available at http://bioingegneria.unicz.it/~veltri/projects/eipeptidi/ with a demo data set. EIPeptiDi significantly increases the number of peptides identified and quantified in analyzed samples, thus reducing the number of unassigned H/L pairs and allowing a better comparative analysis of sample data sets.

  8. Headspace solid-phase microextraction (HS-SPME) combined with GC-MS as a process analytical technology (PAT) tool for monitoring the cultivation of C. tetani.

    PubMed

    Ghader, Masoud; Shokoufi, Nader; Es-Haghi, Ali; Kargosha, Kazem

    2018-04-15

    Vaccine production is a biological process in which variation in time and output is inevitable. Thus, the application of Process Analytical Technologies (PAT) will be important in this regard. Headspace solid - phase microextraction (HS-SPME) coupled with GC-MS can be used as a PAT for process monitoring. This method is suitable to chemical profiling of volatile organic compounds (VOCs) emitted from microorganisms. Tetanus is a lethal disease caused by Clostridium tetani (C. tetani) bacterium and vaccination is an ultimate way to prevent this disease. In this paper, SPME fiber was used for the investigation of VOCs emerging from C. tetani during cultivation. Different types of VOCs such as sulfur-containing compounds were identified and some of them were selected as biomarkers for bioreactor monitoring during vaccine production. In the second step, the portable dynamic air sampling (PDAS) device was used as an interface for sampling VOCs by SPME fibers. The sampling procedure was optimized by face-centered central composite design (FC-CCD). The optimized sampling time and inlet gas flow rates were 10 min and 2 m L s -1 , respectively. PDAS was mounted in exhausted gas line of bioreactor and 42 samples of VOCs were prepared by SPME fibers in 7 days during incubation. Simultaneously, pH and optical density (OD) were evaluated to cultivation process which showed good correlations with the identified VOCs (>80%). This method could be used for VOCs sampling from off-gas of a bioreactor to monitoring of the cultivation process. Copyright © 2018. Published by Elsevier B.V.

  9. High-throughput diagnosis of potato cyst nematodes in soil samples.

    PubMed

    Reid, Alex; Evans, Fiona; Mulholland, Vincent; Cole, Yvonne; Pickup, Jon

    2015-01-01

    Potato cyst nematode (PCN) is a damaging soilborne pest of potatoes which can cause major crop losses. In 2010, a new European Union directive (2007/33/EC) on the control of PCN came into force. Under the new directive, seed potatoes can only be planted on land which has been found to be free from PCN infestation following an official soil test. A major consequence of the new directive was the introduction of a new harmonized soil sampling rate resulting in a threefold increase in the number of samples requiring testing. To manage this increase with the same staffing resources, we have replaced the traditional diagnostic methods. A system has been developed for the processing of soil samples, extraction of DNA from float material, and detection of PCN by high-throughput real-time PCR. Approximately 17,000 samples are analyzed each year using this method. This chapter describes the high-throughput processes for the production of float material from soil samples, DNA extraction from the entire float, and subsequent detection and identification of PCN within these samples.

  10. Comprehensive comparative analysis of 5'-end RNA-sequencing methods.

    PubMed

    Adiconis, Xian; Haber, Adam L; Simmons, Sean K; Levy Moonshine, Ami; Ji, Zhe; Busby, Michele A; Shi, Xi; Jacques, Justin; Lancaster, Madeline A; Pan, Jen Q; Regev, Aviv; Levin, Joshua Z

    2018-06-04

    Specialized RNA-seq methods are required to identify the 5' ends of transcripts, which are critical for studies of gene regulation, but these methods have not been systematically benchmarked. We directly compared six such methods, including the performance of five methods on a single human cellular RNA sample and a new spike-in RNA assay that helps circumvent challenges resulting from uncertainties in annotation and RNA processing. We found that the 'cap analysis of gene expression' (CAGE) method performed best for mRNA and that most of its unannotated peaks were supported by evidence from other genomic methods. We applied CAGE to eight brain-related samples and determined sample-specific transcription start site (TSS) usage, as well as a transcriptome-wide shift in TSS usage between fetal and adult brain.

  11. Occurrence of invertebrates at 38 stream sites in the Mississippi Embayment study unit, 1996-99

    USGS Publications Warehouse

    Caskey, Brian J.; Justus, B.G.; Zappia, Humbert

    2002-01-01

    A total of 88 invertebrate species and 178 genera representing 59 families, 8 orders, 6 classes, and 3 phyla was identified at 38 stream sites in the Mississippi Embayment Study Unit from 1996 through 1999 as part of the National Water-Quality Assessment Program. Sites were selected based on land use within the drainage basins and the availability of long-term streamflow data. Invertebrates were sampled as part of an overall sampling design to provide information related to the status and trends in water quality in the Mississippi Embayment Study Unit, which includes parts of Arkansas, Kentucky, Louisiana, Mississippi, Missouri, and Tennessee. Invertebrate sampling and processing was conducted using nationally standardized techniques developed for the National Water-Quality Assessment Program. These techniques included both a semi-quantitative method, which targeted habitats where invertebrate diversity is expected to be highest, and a qualitative multihabitat method, which samples all available habitat types possible within a sampling reach. All invertebrate samples were shipped to the USGS National Water-Quality Laboratory (NWQL) where they were processed. Of the 365 taxa identified, 156 were identified with the semi-quantitative method that involved sampling a known quantity of what was expected to be the richest habitat, woody debris. The qualitative method, which involved sampling all available habitats, identified 345 taxa The number of organisms identified in the semi-quantitative samples ranged from 74 to 3,295, whereas the number of taxa identified ranged from 9 to 54. The number of organisms identified in the qualitative samples ranged from 42 to 29,634, whereas the number of taxa ranged from 18 to 81. From all the organisms identified, chironomid taxa were the most frequently identified, and plecopteran taxa were among the least frequently identified.

  12. Detecting waste-combustion emissions: several advanced methods are useful for sampling air contaminants from hazardous-waste-incinerator stacks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Johnson, L.D.

    1986-01-01

    This paper is an overview of sampling methods being recommended to EPA regulatory programs, to EPA engineering research and development projects, and to interested parties in the industrial community. The methods discussed are generally applicable to both incineration and processes closely related to incineration (e.g., co-firing of waste in industrial boilers, and burning of contaminated heating oil). Although methods for inorganic hazardous compounds are very briefly outlined, the primary emphasis of the paper is on organic compounds that are likely to be chosen as principal organic hazardous constituents (POHCs) for a trial burn. Methods receiving major attention include: the Modifiedmore » Method 5 Train (MM5) which includes an XAD-2 sorbent module, the Source Assessment Sampling System (SASS), the recently developed Volatile Organic Sampling Train (VOST), and assorted containers such as glass bulbs and plastic bags.« less

  13. ANALYSIS OF RICIN TOXIN PREPARATIONS FOR CARBOHYDRATE AND FATTY ACID ABUNDANCE AND ISOTOPE RATIO INFORMATION

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wunschel, David S.; Kreuzer-Martin, Helen W.; Antolick, Kathryn C.

    2009-12-01

    This report describes method development and preliminary evaluation for analyzing castor samples for signatures of purifying ricin. Ricin purification from the source castor seeds is essentially a problem of protein purification using common biochemical methods. Indications of protein purification will likely manifest themselves as removal of the non-protein fractions of the seed. Two major, non-protein, types of biochemical constituents in the seed are the castor oil and various carbohydrates. The oil comprises roughly half the seed weight while the carbohydrate component comprises roughly half of the remaining “mash” left after oil and hull removal. Different castor oil and carbohydrate componentsmore » can serve as indicators of specific toxin processing steps. Ricinoleic acid is a relatively unique fatty acid in nature and is the most abundant component of castor oil. The loss of ricinoleic acid indicates a step to remove oil from the seeds. The relative amounts of carbohydrates and carbohydrate-like compounds, including arabinose, xylose, myo-inositol fucose, rhamnose, glucosamine and mannose detected in the sample can also indicate specific processing steps. For instance, the differential loss of arabinose relative to mannose and N-acetyl glucosamine indicates enrichment for the protein fraction of the seed using protein precipitation. The methods developed in this project center on fatty acid and carbohydrate extraction from castor samples followed by derivatization to permit analysis by gas chromatography-mass spectrometry (GC-MS). Method descriptions herein include: the source and preparation of castor materials used for method evaluation, the equipment and description of procedure required for chemical derivatization, and the instrument parameters used in the analysis. Two types of derivatization methods describe analysis of carbohydrates and one procedure for analysis of fatty acids. Two types of GC-MS analysis is included in the method development, one employing a quadrupole MS system for compound identification and an isotope ratio MS for measuring the stable isotope ratios of deuterium and hydrogen (D/H) in fatty acids. Finally, the method for analyzing the compound abundance data is included. This study indicates that removal of ricinoleic acid is a conserved consequence of each processing step we tested. Furthermore, the stable isotope D/H ratio of ricinoleic acid distinguished between two of the three castor seed sources. Concentrations of arabinose, xylose, mannose, glucosamine and myo-inositol differentiated between crude or acetone extracted samples and samples produced by protein precipitation. Taken together these data illustrate the ability to distinguish between processes used to purify a ricin sample as well as potentially the source seeds.« less

  14. The effects of sample preparation on measured concentrations of eight elements in edible tissues of fish from streams contaminated by lead mining

    USGS Publications Warehouse

    Schmitt, Christopher J.; Finger, Susan E.

    1987-01-01

    The influence of sample preparation on measured concentrations of eight elements in the edible tissues of two black basses (Centrarchidae), two catfishes (Ictaluridae), and the black redhorse,Moxostoma duquesnei (Catostomidae) from two rivers in southeastern Missouri contaminated by mining and related activities was investigated. Concentrations of Pb, Cd, Cu, Zn, Fe, Mn, Ba, and Ca were measured in two skinless, boneless samples of axial muscle from individual fish prepared in a clean room. One sample (normally-processed) was removed from each fish with a knife in a manner typically used by investigators to process fish for elemental analysis and presumedly representative of methods employed by anglers when preparing fish for home consumption. A second sample (clean-processed) was then prepared from each normally-processed sample by cutting away all surface material with acid-cleaned instruments under ultraclean conditions. The samples were analyzed as a single group by atomic absorption spectrophotometry. Of the elements studied, only Pb regularly exceeded current guidelines for elemental contaminants in foods. Concentrations were high in black redhorse from contaminated sites, regardless of preparation method; for the other fishes, whether or not Pb guidelines were exceeded depended on preparation technique. Except for Mn and Ca, concentrations of all elements measured were significantly lower in cleanthan in normally-processed tissue samples. Absolute differences in measured concentrations between clean- and normally-processed samples were most evident for Pb and Ba in bass and catfish and for Cd and Zn in redhorse. Regardless of preparation method, concentrations of Pb, Ca, Mn, and Ba in individual fish were closely correlated; samples that were high or low in one of these four elements were correspondingly high or low in the other three. In contrast, correlations between Zn, Fe, and Cd occurred only in normallyprocessed samples, suggesting that these correlations resulted from high concentrations on the surfaces of some samples. Concentrations of Pb and Ba in edible tissues of fish from contaminated sites were highly correlated with Ca content, which was probably determined largely by the amount of tissue other than muscle in the sample because fish muscle contains relatively little Ca. Accordingly, variation within a group of similar samples can be reduced by normalizing Pb and Ba concentrations to a standard Ca concentration. When sample size (N) is large, this can be accomplished statistically by analysis of covariance; whenN is small, molar ratios of [Pb]/[Ca] and [Ba]/[Ca] can be computed. Without such adjustments, unrealistically large Ns are required to yield statistically reliable estimates of Pb concentrations in edible tissues. Investigators should acknowledge that reported concentrations of certain elements are only estimates, and that regardless of the care exercised during the collection, preparation, and analysis of samples, results should be interpreted with the awareness that contamination from external sources may have occurred.

  15. A nonparametric method to generate synthetic populations to adjust for complex sampling design features.

    PubMed

    Dong, Qi; Elliott, Michael R; Raghunathan, Trivellore E

    2014-06-01

    Outside of the survey sampling literature, samples are often assumed to be generated by a simple random sampling process that produces independent and identically distributed (IID) samples. Many statistical methods are developed largely in this IID world. Application of these methods to data from complex sample surveys without making allowance for the survey design features can lead to erroneous inferences. Hence, much time and effort have been devoted to develop the statistical methods to analyze complex survey data and account for the sample design. This issue is particularly important when generating synthetic populations using finite population Bayesian inference, as is often done in missing data or disclosure risk settings, or when combining data from multiple surveys. By extending previous work in finite population Bayesian bootstrap literature, we propose a method to generate synthetic populations from a posterior predictive distribution in a fashion inverts the complex sampling design features and generates simple random samples from a superpopulation point of view, making adjustment on the complex data so that they can be analyzed as simple random samples. We consider a simulation study with a stratified, clustered unequal-probability of selection sample design, and use the proposed nonparametric method to generate synthetic populations for the 2006 National Health Interview Survey (NHIS), and the Medical Expenditure Panel Survey (MEPS), which are stratified, clustered unequal-probability of selection sample designs.

  16. A nonparametric method to generate synthetic populations to adjust for complex sampling design features

    PubMed Central

    Dong, Qi; Elliott, Michael R.; Raghunathan, Trivellore E.

    2017-01-01

    Outside of the survey sampling literature, samples are often assumed to be generated by a simple random sampling process that produces independent and identically distributed (IID) samples. Many statistical methods are developed largely in this IID world. Application of these methods to data from complex sample surveys without making allowance for the survey design features can lead to erroneous inferences. Hence, much time and effort have been devoted to develop the statistical methods to analyze complex survey data and account for the sample design. This issue is particularly important when generating synthetic populations using finite population Bayesian inference, as is often done in missing data or disclosure risk settings, or when combining data from multiple surveys. By extending previous work in finite population Bayesian bootstrap literature, we propose a method to generate synthetic populations from a posterior predictive distribution in a fashion inverts the complex sampling design features and generates simple random samples from a superpopulation point of view, making adjustment on the complex data so that they can be analyzed as simple random samples. We consider a simulation study with a stratified, clustered unequal-probability of selection sample design, and use the proposed nonparametric method to generate synthetic populations for the 2006 National Health Interview Survey (NHIS), and the Medical Expenditure Panel Survey (MEPS), which are stratified, clustered unequal-probability of selection sample designs. PMID:29200608

  17. Electrodeposition as an alternate method for preparation of environmental samples for iodide by AMS

    DOE PAGES

    Adamic, M. L.; Lister, T. E.; Dufek, E. J.; ...

    2015-03-25

    This paper presents an evaluation of an alternate method for preparing environmental samples for 129I analysis by accelerator mass spectrometry (AMS) at Idaho National Laboratory. The optimal sample preparation method is characterized by ease of preparation, capability of processing very small quantities of iodide, and ease of loading into a cathode. Electrodeposition of iodide on a silver wire was evaluated using these criteria. This study indicates that the electrochemically-formed silver iodide deposits produce ion currents similar to those from precipitated silver iodide for the same sample mass. Furthermore, precipitated silver iodide samples are usually mixed with niobium or silver powdermore » prior to loading in a cathode. Using electrodeposition, the silver is already mixed with the sample and can simply be picked up with tweezers, placed in the sample die, and pressed into a cathode. The major advantage of this method is that the silver wire/electrodeposited silver iodide is much easier to load into a cathode.« less

  18. Determination of free sulfites (SO3-2) in dried fruits processed with sulfur dioxide by ion chromatography through anion exchange column and conductivity detection.

    PubMed

    Liao, Benjamin S; Sram, Jacqueline C; Files, Darin J

    2013-01-01

    A simple and effective anion ion chromatography (IC) method with anion exchange column and conductivity detector has been developed to determine free sulfites (SO3-2) in dried fruits processed with sulfur dioxide. No oxidation agent, such as hydrogen peroxide, is used to convert sulfites to sulfates for IC analysis. In addition, no stabilizing agent, such as formaldehyde, fructose or EDTA, is required during the sample extraction. This method uses aqueous 0.2 N NaOH as the solvent for standard preparation and sample extraction. The sulfites, either prepared from standard sodium sulfite powder or extracted from food samples, are presumed to be unbound SO3-2 in aqueous 0.2 N NaOH (pH > 13), because the bound sulfites in the sample matrix are released at pH > 10. In this study, sulfites in the standard solutions were stable at room temperature (i.e., 15-25 degrees C) for up to 12 days. The lowest standard of the linear calibration curve is set at 1.59 microg/mL SO3-2 (equivalent to 6.36 microg/g sample with no dilution) for analysis of processed dried fruits that would contain high levels (>1000 microg/g) of sulfites. As a consequence, this method typically requires significant dilution of the sample extract. Samples are prepared with a simple procedure of sample compositing, extraction with aqueous 0.2 N NaOH, centrifugation, dilution as needed, and filtration prior to IC. The sulfites in these sample extracts are stable at room temperature for up to 20 h. Using anion IC, the sulfites are eluted under isocratic conditions with 10 mM aqueous sodium carbonate solution as the mobile phase passing through an anion exchange column. The sulfites are easily separated, with an analysis run time of 18 min, regardless of the dried fruit matrix. Recoveries from samples spiked with sodium sulfites were demonstrated to be between 81 and 105% for five different fruit matrixes (apricot, golden grape, white peach, fig, and mango). Overall, this method is simple to perform and effective for the determination of high levels of sulfites in dried fruits.

  19. Light assisted drying (LAD) for protein stabilization: optical characterization of samples

    NASA Astrophysics Data System (ADS)

    Young, Madison A.; McKinnon, Madison E.; Elliott, Gloria D.; Trammell, Susan R.

    2018-02-01

    Light-Assisted Drying (LAD) is a novel biopreservation technique which allows proteins to be immobilized in a dry, amorphous solid at room temperature. Indicator proteins are used in a variety of diagnostic assays ranging from highthroughput 96-well plates to new microfluidic devices. A challenge in the development of protein-based assays is preserving the structure of the protein during production and storage of the assay, as the structure of the protein is responsible for its functional activity. Freeze-drying or freezing are currently the standard for the preservation of proteins, but these methods are expensive and can be challenging in some environments due to a lack of available infrastructure. An inexpensive, simple processing method that enables supra-zero temperature storage of proteins used in assays is needed. Light-assisted drying offers a relatively inexpensive method for drying samples. Proteins suspended in a trehalose solution are dehydrated using near-infrared laser light. The laser radiation speeds drying and as water is removed the sugar forms a protective matrix. The goal of this study is optically characterize samples processed with LAD. We use polarized light imaging (PLI) to look at crystallization kinetics of samples and determine optimal humidity. PLI shows a 62.5% chance of crystallization during LAD processing and negligible crystallization during low RH storage.

  20. Single-cell transcriptome conservation in cryopreserved cells and tissues.

    PubMed

    Guillaumet-Adkins, Amy; Rodríguez-Esteban, Gustavo; Mereu, Elisabetta; Mendez-Lago, Maria; Jaitin, Diego A; Villanueva, Alberto; Vidal, August; Martinez-Marti, Alex; Felip, Enriqueta; Vivancos, Ana; Keren-Shaul, Hadas; Heath, Simon; Gut, Marta; Amit, Ido; Gut, Ivo; Heyn, Holger

    2017-03-01

    A variety of single-cell RNA preparation procedures have been described. So far, protocols require fresh material, which hinders complex study designs. We describe a sample preservation method that maintains transcripts in viable single cells, allowing one to disconnect time and place of sampling from subsequent processing steps. We sequence single-cell transcriptomes from >1000 fresh and cryopreserved cells using 3'-end and full-length RNA preparation methods. Our results confirm that the conservation process did not alter transcriptional profiles. This substantially broadens the scope of applications in single-cell transcriptomics and could lead to a paradigm shift in future study designs.

  1. Temperament, Parenting, and Depressive Symptoms in a Population Sample of Preadolescents

    ERIC Educational Resources Information Center

    Oldehinkel, Albertine J.; Veenstra, Rene; Ormel, Johan; De Winter, Andrea F.; Verhulst, Frank C.

    2006-01-01

    Background: Depressive symptoms can be triggered by negative social experiences and individuals' processing of these experiences. This study focuses on the interaction between temperament, perceived parenting, and gender in relation to depressive problems in a Dutch population sample of preadolescents. Methods: The sample consisted of 2230…

  2. Characterization of fission gas bubbles in irradiated U-10Mo fuel

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Casella, Andrew M.; Burkes, Douglas E.; MacFarlan, Paul J.

    2017-09-01

    Irradiated U-10Mo fuel samples were prepared with traditional mechanical potting and polishing methods with in a hot cell. They were then removed and imaged with an SEM located outside of a hot cell. The images were then processed with basic imaging techniques from 3 separate software packages. The results were compared and a baseline method for characterization of fission gas bubbles in the samples is proposed. It is hoped that through adoption of or comparison to this baseline method that sample characterization can be somewhat standardized across the field of post irradiated examination of metal fuels.

  3. Method for determining the octane rating of gasoline samples by observing corresponding acoustic resonances therein

    DOEpatents

    Sinha, D.N.; Anthony, B.W.

    1997-02-25

    A method is described for determining the octane rating of gasoline samples by observing corresponding acoustic resonances therein. A direct correlation between the octane rating of gasoline and the frequency of corresponding acoustic resonances therein has been experimentally observed. Therefore, the octane rating of a gasoline sample can be directly determined through speed of sound measurements instead of by the cumbersome process of quantifying the knocking quality of the gasoline. Various receptacle geometries and construction materials may be employed. Moreover, it is anticipated that the measurements can be performed on flowing samples in pipes, thereby rendering the present method useful in refineries and distilleries. 3 figs.

  4. [Comparison of sulfur fumigation processing and direct hot air heating technology on puerarin contents and efficacy of Puerariae Thomsonii Radix].

    PubMed

    Yu, Hong-Li; Zhang, Qian; Jin, Yang-Ping; Wang, Kui-Long; Lu, Tu-Lin; Li, Lin

    2016-07-01

    In order to compare the effect of sulfur fumigation processing and direct hot air heating technology on puerarin contents and efficacy of Puerariae Thomsonii Radix, the fresh roots of Pueraria thomsonii were cut into small pieces and prepared into direct sunshine drying samples, direct hot air drying samples, and sulfur fumigation-hot air drying samples. Moisture contents of the samples were then determined. The puerarin contents of different samples were compared by HPLC method. Moreover, the models of drunkenness mice were established, and then with superoxide dismutase (SOD) content as the index, aqueous decoction extracts of Puerariae Thomsonii Radix samples with sulfur fumigation processing and non-sulfur fumigation processing methods were administrated by ig; the effects of sulfur fumigation on contents of SOD in mice liver and serum were determined, and the sulfur fumigation samples and non-sulfur fumigation samples were investigated for moth and mildew under different packaging and storage conditions. Results showed that the sulfur fumigation samples significantly changed the puerarin content from Puerariae Thomsonii Radix. The content of puerarin was decreased gradually when increasing the times of sulfur fumigation and amount of sulfur. SOD content in drunken mice liver and serum was significantly decreased when increasing the times of sulfur fumigation, showing significant difference with both direct sunshine drying group and direct hot air drying group. Moth and mildew were not found in the sulfur fumigation samples and direct hot air drying samples whose moisture contents were lower than the limit in Pharmacopoeia. Research showed that sulfur fumigation can significantly reduce the content of main active ingredients and reduce the efficacy of Puerariae Thomsonii Radix, indicating that the quality of Puerariae Thomsonii Radix was significantly decreased after sulfur fumigation. However, the contents of the main active ingredients, efficacy and storage results of the direct hot air drying samples were similar to those in direct sunshine drying samples, so the hot air drying process was a nice drying technology which could be promoted for use. Copyright© by the Chinese Pharmaceutical Association.

  5. A fast learning method for large scale and multi-class samples of SVM

    NASA Astrophysics Data System (ADS)

    Fan, Yu; Guo, Huiming

    2017-06-01

    A multi-class classification SVM(Support Vector Machine) fast learning method based on binary tree is presented to solve its low learning efficiency when SVM processing large scale multi-class samples. This paper adopts bottom-up method to set up binary tree hierarchy structure, according to achieved hierarchy structure, sub-classifier learns from corresponding samples of each node. During the learning, several class clusters are generated after the first clustering of the training samples. Firstly, central points are extracted from those class clusters which just have one type of samples. For those which have two types of samples, cluster numbers of their positive and negative samples are set respectively according to their mixture degree, secondary clustering undertaken afterwards, after which, central points are extracted from achieved sub-class clusters. By learning from the reduced samples formed by the integration of extracted central points above, sub-classifiers are obtained. Simulation experiment shows that, this fast learning method, which is based on multi-level clustering, can guarantee higher classification accuracy, greatly reduce sample numbers and effectively improve learning efficiency.

  6. Vibration and acoustic frequency spectra for industrial process modeling using selective fusion multi-condition samples and multi-source features

    NASA Astrophysics Data System (ADS)

    Tang, Jian; Qiao, Junfei; Wu, ZhiWei; Chai, Tianyou; Zhang, Jian; Yu, Wen

    2018-01-01

    Frequency spectral data of mechanical vibration and acoustic signals relate to difficult-to-measure production quality and quantity parameters of complex industrial processes. A selective ensemble (SEN) algorithm can be used to build a soft sensor model of these process parameters by fusing valued information selectively from different perspectives. However, a combination of several optimized ensemble sub-models with SEN cannot guarantee the best prediction model. In this study, we use several techniques to construct mechanical vibration and acoustic frequency spectra of a data-driven industrial process parameter model based on selective fusion multi-condition samples and multi-source features. Multi-layer SEN (MLSEN) strategy is used to simulate the domain expert cognitive process. Genetic algorithm and kernel partial least squares are used to construct the inside-layer SEN sub-model based on each mechanical vibration and acoustic frequency spectral feature subset. Branch-and-bound and adaptive weighted fusion algorithms are integrated to select and combine outputs of the inside-layer SEN sub-models. Then, the outside-layer SEN is constructed. Thus, "sub-sampling training examples"-based and "manipulating input features"-based ensemble construction methods are integrated, thereby realizing the selective information fusion process based on multi-condition history samples and multi-source input features. This novel approach is applied to a laboratory-scale ball mill grinding process. A comparison with other methods indicates that the proposed MLSEN approach effectively models mechanical vibration and acoustic signals.

  7. Ultrasonic-based membrane aided sample preparation of urine proteomes.

    PubMed

    Jesus, Jemmyson Romário; Santos, Hugo M; López-Fernández, H; Lodeiro, Carlos; Arruda, Marco Aurélio Zezzi; Capelo, J L

    2018-02-01

    A new ultrafast ultrasonic-based method for shotgun proteomics as well as label-free protein quantification in urine samples is developed. The method first separates the urine proteins using nitrocellulose-based membranes and then proteins are in-membrane digested using trypsin. The enzymatic digestion process is accelerated from overnight to four minutes using a sonoreactor ultrasonic device. Overall, the sample treatment pipeline comprising protein separation, digestion and identification is done in just 3h. The process is assessed using urine of healthy volunteers. The method shows that male can be differentiated from female using the protein content of urine in a fast, easy and straightforward way. 232 and 226 proteins are identified in urine of male and female, respectively. From this, 162 are common to both genders, whilst 70 are unique to male and 64 to female. From the 162 common proteins, 13 are present at levels statistically different (p < 0.05). The method matches the analytical minimalism concept as outlined by Halls, as each stage of this analysis is evaluated to minimize the time, cost, sample requirement, reagent consumption, energy requirements and production of waste products. Copyright © 2017 Elsevier B.V. All rights reserved.

  8. Probabilistic Round Trip Contamination Analysis of a Mars Sample Acquisition and Handling Process Using Markovian Decompositions

    NASA Technical Reports Server (NTRS)

    Hudson, Nicolas; Lin, Ying; Barengoltz, Jack

    2010-01-01

    A method for evaluating the probability of a Viable Earth Microorganism (VEM) contaminating a sample during the sample acquisition and handling (SAH) process of a potential future Mars Sample Return mission is developed. A scenario where multiple core samples would be acquired using a rotary percussive coring tool, deployed from an arm on a MER class rover is analyzed. The analysis is conducted in a structured way by decomposing sample acquisition and handling process into a series of discrete time steps, and breaking the physical system into a set of relevant components. At each discrete time step, two key functions are defined: The probability of a VEM being released from each component, and the transport matrix, which represents the probability of VEM transport from one component to another. By defining the expected the number of VEMs on each component at the start of the sampling process, these decompositions allow the expected number of VEMs on each component at each sampling step to be represented as a Markov chain. This formalism provides a rigorous mathematical framework in which to analyze the probability of a VEM entering the sample chain, as well as making the analysis tractable by breaking the process down into small analyzable steps.

  9. Performance of Traditional and Molecular Methods for Detecting Biological Agents in Drinking Water

    USGS Publications Warehouse

    Francy, Donna S.; Bushon, Rebecca N.; Brady, Amie M.G.; Bertke, Erin E.; Kephart, Christopher M.; Likirdopulos, Christina A.; Mailot, Brian E.; Schaefer, Frank W.; Lindquist, H.D. Alan

    2009-01-01

    To reduce the impact from a possible bioterrorist attack on drinking-water supplies, analytical methods are needed to rapidly detect the presence of biological agents in water. To this end, 13 drinking-water samples were collected at 9 water-treatment plants in Ohio to assess the performance of a molecular method in comparison to traditional analytical methods that take longer to perform. Two 100-liter samples were collected at each site during each sampling event; one was seeded in the laboratory with six biological agents - Bacillus anthracis (B. anthracis), Burkholderia cepacia (as a surrogate for Bu. pseudomallei), Francisella tularensis (F. tularensis), Salmonella Typhi (S. Typhi), Vibrio cholerae (V. cholerae), and Cryptospordium parvum (C. parvum). The seeded and unseeded samples were processed by ultrafiltration and analyzed by use of quantiative polymerase chain reaction (qPCR), a molecular method, and culture methods for bacterial agents or the immunomagnetic separation/fluorescent antibody (IMS/FA) method for C. parvum as traditional methods. Six replicate seeded samples were also processed and analyzed. For traditional methods, recoveries were highly variable between samples and even between some replicate samples, ranging from below detection to greater than 100 percent. Recoveries were significantly related to water pH, specific conductance, and dissolved organic carbon (DOC) for all bacteria combined by culture methods, but none of the water-quality characteristics tested were related to recoveries of C. parvum by IMS/FA. Recoveries were not determined by qPCR because of problems in quantifying organisms by qPCR in the composite seed. Instead, qPCR results were reported as detected, not detected (no qPCR signal), or +/- detected (Cycle Threshold or 'Ct' values were greater than 40). Several sample results by qPCR were omitted from the dataset because of possible problems with qPCR reagents, primers, and probes. For the remaining 14 qPCR results (including some replicate samples), F. tularensis and V. cholerae were detected in all samples after ultrafiltration, B. anthracis was detected in 13 and +/- detected in 1 sample, and C. parvum was detected in 9 and +/- detected in 4 samples. Bu. cepacia was detected in nine samples, +/- detected in two samples, and not detected in three samples (for two out of three samples not detected, a different strain was used). The qPCR assay for V. cholerae provided two false positive - but late - signals in one unseeded sample. Numbers found by qPCR after ultrafiltration were significantly or nearly significantly related to those found by traditional methods for B. anthracis, F. tularensis, and V. cholerae but not for Bu. cepacia and C. parvum. A qPCR assay for S. Typhi was not available. The qPCR method can be used to rapidly detect B. anthracis, F. tularensis, and V. cholerae with some certainty in drinking-water samples, but additional work would be needed to optimize and test qPCR for Bu. cepacia and C. parvum and establish relations to traditional methods. The specificity for the V. cholerae assay needs to be further investigated. Evidence is provided that ultrafiltration and qPCR are promising methods to rapidly detect biological agents in the Nation's drinking-water supplies and thus reduce the impact and consequences from intentional bioterrorist events. To our knowledge, this is the first study to compare the use of traditional and qPCR methods to detect biological agents in large-volume drinking-water samples.

  10. A Metallurgical Study of Nāga Bhasma.

    PubMed

    Singh Gautam, Dev Nath

    2017-01-01

    The metal Nāga (Lead) is being used by Indians since ancient times. Its external and internal uses have been described in Caraka, Suśruta and other Ayurvedic Saṃhitā . According to most of the Rasa texts, Nāga Bhasma and its formulations are used in many diseases such as Prameha , Jvara , Gulma , Śukrameha etc. In the present study, Nāga Bhasma was prepared by the traditional Puṭa method (TPM) and by the electric muffle furnace Puṭa method (EMFPM) and standardized using Metallographic studies. Doing so helps in the study of the microstructure of Nāga Bhasma and also helps in the identification of the metal particles along with the nature of compound formed during the Māraṇa (Bhasmīkaraṇa) process. Different samples from initial raw material to final product of Nāga Bhasma were collected during the pharmaceutical process (1 st , 30 th and 60 th Puṭa ) from both methods i.e. TPM and EMFPM. Samples from both methods were studied using metallographic examination. The processing of the Nāga Bhasma ( ṣaṣṭipuṭa ) was done according to Ānanda Kanda [9] Samples from the raw material i.e. Aśodhita Nāga (raw Lead) and that processed after 1 st , 30 th and 60 th Puṭa from both methods i.e. traditional Puṭa method (using heat from burning of cow dung cakes) and electric muffle furnace Puṭa method were taken. They were mounted on self hardening acrylic base. After careful polishing to obtain scratch free surface of product, they were used for metallurgical study. This study shows that traditional Puṭa method may be better than electric muffle furnace Puṭa method because of more homogeneous distribution of Lead sulphide in the Nāga Bhasma which is prepared by traditional method.

  11. Free energy reconstruction from steered dynamics without post-processing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Athenes, Manuel, E-mail: Manuel.Athenes@cea.f; Condensed Matter and Materials Division, Physics and Life Sciences Directorate, LLNL, Livermore, CA 94551; Marinica, Mihai-Cosmin

    2010-09-20

    Various methods achieving importance sampling in ensembles of nonequilibrium trajectories enable one to estimate free energy differences and, by maximum-likelihood post-processing, to reconstruct free energy landscapes. Here, based on Bayes theorem, we propose a more direct method in which a posterior likelihood function is used both to construct the steered dynamics and to infer the contribution to equilibrium of all the sampled states. The method is implemented with two steering schedules. First, using non-autonomous steering, we calculate the migration barrier of the vacancy in Fe-{alpha}. Second, using an autonomous scheduling related to metadynamics and equivalent to temperature-accelerated molecular dynamics, wemore » accurately reconstruct the two-dimensional free energy landscape of the 38-atom Lennard-Jones cluster as a function of an orientational bond-order parameter and energy, down to the solid-solid structural transition temperature of the cluster and without maximum-likelihood post-processing.« less

  12. A "three-in-one" sample preparation method for simultaneous determination of B-group water-soluble vitamins in infant formula using VitaFast(®) kits.

    PubMed

    Zhang, Heng; Lan, Fang; Shi, Yupeng; Wan, Zhi-Gang; Yue, Zhen-Feng; Fan, Fang; Lin, Yan-Kui; Tang, Mu-Jin; Lv, Jing-Zhang; Xiao, Tan; Yi, Changqing

    2014-06-15

    VitaFast(®) test kits designed for the microbiological assay in microtiter plate format can be applied to quantitative determination of B-group water-soluble vitamins such as vitamin B12, folic acid and biotin, et al. Compared to traditional microbiological methods, VitaFast(®) kits significantly reduce sample processing time and provide greater reliability, higher productivity and better accuracy. Recently, simultaneous determination of vitamin B12, folic acid and biotin in one sample is urgently required when evaluating the quality of infant formulae in our practical work. However, the present sample preparation protocols which are developed for individual test systems, are incompatible with simultaneous determination of several analytes. To solve this problem, a novel "three-in-one" sample preparation method is herein developed for simultaneous determination of B-group water-soluble vitamins using VitaFast(®) kits. The performance of this novel "three-in-one" sample preparation method was systematically evaluated through comparing with individual sample preparation protocols. The experimental results of the assays which employed "three-in-one" sample preparation method were in good agreement with those obtained from conventional VitaFast(®) extraction methods, indicating that the proposed "three-in-one" sample preparation method is applicable to the present three VitaFast(®) vitamin test systems, thus offering a promising alternative for the three independent sample preparation methods. The proposed new sample preparation method will significantly improve the efficiency of infant formulae inspection. Copyright © 2013 Elsevier Ltd. All rights reserved.

  13. Investigation of the Effluents Produced during the Functioning of Black and White Colored Smoke Devices.

    DTIC Science & Technology

    1986-01-31

    and 4% diatomaceous earth (binder). Modified EPA Method 5 Sampling Train F The modified EPA Method 5 sampling train used was similar to the one...the fiber glass filter paper were taken by the Amberlite XAD-2. The XAD-2 is a porous polymer adsorbent used to sample organic vapors in effluents...from different kinds of combustion processes. Although a careful clean-up procedure was taken to wash the adsorbents before using, the polymer may still

  14. Analysis of defect structure in silicon. Silicon sheet growth development for the large area silicon sheet task of the Low-Cost Solar array Project

    NASA Technical Reports Server (NTRS)

    Natesh, R.; Mena, M.; Plichta, M.; Smith, J. M.; Sellani, M. A.

    1982-01-01

    One hundred ninety-three silicon sheet samples, approximately 880 square centimeters, were analyzed for twin boundary density, dislocation pit density, and grain boundary length. One hundred fifteen of these samples were manufactured by a heat exchanger method, thirty-eight by edge defined film fed growth, twenty-three by the silicon on ceramics process, and ten by the dendritic web process. Seven solar cells were also step-etched to determine the internal defect distribution on these samples. Procedures were developed or the quantitative characterization of structural defects such as dislocation pits, precipitates, twin & grain boundaries using a QTM 720 quantitative image analyzing system interfaced with a PDP 11/03 mini computer. Characterization of the grain boundary length per unit area for polycrystalline samples was done by using the intercept method on an Olympus HBM Microscope.

  15. Amino Acid Enantiomeric Ratios in Biogeochemistry: Complications and Opportunities

    NASA Astrophysics Data System (ADS)

    McDonald, G. D.; Sun, H. J.; Tsapin, A. I.

    2003-12-01

    Amino acid enantiomeric ratios have been used for many years as an indicator of the process of racemization, and thus as a method to determine the age of biological samples such as bones, shells, and teeth. Dating biological samples by this method relies on an accurate knowledge of the environmental temperatures the sample has experienced, and the racemization kinetic parameters in the sample matrix. In some environments, where an independent dating method such as radiocarbon is available, the observed amino acid D/L ratios are found to be either higher or lower than those expected due to racemization alone. The observed D/L ratios in these cases can be clues to biogeochemical processes operating in addition to, or in place of, chemical racemization. In Siberian permafrost (Brinton et al. 2002, Astrobiology 2, 77) we have found D/L ratios lower than expected, which we have interpreted as evidence for low-level D-amino acid metabolism and recycling in microorganisms previously thought to be metabolically dormant. In microbially-colonized Antarctic Dry Valley sandstones (McDonald and Sun 2002, Eos Trans. AGU 83, Fall Meet. Suppl., Abstract B11A-0720) we have found D/L ratios higher than can be accounted for by racemization alone, most likely due to the accumulation of D-amino-acid-containing peptidoglycan material from multiple bacterial generations. D/L profiles in polar ices and in ice-covered lakes (Tsapin et al. 2002, Astrobiology 2, 632) can be used to indicate the sources and histories of water or ice samples. Multiple biological and biogeochemical processes may complicate the interpretation of amino acid enantiomeric excesses in both terrestrial and extraterrestrial samples; however, amino acid racemization remains a useful tool in biogeochemistry and astrobiology. With a good knowledge of the environmental history of samples, amino acid D/L profiles can be used as a window into processes such as molecular repair and biomass turnover that are difficult to detect by other means, particularly over geological time scales.

  16. Non-Contact Conductivity Measurement for Automated Sample Processing Systems

    NASA Technical Reports Server (NTRS)

    Beegle, Luther W.; Kirby, James P.

    2012-01-01

    A new method has been developed for monitoring and control of automated sample processing and preparation especially focusing on desalting of samples before analytical analysis (described in more detail in Automated Desalting Apparatus, (NPO-45428), NASA Tech Briefs, Vol. 34, No. 8 (August 2010), page 44). The use of non-contact conductivity probes, one at the inlet and one at the outlet of the solid phase sample preparation media, allows monitoring of the process, and acts as a trigger for the start of the next step in the sequence (see figure). At each step of the muti-step process, the system is flushed with low-conductivity water, which sets the system back to an overall low-conductivity state. This measurement then triggers the next stage of sample processing protocols, and greatly minimizes use of consumables. In the case of amino acid sample preparation for desalting, the conductivity measurement will define three key conditions for the sample preparation process. First, when the system is neutralized (low conductivity, by washing with excess de-ionized water); second, when the system is acidified, by washing with a strong acid (high conductivity); and third, when the system is at a basic condition of high pH (high conductivity). Taken together, this non-contact conductivity measurement for monitoring sample preparation will not only facilitate automation of the sample preparation and processing, but will also act as a way to optimize the operational time and use of consumables

  17. Detection of Respiratory Viruses in Sputum from Adults by Use of Automated Multiplex PCR

    PubMed Central

    Walsh, Edward E.; Formica, Maria A.; Falsey, Ann R.

    2014-01-01

    Respiratory tract infections (RTI) frequently cause hospital admissions among adults. Diagnostic viral reverse transcriptase PCR (RT-PCR) of nose and throat swabs (NTS) is useful for patient care by informing antiviral use and appropriate isolation. However, automated RT-PCR systems are not amenable to utilizing sputum due to its viscosity. We evaluated a simple method of processing sputum samples in a fully automated respiratory viral panel RT-PCR assay (FilmArray). Archived sputum and NTS samples collected in 2008-2012 from hospitalized adults with RTI were evaluated. A subset of sputum samples positive for 10 common viruses by a uniplex RT-PCR was selected. A sterile cotton-tip swab was dunked in sputum, swirled in 700 μL of sterile water (dunk and swirl method) and tested by the FilmArray assay. Quantitative RT-PCR was performed on “dunked” sputum and NTS samples for influenza A (Flu A), respiratory syncytial virus (RSV), coronavirus OC43 (OC43), and human metapneumovirus (HMPV). Viruses were identified in 31% of 965 illnesses using a uniplex RT-PCR. The sputum sample was the only sample positive for 105 subjects, including 35% (22/64) of influenza cases and significantly increased the diagnostic yield of NTS alone (302/965 [31%] versus 197/965 [20%]; P = 0.0001). Of 108 sputum samples evaluated by the FilmArray assay using the dunk and swirl method, 99 (92%) were positive. Quantitative RT-PCR revealed higher mean viral loads in dunked sputum samples compared to NTS samples for Flu A, RSV, and HMPV (P = 0.0001, P = 0.006, and P = 0.011, respectively). The dunk and swirl method is a simple and practical method for reliably processing sputum samples in a fully automated PCR system. The higher viral loads in sputa may increase detection over NTS testing alone. PMID:25056335

  18. Investigating the soil removal characteristics of flexible tube coring method for lunar exploration

    NASA Astrophysics Data System (ADS)

    Tang, Junyue; Quan, Qiquan; Jiang, Shengyuan; Liang, Jieneng; Lu, Xiangyong; Yuan, Fengpei

    2018-02-01

    Compared with other technical solutions, sampling the planetary soil and returning it back to Earth may be the most direct method to seek the evidence of extraterrestrial life. To keep sample's stratification for further analyzing, a novel sampling method called flexible tube coring has been adopted for China future lunar explorations. Given the uncertain physical properties of lunar regolith, proper drilling parameters should be adjusted immediately in piercing process. Otherwise, only a small amount of core could be sampled and overload drilling faults could occur correspondingly. Due to the fact that the removed soil is inevitably connected with the cored soil, soil removal characteristics may have a great influence on both drilling loads and coring results. To comprehend the soil removal characteristics, a non-contact measurement was proposed and verified to acquire the coring and removal results accurately. Herein, further more experiments in one homogenous lunar regolith simulant were conducted, revealing that there exists a sudden core failure during the sampling process and the final coring results are determined by the penetration per revolution index. Due to the core failure, both drilling loads and soil's removal states are also affected thereby.

  19. Ultrasonic imaging of textured alumina

    NASA Technical Reports Server (NTRS)

    Stang, David B.; Salem, Jonathan A.; Generazio, Edward R.

    1989-01-01

    Ultrasonic images representing the bulk attenuation and velocity of a set of alumina samples were obtained by a pulse-echo contact scanning technique. The samples were taken from larger bodies that were chemically similar but were processed by extrusion or isostatic processing. The crack growth resistance and fracture toughness of the larger bodies were found to vary with processing method and test orientation. The results presented here demonstrate that differences in texture that contribute to variations in structural performance can be revealed by analytic ultrasonic techniques.

  20. Resource Input, Service Process and Resident Activity Indicators in a Welsh National Random Sample of Staffed Housing Services for People with Intellectual Disabilities

    ERIC Educational Resources Information Center

    Felce, David; Perry, Jonathan

    2004-01-01

    Background: The aims were to: (i) explore the association between age and size of setting and staffing per resident; and (ii) report resident and setting characteristics, and indicators of service process and resident activity for a national random sample of staffed housing provision. Methods: Sixty settings were selected randomly from those…

  1. Monitoring of an antigen manufacturing process.

    PubMed

    Zavatti, Vanessa; Budman, Hector; Legge, Raymond; Tamer, Melih

    2016-06-01

    Fluorescence spectroscopy in combination with multivariate statistical methods was employed as a tool for monitoring the manufacturing process of pertactin (PRN), one of the virulence factors of Bordetella pertussis utilized in whopping cough vaccines. Fluorophores such as amino acids and co-enzymes were detected throughout the process. The fluorescence data collected at different stages of the fermentation and purification process were treated employing principal component analysis (PCA). Through PCA, it was feasible to identify sources of variability in PRN production. Then, partial least square (PLS) was employed to correlate the fluorescence spectra obtained from pure PRN samples and the final protein content measured by a Kjeldahl test from these samples. In view that a statistically significant correlation was found between fluorescence and PRN levels, this approach could be further used as a method to predict the final protein content.

  2. A seamless acquisition digital storage oscilloscope with three-dimensional waveform display

    NASA Astrophysics Data System (ADS)

    Yang, Kuojun; Tian, Shulin; Zeng, Hao; Qiu, Lei; Guo, Lianping

    2014-04-01

    In traditional digital storage oscilloscope (DSO), sampled data need to be processed after each acquisition. During data processing, the acquisition is stopped and oscilloscope is blind to the input signal. Thus, this duration is called dead time. With the rapid development of modern electronic systems, the effect of infrequent events becomes significant. To capture these occasional events in shorter time, dead time in traditional DSO that causes the loss of measured signal needs to be reduced or even eliminated. In this paper, a seamless acquisition oscilloscope without dead time is proposed. In this oscilloscope, three-dimensional waveform mapping (TWM) technique, which converts sampled data to displayed waveform, is proposed. With this technique, not only the process speed is improved, but also the probability information of waveform is displayed with different brightness. Thus, a three-dimensional waveform is shown to the user. To reduce processing time further, parallel TWM which processes several sampled points simultaneously, and dual-port random access memory based pipelining technique which can process one sampling point in one clock period are proposed. Furthermore, two DDR3 (Double-Data-Rate Three Synchronous Dynamic Random Access Memory) are used for storing sampled data alternately, thus the acquisition can continue during data processing. Therefore, the dead time of DSO is eliminated. In addition, a double-pulse test method is adopted to test the waveform capturing rate (WCR) of the oscilloscope and a combined pulse test method is employed to evaluate the oscilloscope's capture ability comprehensively. The experiment results show that the WCR of the designed oscilloscope is 6 250 000 wfms/s (waveforms per second), the highest value in all existing oscilloscopes. The testing results also prove that there is no dead time in our oscilloscope, thus realizing the seamless acquisition.

  3. A seamless acquisition digital storage oscilloscope with three-dimensional waveform display

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yang, Kuojun, E-mail: kuojunyang@gmail.com; Guo, Lianping; School of Electrical and Electronic Engineering, Nanyang Technological University

    In traditional digital storage oscilloscope (DSO), sampled data need to be processed after each acquisition. During data processing, the acquisition is stopped and oscilloscope is blind to the input signal. Thus, this duration is called dead time. With the rapid development of modern electronic systems, the effect of infrequent events becomes significant. To capture these occasional events in shorter time, dead time in traditional DSO that causes the loss of measured signal needs to be reduced or even eliminated. In this paper, a seamless acquisition oscilloscope without dead time is proposed. In this oscilloscope, three-dimensional waveform mapping (TWM) technique, whichmore » converts sampled data to displayed waveform, is proposed. With this technique, not only the process speed is improved, but also the probability information of waveform is displayed with different brightness. Thus, a three-dimensional waveform is shown to the user. To reduce processing time further, parallel TWM which processes several sampled points simultaneously, and dual-port random access memory based pipelining technique which can process one sampling point in one clock period are proposed. Furthermore, two DDR3 (Double-Data-Rate Three Synchronous Dynamic Random Access Memory) are used for storing sampled data alternately, thus the acquisition can continue during data processing. Therefore, the dead time of DSO is eliminated. In addition, a double-pulse test method is adopted to test the waveform capturing rate (WCR) of the oscilloscope and a combined pulse test method is employed to evaluate the oscilloscope's capture ability comprehensively. The experiment results show that the WCR of the designed oscilloscope is 6 250 000 wfms/s (waveforms per second), the highest value in all existing oscilloscopes. The testing results also prove that there is no dead time in our oscilloscope, thus realizing the seamless acquisition.« less

  4. Sparse sampling and reconstruction for electron and scanning probe microscope imaging

    DOEpatents

    Anderson, Hyrum; Helms, Jovana; Wheeler, Jason W.; Larson, Kurt W.; Rohrer, Brandon R.

    2015-07-28

    Systems and methods for conducting electron or scanning probe microscopy are provided herein. In a general embodiment, the systems and methods for conducting electron or scanning probe microscopy with an undersampled data set include: driving an electron beam or probe to scan across a sample and visit a subset of pixel locations of the sample that are randomly or pseudo-randomly designated; determining actual pixel locations on the sample that are visited by the electron beam or probe; and processing data collected by detectors from the visits of the electron beam or probe at the actual pixel locations and recovering a reconstructed image of the sample.

  5. System and method for measuring fluorescence of a sample

    DOEpatents

    Riot, Vincent J

    2015-03-24

    The present disclosure provides a system and a method for measuring fluorescence of a sample. The sample may be a polymerase-chain-reaction (PCR) array, a loop-mediated-isothermal amplification array, etc. LEDs are used to excite the sample, and a photodiode is used to collect the sample's fluorescence. An electronic offset signal is used to reduce the effects of background fluorescence and the noises from the measurement system. An integrator integrates the difference between the output of the photodiode and the electronic offset signal over a given period of time. The resulting integral is then converted into digital domain for further processing and storage.

  6. Determination of polychlorinated biphenyls in milk samples by saponification-solid-phase microextraction.

    PubMed

    Llompart, M; Pazos, M; Landin, P; Cela, R

    2001-12-15

    A saponification-HSSPME procedure has been developed for the extraction of PCBs from milk samples. Saponification of the samples improves the PCB extraction efficiency and allows attaining lower background. A mixed-level fractional design has been used to optimize the sample preparation process. Five variables have been considered: extraction time, agitation, kind of microextraction fiber, concentration, and volume of NaOH aqueous solution. Also the kinetic of the process has been studied with the two fibers (100-microm PDMS and 65-microm PDMS-DVB) included in this study. Analyses were performed on a gas chromatograph equipped with an electron capture detector and a gas chromatograph coupled to a mass selective detector working in MS-MS mode. The proposed method is simple and rapid, and yields high sensitivity, with detection limits below 1 ng/mL, good linearity, and reproducibility. The method has been applied to liquid milk samples with different fat content covering the whole commercial range, and it has been validated with powdered milk certified reference material.

  7. Tackling sampling challenges in biomolecular simulations.

    PubMed

    Barducci, Alessandro; Pfaendtner, Jim; Bonomi, Massimiliano

    2015-01-01

    Molecular dynamics (MD) simulations are a powerful tool to give an atomistic insight into the structure and dynamics of proteins. However, the time scales accessible in standard simulations, which often do not match those in which interesting biological processes occur, limit their predictive capabilities. Many advanced sampling techniques have been proposed over the years to overcome this limitation. This chapter focuses on metadynamics, a method based on the introduction of a time-dependent bias potential to accelerate sampling and recover equilibrium properties of a few descriptors that are able to capture the complexity of a process at a coarse-grained level. The theory of metadynamics and its combination with other popular sampling techniques such as the replica exchange method is briefly presented. Practical applications of these techniques to the study of the Trp-Cage miniprotein folding are also illustrated. The examples contain a guide for performing these calculations with PLUMED, a plugin to perform enhanced sampling simulations in combination with many popular MD codes.

  8. Fluidics platform and method for sample preparation and analysis

    DOEpatents

    Benner, W. Henry; Dzenitis, John M.; Bennet, William J.; Baker, Brian R.

    2014-08-19

    Herein provided are fluidics platform and method for sample preparation and analysis. The fluidics platform is capable of analyzing DNA from blood samples using amplification assays such as polymerase-chain-reaction assays and loop-mediated-isothermal-amplification assays. The fluidics platform can also be used for other types of assays and analyzes. In some embodiments, a sample in a sealed tube can be inserted directly. The following isolation, detection, and analyzes can be performed without a user's intervention. The disclosed platform may also comprises a sample preparation system with a magnetic actuator, a heater, and an air-drying mechanism, and fluid manipulation processes for extraction, washing, elution, assay assembly, assay detection, and cleaning after reactions and between samples.

  9. An Exploration of Strategic Planning Perspectives and Processes within Community Colleges Identified as Being Distinctive in Their Strategic Planning Practices

    ERIC Educational Resources Information Center

    Augustyniak, Lisa J.

    2015-01-01

    Community college leaders face unprecedented change, and some have begun reexamining their institutional strategic planning processes. Yet, studies in higher education strategic planning spend little time examining how community colleges formulate their strategic plans. This mixed-method qualitative study used an expert sampling method to identify…

  10. Secondary Schools Principals and Their Job Satisfaction: A Test of Process Theories

    ERIC Educational Resources Information Center

    Maforah, Tsholofelo Paulinah

    2015-01-01

    The study aims to test the validity of process theories on the job satisfaction of previously disadvantaged Secondary School principals in the North West province. A mixed-method approach consisting of both quantitative and qualitative methods was used for the study. A questionnaire was administered during the quantitative phase with a sample that…

  11. Optimization of HPV DNA detection in urine by improving collection, storage, and extraction.

    PubMed

    Vorsters, A; Van den Bergh, J; Micalessi, I; Biesmans, S; Bogers, J; Hens, A; De Coster, I; Ieven, M; Van Damme, P

    2014-11-01

    The benefits of using urine for the detection of human papillomavirus (HPV) DNA have been evaluated in disease surveillance, epidemiological studies, and screening for cervical cancers in specific subgroups. HPV DNA testing in urine is being considered for important purposes, notably the monitoring of HPV vaccination in adolescent girls and young women who do not wish to have a vaginal examination. The need to optimize and standardize sampling, storage, and processing has been reported.In this paper, we examined the impact of a DNA-conservation buffer, the extraction method, and urine sampling on the detection of HPV DNA and human DNA in urine provided by 44 women with a cytologically normal but HPV DNA-positive cervical sample. Ten women provided first-void and midstream urine samples. DNA analysis was performed using real-time PCR to allow quantification of HPV and human DNA.The results showed that an optimized method for HPV DNA detection in urine should (a) prevent DNA degradation during extraction and storage, (b) recover cell-free HPV DNA in addition to cell-associated DNA, (c) process a sufficient volume of urine, and (d) use a first-void sample.In addition, we found that detectable human DNA in urine may not be a good internal control for sample validity. HPV prevalence data that are based on urine samples collected, stored, and/or processed under suboptimal conditions may underestimate infection rates.

  12. Hamiltonian Monte Carlo acceleration using surrogate functions with random bases.

    PubMed

    Zhang, Cheng; Shahbaba, Babak; Zhao, Hongkai

    2017-11-01

    For big data analysis, high computational cost for Bayesian methods often limits their applications in practice. In recent years, there have been many attempts to improve computational efficiency of Bayesian inference. Here we propose an efficient and scalable computational technique for a state-of-the-art Markov chain Monte Carlo methods, namely, Hamiltonian Monte Carlo. The key idea is to explore and exploit the structure and regularity in parameter space for the underlying probabilistic model to construct an effective approximation of its geometric properties. To this end, we build a surrogate function to approximate the target distribution using properly chosen random bases and an efficient optimization process. The resulting method provides a flexible, scalable, and efficient sampling algorithm, which converges to the correct target distribution. We show that by choosing the basis functions and optimization process differently, our method can be related to other approaches for the construction of surrogate functions such as generalized additive models or Gaussian process models. Experiments based on simulated and real data show that our approach leads to substantially more efficient sampling algorithms compared to existing state-of-the-art methods.

  13. Geotechnical characterization of mined clay from Appalachian Ohio: challenges and implications for the clay mining industry.

    PubMed

    Moran, Anthony R; Hettiarachchi, Hiroshan

    2011-07-01

    Clayey soil found in coal mines in Appalachian Ohio is often sold to landfills for constructing Recompacted Soil Liners (RSL) in landfills. Since clayey soils possess low hydraulic conductivity, the suitability of mined clay for RSL in Ohio is first assessed by determining its clay content. When soil samples are tested in a laboratory, the same engineering properties are typically expected for the soils originated from the same source, provided that the testing techniques applied are standard, but mined clay from Appalachian Ohio has shown drastic differences in particle size distribution depending on the sampling and/or laboratory processing methods. Sometimes more than a 10 percent decrease in the clay content is observed in the samples collected at the stockpiles, compared to those collected through reverse circulation drilling. This discrepancy poses a challenge to geotechnical engineers who work on the prequalification process of RSL material as it can result in misleading estimates of the hydraulic conductivity of the samples. This paper describes a laboratory investigation conducted on mined clay from Appalachian Ohio to determine how and why the standard sampling and/or processing methods can affect the grain-size distributions. The variation in the clay content was determined to be due to heavy concentrations of shale fragments in the clayey soils. It was also concluded that, in order to obtain reliable grain size distributions from the samples collected at a stockpile of mined clay, the material needs to be processed using a soil grinder. Otherwise, the samples should be collected through drilling.

  14. Geotechnical Characterization of Mined Clay from Appalachian Ohio: Challenges and Implications for the Clay Mining Industry

    PubMed Central

    Moran, Anthony R.; Hettiarachchi, Hiroshan

    2011-01-01

    Clayey soil found in coal mines in Appalachian Ohio is often sold to landfills for constructing Recompacted Soil Liners (RSL) in landfills. Since clayey soils possess low hydraulic conductivity, the suitability of mined clay for RSL in Ohio is first assessed by determining its clay content. When soil samples are tested in a laboratory, the same engineering properties are typically expected for the soils originated from the same source, provided that the testing techniques applied are standard, but mined clay from Appalachian Ohio has shown drastic differences in particle size distribution depending on the sampling and/or laboratory processing methods. Sometimes more than a 10 percent decrease in the clay content is observed in the samples collected at the stockpiles, compared to those collected through reverse circulation drilling. This discrepancy poses a challenge to geotechnical engineers who work on the prequalification process of RSL material as it can result in misleading estimates of the hydraulic conductivity of the samples. This paper describes a laboratory investigation conducted on mined clay from Appalachian Ohio to determine how and why the standard sampling and/or processing methods can affect the grain-size distributions. The variation in the clay content was determined to be due to heavy concentrations of shale fragments in the clayey soils. It was also concluded that, in order to obtain reliable grain size distributions from the samples collected at a stockpile of mined clay, the material needs to be processed using a soil grinder. Otherwise, the samples should be collected through drilling. PMID:21845150

  15. DNA profiles from clothing fibers using direct PCR.

    PubMed

    Blackie, Renée; Taylor, Duncan; Linacre, Adrian

    2016-09-01

    We report on the successful use of direct PCR amplification of single fibers from items of worn clothing. Items of clothing were worn throughout the course of a day, with the individual commencing regular activities. Single fibers were taken from the cuff of the clothing at regular intervals and amplified directly. The same areas were subjected to tape-lifting, and also amplified directly for comparison. The NGM™ kit that amplifies 15 STR loci plus amelogenin was used. A total of 35 single fiber samples were processed and analyzed from five items of clothing, with 81 % of samples returning a profile of 14 alleles or more. All tape-lift samples amplified directly produced DNA profiles of 15 alleles or more. The aim was to develop a simple, operational method that could be used routinely in forensic science casework and that has the potential to generate more complete profiles, which would not be detected using standard extraction methods on this type of sample. For ease of implementation, the process also adheres to standard methods with no increase in the cycle number.

  16. Transient Method for Determining Indoor Chemical Concentrations Based on SPME: Model Development and Calibration.

    PubMed

    Cao, Jianping; Xiong, Jianyin; Wang, Lixin; Xu, Ying; Zhang, Yinping

    2016-09-06

    Solid-phase microextraction (SPME) is regarded as a nonexhaustive sampling technique with a smaller extraction volume and a shorter extraction time than traditional sampling techniques and is hence widely used. The SPME sampling process is affected by the convection or diffusion effect along the coating surface, but this factor has seldom been studied. This paper derives an analytical model to characterize SPME sampling for semivolatile organic compounds (SVOCs) as well as for volatile organic compounds (VOCs) by considering the surface mass transfer process. Using this model, the chemical concentrations in a sample matrix can be conveniently calculated. In addition, the model can be used to determine the characteristic parameters (partition coefficient and diffusion coefficient) for typical SPME chemical samplings (SPME calibration). Experiments using SPME samplings of two typical SVOCs, dibutyl phthalate (DBP) in sealed chamber and di(2-ethylhexyl) phthalate (DEHP) in ventilated chamber, were performed to measure the two characteristic parameters. The experimental results demonstrated the effectiveness of the model and calibration method. Experimental data from the literature (VOCs sampled by SPME) were used to further validate the model. This study should prove useful for relatively rapid quantification of concentrations of different chemicals in various circumstances with SPME.

  17. In vitro starch digestibility, pasting and textural properties of mung bean: effect of different processing methods.

    PubMed

    Kaur, Maninder; Sandhu, Kawaljit Singh; Ahlawat, RavinderPal; Sharma, Somesh

    2015-03-01

    Mung bean was subjected to different processing conditions (soaking, germination, cooking and autoclaving) and their textural, pasting and in vitro starch digestibility characteristics were studied. A significant reduction in textural properties (hardness, cohesiveness, gumminess and chewiness) after cooking and autoclaving treatment of mung bean was observed. Flours made from differently processed mung bean showed significant differences (P < 0.05) in their pastin g characteristics. Peak and final viscosity were the highest for flour from germinated mung bean whereas those made from autoclaved mung bean showed the lowest value. in vitro starch digestibility of mung bean flours was assessed enzymatically using modified Englyst method and the parameters studied were readily digestible starch (RDS), slowly digestible starch (SDS), resistant starch (RS) and total starch (TS) content. Various processing treatments increased the RDS contents of mung bean, while the SDS content was found to be the highest for soaked and the lowest for the autoclaved sample. Germinated sample showed higher amount of digestible starch (RDS + SDS) as compared to raw and soaked samples. Flours from raw and soaked samples showed significantly low starch hydrolysis rate at all the temperatures with total hydrolysis of 29.9 and 31.2 %, respectively at 180 min whereas cooked and autoclaved samples showed high hydrolysis rates with 50.2 and 53.8 % of these hydrolyzing within 30 min of hydrolysis.

  18. Study on the Application of the Combination of TMD Simulation and Umbrella Sampling in PMF Calculation for Molecular Conformational Transitions

    PubMed Central

    Wang, Qing; Xue, Tuo; Song, Chunnian; Wang, Yan; Chen, Guangju

    2016-01-01

    Free energy calculations of the potential of mean force (PMF) based on the combination of targeted molecular dynamics (TMD) simulations and umbrella samplings as a function of physical coordinates have been applied to explore the detailed pathways and the corresponding free energy profiles for the conformational transition processes of the butane molecule and the 35-residue villin headpiece subdomain (HP35). The accurate PMF profiles for describing the dihedral rotation of butane under both coordinates of dihedral rotation and root mean square deviation (RMSD) variation were obtained based on the different umbrella samplings from the same TMD simulations. The initial structures for the umbrella samplings can be conveniently selected from the TMD trajectories. For the application of this computational method in the unfolding process of the HP35 protein, the PMF calculation along with the coordinate of the radius of gyration (Rg) presents the gradual increase of free energies by about 1 kcal/mol with the energy fluctuations. The feature of conformational transition for the unfolding process of the HP35 protein shows that the spherical structure extends and the middle α-helix unfolds firstly, followed by the unfolding of other α-helices. The computational method for the PMF calculations based on the combination of TMD simulations and umbrella samplings provided a valuable strategy in investigating detailed conformational transition pathways for other allosteric processes. PMID:27171075

  19. Influence of heat processing on the volatile organic compounds and microbial diversity of salted and vacuum-packaged silver carp (Hypophthalmichthys molitrix) fillets during storage.

    PubMed

    Li, Dongping; Zhang, Jingbin; Song, Sijia; Feng, Ligeng; Luo, Yongkang

    2018-06-01

    Ready-to-eat products have become popular with most of the busy people in modern cities. Heat processing combined with vacuum-packaging is one of the most common methods to make ready-to-eat products with an extended shelf-life. In this study, the influence of heat processing [80 °C (LT) and 98 °C (HT) in water bath] on the quality of salted and vacuum-packaged silver carp (Hypophthalmichthys molitrix) fillets, stored at 20 ± 1 °C, was investigated by sensory analysis, biochemical analysis, and microbial diversity. SPME-GC/MS indicated the presence of 27 volatile organic compounds (VOCs) in fillets, and major VOCs were aldehydes and alcohols. Acids tended to increase during storage and caused a fetid odor at the end of storage. Culture-dependent method indicated that Bacillus dominated the spoiled LT and HT samples. In addition, Bacillus was identified as the main spoiler of deteriorated heated fillets by high-throughput sequencing. Sphingomonas and Brevibacillus dominated the indigenous bacteria of fresh raw fillets. After heat processing, LT samples exhibited higher organoleptic quality than HT samples on day 0. HT samples showed extended shelf-life at 20 °C storage compared to LT samples. Copyright © 2017. Published by Elsevier Ltd.

  20. Simultaneous inference of phylogenetic and transmission trees in infectious disease outbreaks

    PubMed Central

    2017-01-01

    Whole-genome sequencing of pathogens from host samples becomes more and more routine during infectious disease outbreaks. These data provide information on possible transmission events which can be used for further epidemiologic analyses, such as identification of risk factors for infectivity and transmission. However, the relationship between transmission events and sequence data is obscured by uncertainty arising from four largely unobserved processes: transmission, case observation, within-host pathogen dynamics and mutation. To properly resolve transmission events, these processes need to be taken into account. Recent years have seen much progress in theory and method development, but existing applications make simplifying assumptions that often break up the dependency between the four processes, or are tailored to specific datasets with matching model assumptions and code. To obtain a method with wider applicability, we have developed a novel approach to reconstruct transmission trees with sequence data. Our approach combines elementary models for transmission, case observation, within-host pathogen dynamics, and mutation, under the assumption that the outbreak is over and all cases have been observed. We use Bayesian inference with MCMC for which we have designed novel proposal steps to efficiently traverse the posterior distribution, taking account of all unobserved processes at once. This allows for efficient sampling of transmission trees from the posterior distribution, and robust estimation of consensus transmission trees. We implemented the proposed method in a new R package phybreak. The method performs well in tests of both new and published simulated data. We apply the model to five datasets on densely sampled infectious disease outbreaks, covering a wide range of epidemiological settings. Using only sampling times and sequences as data, our analyses confirmed the original results or improved on them: the more realistic infection times place more confidence in the inferred transmission trees. PMID:28545083

  1. Simultaneous inference of phylogenetic and transmission trees in infectious disease outbreaks.

    PubMed

    Klinkenberg, Don; Backer, Jantien A; Didelot, Xavier; Colijn, Caroline; Wallinga, Jacco

    2017-05-01

    Whole-genome sequencing of pathogens from host samples becomes more and more routine during infectious disease outbreaks. These data provide information on possible transmission events which can be used for further epidemiologic analyses, such as identification of risk factors for infectivity and transmission. However, the relationship between transmission events and sequence data is obscured by uncertainty arising from four largely unobserved processes: transmission, case observation, within-host pathogen dynamics and mutation. To properly resolve transmission events, these processes need to be taken into account. Recent years have seen much progress in theory and method development, but existing applications make simplifying assumptions that often break up the dependency between the four processes, or are tailored to specific datasets with matching model assumptions and code. To obtain a method with wider applicability, we have developed a novel approach to reconstruct transmission trees with sequence data. Our approach combines elementary models for transmission, case observation, within-host pathogen dynamics, and mutation, under the assumption that the outbreak is over and all cases have been observed. We use Bayesian inference with MCMC for which we have designed novel proposal steps to efficiently traverse the posterior distribution, taking account of all unobserved processes at once. This allows for efficient sampling of transmission trees from the posterior distribution, and robust estimation of consensus transmission trees. We implemented the proposed method in a new R package phybreak. The method performs well in tests of both new and published simulated data. We apply the model to five datasets on densely sampled infectious disease outbreaks, covering a wide range of epidemiological settings. Using only sampling times and sequences as data, our analyses confirmed the original results or improved on them: the more realistic infection times place more confidence in the inferred transmission trees.

  2. Determination of aflatoxins in by-products of industrial processing of cocoa beans.

    PubMed

    Copetti, Marina V; Iamanaka, Beatriz T; Pereira, José Luiz; Lemes, Daniel P; Nakano, Felipe; Taniwaki, Marta H

    2012-01-01

    This study has examined the occurrence of aflatoxins in 168 samples of different fractions obtained during the processing of cocoa in manufacturing plants (shell, nibs, mass, butter, cake and powder) using an optimised methodology for cocoa by-products. The method validation was based on selectivity, linearity, limit of detection and recovery. The method was shown to be adequate for use in quantifying the contamination of cocoa by aflatoxins B(1), B(2), G(1) and G(2). Furthermore, the method was easier to use than other methods available in the literature. For aflatoxin extraction from cocoa samples, a methanol-water solution was used, and then immunoaffinity columns were employed for clean-up before the determination by high-performance liquid chromatography. A survey demonstrated a widespread occurrence of aflatoxins in cocoa by-products, although in general the levels of aflatoxins present in the fractions from industrial processing of cocoa were low. A maximum aflatoxin contamination of 13.3 ng g(-1) was found in a nib sample. The lowest contamination levels were found in cocoa butter. Continued monitoring of aflatoxins in cocoa by-products is nevertheless necessary because these toxins have a high toxicity to humans and cocoa is widely consumed by children through cocoa-containing products, like candies.

  3. Remote Sensing Image Quality Assessment Experiment with Post-Processing

    NASA Astrophysics Data System (ADS)

    Jiang, W.; Chen, S.; Wang, X.; Huang, Q.; Shi, H.; Man, Y.

    2018-04-01

    This paper briefly describes the post-processing influence assessment experiment, the experiment includes three steps: the physical simulation, image processing, and image quality assessment. The physical simulation models sampled imaging system in laboratory, the imaging system parameters are tested, the digital image serving as image processing input are produced by this imaging system with the same imaging system parameters. The gathered optical sampled images with the tested imaging parameters are processed by 3 digital image processes, including calibration pre-processing, lossy compression with different compression ratio and image post-processing with different core. Image quality assessment method used is just noticeable difference (JND) subject assessment based on ISO20462, through subject assessment of the gathered and processing images, the influence of different imaging parameters and post-processing to image quality can be found. The six JND subject assessment experimental data can be validated each other. Main conclusions include: image post-processing can improve image quality; image post-processing can improve image quality even with lossy compression, image quality with higher compression ratio improves less than lower ratio; with our image post-processing method, image quality is better, when camera MTF being within a small range.

  4. Period Estimation for Sparsely-sampled Quasi-periodic Light Curves Applied to Miras

    NASA Astrophysics Data System (ADS)

    He, Shiyuan; Yuan, Wenlong; Huang, Jianhua Z.; Long, James; Macri, Lucas M.

    2016-12-01

    We develop a nonlinear semi-parametric Gaussian process model to estimate periods of Miras with sparsely sampled light curves. The model uses a sinusoidal basis for the periodic variation and a Gaussian process for the stochastic changes. We use maximum likelihood to estimate the period and the parameters of the Gaussian process, while integrating out the effects of other nuisance parameters in the model with respect to a suitable prior distribution obtained from earlier studies. Since the likelihood is highly multimodal for period, we implement a hybrid method that applies the quasi-Newton algorithm for Gaussian process parameters and search the period/frequency parameter space over a dense grid. A large-scale, high-fidelity simulation is conducted to mimic the sampling quality of Mira light curves obtained by the M33 Synoptic Stellar Survey. The simulated data set is publicly available and can serve as a testbed for future evaluation of different period estimation methods. The semi-parametric model outperforms an existing algorithm on this simulated test data set as measured by period recovery rate and quality of the resulting period-luminosity relations.

  5. Homogenous Nucleation and Crystal Growth in a Model Liquid from Direct Energy Landscape Sampling Simulation

    NASA Astrophysics Data System (ADS)

    Walter, Nathan; Zhang, Yang

    Nucleation and crystal growth are understood to be activated processes involving the crossing of free-energy barriers. Attempts to capture the entire crystallization process over long timescales with molecular dynamic simulations have met major obstacles because of molecular dynamics' temporal constraints. Herein, we circumvent this temporal limitation by using a brutal-force, metadynamics-like, adaptive basin-climbing algorithm and directly sample the free-energy landscape of a model liquid Argon. The algorithm biases the system to evolve from an amorphous liquid like structure towards an FCC crystal through inherent structure, and then traces back the energy barriers. Consequently, the sampled timescale is macroscopically long. We observe that the formation of a crystal involves two processes, each with a unique temperature-dependent energy barrier. One barrier corresponds to the crystal nucleus formation; the other barrier corresponds to the crystal growth. We find the two processes dominate in different temperature regimes. Compared to other computation techniques, our method requires no assumptions about the shape or chemical potential of the critical crystal nucleus. The success of this method is encouraging for studying the crystallization of more complex

  6. Inverse analysis of water profile in starch by non-contact photopyroelectric method

    NASA Astrophysics Data System (ADS)

    Frandas, A.; Duvaut, T.; Paris, D.

    2000-07-01

    The photopyroelectric (PPE) method in a non-contact configuration was proposed to study water migration in starch sheets used for biodegradable packaging. A 1-D theoretical model was developed, allowing the study of samples having a water profile characterized by an arbitrary continuous function. An experimental setup was designed or this purpose which included the choice of excitation source, detection of signals, signal and data processing, and cells for conditioning the samples. We report here the development of an inversion procedure allowing for the determination of the parameters that influence the PPE signal. This procedure led to the optimization of experimental conditions in order to identify the parameters related to the water profile in the sample, and to monitor the dynamics of the process.

  7. Permeation-solid adsorbent sampling and GC analysis of formaldehyde.

    PubMed

    Muntuta-Kinyanta, C; Hardy, J K

    1991-12-01

    A passive method with membrane permeation sampling for the determination of time-weighted-average (TWA) concentration of formaldehyde in air is described. The sampling device was constructed by affixing an unbacked dimethyl silicone membrane to the base of a glass tube and by sealing the top with a rubber stopper. Formaldehyde permeates the membrane and reacts with 2-(hydroxymethyl)piperidine (2-HMP) coated on the surface of XAD-2. Sampling times from 15 min to 8 hr have been used. The formaldehyde-oxazolidine produced is thermally desorbed and determined by a packed column gas chromatograph equipped with a flame ionization detector (FID). The response of the monitor is directly proportional to the external concentration of formaldehyde over the concentration range 0.050-100 ppm. The permeation constant (the slope of the permeation curve) of the membrane is 0.333 mug ppm(-1). hr, and the detection limit of the method is 0.03 ppm for an 8-hr sampling period. Relative humidity (RH) (35-94%), temperature (0-82 degrees ) and storage period (0-25 days) do not affect the permeation process for sample collection. Moreover, potential chemical interferences, 10 ppm acetone or acrolein, respectively, have no detectable effect on the process. The method gives TWA concentration directly from the measurements, and the equipment is economical and convenient for personal or multi-location sample collections.

  8. Measurements of airborne methylene diphenyl diisocyanate (MDI) concentration in the U.S. workplace.

    PubMed

    Booth, Karroll; Cummings, Barbara; Karoly, William J; Mullins, Sharon; Robert, William P; Spence, Mark; Lichtenberg, Fran W; Banta, J

    2009-04-01

    This article summarizes a large body of industry air sampling data (8134 samples) in which airborne MDI concentrations were measured in a wide variety of manufacturing processes that use either polymeric MDI (PMDI) or monomeric (pure) MDI. Data were collected during the period 1984 through 1999. A total of 606 surveys were conducted for 251 companies at 317 facilities. The database includes 3583 personal (breathing zone) samples and 4551 area samples. Data demonstrate that workplace airborne MDI concentrations are extremely low in a majority of the manufacturing operations. Most (74.6%) of the airborne MDI concentrations measured in the personal samples were nondetectable, i.e., below the limits of quantification (LOQs). A variety of validated industrial hygiene sampling/analytical methods were used for data collection; most are modifications of OSHA Method 47. The LOQs for these methods ranged from 0.1-0.5 microg/sample. The very low vapor pressures of both monomeric MDI and PMDI largely explain the low airborne concentrations found in most operations. However, processes or applications in which the chemical is sprayed or heated may result in higher airborne concentrations and higher exposure potentials if appropriate control measures are not implemented. Data presented in this article will be a useful reference for employers in helping them to manage their health and safety program as it relates to respiratory protection during MDI/PMDI applications.

  9. Unsupervised Learning —A Novel Clustering Method for Rolling Bearing Faults Identification

    NASA Astrophysics Data System (ADS)

    Kai, Li; Bo, Luo; Tao, Ma; Xuefeng, Yang; Guangming, Wang

    2017-12-01

    To promptly process the massive fault data and automatically provide accurate diagnosis results, numerous studies have been conducted on intelligent fault diagnosis of rolling bearing. Among these studies, such as artificial neural networks, support vector machines, decision trees and other supervised learning methods are used commonly. These methods can detect the failure of rolling bearing effectively, but to achieve better detection results, it often requires a lot of training samples. Based on above, a novel clustering method is proposed in this paper. This novel method is able to find the correct number of clusters automatically the effectiveness of the proposed method is validated using datasets from rolling element bearings. The diagnosis results show that the proposed method can accurately detect the fault types of small samples. Meanwhile, the diagnosis results are also relative high accuracy even for massive samples.

  10. The biospeckle method for the investigation of agricultural crops: A review

    NASA Astrophysics Data System (ADS)

    Zdunek, Artur; Adamiak, Anna; Pieczywek, Piotr M.; Kurenda, Andrzej

    2014-01-01

    Biospeckle is a nondestructive method for the evaluation of living objects. It has been applied to medicine, agriculture and microbiology for monitoring processes related to the movement of material particles. Recently, this method is extensively used for evaluation of quality of agricultural crops. In the case of botanical materials, the sources of apparent biospeckle activity are the Brownian motions and biological processes such as cyclosis, growth, transport, etc. Several different applications have been shown to monitor aging and maturation of samples, organ development and the detection and development of defects and diseases. This review will focus on three aspects: on the image analysis and mathematical methods for biospeckle activity evaluation, on published applications to botanical samples, with special attention to agricultural crops, and on interpretation of the phenomena from a biological point of view.

  11. Effective PCR detection of animal species in highly processed animal byproducts and compound feeds.

    PubMed

    Fumière, Olivier; Dubois, Marc; Baeten, Vincent; von Holst, Christoph; Berben, Gilbert

    2006-07-01

    In this paper we present a polymerase chain reaction (PCR)-based method for detecting meat and bone meal (MBM) in compound feedingstuffs. By choosing adequate DNA targets from an appropriate localisation in the genome, the real-time PCR method developed here proved to be robust to severe heat treatment of the MBM, showing high sensitivity in the detection of MBM. The method developed here permits the specific detection of processed pig and cattle materials treated at 134 degrees C in various feed matrices down to a limit of detection of about 0.1%. This technique has also been successfully applied to well-characterised MBM samples heated to as high as 141 degrees C, as well as to various blind feed samples with very low MBM contents. Finally, the method also passed several official European ring trials.

  12. Quantification of short chain amines in aqueous matrices using liquid chromatography electrospray ionization tandem mass spectrometry.

    PubMed

    Viidanoja, Jyrki

    2017-01-13

    A new liquid chromatography-electrospray ionization-tandem Mass Spectrometry (LC-ESI-MS/MS) method was developed for the determination of more than 20 C 1 -C 6 alkyl and alkanolamines in aqueous matrices. The method employs Hydrophilic Interaction Liquid Chromatography Multiple Reaction Monitoring (HILIC-MRM) with a ZIC-pHILIC column and four stable isotope labeled amines as internal standards for signal normalization and quantification of the amines. The method was validated using a refinery process water sample that was obtained from a cooling cycle of crude oil distillation. The averaged within run precision, between run precision and accuracy were generally within 2-10%, 1-9% and 80-120%, respectively, depending on the analyte and concentration level. Selected aqueous process samples were analyzed with the method. Copyright © 2016 Elsevier B.V. All rights reserved.

  13. Sample Size for Tablet Compression and Capsule Filling Events During Process Validation.

    PubMed

    Charoo, Naseem Ahmad; Durivage, Mark; Rahman, Ziyaur; Ayad, Mohamad Haitham

    2017-12-01

    During solid dosage form manufacturing, the uniformity of dosage units (UDU) is ensured by testing samples at 2 stages, that is, blend stage and tablet compression or capsule/powder filling stage. The aim of this work is to propose a sample size selection approach based on quality risk management principles for process performance qualification (PPQ) and continued process verification (CPV) stages by linking UDU to potential formulation and process risk factors. Bayes success run theorem appeared to be the most appropriate approach among various methods considered in this work for computing sample size for PPQ. The sample sizes for high-risk (reliability level of 99%), medium-risk (reliability level of 95%), and low-risk factors (reliability level of 90%) were estimated to be 299, 59, and 29, respectively. Risk-based assignment of reliability levels was supported by the fact that at low defect rate, the confidence to detect out-of-specification units would decrease which must be supplemented with an increase in sample size to enhance the confidence in estimation. Based on level of knowledge acquired during PPQ and the level of knowledge further required to comprehend process, sample size for CPV was calculated using Bayesian statistics to accomplish reduced sampling design for CPV. Copyright © 2017 American Pharmacists Association®. Published by Elsevier Inc. All rights reserved.

  14. Using Language Sample Analysis to Assess Spoken Language Production in Adolescents

    ERIC Educational Resources Information Center

    Miller, Jon F.; Andriacchi, Karen; Nockerts, Ann

    2016-01-01

    Purpose: This tutorial discusses the importance of language sample analysis and how Systematic Analysis of Language Transcripts (SALT) software can be used to simplify the process and effectively assess the spoken language production of adolescents. Method: Over the past 30 years, thousands of language samples have been collected from typical…

  15. Generalization bounds of ERM-based learning processes for continuous-time Markov chains.

    PubMed

    Zhang, Chao; Tao, Dacheng

    2012-12-01

    Many existing results on statistical learning theory are based on the assumption that samples are independently and identically distributed (i.i.d.). However, the assumption of i.i.d. samples is not suitable for practical application to problems in which samples are time dependent. In this paper, we are mainly concerned with the empirical risk minimization (ERM) based learning process for time-dependent samples drawn from a continuous-time Markov chain. This learning process covers many kinds of practical applications, e.g., the prediction for a time series and the estimation of channel state information. Thus, it is significant to study its theoretical properties including the generalization bound, the asymptotic convergence, and the rate of convergence. It is noteworthy that, since samples are time dependent in this learning process, the concerns of this paper cannot (at least straightforwardly) be addressed by existing methods developed under the sample i.i.d. assumption. We first develop a deviation inequality for a sequence of time-dependent samples drawn from a continuous-time Markov chain and present a symmetrization inequality for such a sequence. By using the resultant deviation inequality and symmetrization inequality, we then obtain the generalization bounds of the ERM-based learning process for time-dependent samples drawn from a continuous-time Markov chain. Finally, based on the resultant generalization bounds, we analyze the asymptotic convergence and the rate of convergence of the learning process.

  16. A multiplex PCR method for the identification of commercially important salmon and trout species (Oncorhynchus and Salmo) in North America.

    PubMed

    Rasmussen Hellberg, Rosalee S; Morrissey, Michael T; Hanner, Robert H

    2010-09-01

    The purpose of this study was to develop a species-specific multiplex polymerase chain reaction (PCR) method that allows for the detection of salmon species substitution on the commercial market. Species-specific primers and TaqMan® probes were developed based on a comprehensive collection of mitochondrial 5' cytochrome c oxidase subunit I (COI) deoxyribonucleic acid (DNA) "barcode" sequences. Primers and probes were combined into multiplex assays and tested for specificity against 112 reference samples representing 25 species. Sensitivity and linearity tests were conducted using 10-fold serial dilutions of target DNA (single-species samples) and DNA admixtures containing the target species at levels of 10%, 1.0%, and 0.1% mixed with a secondary species. The specificity tests showed positive signals for the target DNA in both real-time and conventional PCR systems. Nonspecific amplification in both systems was minimal; however, false positives were detected at low levels (1.2% to 8.3%) in conventional PCR. Detection levels were similar for admixtures and single-species samples based on a 30 PCR cycle cut-off, with limits of 0.25 to 2.5 ng (1% to 10%) in conventional PCR and 0.05 to 5.0 ng (0.1% to 10%) in real-time PCR. A small-scale test with food samples showed promising results, with species identification possible even in heavily processed food items. Overall, this study presents a rapid, specific, and sensitive method for salmon species identification that can be applied to mixed-species and heavily processed samples in either conventional or real-time PCR formats. This study provides a newly developed method for salmon and trout species identification that will assist both industry and regulatory agencies in the detection and prevention of species substitution. This multiplex PCR method allows for rapid, high-throughput species identification even in heavily processed and mixed-species samples. An inter-laboratory study is currently being carried out to assess the ability of this method to identify species in a variety of commercial salmon and trout products.

  17. Implementation of unsteady sampling procedures for the parallel direct simulation Monte Carlo method

    NASA Astrophysics Data System (ADS)

    Cave, H. M.; Tseng, K.-C.; Wu, J.-S.; Jermy, M. C.; Huang, J.-C.; Krumdieck, S. P.

    2008-06-01

    An unsteady sampling routine for a general parallel direct simulation Monte Carlo method called PDSC is introduced, allowing the simulation of time-dependent flow problems in the near continuum range. A post-processing procedure called DSMC rapid ensemble averaging method (DREAM) is developed to improve the statistical scatter in the results while minimising both memory and simulation time. This method builds an ensemble average of repeated runs over small number of sampling intervals prior to the sampling point of interest by restarting the flow using either a Maxwellian distribution based on macroscopic properties for near equilibrium flows (DREAM-I) or output instantaneous particle data obtained by the original unsteady sampling of PDSC for strongly non-equilibrium flows (DREAM-II). The method is validated by simulating shock tube flow and the development of simple Couette flow. Unsteady PDSC is found to accurately predict the flow field in both cases with significantly reduced run-times over single processor code and DREAM greatly reduces the statistical scatter in the results while maintaining accurate particle velocity distributions. Simulations are then conducted of two applications involving the interaction of shocks over wedges. The results of these simulations are compared to experimental data and simulations from the literature where there these are available. In general, it was found that 10 ensembled runs of DREAM processing could reduce the statistical uncertainty in the raw PDSC data by 2.5-3.3 times, based on the limited number of cases in the present study.

  18. Molecular detection of Toxoplasma gondii in water samples from Scotland and a comparison between the 529bp real-time PCR and ITS1 nested PCR.

    PubMed

    Wells, Beth; Shaw, Hannah; Innocent, Giles; Guido, Stefano; Hotchkiss, Emily; Parigi, Maria; Opsteegh, Marieke; Green, James; Gillespie, Simon; Innes, Elisabeth A; Katzer, Frank

    2015-12-15

    Waterborne transmission of Toxoplasma gondii is a potential public health risk and there are currently no agreed optimised methods for the recovery, processing and detection of T. gondii oocysts in water samples. In this study modified methods of T. gondii oocyst recovery and DNA extraction were applied to 1427 samples collected from 147 public water supplies throughout Scotland. T. gondii DNA was detected, using real time PCR (qPCR) targeting the 529bp repeat element, in 8.79% of interpretable samples (124 out of 1411 samples). The samples which were positive for T. gondii DNA originated from a third of the sampled water sources. The samples which were positive by qPCR and some of the negative samples were reanalysed using ITS1 nested PCR (nPCR) and results compared. The 529bp qPCR was the more sensitive technique and a full analysis of assay performance, by Bayesian analysis using a Markov Chain Monte Carlo method, was completed which demonstrated the efficacy of this method for the detection of T. gondii in water samples. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.

  19. Carcass enrichment detects Salmonella from broiler carcasses found to be negative by other sampling methods

    USDA-ARS?s Scientific Manuscript database

    The most frequently used methods to recover Salmonella from processed broiler chicken carcasses involve carcass rinsing or neck skin maceration. These methods are nondestructive and practical, but have limited sensitivity. The standard carcass rinse method uses only 7.5% of the residual rinsate an...

  20. Novel method for fast characterization of high-surface-area electrocatalytic materials using carbon fiber microelectrode.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Strmcnik, D.; Hodnik, N.; Hocevar, S. B.

    2010-02-18

    A carbon fiber microelectrode (CFME) was used for characterization of the nanoparticle catalysts as an alternative to the well-established rotating disk electrode (RDE) method. We found that the novel CFME method yielded comparable results to the RDE method when investigating the adsorption/desorption processes as well the specific activity for reactions such as the oxygen reduction reaction. Its major advantage over the RDE method is a fast sample preparation and rapid measurement, reducing significantly the time of a single sample characterization from 2-3 h to a favorable 5-10 min.

  1. Sampling methods for microbiological analysis of red meat and poultry carcasses.

    PubMed

    Capita, Rosa; Prieto, Miguel; Alonso-Calleja, Carlos

    2004-06-01

    Microbiological analysis of carcasses at slaughterhouses is required in the European Union for evaluating the hygienic performance of carcass production processes as required for effective hazard analysis critical control point implementation. The European Union microbial performance standards refer exclusively to the excision method, even though swabbing using the wet/dry technique is also permitted when correlation between both destructive and nondestructive methods can be established. For practical and economic reasons, the swab technique is the most extensively used carcass surface-sampling method. The main characteristics, advantages, and limitations of the common excision and swabbing methods are described here.

  2. Frequency division multiplexed multi-color fluorescence microscope system

    NASA Astrophysics Data System (ADS)

    Le, Vu Nam; Yang, Huai Dong; Zhang, Si Chun; Zhang, Xin Rong; Jin, Guo Fan

    2017-10-01

    Grayscale camera can only obtain gray scale image of object, while the multicolor imaging technology can obtain the color information to distinguish the sample structures which have the same shapes but in different colors. In fluorescence microscopy, the current method of multicolor imaging are flawed. Problem of these method is affecting the efficiency of fluorescence imaging, reducing the sampling rate of CCD etc. In this paper, we propose a novel multiple color fluorescence microscopy imaging method which based on the Frequency division multiplexing (FDM) technology, by modulating the excitation lights and demodulating the fluorescence signal in frequency domain. This method uses periodic functions with different frequency to modulate amplitude of each excitation lights, and then combine these beams for illumination in a fluorescence microscopy imaging system. The imaging system will detect a multicolor fluorescence image by a grayscale camera. During the data processing, the signal obtained by each pixel of the camera will be processed with discrete Fourier transform, decomposed by color in the frequency domain and then used inverse discrete Fourier transform. After using this process for signals from all of the pixels, monochrome images of each color on the image plane can be obtained and multicolor image is also acquired. Based on this method, this paper has constructed and set up a two-color fluorescence microscope system with two excitation wavelengths of 488 nm and 639 nm. By using this system to observe the linearly movement of two kinds of fluorescent microspheres, after the data processing, we obtain a two-color fluorescence dynamic video which is consistent with the original image. This experiment shows that the dynamic phenomenon of multicolor fluorescent biological samples can be generally observed by this method. Compared with the current methods, this method can obtain the image signals of each color at the same time, and the color video's frame rate is consistent with the frame rate of the camera. The optical system is simpler and does not need extra color separation element. In addition, this method has a good filtering effect on the ambient light or other light signals which are not affected by the modulation process.

  3. Holographic femtosecond laser processing and its application to biological materials (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Hayasaki, Yoshio

    2017-02-01

    Femtosecond laser processing is a promising tool for fabricating novel and useful structures on the surfaces of and inside materials. An enormous number of pulse irradiation points will be required for fabricating actual structures with millimeter scale, and therefore, the throughput of femtosecond laser processing must be improved for practical adoption of this technique. One promising method to improve throughput is parallel pulse generation based on a computer-generated hologram (CGH) displayed on a spatial light modulator (SLM), a technique called holographic femtosecond laser processing. The holographic method has the advantages such as high throughput, high light use efficiency, and variable, instantaneous, and 3D patterning. Furthermore, the use of an SLM gives an ability to correct unknown imperfections of the optical system and inhomogeneity in a sample using in-system optimization of the CGH. Furthermore, the CGH can adaptively compensate in response to dynamic unpredictable mechanical movements, air and liquid disturbances, a shape variation and deformation of the target sample, as well as adaptive wavefront control for environmental changes. Therefore, it is a powerful tool for the fabrication of biological cells and tissues, because they have free form, variable, and deformable structures. In this paper, we present the principle and the experimental setup of holographic femtosecond laser processing, and the effective way for processing the biological sample. We demonstrate the femtosecond laser processing of biological materials and the processing properties.

  4. Influence of the processing route of porcelain/Ti-6Al-4V interfaces on shear bond strength.

    PubMed

    Toptan, Fatih; Alves, Alexandra C; Henriques, Bruno; Souza, Júlio C M; Coelho, Rui; Silva, Filipe S; Rocha, Luís A; Ariza, Edith

    2013-04-01

    This study aims at evaluating the two-fold effect of initial surface conditions and dental porcelain-to-Ti-6Al-4V alloy joining processing route on the shear bond strength. Porcelain-to-Ti-6Al-4V samples were processed by conventional furnace firing (porcelain-fused-to-metal) and hot pressing. Prior to the processing, Ti-6Al-4V cylinders were prepared by three different surface treatments: polishing, alumina or silica blasting. Within the firing process, polished and alumina blasted samples were subjected to two different cooling rates: air cooling and a slower cooling rate (65°C/min). Metal/porcelain bond strength was evaluated by shear bond test. The data were analyzed using one-way ANOVA followed by Tuckey's test (p<0.05). Before and after shear bond tests, metallic surfaces and metal/ceramic interfaces were examined by Field Emission Gun Scanning Electron Microscope (FEG-SEM) equipped with Energy Dispersive X-Ray Spectroscopy (EDS). Shear bond strength values of the porcelain-to-Ti-6Al-4V alloy interfaces ranged from 27.1±8.9MPa for porcelain fused to polished samples up to 134.0±43.4MPa for porcelain fused to alumina blasted samples. According to the statistical analysis, no significant difference were found on the shear bond strength values for different cooling rates. Processing method was statistically significant only for the polished samples, and airborne particle abrasion was statistically significant only for the fired samples. The type of the blasting material did not cause a statistically significant difference on the shear bond strength values. Shear bond strength of dental porcelain to Ti-6Al-4V alloys can be significantly improved from controlled conditions of surface treatments and processing methods. Copyright © 2013 Elsevier Ltd. All rights reserved.

  5. Efficient Research Design: Using Value-of-Information Analysis to Estimate the Optimal Mix of Top-down and Bottom-up Costing Approaches in an Economic Evaluation alongside a Clinical Trial.

    PubMed

    Wilson, Edward C F; Mugford, Miranda; Barton, Garry; Shepstone, Lee

    2016-04-01

    In designing economic evaluations alongside clinical trials, analysts are frequently faced with alternative methods of collecting the same data, the extremes being top-down ("gross costing") and bottom-up ("micro-costing") approaches. A priori, bottom-up approaches may be considered superior to top-down approaches but are also more expensive to collect and analyze. In this article, we use value-of-information analysis to estimate the efficient mix of observations on each method in a proposed clinical trial. By assigning a prior bivariate distribution to the 2 data collection processes, the predicted posterior (i.e., preposterior) mean and variance of the superior process can be calculated from proposed samples using either process. This is then used to calculate the preposterior mean and variance of incremental net benefit and hence the expected net gain of sampling. We apply this method to a previously collected data set to estimate the value of conducting a further trial and identifying the optimal mix of observations on drug costs at 2 levels: by individual item (process A) and by drug class (process B). We find that substituting a number of observations on process A for process B leads to a modest £ 35,000 increase in expected net gain of sampling. Drivers of the results are the correlation between the 2 processes and their relative cost. This method has potential use following a pilot study to inform efficient data collection approaches for a subsequent full-scale trial. It provides a formal quantitative approach to inform trialists whether it is efficient to collect resource use data on all patients in a trial or on a subset of patients only or to collect limited data on most and detailed data on a subset. © The Author(s) 2016.

  6. Quality assessment of raw and processed Arctium lappa L. through multicomponent quantification, chromatographic fingerprint, and related chemometric analysis.

    PubMed

    Qin, Kunming; Wang, Bin; Li, Weidong; Cai, Hao; Chen, Danni; Liu, Xiao; Yin, Fangzhou; Cai, Baochang

    2015-05-01

    In traditional Chinese medicine, raw and processed herbs are used to treat different diseases. Suitable quality assessment methods are crucial for the discrimination between raw and processed herbs. The dried fruit of Arctium lappa L. and their processed products are widely used in traditional Chinese medicine, yet their therapeutic effects are different. In this study, a novel strategy using high-performance liquid chromatography and diode array detection coupled with multivariate statistical analysis to rapidly explore raw and processed Arctium lappa L. was proposed and validated. Four main components in a total of 30 batches of raw and processed Fructus Arctii samples were analyzed, and ten characteristic peaks were identified in the fingerprint common pattern. Furthermore, similarity evaluation, principal component analysis, and hierachical cluster analysis were applied to demonstrate the distinction. The results suggested that the relative amounts of the chemical components of raw and processed Fructus Arctii samples are different. This new method has been successfully applied to detect the raw and processed Fructus Arctii in marketed herbal medicinal products. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  7. Ayurvedic hydro-alcoholic anti-asthmatic medicine Vasarishta built upon Mritasanjeevani Sura: Development and evaluation

    PubMed Central

    Wele, Asmita A.; Pendse, Nikhil U.; Takle, Shrikant P.; Burase, Raghunath V.; Patil, Sanjay V.; Bhalerao, Supriya; Harsulkar, Abhay M.

    2015-01-01

    Introduction: Vasarishta built upon Mritasanjeevani Sura (MS) is a polyherbal hydro-alcoholic anti-asthmatic formulation which is administered in a dose of 1 ml instead of standard dose 40 ml, generally advocated for any “Asava–Arishta” in Ayurveda. Aim: The present study was aimed at finding out rationale for the peculiar distillation process to manufacture MS followed by Sthapana process to make Vasarishta. It was further aimed to find out difference in Vasarishta samples manufactured by purely fermentation process and the peculiar method mentioned above. Materials and Methods: Three batches of MS and subsequently three batches of Vasarishta were prepared. Basic standardization and development of standard operating procedure for the same were achieved by doing pH, percentage of alcohol and total reducing sugar, specific gravity on both MS and Vasarishta, during and after completion of process. Finally, MS and Vasarishta (built upon MS) made in laboratory were compared with marketed samples of MS and Vasarishta using gas chromatography. Results: The types of alcohols and volatile acids in MS and Vasarishta, prepared in laboratory, are similar but the proportions differ, which is taken as an indicator of process standardization. Values of furfural, ethyl acetate, and 1-butanol in lab samples are within permissible limits as against the values of the market samples. Conclusions: The textual process for the production of Vasarishta proved to produce organoleptically acceptable product which is virtually free of toxic compounds such as furfural. PMID:27313419

  8. Preliminary assessment for DNA extraction on microfluidic channel

    NASA Astrophysics Data System (ADS)

    Gopinath, Subash C. B.; Hashim, Uda; Uda, M. N. A.

    2017-03-01

    The aim of this research is to extract, purify and yield DNA in mushroom from solid state mushroom sample by using fabricated continuous high-capacity sample delivery microfluidic through integrated solid state extraction based amino-coated silica bead. This device is made to specifically extract DNA in mushroom sample in continuous inflow process with energy and cost consumption. In this project, we present two methods of DNA extraction and purification which are by using centrifuge (complex and conventional method) and by using microfluidic biosensor (new and fast method). DNA extracted can be determined by using ultraviolet-visible spectroscopy (UV-VIS). The peak obtained at wavelength 260nm after measuring the absorbance of sample proves that DNA is successfully extracted from the mushroom.

  9. Surface sampling techniques for 3D object inspection

    NASA Astrophysics Data System (ADS)

    Shih, Chihhsiong S.; Gerhardt, Lester A.

    1995-03-01

    While the uniform sampling method is quite popular for pointwise measurement of manufactured parts, this paper proposes three novel sampling strategies which emphasize 3D non-uniform inspection capability. They are: (a) the adaptive sampling, (b) the local adjustment sampling, and (c) the finite element centroid sampling techniques. The adaptive sampling strategy is based on a recursive surface subdivision process. Two different approaches are described for this adaptive sampling strategy. One uses triangle patches while the other uses rectangle patches. Several real world objects were tested using these two algorithms. Preliminary results show that sample points are distributed more closely around edges, corners, and vertices as desired for many classes of objects. Adaptive sampling using triangle patches is shown to generally perform better than both uniform and adaptive sampling using rectangle patches. The local adjustment sampling strategy uses a set of predefined starting points and then finds the local optimum position of each nodal point. This method approximates the object by moving the points toward object edges and corners. In a hybrid approach, uniform points sets and non-uniform points sets, first preprocessed by the adaptive sampling algorithm on a real world object were then tested using the local adjustment sampling method. The results show that the initial point sets when preprocessed by adaptive sampling using triangle patches, are moved the least amount of distance by the subsequently applied local adjustment method, again showing the superiority of this method. The finite element sampling technique samples the centroids of the surface triangle meshes produced from the finite element method. The performance of this algorithm was compared to that of the adaptive sampling using triangular patches. The adaptive sampling with triangular patches was once again shown to be better on different classes of objects.

  10. Propagation-based x-ray phase contrast imaging using an iterative phase diversity technique

    NASA Astrophysics Data System (ADS)

    Carroll, Aidan J.; van Riessen, Grant A.; Balaur, Eugeniu; Dolbnya, Igor P.; Tran, Giang N.; Peele, Andrew G.

    2018-03-01

    Through the use of a phase diversity technique, we demonstrate a near-field in-line x-ray phase contrast algorithm that provides improved object reconstruction when compared to our previous iterative methods for a homogeneous sample. Like our previous methods, the new technique uses the sample refractive index distribution during the reconstruction process. The technique complements existing monochromatic and polychromatic methods and is useful in situations where experimental phase contrast data is affected by noise.

  11. Particle Morphology Analysis of Biomass Material Based on Improved Image Processing Method

    PubMed Central

    Lu, Zhaolin

    2017-01-01

    Particle morphology, including size and shape, is an important factor that significantly influences the physical and chemical properties of biomass material. Based on image processing technology, a method was developed to process sample images, measure particle dimensions, and analyse the particle size and shape distributions of knife-milled wheat straw, which had been preclassified into five nominal size groups using mechanical sieving approach. Considering the great variation of particle size from micrometer to millimeter, the powders greater than 250 μm were photographed by a flatbed scanner without zoom function, and the others were photographed using a scanning electron microscopy (SEM) with high-image resolution. Actual imaging tests confirmed the excellent effect of backscattered electron (BSE) imaging mode of SEM. Particle aggregation is an important factor that affects the recognition accuracy of the image processing method. In sample preparation, the singulated arrangement and ultrasonic dispersion methods were used to separate powders into particles that were larger and smaller than the nominal size of 250 μm. In addition, an image segmentation algorithm based on particle geometrical information was proposed to recognise the finer clustered powders. Experimental results demonstrated that the improved image processing method was suitable to analyse the particle size and shape distributions of ground biomass materials and solve the size inconsistencies in sieving analysis. PMID:28298925

  12. Method for preparing Pb-. beta. ''-alumina ceramic

    DOEpatents

    Hellstrom, E.E.

    1984-08-30

    A process is disclosed for preparing impermeable, polycrystalline samples of Pb-..beta..''-alumina ceramic from Na-..beta..''-alumina ceramic by ion exchange. The process comprises two steps. The first step is a high-temperature vapor phase exchange of Na by K, followed by substitution of Pb for K by immersing the sample in a molten Pb salt bath. The result is a polycrystalline Pb-..beta..''-alumina ceramic that is substantially crack-free.

  13. Method for preparing Pb-.beta."-alumina ceramic

    DOEpatents

    Hellstrom, Eric E.

    1986-01-01

    A process is disclosed for preparing impermeable, polycrystalline samples of Pb-.beta."-alumina ceramic from Na-.beta."-alumina ceramic by ion exchange. The process comprises two steps. The first step is a high-temperature vapor phase exchange of Na by K, followed by substitution of Pb for K by immersing the sample in a molten Pb salt bath. The result is a polycrystalline Pb-.beta."-alumina ceramic that is substantially crack-free.

  14. The effect of silica-coating by sol-gel process on resin-zirconia bonding.

    PubMed

    Lung, Christie Ying Kei; Kukk, Edwin; Matinlinna, Jukka Pekka

    2013-01-01

    The effect of silica-coating by sol-gel process on the bond strength of resin composite to zirconia was evaluated and compared against the sandblasting method. Four groups of zirconia samples were silica-coated by sol-gel process under varied reagent ratios of ethanol, water, ammonia and tetraethyl orthosilicate and for different deposition times. One control group of zirconia samples were treated with sandblasting. Within each of these five groups, one subgroup of samples was kept in dry storage while another subgroup was aged by thermocycling for 6,000 times. Besides shear bond testing, the surface topography and surface elemental composition of silica-coated zirconia samples were also examined using scanning electron microscopy and X-ray photoelectron spectroscopy. Comparison of silica coating methods revealed significant differences in bond strength among the Dry groups (p<0.001) and Thermocycled groups (p<0.001). Comparison of sol-gel deposition times also revealed significant differences in bond strength among the Dry groups (p<0.01) and Thermocycled groups (p<0.001). Highest bond strengths were obtained after 141-h deposition: Dry (7.97±3.72 MPa); Thermocycled (2.33±0.79 MPa). It was concluded that silica-coating of zirconia by sol-gel process resulted in weaker resin bonding than by sandblasting.

  15. A comparative study of Conroy and Monte Carlo methods applied to multiple quadratures and multiple scattering

    NASA Technical Reports Server (NTRS)

    Deepak, A.; Fluellen, A.

    1978-01-01

    An efficient numerical method of multiple quadratures, the Conroy method, is applied to the problem of computing multiple scattering contributions in the radiative transfer through realistic planetary atmospheres. A brief error analysis of the method is given and comparisons are drawn with the more familiar Monte Carlo method. Both methods are stochastic problem-solving models of a physical or mathematical process and utilize the sampling scheme for points distributed over a definite region. In the Monte Carlo scheme the sample points are distributed randomly over the integration region. In the Conroy method, the sample points are distributed systematically, such that the point distribution forms a unique, closed, symmetrical pattern which effectively fills the region of the multidimensional integration. The methods are illustrated by two simple examples: one, of multidimensional integration involving two independent variables, and the other, of computing the second order scattering contribution to the sky radiance.

  16. COMPARISON OF TWO METHODS FOR THE ISOLATION OF SALMONELLAE FROM IMPORTED FOODS.

    PubMed

    TAYLOR, W I; HOBBS, B C; SMITH, M E

    1964-01-01

    Two methods for the detection of salmonellae in foods were compared in 179 imported meat and egg samples. The number of positive samples and replications, and the number of strains and kinds of serotypes were statistically comparable by both the direct enrichment method of the Food Hygiene Laboratory in England, and the pre-enrichment method devised for processed foods in the United States. Boneless frozen beef, veal, and horsemeat imported from five countries for consumption in England were found to have salmonellae present in 48 of 116 (41%) samples. Dried egg products imported from three countries were observed to have salmonellae in 10 of 63 (16%) samples. The high incidence of salmonellae isolated from imported foods illustrated the existence of an international health hazard resulting from the continuous introduction of exogenous strains of pathogenic microorganisms on a large scale.

  17. Field-based random sampling without a sampling frame: control selection for a case-control study in rural Africa.

    PubMed

    Crampin, A C; Mwinuka, V; Malema, S S; Glynn, J R; Fine, P E

    2001-01-01

    Selection bias, particularly of controls, is common in case-control studies and may materially affect the results. Methods of control selection should be tailored both for the risk factors and disease under investigation and for the population being studied. We present here a control selection method devised for a case-control study of tuberculosis in rural Africa (Karonga, northern Malawi) that selects an age/sex frequency-matched random sample of the population, with a geographical distribution in proportion to the population density. We also present an audit of the selection process, and discuss the potential of this method in other settings.

  18. Optical method for distance and displacement measurements of the probe-sample separation in a scanning near-field optical microscope

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Santamaria, L.; Siller, H. R.; Garcia-Ortiz, C. E., E-mail: cegarcia@cicese.mx

    In this work, we present an alternative optical method to determine the probe-sample separation distance in a scanning near-field optical microscope. The experimental method is based in a Lloyd’s mirror interferometer and offers a measurement precision deviation of ∼100 nm using digital image processing and numerical analysis. The technique can also be strategically combined with the characterization of piezoelectric actuators and stability evaluation of the optical system. It also opens the possibility for the development of an automatic approximation control system valid for probe-sample distances from 5 to 500 μm.

  19. A low-rank matrix recovery approach for energy efficient EEG acquisition for a wireless body area network.

    PubMed

    Majumdar, Angshul; Gogna, Anupriya; Ward, Rabab

    2014-08-25

    We address the problem of acquiring and transmitting EEG signals in Wireless Body Area Networks (WBAN) in an energy efficient fashion. In WBANs, the energy is consumed by three operations: sensing (sampling), processing and transmission. Previous studies only addressed the problem of reducing the transmission energy. For the first time, in this work, we propose a technique to reduce sensing and processing energy as well: this is achieved by randomly under-sampling the EEG signal. We depart from previous Compressed Sensing based approaches and formulate signal recovery (from under-sampled measurements) as a matrix completion problem. A new algorithm to solve the matrix completion problem is derived here. We test our proposed method and find that the reconstruction accuracy of our method is significantly better than state-of-the-art techniques; and we achieve this while saving sensing, processing and transmission energy. Simple power analysis shows that our proposed methodology consumes considerably less power compared to previous CS based techniques.

  20. Adjacent slice prostate cancer prediction to inform MALDI imaging biomarker analysis

    NASA Astrophysics Data System (ADS)

    Chuang, Shao-Hui; Sun, Xiaoyan; Cazares, Lisa; Nyalwidhe, Julius; Troyer, Dean; Semmes, O. John; Li, Jiang; McKenzie, Frederic D.

    2010-03-01

    Prostate cancer is the second most common type of cancer among men in US [1]. Traditionally, prostate cancer diagnosis is made by the analysis of prostate-specific antigen (PSA) levels and histopathological images of biopsy samples under microscopes. Proteomic biomarkers can improve upon these methods. MALDI molecular spectra imaging is used to visualize protein/peptide concentrations across biopsy samples to search for biomarker candidates. Unfortunately, traditional processing methods require histopathological examination on one slice of a biopsy sample while the adjacent slice is subjected to the tissue destroying desorption and ionization processes of MALDI. The highest confidence tumor regions gained from the histopathological analysis are then mapped to the MALDI spectra data to estimate the regions for biomarker identification from the MALDI imaging. This paper describes a process to provide a significantly better estimate of the cancer tumor to be mapped onto the MALDI imaging spectra coordinates using the high confidence region to predict the true area of the tumor on the adjacent MALDI imaged slice.

  1. Effects of drying processes on starch-related physicochemical properties, bioactive components and antioxidant properties of yam flours.

    PubMed

    Chen, Xuetao; Li, Xia; Mao, Xinhui; Huang, Hanhan; Wang, Tingting; Qu, Zhuo; Miao, Jing; Gao, Wenyuan

    2017-06-01

    The effects of five different drying processes, air drying (AD), sulphur fumigation drying (SFD), hot air drying (HAD), freeze drying (FD) and microwave drying (MWD) for yams in terms of starch-related properties and antioxidant activity were studied. From the results of scanning electron microscopy (SEM), polarized optical microscopy (POM), X-ray diffraction (XRD), and Fourier transform infrared (FT-IR), the MWD sample was found to contain gelatinized starch granules. The FD yam had more slow digestible (SDS) and resistant starches (RS) compared with those processed with other modern drying methods. The bioactive components and the reducing power of the dried yams, were lower than those of fresh yam. When five dried samples were compared by principal component analysis, the HAD and SFD samples were observed to have the highest comprehensive principal component values. Based on our results, HAD would be a better method for yam drying than the more traditional SFD. Copyright © 2016 Elsevier Ltd. All rights reserved.

  2. A Nonlinear Framework of Delayed Particle Smoothing Method for Vehicle Localization under Non-Gaussian Environment.

    PubMed

    Xiao, Zhu; Havyarimana, Vincent; Li, Tong; Wang, Dong

    2016-05-13

    In this paper, a novel nonlinear framework of smoothing method, non-Gaussian delayed particle smoother (nGDPS), is proposed, which enables vehicle state estimation (VSE) with high accuracy taking into account the non-Gaussianity of the measurement and process noises. Within the proposed method, the multivariate Student's t-distribution is adopted in order to compute the probability distribution function (PDF) related to the process and measurement noises, which are assumed to be non-Gaussian distributed. A computation approach based on Ensemble Kalman Filter (EnKF) is designed to cope with the mean and the covariance matrix of the proposal non-Gaussian distribution. A delayed Gibbs sampling algorithm, which incorporates smoothing of the sampled trajectories over a fixed-delay, is proposed to deal with the sample degeneracy of particles. The performance is investigated based on the real-world data, which is collected by low-cost on-board vehicle sensors. The comparison study based on the real-world experiments and the statistical analysis demonstrates that the proposed nGDPS has significant improvement on the vehicle state accuracy and outperforms the existing filtering and smoothing methods.

  3. A comparison of processed and fresh squeezed ‘Hamlin’ orange juice - nutrients and phytonutrients

    USDA-ARS?s Scientific Manuscript database

    ‘Hamlin’ orange juices were extracted using one of following methods: 1) freshly squeezed with a commercial food service squeezer (fresh), 2) freshly squeezed + pasteurized (fresh/pasteurized), and 3) processed with industrial extractor and pasteurized (processed). Samples were taken directly after ...

  4. A simple and reliable method reducing sulfate to sulfide for multiple sulfur isotope analysis.

    PubMed

    Geng, Lei; Savarino, Joel; Savarino, Clara A; Caillon, Nicolas; Cartigny, Pierre; Hattori, Shohei; Ishino, Sakiko; Yoshida, Naohiro

    2018-02-28

    Precise analysis of four sulfur isotopes of sulfate in geological and environmental samples provides the means to extract unique information in wide geological contexts. Reduction of sulfate to sulfide is the first step to access such information. The conventional reduction method suffers from a cumbersome distillation system, long reaction time and large volume of the reducing solution. We present a new and simple method enabling the process of multiple samples at one time with a much reduced volume of reducing solution. One mL of reducing solution made of HI and NaH 2 PO 2 was added to a septum glass tube with dry sulfate. The tube was heated at 124°C and the produced H 2 S was purged with inert gas (He or N 2 ) through gas-washing tubes and then collected by NaOH solution. The collected H 2 S was converted into Ag 2 S by adding AgNO 3 solution and the co-precipitated Ag 2 O was removed by adding a few drops of concentrated HNO 3 . Within 2-3 h, a 100% yield was observed for samples with 0.2-2.5 μmol Na 2 SO 4 . The reduction rate was much slower for BaSO 4 and a complete reduction was not observed. International sulfur reference materials, NBS-127, SO-5 and SO-6, were processed with this method, and the measured against accepted δ 34 S values yielded a linear regression line which had a slope of 0.99 ± 0.01 and a R 2 value of 0.998. The new methodology is easy to handle and allows us to process multiple samples at a time. It has also demonstrated good reproducibility in terms of H 2 S yield and for further isotope analysis. It is thus a good alternative to the conventional manual method, especially when processing samples with limited amount of sulfate available. © 2017 The Authors. Rapid Communications in Mass Spectrometry Pubished by John Wiley & Sons Ltd.

  5. Apparatus for producing a thin sample band in a microchannel system

    DOEpatents

    Griffiths, Stewart K [Livermore, CA; Nilson, Robert H [Cardiff, CA

    2008-05-13

    The present invention improves the performance of microchannel systems for chemical and biological synthesis and analysis by providing a method and apparatus for producing a thin band of a species sample. Thin sample bands improve the resolution of microchannel separation processes, as well as many other processes requiring precise control of sample size and volume. The new method comprises a series of steps in which a species sample is manipulated by controlled transport through a junction formed at the intersection of four or more channels. A sample is first inserted into the end of one of these channels in the vicinity of the junction. Next, this sample is thinned by transport across the junction one or more times. During these thinning steps, flow enters the junction through one of the channels and exists through those remaining, providing a divergent flow field that progressively stretches and thins the band with each traverse of the junction. The thickness of the resulting sample band may be smaller than the channel width. Moreover, the thickness of the band may be varied and controlled by altering the method alone, without modification to the channel or junction geometries. The invention is applicable to both electroosmotic and electrophoretic transport, to combined electrokinetic transport, and to some special cases in which bulk fluid transport is driven by pressure gradients. It is further applicable to channels that are open, filled with a gel or filled with a porous or granular material.

  6. ANALYTICAL METHOD DEVELOPMENTS TO SUPPORT PARTITIONING INTERWELL TRACER TESTING

    EPA Science Inventory

    Partitioning Interwell Tracer Testing (PITT) uses alcohol tracer compounds in estimating subsurface contamination from non-polar pollutants. PITT uses the analysis of water samples for various alcohols as part of the overall measurement process. The water samples may contain many...

  7. New definitions for cotton fiber maturity ratio

    USDA-ARS?s Scientific Manuscript database

    Cotton fiber maturity affects fiber physical, mechanical, and chemical properties, as well as the processability and qualities of yarn and fabrics. New definitions of cotton fiber maturity ratio are introduced. The influences of sampling, sample preparation, measurement method, and correlations am...

  8. Two-Dimensional Mathematical Modeling of the Pack Carburizing Process

    NASA Astrophysics Data System (ADS)

    Sarkar, S.; Gupta, G. S.

    2008-10-01

    Pack carburization is the oldest method among the case-hardening treatments, and sufficient attempts have not been made to understand this process in terms of heat and mass transfer, effect of alloying elements, dimensions of the sample, etc. Thus, a two-dimensional mathematical model in cylindrical coordinate is developed for simulating the pack carburization process for chromium-bearing steel in this study. Heat and mass balance equations are solved simultaneously, where the surface temperature of the sample varies with time, but the carbon potential at the surface during the process remains constant. The fully implicit finite volume technique is used to solve the governing equations. Good agreement has been found between the predicted and published data. The effect of temperature, carburizing time, dimensions of the sample, etc. on the pack carburizing process shows some interesting results. It is found that the two-dimensional model gives better insight into understanding the carburizing process.

  9. Method and apparatus for implementing material thermal property measurement by flash thermal imaging

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sun, Jiangang

    A method and apparatus are provided for implementing measurement of material thermal properties including measurement of thermal effusivity of a coating and/or film or a bulk material of uniform property. The test apparatus includes an infrared camera, a data acquisition and processing computer coupled to the infrared camera for acquiring and processing thermal image data, a flash lamp providing an input of heat onto the surface of a two-layer sample with an enhanced optical filter covering the flash lamp attenuating an entire infrared wavelength range with a series of thermal images is taken of the surface of the two-layer sample.

  10. Fast Physically Accurate Rendering of Multimodal Signatures of Distributed Fracture in Heterogeneous Materials.

    PubMed

    Visell, Yon

    2015-04-01

    This paper proposes a fast, physically accurate method for synthesizing multimodal, acoustic and haptic, signatures of distributed fracture in quasi-brittle heterogeneous materials, such as wood, granular media, or other fiber composites. Fracture processes in these materials are challenging to simulate with existing methods, due to the prevalence of large numbers of disordered, quasi-random spatial degrees of freedom, representing the complex physical state of a sample over the geometric volume of interest. Here, I develop an algorithm for simulating such processes, building on a class of statistical lattice models of fracture that have been widely investigated in the physics literature. This algorithm is enabled through a recently published mathematical construction based on the inverse transform method of random number sampling. It yields a purely time domain stochastic jump process representing stress fluctuations in the medium. The latter can be readily extended by a mean field approximation that captures the averaged constitutive (stress-strain) behavior of the material. Numerical simulations and interactive examples demonstrate the ability of these algorithms to generate physically plausible acoustic and haptic signatures of fracture in complex, natural materials interactively at audio sampling rates.

  11. Assessing risk of draft survey by AHP method

    NASA Astrophysics Data System (ADS)

    Xu, Guangcheng; Zhao, Kuimin; Zuo, Zhaoying; Liu, Gang; Jian, Binguo; Lin, Yan; Fan, Yukun; Wang, Fei

    2018-04-01

    The paper assesses the risks of vessel floating in the seawater for draft survey by using the analytic hierarchy process. On this basis, the paper established draft survey risk index from the view of draft reading, ballast water, fresh water, and calculation process and so on. Then the paper proposes the method to deal with risk assessment using one concrete sample.

  12. GNSS software receiver sampling noise and clock jitter performance and impact analysis

    NASA Astrophysics Data System (ADS)

    Chen, Jian Yun; Feng, XuZhe; Li, XianBin; Wu, GuangYao

    2015-02-01

    In the design of a multi-frequency multi-constellation GNSS software defined radio receivers is becoming more and more popular due to its simple architecture, flexible configuration and good coherence in multi-frequency signal processing. It plays an important role in navigation signal processing and signal quality monitoring. In particular, GNSS software defined radio receivers driving the sampling clock of analogue-to-digital converter (ADC) by FPGA implies that a more flexible radio transceiver design is possible. According to the concept of software defined radio (SDR), the ideal is to digitize as close to the antenna as possible. Whereas the carrier frequency of GNSS signal is of the frequency of GHz, converting at this frequency is expensive and consumes more power. Band sampling method is a cheaper, more effective alternative. When using band sampling method, it is possible to sample a RF signal at twice the bandwidth of the signal. Unfortunately, as the other side of the coin, the introduction of SDR concept and band sampling method induce negative influence on the performance of the GNSS receivers. ADC's suffer larger sampling clock jitter generated by FPGA; and low sampling frequency introduces more noise to the receiver. Then the influence of sampling noise cannot be neglected. The paper analyzes the sampling noise, presents its influence on the carrier noise ratio, and derives the ranging error by calculating the synchronization error of the delay locked loop. Simulations aiming at each impact factors of sampling-noise-induced ranging error are performed. Simulation and experiment results show that if the target ranging accuracy is at the level of centimeter, the quantization length should be no less than 8 and the sampling clock jitter should not exceed 30ps.

  13. Direct PCR Offers a Fast and Reliable Alternative to Conventional DNA Isolation Methods for Gut Microbiomes.

    PubMed

    Videvall, Elin; Strandh, Maria; Engelbrecht, Anel; Cloete, Schalk; Cornwallis, Charlie K

    2017-01-01

    The gut microbiome of animals is emerging as an important factor influencing ecological and evolutionary processes. A major bottleneck in obtaining microbiome data from large numbers of samples is the time-consuming laboratory procedures required, specifically the isolation of DNA and generation of amplicon libraries. Recently, direct PCR kits have been developed that circumvent conventional DNA extraction steps, thereby streamlining the laboratory process by reducing preparation time and costs. However, the reliability and efficacy of direct PCR for measuring host microbiomes have not yet been investigated other than in humans with 454 sequencing. Here, we conduct a comprehensive evaluation of the microbial communities obtained with direct PCR and the widely used Mo Bio PowerSoil DNA extraction kit in five distinct gut sample types (ileum, cecum, colon, feces, and cloaca) from 20 juvenile ostriches, using 16S rRNA Illumina MiSeq sequencing. We found that direct PCR was highly comparable over a range of measures to the DNA extraction method in cecal, colon, and fecal samples. However, the two methods significantly differed in samples with comparably low bacterial biomass: cloacal and especially ileal samples. We also sequenced 100 replicate sample pairs to evaluate repeatability during both extraction and PCR stages and found that both methods were highly consistent for cecal, colon, and fecal samples ( r s > 0.7) but had low repeatability for cloacal ( r s = 0.39) and ileal ( r s = -0.24) samples. This study indicates that direct PCR provides a fast, cheap, and reliable alternative to conventional DNA extraction methods for retrieving 16S rRNA data, which can aid future gut microbiome studies. IMPORTANCE The microbial communities of animals can have large impacts on their hosts, and the number of studies using high-throughput sequencing to measure gut microbiomes is rapidly increasing. However, the library preparation procedure in microbiome research is both costly and time-consuming, especially for large numbers of samples. We investigated a cheaper and faster direct PCR method designed to bypass the DNA isolation steps during 16S rRNA library preparation and compared it with a standard DNA extraction method. We used both techniques on five different gut sample types collected from 20 juvenile ostriches and sequenced samples with Illumina MiSeq. The methods were highly comparable and highly repeatable in three sample types with high microbial biomass (cecum, colon, and feces), but larger differences and low repeatability were found in the microbiomes obtained from the ileum and cloaca. These results will help microbiome researchers assess library preparation procedures and plan their studies accordingly.

  14. Comparison of methods for determination of total oil sands-derived naphthenic acids in water samples.

    PubMed

    Hughes, Sarah A; Huang, Rongfu; Mahaffey, Ashley; Chelme-Ayala, Pamela; Klamerth, Nikolaus; Meshref, Mohamed N A; Ibrahim, Mohamed D; Brown, Christine; Peru, Kerry M; Headley, John V; Gamal El-Din, Mohamed

    2017-11-01

    There are several established methods for the determination of naphthenic acids (NAs) in waters associated with oil sands mining operations. Due to their highly complex nature, measured concentration and composition of NAs vary depending on the method used. This study compared different common sample preparation techniques, analytical instrument methods, and analytical standards to measure NAs in groundwater and process water samples collected from an active oil sands operation. In general, the high- and ultrahigh-resolution methods, namely high performance liquid chromatography time-of-flight mass spectrometry (UPLC-TOF-MS) and Orbitrap mass spectrometry (Orbitrap-MS), were within an order of magnitude of the Fourier transform infrared spectroscopy (FTIR) methods. The gas chromatography mass spectrometry (GC-MS) methods consistently had the highest NA concentrations and greatest standard error. Total NAs concentration was not statistically different between sample preparation of solid phase extraction and liquid-liquid extraction. Calibration standards influenced quantitation results. This work provided a comprehensive understanding of the inherent differences in the various techniques available to measure NAs and hence the potential differences in measured amounts of NAs in samples. Results from this study will contribute to the analytical method standardization for NA analysis in oil sands related water samples. Copyright © 2017 Elsevier Ltd. All rights reserved.

  15. An integrated bioanalytical method development and validation approach: case studies.

    PubMed

    Xue, Y-J; Melo, Brian; Vallejo, Martha; Zhao, Yuwen; Tang, Lina; Chen, Yuan-Shek; Keller, Karin M

    2012-10-01

    We proposed an integrated bioanalytical method development and validation approach: (1) method screening based on analyte's physicochemical properties and metabolism information to determine the most appropriate extraction/analysis conditions; (2) preliminary stability evaluation using both quality control and incurred samples to establish sample collection, storage and processing conditions; (3) mock validation to examine method accuracy and precision and incurred sample reproducibility; and (4) method validation to confirm the results obtained during method development. This integrated approach was applied to the determination of compound I in rat plasma and compound II in rat and dog plasma. The effectiveness of the approach was demonstrated by the superior quality of three method validations: (1) a zero run failure rate; (2) >93% of quality control results within 10% of nominal values; and (3) 99% incurred sample within 9.2% of the original values. In addition, rat and dog plasma methods for compound II were successfully applied to analyze more than 900 plasma samples obtained from Investigational New Drug (IND) toxicology studies in rats and dogs with near perfect results: (1) a zero run failure rate; (2) excellent accuracy and precision for standards and quality controls; and (3) 98% incurred samples within 15% of the original values. Copyright © 2011 John Wiley & Sons, Ltd.

  16. The combined positive impact of Lean methodology and Ventana Symphony autostainer on histology lab workflow

    PubMed Central

    2010-01-01

    Background Histologic samples all funnel through the H&E microtomy staining area. Here manual processes intersect with semi-automated processes creating a bottleneck. We compare alternate work processes in anatomic pathology primarily in the H&E staining work cell. Methods We established a baseline measure of H&E process impact on personnel, information management and sample flow from historical workload and production data and direct observation. We compared this to performance after implementing initial Lean process modifications, including workstation reorganization, equipment relocation and workflow levelling, and the Ventana Symphony stainer to assess the impact on productivity in the H&E staining work cell. Results Average time from gross station to assembled case decreased by 2.9 hours (12%). Total process turnaround time (TAT) exclusive of processor schedule changes decreased 48 minutes/case (4%). Mean quarterly productivity increased 8.5% with the new methods. Process redesign reduced the number of manual steps from 219 to 182, a 17% reduction. Specimen travel distance was reduced from 773 ft/case to 395 ft/case (49%) overall, and from 92 to 53 ft/case in the H&E cell (42% improvement). Conclusions Implementation of Lean methods in the H&E work cell of histology can result in improved productivity, improved through-put and case availability parameters including TAT. PMID:20181123

  17. [Extraction method suitable for detection of unheated crustaceans including cephalothorax by ELISA].

    PubMed

    Shibahara, Yusuke; Yamada, Itta; Uesaka, Yoshihiko; Uneo, Noriko; Abe, Akihisa; Ohashi, Eiji; Shiomi, Kazuo

    2009-08-01

    When unheated whole samples of crustaceans (shrimp, prawn and crab) were analyzed with our ELISA kit (FA test EIA-Crustacean 'Nissui') using anti-tropomyosin antibodies, a remarkable reduction in reactivity was recognized. This reduction in activity was found to be due to the digestion of tropomyosin during the extraction process by proteases contained in cephalothorax. To avoid the digestion of tropomyosin by proteases, we developed an extraction method (heating method) suitable for the detection of tropomyosin in unheated crustaceans including cephalothorax. Experiments with unheated whole samples of various species of crustaceans confirmed that the heating method greatly improved the low reactivity in the standard method; the heating method gave extraction efficiencies of as high as 93-107%. Various processed crustaceans with cephalothorax, such as dry products (unheated or weakly heated products) and pickles in soy sauce (unheated products), that showed low reactivity with the standard method were confirmed to give superior results with the heating method. These results indicated that the developed heating method is suitable for detecting unheated crustaceans with cephalothorax by means of the ELISA kit.

  18. Uniform batch processing using microwaves

    NASA Technical Reports Server (NTRS)

    Barmatz, Martin B. (Inventor); Jackson, Henry W. (Inventor)

    2000-01-01

    A microwave oven and microwave heating method generates microwaves within a cavity in a predetermined mode such that there is a known region of uniform microwave field. Samples placed in the region will then be heated in a relatively identical manner. Where perturbations induced by the samples are significant, samples are arranged in a symmetrical distribution so that the cumulative perturbation at each sample location is the same.

  19. Effects of processing and in vitro proteolytic digestion on soybean and yambean hemagglutinins.

    PubMed

    Ojimelukwe, P C; Onuoha, C C; Obanu, Z A

    1995-06-01

    Some conventional processing methods were applied on yambean and soybean seeds and flour samples. They include soaking fermentation, cooking whole seeds in the presence and absence of trona, autoclaving and dry heat treatment of flour samples. Hemagglutinating activity was assayed for after processing treatments. The hemagglutinating proteins from these seeds were classified based on their solubility properties. Effects of the presence of 0.01% concentration of trypsin, pepsin and proteases on agglutination of human red blood cells were also evaluated. Most processing methods, particularly cooking whole seeds for 1-2 h, soaking and fermentation, reduced hemagglutinating activity on cow red blood cells. Size reduction accompanied by heat treatment was effective in eliminating hemagglutination. Both the albumin and globulin fractions of the soybean showed hemagglutinating activity but only the albumin fraction of the yambean had agglutinating properties. Proteolytic action of proteases was more effective in reduction of hemagglutinating activity than that of trypsin and pepsin.

  20. Study of the Wavelength Dependence in Laser Ablation of Advanced Ceramics and Glass-Ceramic Materials in the Nanosecond Range

    PubMed Central

    Sola, Daniel; Peña, Jose I.

    2013-01-01

    In this work, geometrical dimensions and ablation yields as a function of the machining method and reference position were studied when advanced ceramics and glass-ceramic materials were machined with pulsed lasers in the nanosecond range. Two laser systems, emitting at 1064 and 532 nm, were used. It was shown that the features obtained depend on whether the substrate is processed by means of pulse bursts or by grooves. In particular, when the samples were processed by grooves, machined depth, removed volume and ablation yields reached their maximum, placing the sample out of focus. It was shown that these characteristics do not depend on the processing conditions, the wavelength or the optical configuration, and that this is intrinsic behavior of the processing method. Furthermore, the existence of a close relation between material hardness and ablation yields was demonstrated. PMID:28788391

  1. 40 CFR 60.316 - Test methods and procedures.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Section 60.316 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS... for the measurement of VOC concentration. (3) Method 1 for sample and velocity traverses. (4) Method 2... smaller volumes, when necessitated by process variables or other factors, may be approved by the...

  2. 40 CFR 61.174 - Test methods and procedures.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Section 61.174 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS... follows: (1) Method 5 for the measurement of particulate matter, (2) Method 1 for sample and velocity... when necessitated by process variables or other factors may be approved by the Administrator. (f) For...

  3. Sampling and analysis of hexavalent chromium during exposure to chromic acid mist and welding fumes.

    PubMed

    Blomquist, G; Nilsson, C A; Nygren, O

    1983-12-01

    Sampling and analysis of hexavalent chromium during exposure to chromic acid mist and welding fumes. Scand j work environ & health 9 (1983) 489-495. In view of the serious health effects of hexavalent chromium, the problems involved in its sampling and analysis in workroom air have been the subject of much concern. In this paper, the stability problems arising from the reduction of hexavalent to trivalent chromium during sampling, sample storage, and analysis are discussed. Replacement of sulfuric acid by a sodium acetate buffer (pH 4) as a leaching solution prior to analysis with the diphenylcarbazide (DPC) method is suggested and is demonstrated to be necessary in order to avoid reduction. Field samples were taken from two different industrial processes-manual metal arc welding on stainless steel without shield gas and chromium plating. A comparison was made of the DPC method, acidic dissolution with atomic absorption spectrophotometric (AAS) analysis, and the carbonate method. For chromic acid mist, the DPC method and AAS analysis were shown to give the same results. In the analysis of welding fumes, the modified DPC method gave the same results as the laborious and less sensitive carbonate method.

  4. An adaptive sampling method for variable-fidelity surrogate models using improved hierarchical kriging

    NASA Astrophysics Data System (ADS)

    Hu, Jiexiang; Zhou, Qi; Jiang, Ping; Shao, Xinyu; Xie, Tingli

    2018-01-01

    Variable-fidelity (VF) modelling methods have been widely used in complex engineering system design to mitigate the computational burden. Building a VF model generally includes two parts: design of experiments and metamodel construction. In this article, an adaptive sampling method based on improved hierarchical kriging (ASM-IHK) is proposed to refine the improved VF model. First, an improved hierarchical kriging model is developed as the metamodel, in which the low-fidelity model is varied through a polynomial response surface function to capture the characteristics of a high-fidelity model. Secondly, to reduce local approximation errors, an active learning strategy based on a sequential sampling method is introduced to make full use of the already required information on the current sampling points and to guide the sampling process of the high-fidelity model. Finally, two numerical examples and the modelling of the aerodynamic coefficient for an aircraft are provided to demonstrate the approximation capability of the proposed approach, as well as three other metamodelling methods and two sequential sampling methods. The results show that ASM-IHK provides a more accurate metamodel at the same simulation cost, which is very important in metamodel-based engineering design problems.

  5. Room temperature ferromagnetism of tin oxide nanocrystal based on synthesis methods

    NASA Astrophysics Data System (ADS)

    Sakthiraj, K.; Hema, M.; Balachandrakumar, K.

    2016-04-01

    The experimental conditions used in the preparation of nanocrystalline oxide materials play an important role in the room temperature ferromagnetism of the product. In the present work, a comparison was made between sol-gel, microwave assisted sol-gel and hydrothermal methods for preparing tin oxide nanocrystal. X-ray diffraction analysis indicates the formation of tetragonal rutile phase structure for all the samples. The crystallite size was estimated from the HRTEM images and it is around 6-12 nm. Using optical absorbance measurement, the band gap energy value of the samples has been calculated. It reveals the existence of quantum confinement effect in all the prepared samples. Photoluminescence (PL) spectra confirms that the luminescence process originates from the structural defects such as oxygen vacancies present in the samples. Room temperature hysteresis loop was clearly observed in M-H curve of all the samples. But the sol-gel derived sample shows the higher values of saturation magnetization (Ms) and remanence (Mr) than other two samples. This study reveals that the sol-gel method is superior to the other two methods for producing room temperature ferromagnetism in tin oxide nanocrystal.

  6. Elongation measurement using 1-dimensional image correlation method

    NASA Astrophysics Data System (ADS)

    Phongwisit, Phachara; Kamoldilok, Surachart; Buranasiri, Prathan

    2016-11-01

    Aim of this paper was to study, setup, and calibrate an elongation measurement by using 1- Dimensional Image Correlation method (1-DIC). To confirm our method and setup correctness, we need calibration with other methods. In this paper, we used a small spring as a sample to find a result in terms of spring constant. With a fundamental of Image Correlation method, images of formed and deformed samples were compared to understand the difference between deformed process. By comparing the location of reference point on both image's pixel, the spring's elongation were calculated. Then, the results have been compared with the spring constants, which were found from Hooke's law. The percentage of 5 percent error has been found. This DIC method, then, would be applied to measure the elongation of some different kinds of small fiber samples.

  7. Generation and coherent detection of QPSK signal using a novel method of digital signal processing

    NASA Astrophysics Data System (ADS)

    Zhao, Yuan; Hu, Bingliang; He, Zhen-An; Xie, Wenjia; Gao, Xiaohui

    2018-02-01

    We demonstrate an optical quadrature phase-shift keying (QPSK) signal transmitter and an optical receiver for demodulating optical QPSK signal with homodyne detection and digital signal processing (DSP). DSP on the homodyne detection scheme is employed without locking the phase of the local oscillator (LO). In this paper, we present an extracting one-dimensional array of down-sampling method for reducing unwanted samples of constellation diagram measurement. Such a novel scheme embodies the following major advantages over the other conventional optical QPSK signal detection methods. First, this homodyne detection scheme does not need strict requirement on LO in comparison with linear optical sampling, such as having a flat spectral density and phase over the spectral support of the source under test. Second, the LabVIEW software is directly used for recovering the QPSK signal constellation without employing complex DSP circuit. Third, this scheme is applicable to multilevel modulation formats such as M-ary PSK and quadrature amplitude modulation (QAM) or higher speed signals by making minor changes.

  8. Systems and methods for laser assisted sample transfer to solution for chemical analysis

    DOEpatents

    Van Berkel, Gary J.; Kertesz, Vilmos; Ovchinnikova, Olga S.

    2014-06-03

    Systems and methods are described for laser ablation of an analyte from a specimen and capturing of the analyte in a dispensed solvent to form a testing solution. A solvent dispensing and extraction system can form a liquid microjunction with the specimen. The solvent dispensing and extraction system can include a surface sampling probe. The laser beam can be directed through the surface sampling probe. The surface sampling probe can also serve as an atomic force microscopy probe. The surface sampling probe can form a seal with the specimen. The testing solution including the analyte can then be analyzed using an analytical instrument or undergo further processing.

  9. Systems and methods for laser assisted sample transfer to solution for chemical analysis

    DOEpatents

    Van Berkel, Gary J.; Kertesz, Vilmos; Ovchinnikova, Olga S.

    2015-09-29

    Systems and methods are described for laser ablation of an analyte from a specimen and capturing of the analyte in a dispensed solvent to form a testing solution. A solvent dispensing and extraction system can form a liquid microjunction with the specimen. The solvent dispensing and extraction system can include a surface sampling probe. The laser beam can be directed through the surface sampling probe. The surface sampling probe can also serve as an atomic force microscopy probe. The surface sampling probe can form a seal with the specimen. The testing solution including the analyte can then be analyzed using an analytical instrument or undergo further processing.

  10. Systems and methods for laser assisted sample transfer to solution for chemical analysis

    DOEpatents

    Van Berkel, Gary J; Kertesz, Vilmos; Ovchinnikova, Olga S

    2013-08-27

    Systems and methods are described for laser ablation of an analyte from a specimen and capturing of the analyte in a dispensed solvent to form a testing solution. A solvent dispensing and extraction system can form a liquid microjunction with the specimen. The solvent dispensing and extraction system can include a surface sampling probe. The laser beam can be directed through the surface sampling probe. The surface sampling probe can also serve as an atomic force microscopy probe. The surface sampling probe can form a seal with the specimen. The testing solution including the analyte can then be analyzed using an analytical instrument or undergo further processing.

  11. Role of aging time on the magnetic properties of Sm2Co17 permanent magnets processed through cold isostatic pressing

    NASA Astrophysics Data System (ADS)

    Ramudu, M.; Rajkumar, D. M.

    2018-04-01

    The effect of aging time on the magnetic properties of Sm2Co17 permanent magnets processed through a novel method of cold isostatic pressing was investigated. Sintered Sm2Co17 samples were subjected to different aging times in the range of 10-30 h and their respective microstructures were correlated with the magnetic properties obtained. The values of remanant magnetization (Br) were observed to be constant in samples aged from 10-20 h beyond which a gradual decrease in Br values was observed. The values of coercivity (Hc) displayed a sharp increase in samples aged from 10 to 20 h beyond which the coercivity values showed marginal improvement. Hence a good combination of magnetic properties could be achieved in samples aged for 20 h. A maximum energy product of 27 MGOe was achieved in the 20 h aged sample processed through a novel route.

  12. Levels of interleukin-6 in tears before and after excimer laser treatment.

    PubMed

    Resan, Mirko; Stanojević, Ivan; Petković, Aleksandra; Pajić, Bojan; Vojvodić, Danilo

    2015-04-01

    Immune response and consequent inflammatory process which originate on ocular surface after a trauma are mediated by cytokines. Photoablation of corneal stroma performed by excimer laser causes surgically induced trauma. Interleukin-6 (IL-6) is mostly known as a proinflammatory cytokine. However, it also has regenerative and anti-inflammatory effects. It is supposed that this cytokine is likely to play a significant role in the process of corneal wound healing response after photoablation of stroma carried out by laser in situ keratomileusis (LASIK) or photorefractive keratectomy (PRK) methods. The aim of this study was to determine and compare the levels of IL-6 in tears before and after treatment with LASIK and PRK methods. The study included 68 shortsighted eyes up to -3.0 diopter sphere, i.e. 198 samples of tears (per three samples taken from each of the eyes), divided into two groups according to the kind of excimer laser intervention performed: the group 1--eyes treated by LASIK method (n=31), and the group 2--eyes treated by the PRK method (n=37). The samples of tears were taken from each eye at the following time points: before excimer laser treatment (0 h, the control group), 1 h after the treatment (1 h) and 24 h after the treatment (24 h). The patients did not use anti-inflammatory therapy 24 h after the intervention. Tear samples were collected using microsurgical sponge. Level of IL-6 in tear fluid was determined by the flow cytometry method, applying a commercial test kit which allowed cytokine detection from a small sample volume. Results. The values of IL-6 were detectable in 16% of samples before LASIK treatment and in 30% of samples before PRK treatment. One h after the treatment IL-6 was detectable in 29% of samples for the LASIK group and 43% of samples for the PRK group, and 24 h after the treatment it was detectable in 19% of samples for the LASIK group and in 57% of samples for the PRK group. When we analyzed the dynamics of IL76 production in particular groups, we noticed that both in the LASIK and PRK group the number of samples with increased values of IL-6 after 1 h, and after 24 h, was con- siderably larger than the number of samples with decreased values of IL-6 after the intervention. Analyzing the dynamics of IL-6 concentration changes in the 1 h samples vs. 24 h samples there was a statistically significant increase in the number of samples with IL-6 concentration decline in the LASIK group, while at the same time no considerable changes occurred in the PRK group. Comparing average IL-6 values between the two treatment groups in all tear samples at 0 h, 1 h and 24 h after intervention a significantly higher level in the PRK group 24 h after procedure (p = 0.0031) was detected. IL-6 level in tears increases 1 h and 24 h after LASIK and PRK treatments. This increment is significantly larger 24 h after the treatment with the PRK method than with the LASIK method. Changes of IL-6 production levels in tears after excimer laser treatment indicate that this cytokine takes part in the corneal recovery process after stromal photoablation.

  13. Correlation between processing conditions, microstructure and charge transport in half-Heusler alloys

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Makongo, Julien P.A.; Zhou, Xiaoyuan; Misra, Dinesh K.

    2013-05-01

    Five bulk samples of n-type Zr₀.₂₅Hf₀.₇₅NiSn₀.₉₇₅Sb₀.₀₂₅ half-Heusler (HH) alloy were fabricated by reacting elemental powders via (1) high temperature solid state (SS) reaction and (2) mechanical alloying (MA), followed by densification using spark plasma sintering (SPS) and/or hot pressing (HP). A portion of the sample obtained by SS reaction was mechanically alloyed before consolidation by hot pressing (SS–MA–HP). X-ray powder diffraction and transmission electron microscopy studies revealed that all SS specimen (SS–SPS, SS–HP, SS–MA–HP) are single phase HH alloys, whereas the MA sample (MA–SPS) contains metallic nanoprecipitates. Electronic and thermal transport measurements showed that the embedded nanoprecipitates induce a drasticmore » increase in the carrier concentration (n), a large decrease in the Seebeck coefficient (S) and a marginal decrease in the lattice thermal conductivity (κ l) of the MA–SPS sample leading to lower ZT values when compared to the SS–HP samples. Constant values of S are observed for the SS series regardless of the processing method. However, a strong dependence of the carrier mobility (μ), electrical conductivity (σ) and κ l on the processing and consolidation method is observed. For instance, mechanical alloying introduces additional structural defects which enhance electron and phonon scattering leading to moderately low values of μ and large reduction in κ l. This results in a net 20% enhancement in the figure of merit (ZT=0.6 at 775 K). HH specimen of the same nominal composition with higher ZT is anticipated from a combination of SS reaction, MA and SPS (SS–MA–SPS). - Graphical abstract: In half-Heusler alloys, thermopower values are insensitive to processing method, whereas carrier mobility (μ), electrical conductivity (σ), and κ l strongly dependent on the microstructure which in turn is altered by the synthesis, processing and consolidation method. Highlights: • Phase composition of HH alloy strongly depends on the synthesis technique. • Mechanical alloying of elements yields bulk HH alloy with metallic impurity phases. • Thermopower, carrier density, and effective mass of HHs are insensitive to processing conditions. • Mechanical alloying decreases the carrier mobility and lattice thermal conductivity of bulk HH.« less

  14. Follow-up of the fate of imazalil from post-harvest lemon surface treatment to a baking experiment.

    PubMed

    Vass, Andrea; Korpics, Evelin; Dernovics, Mihály

    2015-01-01

    Imazalil is one of the most widespread fungicides used for the post-harvest treatment of citrus species. The separate use of peel during food preparation and processing may hitherto concentrate most of the imazalil into food products, where specific maximum residue limits hardly exist for this fungicide. In order to monitor comprehensively the path of imazalil, our study covered the monitoring of the efficiency of several washing treatments, the comparison of operative and related sample preparation methods for the lemon samples, the validation of a sample preparation technique for a fatty cake matrix, the preparation of a model cake sample made separately either with imazalil containing lemon peel or with imazalil spiking, the monitoring of imazalil degradation into α-(2,4-dichlorophenyl)-1H-imidazole-1-ethanol because of the baking process, and finally the mass balance of imazalil throughout the washing experiments and the baking process. Quantification of imazalil was carried out with an LC-ESI-MS/MS set-up, while LC-QTOF was used for the monitoring of imazalil degradation. Concerning the washing, none of the addressed five washing protocols could remove more than 30% of imazalil from the surface of the lemon samples. The study revealed a significant difference between the extraction efficiency of imazalil by the EN 15662:2008 and AOAC 2007.1 methods, with the advantage of the former. The use of the model cake sample helped to validate a modified version of the EN 15662:2008 method that included a freeze-out step to efficiently recover imazalil (>90%) from the fatty cake matrix. The degradation of imazalil during the baking process was significantly higher when this analyte was spiked into the cake matrix than in the case of preparing the cake with imazalil-containing lemon peel (52% vs. 22%). This observation calls the attention to the careful evaluation of pesticide stability data that are based on solution spiking experiments.

  15. VNIR hyperspectral background characterization methods in adverse weather conditions

    NASA Astrophysics Data System (ADS)

    Romano, João M.; Rosario, Dalton; Roth, Luz

    2009-05-01

    Hyperspectral technology is currently being used by the military to detect regions of interest where potential targets may be located. Weather variability, however, may affect the ability for an algorithm to discriminate possible targets from background clutter. Nonetheless, different background characterization approaches may facilitate the ability for an algorithm to discriminate potential targets over a variety of weather conditions. In a previous paper, we introduced a new autonomous target size invariant background characterization process, the Autonomous Background Characterization (ABC) or also known as the Parallel Random Sampling (PRS) method, features a random sampling stage, a parallel process to mitigate the inclusion by chance of target samples into clutter background classes during random sampling; and a fusion of results at the end. In this paper, we will demonstrate how different background characterization approaches are able to improve performance of algorithms over a variety of challenging weather conditions. By using the Mahalanobis distance as the standard algorithm for this study, we compare the performance of different characterization methods such as: the global information, 2 stage global information, and our proposed method, ABC, using data that was collected under a variety of adverse weather conditions. For this study, we used ARDEC's Hyperspectral VNIR Adverse Weather data collection comprised of heavy, light, and transitional fog, light and heavy rain, and low light conditions.

  16. Further improvement of hydrostatic pressure sample injection for microchip electrophoresis.

    PubMed

    Luo, Yong; Zhang, Qingquan; Qin, Jianhua; Lin, Bingcheng

    2007-12-01

    Hydrostatic pressure sample injection method is able to minimize the number of electrodes needed for a microchip electrophoresis process; however, it neither can be applied for electrophoretic DNA sizing, nor can be implemented on the widely used single-cross microchip. This paper presents an injector design that makes the hydrostatic pressure sample injection method suitable for DNA sizing. By introducing an assistant channel into the normal double-cross injector, a rugged DNA sample plug suitable for sizing can be successfully formed within the cross area during the sample loading. This paper also demonstrates that the hydrostatic pressure sample injection can be performed in the single-cross microchip by controlling the radial position of the detection point in the separation channel. Rhodamine 123 and its derivative as model sample were successfully separated.

  17. Estimating the probability that the sample mean is within a desired fraction of the standard deviation of the true mean.

    PubMed

    Schillaci, Michael A; Schillaci, Mario E

    2009-02-01

    The use of small sample sizes in human and primate evolutionary research is commonplace. Estimating how well small samples represent the underlying population, however, is not commonplace. Because the accuracy of determinations of taxonomy, phylogeny, and evolutionary process are dependant upon how well the study sample represents the population of interest, characterizing the uncertainty, or potential error, associated with analyses of small sample sizes is essential. We present a method for estimating the probability that the sample mean is within a desired fraction of the standard deviation of the true mean using small (n<10) or very small (n < or = 5) sample sizes. This method can be used by researchers to determine post hoc the probability that their sample is a meaningful approximation of the population parameter. We tested the method using a large craniometric data set commonly used by researchers in the field. Given our results, we suggest that sample estimates of the population mean can be reasonable and meaningful even when based on small, and perhaps even very small, sample sizes.

  18. Using the global positioning system to map disturbance patterns of forest harvesting machinery

    Treesearch

    T.P. McDonald; E.A. Carter; S.E. Taylor

    2002-01-01

    Abstract: A method was presented to transform sampled machine positional data obtained from a global positioning system (GPS) receiver into a two-dimensional raster map of number of passes as a function of location. The effect of three sources of error in the transformation process were investigated: path sampling rate (receiver sampling frequency);...

  19. Scalable approximate policies for Markov decision process models of hospital elective admissions.

    PubMed

    Zhu, George; Lizotte, Dan; Hoey, Jesse

    2014-05-01

    To demonstrate the feasibility of using stochastic simulation methods for the solution of a large-scale Markov decision process model of on-line patient admissions scheduling. The problem of admissions scheduling is modeled as a Markov decision process in which the states represent numbers of patients using each of a number of resources. We investigate current state-of-the-art real time planning methods to compute solutions to this Markov decision process. Due to the complexity of the model, traditional model-based planners are limited in scalability since they require an explicit enumeration of the model dynamics. To overcome this challenge, we apply sample-based planners along with efficient simulation techniques that given an initial start state, generate an action on-demand while avoiding portions of the model that are irrelevant to the start state. We also propose a novel variant of a popular sample-based planner that is particularly well suited to the elective admissions problem. Results show that the stochastic simulation methods allow for the problem size to be scaled by a factor of almost 10 in the action space, and exponentially in the state space. We have demonstrated our approach on a problem with 81 actions, four specialities and four treatment patterns, and shown that we can generate solutions that are near-optimal in about 100s. Sample-based planners are a viable alternative to state-based planners for large Markov decision process models of elective admissions scheduling. Copyright © 2014 Elsevier B.V. All rights reserved.

  20. Analysis of large soil samples for actinides

    DOEpatents

    Maxwell, III; Sherrod, L [Aiken, SC

    2009-03-24

    A method of analyzing relatively large soil samples for actinides by employing a separation process that includes cerium fluoride precipitation for removing the soil matrix and precipitates plutonium, americium, and curium with cerium and hydrofluoric acid followed by separating these actinides using chromatography cartridges.

  1. 40 CFR 63.1325 - Batch process vents-performance test methods and procedures to determine compliance.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    .... Cj=Average inlet or outlet concentration of TOC or sample organic HAP component j of the gas stream...), where standard temperature is 20 °C. Cj=Inlet or outlet concentration of TOC or sample organic HAP...

  2. 40 CFR 63.1325 - Batch process vents-performance test methods and procedures to determine compliance.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    .... Cj=Average inlet or outlet concentration of TOC or sample organic HAP component j of the gas stream...), where standard temperature is 20 °C. Cj=Inlet or outlet concentration of TOC or sample organic HAP...

  3. 40 CFR 63.1325 - Batch process vents-performance test methods and procedures to determine compliance.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    .... Cj=Average inlet or outlet concentration of TOC or sample organic HAP component j of the gas stream...), where standard temperature is 20 °C. Cj=Inlet or outlet concentration of TOC or sample organic HAP...

  4. CONCEPTS AND APPROACHES FOR THE BIOASSESSMENT OF NON-WADEABLE STREAMS AND RIVERS

    EPA Science Inventory

    This document is intended to assist users in establishing or refining protocols, including the specific methods related to field sampling, laboratory sample processing, taxonomy, data entry, management and analysis, and final assessment and reporting. It also reviews and provide...

  5. Robust Abundance Estimation in Animal Abundance Surveys with Imperfect Detection

    EPA Science Inventory

    Surveys of animal abundance are central to the conservation and management of living natural resources. However, detection uncertainty complicates the sampling process of many species. One sampling method employed to deal with this problem is depletion (or removal) surveys in whi...

  6. Robust Abundance Estimation in Animal Surveys with Imperfect Detection

    EPA Science Inventory

    Surveys of animal abundance are central to the conservation and management of living natural resources. However, detection uncertainty complicates the sampling process of many species. One sampling method employed to deal with this problem is depletion (or removal) surveys in whi...

  7. Comparing concentration methods: parasitrap® versus Kato-Katz for studying the prevalence of Helminths in Bengo province, Angola.

    PubMed

    Mirante, Clara; Clemente, Isabel; Zambu, Graciette; Alexandre, Catarina; Ganga, Teresa; Mayer, Carlos; Brito, Miguel

    2016-09-01

    Helminth intestinal parasitoses are responsible for high levels of child mortality and morbidity. Hence, the capacity to diagnose these parasitoses and consequently ensure due treatment represents a factor of great importance. The main objective of this study involves comparing two methods of concentration, parasitrap and Kato-Katz, for the diagnosis of intestinal parasitoses in faecal samples. Sample processing made recourse to two different concentration methods: the commercial parasitrap® method and the Kato-Katz method. We correspondingly collected a total of 610 stool samples from pre-school and school age children. The results demonstrate the incidence of helminth parasites in 32.8% or 32.3% of the sample collected depending on whether the concentration method applied was either the parasitrap method or the Kato-Katz method. We detected a relatively high percentage of samples testing positive for two or more species of helminth parasites. We would highlight that in searching for larvae the Kato-Katz method does not prove as appropriate as the parasitrap method. Both techniques prove easily applicable even in field working conditions and returning mutually agreeing results. This study concludes in favour of the need for deworming programs and greater public awareness among the rural populations of Angola.

  8. Analysis of the influence of pasteurization, freezing/thawing, and offer processes on human milk's macronutrient concentrations.

    PubMed

    Vieira, Alan Araujo; Soares, Fernanda Valente Mendes; Pimenta, Hellen Porto; Abranches, Andrea Dunshee; Moreira, Maria Elisabeth Lopes

    2011-08-01

    The macronutrient concentrations of human milk could be influenced by the various processes used in human milk bank. To determine the effect of various process (Holder pasteurization, freezing and thawing and feeding method) on the macronutrient concentration of human milk. The samples of donated fresh human milk were studied before and after each process (Holder pasteurization, freezing and thawing and feeding method) until their delivery to newborn infants. Fifty-seven raw human milk samples were analyzed in the first step (pasteurization) and 228 in the offer step. Repeated measurements of protein, fat and lactose amounts were made in samples of human milk using an Infrared analyzer. The influence of repeated processes on the mean concentration of macronutrients in donor human milk was analyzed by repeated measurements ANOVA, using R statistical package. The most variable macronutrient concentration in the analyzed samples was fat (reduction of 59%). There was a significant reduction of fat and protein mean concentrations following pasteurization (5.5 and 3.9%, respectively). The speed at which the milk was thawed didn't cause a significant variation in the macronutrients concentrations. However, the continuous infusion delivery significantly reduced the fat concentration. When the influence of repeated processes was analyzed, the fat and protein concentrations varied significantly (reduction of 56.6% and 10.1% respectively) (P<0.05). Lactose didn't suffer significant reductions in all steps. The repeated processes that donor human milk is submitted before delivery to newborn infants cause a reduction in the fat and protein concentration. The magnitude of this decrease is higher on the fat concentration and it needs to be considered when this processed milk is used to feed preterm infants. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.

  9. Advancing microwave technology for dehydration processing of biologics.

    PubMed

    Cellemme, Stephanie L; Van Vorst, Matthew; Paramore, Elisha; Elliott, Gloria D

    2013-10-01

    Our prior work has shown that microwave processing can be effective as a method for dehydrating cell-based suspensions in preparation for anhydrous storage, yielding homogenous samples with predictable and reproducible drying times. In the current work an optimized microwave-based drying process was developed that expands upon this previous proof-of-concept. Utilization of a commercial microwave (CEM SAM 255, Matthews, NC) enabled continuous drying at variable low power settings. A new turntable was manufactured from Ultra High Molecular Weight Polyethylene (UHMW-PE; Grainger, Lake Forest, IL) to provide for drying of up to 12 samples at a time. The new process enabled rapid and simultaneous drying of multiple samples in containment devices suitable for long-term storage and aseptic rehydration of the sample. To determine sample repeatability and consistency of drying within the microwave cavity, a concentration series of aqueous trehalose solutions were dried for specific intervals and water content assessed using Karl Fischer Titration at the end of each processing period. Samples were dried on Whatman S-14 conjugate release filters (Whatman, Maidestone, UK), a glass fiber membrane used currently in clinical laboratories. The filters were cut to size for use in a 13 mm Swinnex(®) syringe filter holder (Millipore(™), Billerica, MA). Samples of 40 μL volume could be dehydrated to the equilibrium moisture content by continuous processing at 20% with excellent sample-to-sample repeatability. The microwave-assisted procedure enabled high throughput, repeatable drying of multiple samples, in a manner easily adaptable for drying a wide array of biological samples. Depending on the tolerance for sample heating, the drying time can be altered by changing the power level of the microwave unit.

  10. Preliminary study of ultrasonic structural quality control of Swiss-type cheese.

    PubMed

    Eskelinen, J J; Alavuotunki, A P; Haeggström, E; Alatossava, T

    2007-09-01

    There is demand for a new nondestructive cheese-structure analysis method for Swiss-type cheese. Such a method would provide the cheese-making industry the means to enhance process control and quality assurance. This paper presents a feasibility study on ultrasonic monitoring of the structural quality of Swiss cheese by using a single-transducer 2-MHz longitudinal mode pulse-echo setup. A volumetric ultrasonic image of a cheese sample featuring gas holes (cheese-eyes) and defects (cracks) in the scan area is presented. The image is compared with an optical reference image constructed from dissection images of the same sample. The results show that the ultrasonic method is capable of monitoring the gas-solid structure of the cheese during the ripening process. Moreover, the method can be used to detect and to characterize cheese-eyes and cracks in ripened cheese. Industrial application demands were taken into account when conducting the measurements.

  11. Thermophysical Properties Measurements of Zr62Cu20Al10Ni8

    NASA Technical Reports Server (NTRS)

    Bradshaw, Richard C.; Waren, Mary; Rogers, Jan R.; Rathz, Thomas J.; Gangopadhyay, Anup K.; Kelton, Ken F.; Hyers, Robert W.

    2006-01-01

    Thermophysical property studies performed at high temperature can prove challenging because of reactivity problems brought on by the elevated temperatures. Contaminants from measuring devices and container walls can cause changes in properties. To prevent this, containerless processing techniques can be employed to isolate a sample during study. A common method used for this is levitation. Typical levitation methods used for containerless processing are, aerodynamically, electromagnetically and electrostatically based. All levitation methods reduce heterogeneous nucleation sites, 'which in turn provide access to metastable undercooled phases. In particular, electrostatic levitation is appealing because sample motion and stirring are minimized; and by combining it with optically based non-contact measuring techniques, many thermophysical properties can be measured. Applying some of these techniques, surface tension, viscosity and density have been measured for the glass forming alloy Zr62Cu20Al10Ni8 and will be presented with a brief overview of the non-contact measuring method used.

  12. The association between ruminative thinking and negative interpretation bias in social anxiety.

    PubMed

    Badra, Marcel; Schulze, Lars; Becker, Eni S; Vrijsen, Janna Nonja; Renneberg, Babette; Zetsche, Ulrike

    2017-09-01

    Cognitive models propose that both, negative interpretations of ambiguous social situations and ruminative thoughts about social events contribute to the maintenance of social anxiety disorder. It has further been postulated that ruminative thoughts fuel biased negative interpretations, however, evidence is rare. The present study used a multi-method approach to assess ruminative processing following a social interaction (post-event processing by self-report questionnaire and social rumination by experience sampling method) and negative interpretation bias (via two separate tasks) in a student sample (n = 51) screened for high (HSA) and low social anxiety (LSA). Results support the hypothesis that group differences in negative interpretations of ambiguous social situations in HSAs vs. LSAs are mediated by higher levels of post-event processing assessed in the questionnaire. Exploratory analyses highlight the potential role of comorbid depressive symptoms. The current findings help to advance the understanding of the association between two cognitive processes involved in social anxiety and stress the importance of ruminative post-event processing.

  13. Soft sensor development for Mooney viscosity prediction in rubber mixing process based on GMMDJITGPR algorithm

    NASA Astrophysics Data System (ADS)

    Yang, Kai; Chen, Xiangguang; Wang, Li; Jin, Huaiping

    2017-01-01

    In rubber mixing process, the key parameter (Mooney viscosity), which is used to evaluate the property of the product, can only be obtained with 4-6h delay offline. It is quite helpful for the industry, if the parameter can be estimate on line. Various data driven soft sensors have been used to prediction in the rubber mixing. However, it always not functions well due to the phase and nonlinear property in the process. The purpose of this paper is to develop an efficient soft sensing algorithm to solve the problem. Based on the proposed GMMD local sample selecting criterion, the phase information is extracted in the local modeling. Using the Gaussian local modeling method within Just-in-time (JIT) learning framework, nonlinearity of the process is well handled. Efficiency of the new method is verified by comparing the performance with various mainstream soft sensors, using the samples from real industrial rubber mixing process.

  14. Discretization of Continuous Time Discrete Scale Invariant Processes: Estimation and Spectra

    NASA Astrophysics Data System (ADS)

    Rezakhah, Saeid; Maleki, Yasaman

    2016-07-01

    Imposing some flexible sampling scheme we provide some discretization of continuous time discrete scale invariant (DSI) processes which is a subsidiary discrete time DSI process. Then by introducing some simple random measure we provide a second continuous time DSI process which provides a proper approximation of the first one. This enables us to provide a bilateral relation between covariance functions of the subsidiary process and the new continuous time processes. The time varying spectral representation of such continuous time DSI process is characterized, and its spectrum is estimated. Also, a new method for estimation time dependent Hurst parameter of such processes is provided which gives a more accurate estimation. The performance of this estimation method is studied via simulation. Finally this method is applied to the real data of S & P500 and Dow Jones indices for some special periods.

  15. Advances in spectroscopic methods for quantifying soil carbon

    USDA-ARS?s Scientific Manuscript database

    The gold standard for soil C determination is combustion. However, this method requires expensive consumables, is limited to the determination of the total carbon and in the number of samples which can be processed (~100/d). With increased interest in soil C sequestration, faster methods are needed....

  16. Automated high-throughput purification of genomic DNA from plant leaf or seed using MagneSil paramagnetic particles

    NASA Astrophysics Data System (ADS)

    Bitner, Rex M.; Koller, Susan C.

    2004-06-01

    Three different methods of automated high throughput purification of genomic DNA from plant materials processed in 96 well plates are described. One method uses MagneSil paramagnetic particles to purify DNA present in single leaf punch samples or small seed samples, using 320ul capacity 96 well plates which minimizes reagent and plate costs. A second method uses 2.2 ml and 1.2 ml capacity plates and allows the purification of larger amounts of DNA from 5-6 punches of materials or larger amounts of seeds. The third method uses the MagneSil ONE purification system to purify a fixed amount of DNA, thus simplifying the processing of downstream applications by normalizing the amounts of DNA so they do not require quantitation. Protocols for the purification of a fixed yield of DNA, e.g. 1 ug, from plant leaf or seed samples using MagneSil paramagnetic particles and a Beckman-Coulter BioMek FX robot are described. DNA from all three methods is suitable for applications such as PCR, RAPD, STR, READIT SNP analysis, and multiplexed PCR systems. The MagneSil ONE system is also suitable for use with SNP detection systems such as Third Wave Technology"s Invader methods.

  17. Statistical Inference on Memory Structure of Processes and Its Applications to Information Theory

    DTIC Science & Technology

    2016-05-12

    valued times series from a sample. (A practical algorithm to compute the estimator is a work in progress.) Third, finitely-valued spatial processes...ES) U.S. Army Research Office P.O. Box 12211 Research Triangle Park, NC 27709-2211 mathematical statistics; time series ; Markov chains; random...proved. Second, a statistical method is developed to estimate the memory depth of discrete- time and continuously-valued times series from a sample. (A

  18. Experimental investigation and optimization of welding process parameters for various steel grades using NN tool and Taguchi method

    NASA Astrophysics Data System (ADS)

    Soni, Sourabh Kumar; Thomas, Benedict

    2018-04-01

    The term "weldability" has been used to describe a wide variety of characteristics when a material is subjected to welding. In our analysis we perform experimental investigation to estimate the tensile strength of welded joint strength and then optimization of welding process parameters by using taguchi method and Artificial Neural Network (ANN) tool in MINITAB and MATLAB software respectively. The study reveals the influence on weldability of steel by varying composition of steel by mechanical characterization. At first we prepare the samples of different grades of steel (EN8, EN 19, EN 24). The samples were welded together by metal inert gas welding process and then tensile testing on Universal testing machine (UTM) was conducted for the same to evaluate the tensile strength of the welded steel specimens. Further comparative study was performed to find the effects of welding parameter on quality of weld strength by employing Taguchi method and Neural Network tool. Finally we concluded that taguchi method and Neural Network Tool is much efficient technique for optimization.

  19. Spatially resolved imaging of opto-electrical property variations

    DOEpatents

    Nikiforov, Maxim; Darling, Seth B; Suzer, Ozgun; Guest, Jeffrey; Roelofs, Andreas

    2014-09-16

    Systems and methods for opto electric properties are provided. A light source illuminates a sample. A reference detector senses light from the light source. A sample detector receives light from the sample. A positioning fixture allows for relative positioning of the sample or the light source with respect to each other. An electrical signal device measures the electrical properties of the sample. The reference detector, sample detector and electrical signal device provide information that may be processed to determine opto-electric properties of the same.

  20. A Process Analytical Technology (PAT) approach to control a new API manufacturing process: development, validation and implementation.

    PubMed

    Schaefer, Cédric; Clicq, David; Lecomte, Clémence; Merschaert, Alain; Norrant, Edith; Fotiadu, Frédéric

    2014-03-01

    Pharmaceutical companies are progressively adopting and introducing Process Analytical Technology (PAT) and Quality-by-Design (QbD) concepts promoted by the regulatory agencies, aiming the building of the quality directly into the product by combining thorough scientific understanding and quality risk management. An analytical method based on near infrared (NIR) spectroscopy was developed as a PAT tool to control on-line an API (active pharmaceutical ingredient) manufacturing crystallization step during which the API and residual solvent contents need to be precisely determined to reach the predefined seeding point. An original methodology based on the QbD principles was designed to conduct the development and validation of the NIR method and to ensure that it is fitted for its intended use. On this basis, Partial least squares (PLS) models were developed and optimized using chemometrics methods. The method was fully validated according to the ICH Q2(R1) guideline and using the accuracy profile approach. The dosing ranges were evaluated to 9.0-12.0% w/w for the API and 0.18-1.50% w/w for the residual methanol. As by nature the variability of the sampling method and the reference method are included in the variability obtained for the NIR method during the validation phase, a real-time process monitoring exercise was performed to prove its fit for purpose. The implementation of this in-process control (IPC) method on the industrial plant from the launch of the new API synthesis process will enable automatic control of the final crystallization step in order to ensure a predefined quality level of the API. In addition, several valuable benefits are expected including reduction of the process time, suppression of a rather difficult sampling and tedious off-line analyses. © 2013 Published by Elsevier B.V.

  1. Wastewater treatment plants as a pathway for microplastics: Development of a new approach to sample wastewater-based microplastics.

    PubMed

    Ziajahromi, Shima; Neale, Peta A; Rintoul, Llew; Leusch, Frederic D L

    2017-04-01

    Wastewater effluent is expected to be a pathway for microplastics to enter the aquatic environment, with microbeads from cosmetic products and polymer fibres from clothes likely to enter wastewater treatment plants (WWTP). To date, few studies have quantified microplastics in wastewater. Moreover, the lack of a standardized and applicable method to identify microplastics in complex samples, such as wastewater, has limited the accurate assessment of microplastics and may lead to an incorrect estimation. This study aimed to develop a validated method to sample and process microplastics from wastewater effluent and to apply the developed method to quantify and characterise wastewater-based microplastics in effluent from three WWTPs that use primary, secondary and tertiary treatment processes. We applied a high-volume sampling device that fractionated microplastics in situ and an efficient sample processing procedure to improve the sampling of microplastics in wastewater and to minimize the false detection of non-plastic particles. The sampling device captured between 92% and 99% of polystyrene microplastics using 25 μm-500 μm mesh screens in laboratory tests. Microplastic type, size and suspected origin in all studied WWTPs, along with the removal efficiency during the secondary and tertiary treatment stages, was investigated. Suspected microplastics were characterised using Fourier Transform Infrared spectroscopy, with between 22 and 90% of the suspected microplastics found to be non-plastic particles. An average of 0.28, 0.48 and 1.54 microplastics per litre of final effluent was found in tertiary, secondary and primary treated effluent, respectively. This study suggests that although low concentrations of microplastics are detected in wastewater effluent, WWTPs still have the potential to act as a pathway to release microplastics given the large volumes of effluent discharged to the aquatic environment. This study focused on a single sampling campaign, with long-term monitoring recommended to further characterise microplastics in wastewater. Copyright © 2017 Elsevier Ltd. All rights reserved.

  2. Determination of trace nickel in hydrogenated cottonseed oil by electrothermal atomic absorption spectrometry after microwave-assisted digestion.

    PubMed

    Zhang, Gai

    2012-01-01

    Microwave digestion of hydrogenated cottonseed oil prior to trace nickel determination by electrothermal atomic absorption spectrometry (ETAAS) is proposed here for the first time. Currently, the methods outlined in U.S. Pharmacopeia 28 (USP28) or British Pharmacopeia (BP2003) are recommended as the official methods for analyzing nickel in hydrogenated cottonseed oil. With these methods the samples may be pre-treated by a silica or a platinum crucible. However, the samples were easily tarnished during sample pretreatment when using a silica crucible. In contrast, when using a platinum crucible, hydrogenated cottonseed oil acting as a reducing material may react with the platinum and destroy the crucible. The proposed microwave-assisted digestion avoided tarnishing of sample in the process of sample pretreatment and also reduced the cycle of analysis. The programs of microwave digestion and the parameters of ETAAS were optimized. The accuracy of the proposed method was investigated by analyzing real samples. The results were compared with the ones by pressurized-PTFE-bomb acid digestion and ones obtained by the U.S. Pharmacopeia 28 (USP28) method. The new method involves a relatively rapid matrix destruction technique compared with other present methods for the quantification of metals in oil. © 2011 Institute of Food Technologists®

  3. myPresto/omegagene: a GPU-accelerated molecular dynamics simulator tailored for enhanced conformational sampling methods with a non-Ewald electrostatic scheme.

    PubMed

    Kasahara, Kota; Ma, Benson; Goto, Kota; Dasgupta, Bhaskar; Higo, Junichi; Fukuda, Ikuo; Mashimo, Tadaaki; Akiyama, Yutaka; Nakamura, Haruki

    2016-01-01

    Molecular dynamics (MD) is a promising computational approach to investigate dynamical behavior of molecular systems at the atomic level. Here, we present a new MD simulation engine named "myPresto/omegagene" that is tailored for enhanced conformational sampling methods with a non-Ewald electrostatic potential scheme. Our enhanced conformational sampling methods, e.g. , the virtual-system-coupled multi-canonical MD (V-McMD) method, replace a multi-process parallelized run with multiple independent runs to avoid inter-node communication overhead. In addition, adopting the non-Ewald-based zero-multipole summation method (ZMM) makes it possible to eliminate the Fourier space calculations altogether. The combination of these state-of-the-art techniques realizes efficient and accurate calculations of the conformational ensemble at an equilibrium state. By taking these advantages, myPresto/omegagene is specialized for the single process execution with Graphics Processing Unit (GPU). We performed benchmark simulations for the 20-mer peptide, Trp-cage, with explicit solvent. One of the most thermodynamically stable conformations generated by the V-McMD simulation is very similar to an experimentally solved native conformation. Furthermore, the computation speed is four-times faster than that of our previous simulation engine, myPresto/psygene-G. The new simulator, myPresto/omegagene, is freely available at the following URLs: http://www.protein.osaka-u.ac.jp/rcsfp/pi/omegagene/ and http://presto.protein.osaka-u.ac.jp/myPresto4/.

  4. Doppler Processing with Ultra-Wideband (UWB) Radar Revisited

    DTIC Science & Technology

    2018-01-01

    grating lobes as compared to the conventional Doppler processing counterpart. 15. SUBJECT TERMS Doppler radar, UWB radar, matched filter , ambiguity...maps by the matched filter method, illustrating the radar data support in (a) the frequency-slow time domain and (b) the ρ-u domain. The samples...example, obtained by the matched filter method, for a 1.2-s CPI centered at t = 1.5 s

  5. A sub-sampled approach to extremely low-dose STEM

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stevens, A.; Luzi, L.; Yang, H.

    The inpainting of randomly sub-sampled images acquired by scanning transmission electron microscopy (STEM) is an attractive method for imaging under low-dose conditions (≤ 1 e -Å 2) without changing either the operation of the microscope or the physics of the imaging process. We show that 1) adaptive sub-sampling increases acquisition speed, resolution, and sensitivity; and 2) random (non-adaptive) sub-sampling is equivalent, but faster than, traditional low-dose techniques. Adaptive sub-sampling opens numerous possibilities for the analysis of beam sensitive materials and in-situ dynamic processes at the resolution limit of the aberration corrected microscope and is demonstrated here for the analysis ofmore » the node distribution in metal-organic frameworks (MOFs).« less

  6. Endoscopic ultrasound guided fine needle aspiration and useful ancillary methods

    PubMed Central

    Tadic, Mario; Stoos-Veic, Tajana; Kusec, Rajko

    2014-01-01

    The role of endoscopic ultrasound (EUS) in evaluating pancreatic pathology has been well documented from the beginning of its clinical use. High spatial resolution and the close proximity to the evaluated organs within the mediastinum and abdominal cavity allow detection of small focal lesions and precise tissue acquisition from suspected lesions within the reach of this method. Fine needle aspiration (FNA) is considered of additional value to EUS and is performed to obtain tissue diagnosis. Tissue acquisition from suspected lesions for cytological or histological analysis allows, not only the differentiation between malignant and non-malignant lesions, but, in most cases, also the accurate distinction between the various types of malignant lesions. It is well documented that the best results are achieved only if an adequate sample is obtained for further analysis, if the material is processed in an appropriate way, and if adequate ancillary methods are performed. This is a multi-step process and could be quite a challenge in some cases. In this article, we discuss the technical aspects of tissue acquisition by EUS-guided-FNA (EUS-FNA), as well as the role of an on-site cytopathologist, various means of specimen processing, and the selection of the appropriate ancillary method for providing an accurate tissue diagnosis and maximizing the yield of this method. The main goal of this review is to alert endosonographers, not only to the different possibilities of tissue acquisition, namely EUS-FNA, but also to bring to their attention the importance of proper sample processing in the evaluation of various lesions in the gastrointestinal tract and other accessible organs. All aspects of tissue acquisition (needles, suction, use of stylet, complications, etc.) have been well discussed lately. Adequate tissue samples enable comprehensive diagnoses, which answer the main clinical questions, thus enabling targeted therapy. PMID:25339816

  7. Finite element simulation of the T-shaped ECAP processing of round samples

    NASA Astrophysics Data System (ADS)

    Shaban Ghazani, Mehdi; Fardi-Ilkhchy, Ali; Binesh, Behzad

    2018-05-01

    Grain refinement is the only mechanism that increases the yield strength and toughness of the materials simultaneously. Severe plastic deformation is one of the promising methods to refine the microstructure of materials. Among different severe plastic deformation processes, the T-shaped equal channel angular pressing (T-ECAP) is a relatively new technique. In the present study, finite element analysis was conducted to evaluate the deformation behavior of metals during T-ECAP process. The study was focused mainly on flow characteristics, plastic strain distribution and its homogeneity, damage development, and pressing force which are among the most important factors governing the sound and successful processing of nanostructured materials by severe plastic deformation techniques. The results showed that plastic strain is localized in the bottom side of sample and uniform deformation cannot be possible using T-ECAP processing. Friction coefficient between sample and die channel wall has a little effect on strain distributions in mirror plane and transverse plane of deformed sample. Also, damage analysis showed that superficial cracks may be initiated from bottom side of sample and their propagation will be limited due to the compressive state of stress. It was demonstrated that the V shaped deformation zone are existed in T-ECAP process and the pressing load needed for execution of deformation process is increased with friction.

  8. The Effect of Temperature and Rotational Speed on Structure and Mechanical Properties of Cast Cu Base Alloy (Cu-Al-Si-Fe) Welded by Semisolid Stir Joining Method

    NASA Astrophysics Data System (ADS)

    Ferasat, Keyvan; Aashuri, Hossein; Kokabi, Amir Hossein; Shafizadeh, Mahdi; Nikzad, Siamak

    2015-12-01

    Semisolid stir joining has been under deliberation as a possible method for joining of copper alloys. In this study, the effect of temperature and rotational speed of stirrer on macrostructure evaluation and mechanical properties of samples were investigated. Optical microscopy and X-ray diffraction were performed for macro and microstructural analysis. A uniform micro-hardness profile was attained by semisolid stir joining method. The ultimate shear strength and bending strength of welded samples were improved in comparison with the cast sample. There is also lower area porosity in welded samples than the cast metal. The mechanical properties were improved by increasing temperature and rotational speed of the joining process.

  9. New method for detection of gastric cancer by hyperspectral imaging: a pilot study

    NASA Astrophysics Data System (ADS)

    Kiyotoki, Shu; Nishikawa, Jun; Okamoto, Takeshi; Hamabe, Kouichi; Saito, Mari; Goto, Atsushi; Fujita, Yusuke; Hamamoto, Yoshihiko; Takeuchi, Yusuke; Satori, Shin; Sakaida, Isao

    2013-02-01

    We developed a new, easy, and objective method to detect gastric cancer using hyperspectral imaging (HSI) technology combining spectroscopy and imaging A total of 16 gastroduodenal tumors removed by endoscopic resection or surgery from 14 patients at Yamaguchi University Hospital, Japan, were recorded using a hyperspectral camera (HSC) equipped with HSI technology Corrected spectral reflectance was obtained from 10 samples of normal mucosa and 10 samples of tumors for each case The 16 cases were divided into eight training cases (160 training samples) and eight test cases (160 test samples) We established a diagnostic algorithm with training samples and evaluated it with test samples Diagnostic capability of the algorithm for each tumor was validated, and enhancement of tumors by image processing using the HSC was evaluated The diagnostic algorithm used the 726-nm wavelength, with a cutoff point established from training samples The sensitivity, specificity, and accuracy rates of the algorithm's diagnostic capability in the test samples were 78.8% (63/80), 92.5% (74/80), and 85.6% (137/160), respectively Tumors in HSC images of 13 (81.3%) cases were well enhanced by image processing Differences in spectral reflectance between tumors and normal mucosa suggested that tumors can be clearly distinguished from background mucosa with HSI technology.

  10. Comparison of Two Methods for the Isolation of Salmonellae From Imported Foods

    PubMed Central

    Taylor, Welton I.; Hobbs, Betty C.; Smith, Muriel E.

    1964-01-01

    Two methods for the detection of salmonellae in foods were compared in 179 imported meat and egg samples. The number of positive samples and replications, and the number of strains and kinds of serotypes were statistically comparable by both the direct enrichment method of the Food Hygiene Laboratory in England, and the pre-enrichment method devised for processed foods in the United States. Boneless frozen beef, veal, and horsemeat imported from five countries for consumption in England were found to have salmonellae present in 48 of 116 (41%) samples. Dried egg products imported from three countries were observed to have salmonellae in 10 of 63 (16%) samples. The high incidence of salmonellae isolated from imported foods illustrated the existence of an international health hazard resulting from the continuous introduction of exogenous strains of pathogenic microorganisms on a large scale. PMID:14106941

  11. Fish assemblages

    USGS Publications Warehouse

    McGarvey, Daniel J.; Falke, Jeffrey A.; Li, Hiram W.; Li, Judith; Hauer, F. Richard; Lamberti, G.A.

    2017-01-01

    Methods to sample fishes in stream ecosystems and to analyze the raw data, focusing primarily on assemblage-level (all fish species combined) analyses, are presented in this chapter. We begin with guidance on sample site selection, permitting for fish collection, and information-gathering steps to be completed prior to conducting fieldwork. Basic sampling methods (visual surveying, electrofishing, and seining) are presented with specific instructions for estimating population sizes via visual, capture-recapture, and depletion surveys, in addition to new guidance on environmental DNA (eDNA) methods. Steps to process fish specimens in the field including the use of anesthesia and preservation of whole specimens or tissue samples (for genetic or stable isotope analysis) are also presented. Data analysis methods include characterization of size-structure within populations, estimation of species richness and diversity, and application of fish functional traits. We conclude with three advanced topics in assemblage-level analysis: multidimensional scaling (MDS), ecological networks, and loop analysis.

  12. Solid-phase extraction and GC-MS analysis of THC-COOH method optimized for a high-throughput forensic drug-testing laboratory.

    PubMed

    Stout, P R; Horn, C K; Klette, K L

    2001-10-01

    In order to facilitate the confirmation analysis of large numbers of urine samples previously screened positive for delta9-tetrahydrocannabinol (THC), an extraction, derivitization, and GC-MS analysis method was developed. This method utilized a positive pressure manifold anion-exchange polymer-based solid-phase extraction followed by elution directly into the automated liquid sampling (ALS) vials. Rapid derivitization was accomplished using pentafluoropropionic anhydride/pentafluoropropanol (PFPA/PFPOH). Recoveries averaged 95% with a limit of detection of 0.875 ng/mL with a 3-mL sample volume. Performance of 11-nor-delta9-tetrahydrocannabinol-9-carboxylic acid (THC-COOH)-d3 and THC-COOH-d9 internal standards were evaluated. The method was linear to 900 ng/mL THC-COOH using THC-COOH-d9 with negligible contribution from the internal standard to very weak samples. Excellent agreement was seen with previous quantitations of human urine samples. More than 1000 human urine samples were analyzed using the method with 300 samples analyzed using an alternate qualifier ion (m/z 622) after some interference was observed with a qualifier ion (m/z 489). The 622 ion did not exhibit any interference even in samples with interfering peaks present in the 489 ion. The method resulted in dramatic reductions in processing time, waste production, and exposure hazards to laboratory personnel.

  13. Fatigue Behavior of Ultrafine-Grained 5052 Al Alloy Processed Through Different Rolling Methods

    NASA Astrophysics Data System (ADS)

    Yogesha, K. K.; Joshi, Amit; Jayaganthan, R.

    2017-05-01

    In the present study, 5052 Al alloy was processed through different rolling methods to obtain ultrafine grains and its high-cycle fatigue behavior were investigated. The solution-treated Al-Mg alloys (AA 5052) were deformed through different methods such as cryorolling (CR), cryo groove rolling (CGR) and cryo groove rolling followed by warm rolling (CGW), up to 75% thickness reduction. The deformed samples were subjected to mechanical testing such as hardness, tensile and high-cycle fatigue (HCF) test at stress control mode. The CGW samples exhibit better HCF strength when compared to other conditions. The microstructure of the tested samples was characterized by optical microscopy, SEM fractography and TEM to understand the deformation behavior of deformed Al alloy. The improvement in fatigue life of CR and CGR samples is due to effective grain refinement, subgrain formations, and high dislocation density observed in the heavily deformed samples at cryogenic condition as observed from SEM and TEM analysis. However, in case of CGW samples, formation of nanoshear bands accommodates the applied strain during cyclic loading, thereby facilitating dislocation accumulation along with subgrain formations, leading to the high fatigue life. The deformed or broken impurity phase particles found in the deformed samples along with the precipitates that were formed during warm rolling also play a prominent role in enhancing the fatigue strength. These tiny particles hindered the dislocation movement by effectively pinning it at grain boundaries, thereby improving the resistance of crack propagation under cyclic load.

  14. Hurst Estimation of Scale Invariant Processes with Stationary Increments and Piecewise Linear Drift

    NASA Astrophysics Data System (ADS)

    Modarresi, N.; Rezakhah, S.

    The characteristic feature of the discrete scale invariant (DSI) processes is the invariance of their finite dimensional distributions by dilation for certain scaling factor. DSI process with piecewise linear drift and stationary increments inside prescribed scale intervals is introduced and studied. To identify the structure of the process, first, we determine the scale intervals, their linear drifts and eliminate them. Then, a new method for the estimation of the Hurst parameter of such DSI processes is presented and applied to some period of the Dow Jones indices. This method is based on fixed number equally spaced samples inside successive scale intervals. We also present some efficient method for estimating Hurst parameter of self-similar processes with stationary increments. We compare the performance of this method with the celebrated FA, DFA and DMA on the simulated data of fractional Brownian motion (fBm).

  15. Food adulteration analysis without laboratory prepared or determined reference food adulterant values.

    PubMed

    Kalivas, John H; Georgiou, Constantinos A; Moira, Marianna; Tsafaras, Ilias; Petrakis, Eleftherios A; Mousdis, George A

    2014-04-01

    Quantitative analysis of food adulterants is an important health and economic issue that needs to be fast and simple. Spectroscopy has significantly reduced analysis time. However, still needed are preparations of analyte calibration samples matrix matched to prediction samples which can be laborious and costly. Reported in this paper is the application of a newly developed pure component Tikhonov regularization (PCTR) process that does not require laboratory prepared or reference analysis methods, and hence, is a greener calibration method. The PCTR method requires an analyte pure component spectrum and non-analyte spectra. As a food analysis example, synchronous fluorescence spectra of extra virgin olive oil samples adulterated with sunflower oil is used. Results are shown to be better than those obtained using ridge regression with reference calibration samples. The flexibility of PCTR allows including reference samples and is generic for use with other instrumental methods and food products. Copyright © 2013 Elsevier Ltd. All rights reserved.

  16. Quality control for quantitative PCR based on amplification compatibility test.

    PubMed

    Tichopad, Ales; Bar, Tzachi; Pecen, Ladislav; Kitchen, Robert R; Kubista, Mikael; Pfaffl, Michael W

    2010-04-01

    Quantitative qPCR is a routinely used method for the accurate quantification of nucleic acids. Yet it may generate erroneous results if the amplification process is obscured by inhibition or generation of aberrant side-products such as primer dimers. Several methods have been established to control for pre-processing performance that rely on the introduction of a co-amplified reference sequence, however there is currently no method to allow for reliable control of the amplification process without directly modifying the sample mix. Herein we present a statistical approach based on multivariate analysis of the amplification response data generated in real-time. The amplification trajectory in its most resolved and dynamic phase is fitted with a suitable model. Two parameters of this model, related to amplification efficiency, are then used for calculation of the Z-score statistics. Each studied sample is compared to a predefined reference set of reactions, typically calibration reactions. A probabilistic decision for each individual Z-score is then used to identify the majority of inhibited reactions in our experiments. We compare this approach to univariate methods using only the sample specific amplification efficiency as reporter of the compatibility. We demonstrate improved identification performance using the multivariate approach compared to the univariate approach. Finally we stress that the performance of the amplification compatibility test as a quality control procedure depends on the quality of the reference set. Copyright 2010 Elsevier Inc. All rights reserved.

  17. Preparation of water and ice samples for 39Ar dating by atom trap trace analysis (ATTA)

    NASA Astrophysics Data System (ADS)

    Schwefel, R.; Reichel, T.; Aeschbach-Hertig, W.; Wagenbach, D.

    2012-04-01

    Atom trap trace analysis (ATTA) is a new and promising method to measure very rare noble gas radioisotopes in the environment. The applicability of this method for the dating of very old groundwater with 81Kr has already been demonstrated [1]. Recent developments now show its feasibility also for the analysis of 39Ar [2,3], which is an ideal dating tracer for the age range between 50 and 1000 years. This range is of interest in the fields of hydro(geo)logy, oceanography, and glaciology. We present preparation (gas extraction and Ar separation) methods for groundwater and ice samples for later analysis by the ATTA technique. For groundwater, the sample size is less of a limitation than for applications in oceanography or glaciology. Large samples are furthermore needed to enable a comparison with the classical method of 39Ar detection by low-level counting. Therefore, a system was built that enables gas extraction from several thousand liters of water using membrane contactors. This system provides degassing efficiencies greater than 80 % and has successfully been tested in the field. Gas samples are further processed to separate a pure Ar fraction by a gas-chromatographic method based on Li-LSX zeolite as selective adsorber material at very low temperatures. The gas separation achieved by this system is controlled by a quadrupole mass spectrometer. It has successfully been tested and used on real samples. The separation efficiency was found to be strongly temperature dependent in the range of -118 to -130 °C. Since ATTA should enable the analysis of 39Ar on samples of less than 1 ccSTP of Ar (corresponding to about 100 ml of air, 2.5 l of water or 1 kg of ice), a method to separate Ar from small amounts of gas was developed. Titanium sponge was found to absorb 60 ccSTP of reactive gases per g of the getter material with reasonably high absorption rates at high operating temperatures (~ 800 ° C). Good separation (higher than 92 % Ar content in residual gas) was achieved by this gettering process. The other main remaining component is H2, which can be further reduced by operating the Ti getter at lower temperature. Furthermore, a system was designed to degas ice samples, followed by Ar separation by gettering. Ice from an alpine glacier was successfully processed on this system.

  18. Prevalence and counts of Salmonella spp. in minimally processed vegetables in São Paulo, Brazil.

    PubMed

    Sant'Ana, Anderson S; Landgraf, Mariza; Destro, Maria Teresa; Franco, Bernadette D G M

    2011-09-01

    Minimally processed vegetables (MPV) may be important vehicles of Salmonella spp. and cause disease. This study aimed at detecting and enumerating Salmonella spp. in MPV marketed in the city of São Paulo, Brazil. A total of 512 samples of MPV packages collected in retail stores were tested for Salmonella spp. and total coliforms and Escherichia coli as indication of the hygienic status. Salmonella spp. was detected in four samples, two using the detection method and two using the counting method, where the results were 8.8 × 10(2) CFU/g and 2.4 × 10(2) CFU/g. The serovars were Salmonella Typhimurium (three samples) and Salmonella enterica subsp. enterica O:47:z4,z23:- (one sample). Fourteen samples (2.7%) presented counts of E. coli above the maximum limit established by the Brazilian regulation for MPV (10(2) CFU/g). Therefore, tightened surveillance and effective intervention strategies are necessary in order to address consumers and governments concerns on safety of MPV. Copyright © 2011 Elsevier Ltd. All rights reserved.

  19. Pore water sampling in acid sulfate soils: a new peeper method.

    PubMed

    Johnston, Scott G; Burton, Edward D; Keene, Annabelle F; Bush, Richard T; Sullivan, Leigh A; Isaacson, Lloyd

    2009-01-01

    This study describes the design, deployment, and application of a modified equilibration dialysis device (peeper) optimized for sampling pore waters in acid sulfate soils (ASS). The modified design overcomes the limitations of traditional-style peepers, when sampling firm ASS materials over relatively large depth intervals. The new peeper device uses removable, individual cells of 25 mL volume housed in a 1.5 m long rigid, high-density polyethylene rod. The rigid housing structure allows the device to be inserted directly into relatively firm soils without requiring a supporting frame. The use of removable cells eliminates the need for a large glove-box after peeper retrieval, thus simplifying physical handling. Removable cells are easily maintained in an inert atmosphere during sample processing and the 25-mL sample volume is sufficient for undertaking multiple analyses. A field evaluation of equilibration times indicates that 32 to 38 d of deployment was necessary. Overall, the modified method is simple and effective and well suited to acquisition and processing of redox-sensitive pore water profiles>1 m deep in acid sulfate soil or any other firm wetland soils.

  20. Rate Constant and Reaction Coordinate of Trp-Cage Folding in Explicit Water

    PubMed Central

    Juraszek, Jarek; Bolhuis, Peter G.

    2008-01-01

    We report rate constant calculations and a reaction coordinate analysis of the rate-limiting folding and unfolding process of the Trp-cage mini-protein in explicit solvent using transition interface sampling. Previous transition path sampling simulations revealed that in this (un)folding process the protein maintains its compact configuration, while a (de)increase of secondary structure is observed. The calculated folding rate agrees reasonably with experiment, while the unfolding rate is 10 times higher. We discuss possible origins for this mismatch. We recomputed the rates with the forward flux sampling method, and found a discrepancy of four orders of magnitude, probably caused by the method's higher sensitivity to the choice of order parameter with respect to transition interface sampling. Finally, we used the previously computed transition path-sampling ensemble to screen combinations of many order parameters for the best model of the reaction coordinate by employing likelihood maximization. We found that a combination of the root mean-square deviation of the helix and of the entire protein was, of the set of tried order parameters, the one that best describes the reaction coordination. PMID:18676648

  1. Insect pest management for raw commodities during storage

    USDA-ARS?s Scientific Manuscript database

    This book chapter provides an overview of the pest management decision-making process during grain storage. An in-depth discussion of sampling methods, cost-benefit analysis, expert systems, consultants and the use of computer simulation models is provided. Sampling is essential to determine if pest...

  2. Temporally flickering nanoparticles for compound cellular imaging and super resolution

    NASA Astrophysics Data System (ADS)

    Ilovitsh, Tali; Danan, Yossef; Meir, Rinat; Meiri, Amihai; Zalevsky, Zeev

    2016-03-01

    This work presents the use of flickering nanoparticles for imaging biological samples. The method has high noise immunity, and it enables the detection of overlapping types of GNPs, at significantly sub-diffraction distances, making it attractive for super resolving localization microscopy techniques. The method utilizes a lock-in technique at which the imaging of the sample is done using a time-modulated laser beam that match the number of the types of gold nanoparticles (GNPs) that label a given sample, and resulting in the excitation of the temporal flickering of the scattered light at known temporal frequencies. The final image where the GNPs are spatially separated is obtained using post processing where the proper spectral components corresponding to the different modulation frequencies are extracted. This allows the simultaneous super resolved imaging of multiple types of GNPs that label targets of interest within biological samples. Additionally applying the post-processing algorithm of the K-factor image decomposition algorithm can further improve the performance of the proposed approach.

  3. Removal of Non-metallic Inclusions from Nickel Base Superalloys by Electromagnetic Levitation Melting in a Slag

    NASA Astrophysics Data System (ADS)

    Manjili, Mohsen Hajipour; Halali, Mohammad

    2018-02-01

    Samples of INCONEL 718 were levitated and melted in a slag by the application of an electromagnetic field. The effects of temperature, time, and slag composition on the inclusion content of the samples were studied thoroughly. Samples were compared with the original alloy to study the effect of the process on inclusions. Size, shape, and chemical composition of remaining non-metallic inclusions were investigated. The samples were prepared by Standard Guide for Preparing and Evaluating Specimens for Automatic Inclusion Assessment of Steel (ASTM E 768-99) method and the results were reported by means of the Standard Test Methods for Determining the Inclusion Content of Steel (ASTM E 45-97). Results indicated that by increasing temperature and processing time, greater level of cleanliness could be achieved, and numbers and size of the remaining inclusions decreased significantly. It was also observed that increasing calcium fluoride content of the slag helped reduce inclusion content.

  4. A Combined Adaptive Neural Network and Nonlinear Model Predictive Control for Multirate Networked Industrial Process Control.

    PubMed

    Wang, Tong; Gao, Huijun; Qiu, Jianbin

    2016-02-01

    This paper investigates the multirate networked industrial process control problem in double-layer architecture. First, the output tracking problem for sampled-data nonlinear plant at device layer with sampling period T(d) is investigated using adaptive neural network (NN) control, and it is shown that the outputs of subsystems at device layer can track the decomposed setpoints. Then, the outputs and inputs of the device layer subsystems are sampled with sampling period T(u) at operation layer to form the index prediction, which is used to predict the overall performance index at lower frequency. Radial basis function NN is utilized as the prediction function due to its approximation ability. Then, considering the dynamics of the overall closed-loop system, nonlinear model predictive control method is proposed to guarantee the system stability and compensate the network-induced delays and packet dropouts. Finally, a continuous stirred tank reactor system is given in the simulation part to demonstrate the effectiveness of the proposed method.

  5. Numerical simulation and analysis for low-frequency rock physics measurements

    NASA Astrophysics Data System (ADS)

    Dong, Chunhui; Tang, Genyang; Wang, Shangxu; He, Yanxiao

    2017-10-01

    In recent years, several experimental methods have been introduced to measure the elastic parameters of rocks in the relatively low-frequency range, such as differential acoustic resonance spectroscopy (DARS) and stress-strain measurement. It is necessary to verify the validity and feasibility of the applied measurement method and to quantify the sources and levels of measurement error. Relying solely on the laboratory measurements, however, we cannot evaluate the complete wavefield variation in the apparatus. Numerical simulations of elastic wave propagation, on the other hand, are used to model the wavefield distribution and physical processes in the measurement systems, and to verify the measurement theory and analyze the measurement results. In this paper we provide a numerical simulation method to investigate the acoustic waveform response of the DARS system and the quasi-static responses of the stress-strain system, both of which use axisymmetric apparatus. We applied this method to parameterize the properties of the rock samples, the sample locations and the sensor (hydrophone and strain gauges) locations and simulate the measurement results, i.e. resonance frequencies and axial and radial strains on the sample surface, from the modeled wavefield following the physical experiments. Rock physical parameters were estimated by inversion or direct processing of these data, and showed a perfect match with the true values, thus verifying the validity of the experimental measurements. Error analysis was also conducted for the DARS system with 18 numerical samples, and the sources and levels of error are discussed. In particular, we propose an inversion method for estimating both density and compressibility of these samples. The modeled results also showed fairly good agreement with the real experiment results, justifying the effectiveness and feasibility of our modeling method.

  6. Free radical reaction characteristics of coal low-temperature oxidation and its inhibition method.

    PubMed

    Li, Zenghua; Kong, Biao; Wei, Aizhu; Yang, Yongliang; Zhou, Yinbo; Zhang, Lanzhun

    2016-12-01

    Study on the mechanism of coal spontaneous combustion is significant for controlling fire disasters due to coal spontaneous combustion. The free radical reactions can explain the chemical process of coal at low-temperature oxidation. Electron spin resonance (ESR) spectroscopy was used to measure the change rules of the different sorts and different granularity of coal directly; ESR spectroscopy chart of free radicals following the changes of temperatures was compared by the coal samples applying air and blowing nitrogen, original coal samples, dry coal samples, and demineralized coal samples. The fragmentation process was the key factor of producing and initiating free radical reactions. Oxygen, moisture, and mineral accelerated the free radical reactions. Combination of the free radical reaction mechanism, the mechanical fragmentation leaded to the elevated CO concentration, fracturing of coal pillar was more prone to spontaneous combustion, and spontaneous combustion in goaf accounted for a large proportion of the fire in the mine were explained. The method of added diphenylamine can inhibit the self-oxidation of coal effectively, the action mechanism of diphenylamine was analyzed by free radical chain reaction, and this research can offer new method for the development of new flame retardant.

  7. Optical detection of Trypanosoma cruzi in blood samples for diagnosis purpose

    NASA Astrophysics Data System (ADS)

    Alanis, Elvio; Romero, Graciela; Alvarez, Liliana; Martinez, Carlos C.; Basombrio, Miguel A.

    2004-10-01

    An optical method for detection of Trypanosoma Cruzi (T. cruzi) parasites in blood samples of mice infected with Chagas disease is presented. The method is intended for use in human blood, for diagnosis purposes. A thin layer of blood infected by T. cruzi parasites, in small concentrations, is examined in an interferometric microscope in which the images of the vision field are taken by a CCD camera and temporarily stored in the memory of a host computer. The whole sample is scanned displacing the microscope plate by means of step motors driven by the computer. Several consecutive images of the same field are taken and digitally processed by means of image temporal differentiation in order to detect if a parasite is eventually present in the field. Each field of view is processed in the same fashion, until the full area of the sample is covered or until a parasite is detected, in which case an acoustical warning is activated and the corresponding image is displayed permitting the technician to corroborate the result visually. A discussion of the reliability of the method as well as a comparison with other well established techniques are presented.

  8. An adaptive two-stage sequential design for sampling rare and clustered populations

    USGS Publications Warehouse

    Brown, J.A.; Salehi, M.M.; Moradi, M.; Bell, G.; Smith, D.R.

    2008-01-01

    How to design an efficient large-area survey continues to be an interesting question for ecologists. In sampling large areas, as is common in environmental studies, adaptive sampling can be efficient because it ensures survey effort is targeted to subareas of high interest. In two-stage sampling, higher density primary sample units are usually of more interest than lower density primary units when populations are rare and clustered. Two-stage sequential sampling has been suggested as a method for allocating second stage sample effort among primary units. Here, we suggest a modification: adaptive two-stage sequential sampling. In this method, the adaptive part of the allocation process means the design is more flexible in how much extra effort can be directed to higher-abundance primary units. We discuss how best to design an adaptive two-stage sequential sample. ?? 2008 The Society of Population Ecology and Springer.

  9. 40 CFR 60.456 - Test methods and procedures.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Section 60.456 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS... § 60.453. (2) Method 25 for the measurement of the VOC concentration in the gas stream vent. (3) Method... sampling times or smaller volumes, when necessitated by process variables or other factors, may be approved...

  10. Variability in, variability out: best practice recommendations to standardize pre-analytical variables in the detection of circulating and tissue microRNAs.

    PubMed

    Khan, Jenna; Lieberman, Joshua A; Lockwood, Christina M

    2017-05-01

    microRNAs (miRNAs) hold promise as biomarkers for a variety of disease processes and for determining cell differentiation. These short RNA species are robust, survive harsh treatment and storage conditions and may be extracted from blood and tissue. Pre-analytical variables are critical confounders in the analysis of miRNAs: we elucidate these and identify best practices for minimizing sample variation in blood and tissue specimens. Pre-analytical variables addressed include patient-intrinsic variation, time and temperature from sample collection to storage or processing, processing methods, contamination by cells and blood components, RNA extraction method, normalization, and storage time/conditions. For circulating miRNAs, hemolysis and blood cell contamination significantly affect profiles; samples should be processed within 2 h of collection; ethylene diamine tetraacetic acid (EDTA) is preferred while heparin should be avoided; samples should be "double spun" or filtered; room temperature or 4 °C storage for up to 24 h is preferred; miRNAs are stable for at least 1 year at -20 °C or -80 °C. For tissue-based analysis, warm ischemic time should be <1 h; cold ischemic time (4 °C) <24 h; common fixative used for all specimens; formalin fix up to 72 h prior to processing; enrich for cells of interest; validate candidate biomarkers with in situ visualization. Most importantly, all specimen types should have standard and common workflows with careful documentation of relevant pre-analytical variables.

  11. Bacteria and Bioactivity in Holder Pasteurized and Shelf-Stable Human Milk Products

    PubMed Central

    2017-01-01

    Abstract Background: Historically, Holder pasteurization has been used to pasteurize donor human milk available in a hospital setting. There is extensive research that provides an overview of the impact of Holder pasteurization on bioactive components of human milk. A shelf-stable (SS) human milk product, created using retort processing, recently became available; however, to our knowledge, little has been published about the effect of retort processing on human milk. Objective: We aimed to assess the ability of retort processing to eliminate bacteria and to quantify the difference in lysozyme and secretory immunoglobulin A (sIgA) activity between Holder pasteurized (HP) and SS human milk. Methods: Milk samples from 60 mothers were pooled. From this pool, 36 samples were taken: 12 samples were kept raw, 12 samples were HP, and 12 samples were retort processed to create an SS product. All samples were analyzed for total aerobic bacteria, coliform bacteria, Bacillus cereus, sIgA activity, and lysozyme activity. Raw samples served as the control. Results: One raw sample and 3 HP samples contained B. cereus at the time of culture. There were no detectable bacteria in SS samples at the time of culture. Raw samples had significantly greater lysozyme and sIgA activity than HP and SS samples (P < 0.0001). HP samples retained significantly more lysozyme and sIgA activity (54% and 87%, respectively) than SS samples (0% and 11%, respectively). Conclusions: Human milk processed using Holder pasteurization should continue to be screened for the presence of B. cereus. Clinicians should be aware of the differences in the retention of lysozyme and sIgA activity in HP and SS products when making feeding decisions for medically fragile or immunocompromised infants to ensure that patients are receiving the maximum immune protection. PMID:29955718

  12. Headspace solid-phase microextraction (HS-SPME) and liquid-liquid extraction (LLE): comparison of the performance in classification of ecstasy tablets. Part 2.

    PubMed

    Bonadio, Federica; Margot, Pierre; Delémont, Olivier; Esseiva, Pierre

    2008-11-20

    Headspace solid-phase microextraction (HS-SPME) is assessed as an alternative to liquid-liquid extraction (LLE) currently used for 3,4-methylenedioxymethampethamine (MDMA) profiling. Both methods were compared evaluating their performance in discriminating and classifying samples. For this purpose 62 different seizures were analysed using both extraction techniques followed by gas chromatography-mass spectroscopy (GC-MS). A previously validated method provided data for HS-SPME, whereas LLE data were collected applying a harmonized methodology developed and used in the European project CHAMP. After suitable pre-treatment, similarities between sample pairs were studied using the Pearson correlation. Both methods enable to distinguish between samples coming from the same pre-tabletting batches and samples coming from different pre-tabletting batches. This finding emphasizes the use of HS-SPME as an effective alternative to LLE, with additional advantages such as sample preparation and a solvent-free process.

  13. Experimental Null Method to Guide the Development of Technical Procedures and to Control False-Positive Discovery in Quantitative Proteomics.

    PubMed

    Shen, Xiaomeng; Hu, Qiang; Li, Jun; Wang, Jianmin; Qu, Jun

    2015-10-02

    Comprehensive and accurate evaluation of data quality and false-positive biomarker discovery is critical to direct the method development/optimization for quantitative proteomics, which nonetheless remains challenging largely due to the high complexity and unique features of proteomic data. Here we describe an experimental null (EN) method to address this need. Because the method experimentally measures the null distribution (either technical or biological replicates) using the same proteomic samples, the same procedures and the same batch as the case-vs-contol experiment, it correctly reflects the collective effects of technical variability (e.g., variation/bias in sample preparation, LC-MS analysis, and data processing) and project-specific features (e.g., characteristics of the proteome and biological variation) on the performances of quantitative analysis. To show a proof of concept, we employed the EN method to assess the quantitative accuracy and precision and the ability to quantify subtle ratio changes between groups using different experimental and data-processing approaches and in various cellular and tissue proteomes. It was found that choices of quantitative features, sample size, experimental design, data-processing strategies, and quality of chromatographic separation can profoundly affect quantitative precision and accuracy of label-free quantification. The EN method was also demonstrated as a practical tool to determine the optimal experimental parameters and rational ratio cutoff for reliable protein quantification in specific proteomic experiments, for example, to identify the necessary number of technical/biological replicates per group that affords sufficient power for discovery. Furthermore, we assessed the ability of EN method to estimate levels of false-positives in the discovery of altered proteins, using two concocted sample sets mimicking proteomic profiling using technical and biological replicates, respectively, where the true-positives/negatives are known and span a wide concentration range. It was observed that the EN method correctly reflects the null distribution in a proteomic system and accurately measures false altered proteins discovery rate (FADR). In summary, the EN method provides a straightforward, practical, and accurate alternative to statistics-based approaches for the development and evaluation of proteomic experiments and can be universally adapted to various types of quantitative techniques.

  14. Matrix effects in pesticide multi-residue analysis by liquid chromatography-mass spectrometry.

    PubMed

    Kruve, Anneli; Künnapas, Allan; Herodes, Koit; Leito, Ivo

    2008-04-11

    Three sample preparation methods: Luke method (AOAC 985.22), QuEChERS (quick, easy, cheap, effective, rugged and safe) and matrix solid-phase dispersion (MSPD) were applied to different fruits and vegetables for analysis of 14 pesticide residues by high-performance liquid chromatography with electrospray ionization-mass spectrometry (HPLC/ESI/MS). Matrix effect, recovery and process efficiency of the sample preparation methods applied to different fruits and vegetables were compared. The Luke method was found to produce least matrix effect. On an average the best recoveries were obtained with the QuEChERS method. MSPD gave unsatisfactory recoveries for some basic pesticide residues. Comparison of matrix effects for different apple varieties showed high variability for some residues. It was demonstrated that the amount of co-extracting compounds that cause ionization suppression of aldicarb depends on the apple variety as well as on the sample preparation method employed.

  15. Advances in spectroscopic methods for quantifying soil carbon

    USGS Publications Warehouse

    Liebig, Mark; Franzluebbers, Alan J.; Follett, Ronald F.; Hively, W. Dean; Reeves, James B.; McCarty, Gregory W.; Calderon, Francisco

    2012-01-01

    The gold standard for soil C determination is combustion. However, this method requires expensive consumables, is limited to the determination of the total carbon and in the number of samples which can be processed (~100/d). With increased interest in soil C sequestration, faster methods are needed. Thus, interest in methods based on diffuse reflectance spectroscopy in the visible, near-infrared or mid-infrared ranges using either proximal or remote sensing. These methods have the ability to analyze more samples (2 to 3X/d) or huge areas (imagery) and do multiple analytes simultaneously, but require calibrations relating spectral and reference data and have specific problems, i.e., remote sensing is capable of scanning entire watersheds, thus reducing the sampling needed, but is limiting to the surface layer of tilled soils and by difficulty in obtaining proper calibration reference values. The objective of this discussion is the present state of spectroscopic methods for soil C determination.

  16. Evaluation of consolidation method on mechanical and structural properties of ODS RAF steel

    NASA Astrophysics Data System (ADS)

    Frelek-Kozak, M.; Kurpaska, L.; Wyszkowska, E.; Jagielski, J.; Jozwik, I.; Chmielewski, M.

    2018-07-01

    In the present work, the effects of the fabrication method on mechanical and structural properties of 12%Cr, 2%W, 0.25%Ti, 0.25%Y2O3 steels were investigated. Materials obtained by Spark Plasma Sintering (SPS), Hot Isostatic Pressing (HIP) and Hot Extrusion (HE) methods were studied. The microstructure was characterized by using Scanning Electron Microscopy (SEM) and Electron Backscatter Diffraction analysis (EBSD). Mechanical properties of the studied samples were evaluated by using Vickers micro hardness HV0.1, Small Punch Test (SPT) and nanoindentation (NI) methods. The analysis revealed that samples manufactured via HIP and SPS processes exhibit very similar properties, whereas SPS method produces material with slightly lower hardness. In addition, significantly lower mechanical properties of the specimens after HE process were observed. The study described in this article addresses also the problems of mechanical parameters measured in micro- and nano-scale experiments and aims to identify possible pitfalls related to the use of various manufacturing technologies.

  17. Improved lossless intra coding for H.264/MPEG-4 AVC.

    PubMed

    Lee, Yung-Lyul; Han, Ki-Hun; Sullivan, Gary J

    2006-09-01

    A new lossless intra coding method based on sample-by-sample differential pulse code modulation (DPCM) is presented as an enhancement of the H.264/MPEG-4 AVC standard. The H.264/AVC design includes a multidirectional spatial prediction method to reduce spatial redundancy by using neighboring samples as a prediction for the samples in a block of data to be encoded. In the new lossless intra coding method, the spatial prediction is performed based on samplewise DPCM instead of in the block-based manner used in the current H.264/AVC standard, while the block structure is retained for the residual difference entropy coding process. We show that the new method, based on samplewise DPCM, does not have a major complexity penalty, despite its apparent pipeline dependencies. Experiments show that the new lossless intra coding method reduces the bit rate by approximately 12% in comparison with the lossless intra coding method previously included in the H.264/AVC standard. As a result, the new method is currently being adopted into the H.264/AVC standard in a new enhancement project.

  18. PERIOD ESTIMATION FOR SPARSELY SAMPLED QUASI-PERIODIC LIGHT CURVES APPLIED TO MIRAS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    He, Shiyuan; Huang, Jianhua Z.; Long, James

    2016-12-01

    We develop a nonlinear semi-parametric Gaussian process model to estimate periods of Miras with sparsely sampled light curves. The model uses a sinusoidal basis for the periodic variation and a Gaussian process for the stochastic changes. We use maximum likelihood to estimate the period and the parameters of the Gaussian process, while integrating out the effects of other nuisance parameters in the model with respect to a suitable prior distribution obtained from earlier studies. Since the likelihood is highly multimodal for period, we implement a hybrid method that applies the quasi-Newton algorithm for Gaussian process parameters and search the period/frequencymore » parameter space over a dense grid. A large-scale, high-fidelity simulation is conducted to mimic the sampling quality of Mira light curves obtained by the M33 Synoptic Stellar Survey. The simulated data set is publicly available and can serve as a testbed for future evaluation of different period estimation methods. The semi-parametric model outperforms an existing algorithm on this simulated test data set as measured by period recovery rate and quality of the resulting period–luminosity relations.« less

  19. Fabrication of Ultra-thin Color Films with Highly Absorbing Media Using Oblique Angle Deposition.

    PubMed

    Yoo, Young Jin; Lee, Gil Ju; Jang, Kyung-In; Song, Young Min

    2017-08-29

    Ultra-thin film structures have been studied extensively for use as optical coatings, but performance and fabrication challenges remain.  We present an advanced method for fabricating ultra-thin color films with improved characteristics. The proposed process addresses several fabrication issues, including large area processing. Specifically, the protocol describes a process for fabricating ultra-thin color films using an electron beam evaporator for oblique angle deposition of germanium (Ge) and gold (Au) on silicon (Si) substrates.  Film porosity produced by the oblique angle deposition induces color changes in the ultra-thin film. The degree of color change depends on factors such as deposition angle and film thickness. Fabricated samples of the ultra-thin color films showed improved color tunability and color purity. In addition, the measured reflectance of the fabricated samples was converted into chromatic values and analyzed in terms of color. Our ultra-thin film fabricating method is expected to be used for various ultra-thin film applications such as flexible color electrodes, thin film solar cells, and optical filters. Also, the process developed here for analyzing the color of the fabricated samples is broadly useful for studying various color structures.

  20. Method for isolating nucleic acids

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hurt, Jr., Richard Ashley; Elias, Dwayne A.

    The current disclosure provides methods and kits for isolating nucleic acid from an environmental sample. The current methods and compositions further provide methods for isolating nucleic acids by reducing adsorption of nucleic acids by charged ions and particles within an environmental sample. The methods of the current disclosure provide methods for isolating nucleic acids by releasing adsorbed nucleic acids from charged particles during the nucleic acid isolation process. The current disclosure facilitates the isolation of nucleic acids of sufficient quality and quantity to enable one of ordinary skill in the art to utilize or analyze the isolated nucleic acids formore » a wide variety of applications including, sequencing or species population analysis.« less

Top