Sample records for sampling technique based

  1. Screen Space Ambient Occlusion Based Multiple Importance Sampling for Real-Time Rendering

    NASA Astrophysics Data System (ADS)

    Zerari, Abd El Mouméne; Babahenini, Mohamed Chaouki

    2018-03-01

    We propose a new approximation technique for accelerating the Global Illumination algorithm for real-time rendering. The proposed approach is based on the Screen-Space Ambient Occlusion (SSAO) method, which approximates the global illumination for large, fully dynamic scenes at interactive frame rates. Current algorithms that are based on the SSAO method suffer from difficulties due to the large number of samples that are required. In this paper, we propose an improvement to the SSAO technique by integrating it with a Multiple Importance Sampling technique that combines a stratified sampling method with an importance sampling method, with the objective of reducing the number of samples. Experimental evaluation demonstrates that our technique can produce high-quality images in real time and is significantly faster than traditional techniques.

  2. State of the art of environmentally friendly sample preparation approaches for determination of PBDEs and metabolites in environmental and biological samples: A critical review.

    PubMed

    Berton, Paula; Lana, Nerina B; Ríos, Juan M; García-Reyes, Juan F; Altamirano, Jorgelina C

    2016-01-28

    Green chemistry principles for developing methodologies have gained attention in analytical chemistry in recent decades. A growing number of analytical techniques have been proposed for determination of organic persistent pollutants in environmental and biological samples. In this light, the current review aims to present state-of-the-art sample preparation approaches based on green analytical principles proposed for the determination of polybrominated diphenyl ethers (PBDEs) and metabolites (OH-PBDEs and MeO-PBDEs) in environmental and biological samples. Approaches to lower the solvent consumption and accelerate the extraction, such as pressurized liquid extraction, microwave-assisted extraction, and ultrasound-assisted extraction, are discussed in this review. Special attention is paid to miniaturized sample preparation methodologies and strategies proposed to reduce organic solvent consumption. Additionally, extraction techniques based on alternative solvents (surfactants, supercritical fluids, or ionic liquids) are also commented in this work, even though these are scarcely used for determination of PBDEs. In addition to liquid-based extraction techniques, solid-based analytical techniques are also addressed. The development of greener, faster and simpler sample preparation approaches has increased in recent years (2003-2013). Among green extraction techniques, those based on the liquid phase predominate over those based on the solid phase (71% vs. 29%, respectively). For solid samples, solvent assisted extraction techniques are preferred for leaching of PBDEs, and liquid phase microextraction techniques are mostly used for liquid samples. Likewise, green characteristics of the instrumental analysis used after the extraction and clean-up steps are briefly discussed. Copyright © 2015 Elsevier B.V. All rights reserved.

  3. Advances in paper-based sample pretreatment for point-of-care testing.

    PubMed

    Tang, Rui Hua; Yang, Hui; Choi, Jane Ru; Gong, Yan; Feng, Shang Sheng; Pingguan-Murphy, Belinda; Huang, Qing Sheng; Shi, Jun Ling; Mei, Qi Bing; Xu, Feng

    2017-06-01

    In recent years, paper-based point-of-care testing (POCT) has been widely used in medical diagnostics, food safety and environmental monitoring. However, a high-cost, time-consuming and equipment-dependent sample pretreatment technique is generally required for raw sample processing, which are impractical for low-resource and disease-endemic areas. Therefore, there is an escalating demand for a cost-effective, simple and portable pretreatment technique, to be coupled with the commonly used paper-based assay (e.g. lateral flow assay) in POCT. In this review, we focus on the importance of using paper as a platform for sample pretreatment. We firstly discuss the beneficial use of paper for sample pretreatment, including sample collection and storage, separation, extraction, and concentration. We highlight the working principle and fabrication of each sample pretreatment device, the existing challenges and the future perspectives for developing paper-based sample pretreatment technique.

  4. Examining Returned Samples in their Collection Tubes Using Synchrotron Radiation-Based Techniques

    NASA Astrophysics Data System (ADS)

    Schoonen, M. A.; Hurowitz, J. A.; Thieme, J.; Dooryhee, E.; Fogelqvist, E.; Gregerson, J.; Farley, K. A.; Sherman, S.; Hill, J.

    2018-04-01

    Synchrotron radiation-based techniques can be leveraged for triaging and analysis of returned samples before unsealing collection tubes. Proof-of-concept measurements conducted at Brookhaven National Lab's National Synchrotron Light Source-II.

  5. Technique for fast and efficient hierarchical clustering

    DOEpatents

    Stork, Christopher

    2013-10-08

    A fast and efficient technique for hierarchical clustering of samples in a dataset includes compressing the dataset to reduce a number of variables within each of the samples of the dataset. A nearest neighbor matrix is generated to identify nearest neighbor pairs between the samples based on differences between the variables of the samples. The samples are arranged into a hierarchy that groups the samples based on the nearest neighbor matrix. The hierarchy is rendered to a display to graphically illustrate similarities or differences between the samples.

  6. Field Validity and Feasibility of Four Techniques for the Detection of Trichuris in Simians: A Model for Monitoring Drug Efficacy in Public Health?

    PubMed Central

    Levecke, Bruno; De Wilde, Nathalie; Vandenhoute, Els; Vercruysse, Jozef

    2009-01-01

    Background Soil-transmitted helminths, such as Trichuris trichiura, are of major concern in public health. Current efforts to control these helminth infections involve periodic mass treatment in endemic areas. Since these large-scale interventions are likely to intensify, monitoring the drug efficacy will become indispensible. However, studies comparing detection techniques based on sensitivity, fecal egg counts (FEC), feasibility for mass diagnosis and drug efficacy estimates are scarce. Methodology/Principal Findings In the present study, the ether-based concentration, the Parasep Solvent Free (SF), the McMaster and the FLOTAC techniques were compared based on both validity and feasibility for the detection of Trichuris eggs in 100 fecal samples of nonhuman primates. In addition, the drug efficacy estimates of quantitative techniques was examined using a statistical simulation. Trichuris eggs were found in 47% of the samples. FLOTAC was the most sensitive technique (100%), followed by the Parasep SF (83.0% [95% confidence interval (CI): 82.4–83.6%]) and the ether-based concentration technique (76.6% [95% CI: 75.8–77.3%]). McMaster was the least sensitive (61.7% [95% CI: 60.7–62.6%]) and failed to detect low FEC. The quantitative comparison revealed a positive correlation between the four techniques (Rs = 0.85–0.93; p<0.0001). However, the ether-based concentration technique and the Parasep SF detected significantly fewer eggs than both the McMaster and the FLOTAC (p<0.0083). Overall, the McMaster was the most feasible technique (3.9 min/sample for preparing, reading and cleaning of the apparatus), followed by the ether-based concentration technique (7.7 min/sample) and the FLOTAC (9.8 min/sample). Parasep SF was the least feasible (17.7 min/sample). The simulation revealed that the sensitivity is less important for monitoring drug efficacy and that both FLOTAC and McMaster were reliable estimators. Conclusions/Significance The results of this study demonstrated that McMaster is a promising technique when making use of FEC to monitor drug efficacy in Trichuris. PMID:19172171

  7. Gearbox Tooth Cut Fault Diagnostics Using Acoustic Emission and Vibration Sensors — A Comparative Study

    PubMed Central

    Qu, Yongzhi; He, David; Yoon, Jae; Van Hecke, Brandon; Bechhoefer, Eric; Zhu, Junda

    2014-01-01

    In recent years, acoustic emission (AE) sensors and AE-based techniques have been developed and tested for gearbox fault diagnosis. In general, AE-based techniques require much higher sampling rates than vibration analysis-based techniques for gearbox fault diagnosis. Therefore, it is questionable whether an AE-based technique would give a better or at least the same performance as the vibration analysis-based techniques using the same sampling rate. To answer the question, this paper presents a comparative study for gearbox tooth damage level diagnostics using AE and vibration measurements, the first known attempt to compare the gearbox fault diagnostic performance of AE- and vibration analysis-based approaches using the same sampling rate. Partial tooth cut faults are seeded in a gearbox test rig and experimentally tested in a laboratory. Results have shown that the AE-based approach has the potential to differentiate gear tooth damage levels in comparison with the vibration-based approach. While vibration signals are easily affected by mechanical resonance, the AE signals show more stable performance. PMID:24424467

  8. An improved sample loading technique for cellular metabolic response monitoring under pressure

    NASA Astrophysics Data System (ADS)

    Gikunda, Millicent Nkirote

    To monitor cellular metabolism under pressure, a pressure chamber designed around a simple-to-construct capillary-based spectroscopic chamber coupled to a microliter-flow perfusion system is used in the laboratory. Although cyanide-induced metabolic responses from Saccharomyces cerevisiae (baker's yeast) could be controllably induced and monitored under pressure, previously used sample loading technique was not well controlled. An improved cell-loading technique which is based on use of a secondary inner capillary into which the sample is loaded then inserted into the capillary pressure chamber, has been developed. As validation, we demonstrate the ability to measure the chemically-induced metabolic responses at pressures of up to 500 bars. This technique is shown to be less prone to sample loss due to perfusive flow than the previous techniques used.

  9. Spatial interpolation techniques using R

    EPA Science Inventory

    Interpolation techniques are used to predict the cell values of a raster based on sample data points. For example, interpolation can be used to predict the distribution of sediment particle size throughout an estuary based on discrete sediment samples. We demonstrate some inter...

  10. COST-EFFECTIVE SAMPLING FOR SPATIALLY DISTRIBUTED PHENOMENA

    EPA Science Inventory

    Various measures of sampling plan cost and loss are developed and analyzed as they relate to a variety of multidisciplinary sampling techniques. The sampling choices examined include methods from design-based sampling, model-based sampling, and geostatistics. Graphs and tables ar...

  11. Improved optical axis determination accuracy for fiber-based polarization-sensitive optical coherence tomography

    NASA Astrophysics Data System (ADS)

    Lu, Zenghai; Matcher, Stephen J.

    2013-03-01

    We report on a new calibration technique that permits the accurate extraction of sample Jones matrix and hence fast-axis orientation by using fiber-based polarization-sensitive optical coherence tomography (PS-OCT) that is completely based on non polarization maintaining fiber such as SMF-28. In this technique, two quarter waveplates are used to completely specify the parameters of the system fibers in the sample arm so that the Jones matrix of the sample can be determined directly. The device was validated on measurements of a quarter waveplate and an equine tendon sample by a single-mode fiber-based swept-source PS-OCT system.

  12. A two-step electrodialysis method for DNA purification from polluted metallic environmental samples.

    PubMed

    Rodríguez-Mejía, José Luis; Martínez-Anaya, Claudia; Folch-Mallol, Jorge Luis; Dantán-González, Edgar

    2008-08-01

    Extracting DNA from samples of polluted environments using standard methods often results in low yields of poor-quality material unsuited to subsequent manipulation and analysis by molecular biological techniques. Here, we report a novel two-step electrodialysis-based method for the extraction of DNA from environmental samples. This technique permits the rapid and efficient isolation of high-quality DNA based on its acidic nature, and without the requirement for phenol-chloroform-isoamyl alcohol cleanup and ethanol precipitation steps. Subsequent PCR, endonuclease restriction, and cloning reactions were successfully performed utilizing DNA obtained by electrodialysis, whereas some or all of these techniques failed using DNA extracted with two alternative methods. We also show that his technique is applicable to purify DNA from a range of polluted and nonpolluted samples.

  13. Quantitative Analysis of Tissue Samples by Combining iTRAQ Isobaric Labeling with Selected/Multiple Reaction Monitoring (SRM/MRM).

    PubMed

    Narumi, Ryohei; Tomonaga, Takeshi

    2016-01-01

    Mass spectrometry-based phosphoproteomics is an indispensible technique used in the discovery and quantification of phosphorylation events on proteins in biological samples. The application of this technique to tissue samples is especially useful for the discovery of biomarkers as well as biological studies. We herein describe the application of a large-scale phosphoproteome analysis and SRM/MRM-based quantitation to develop a strategy for the systematic discovery and validation of biomarkers using tissue samples.

  14. Direct Liquid Sampling for Corona Discharge Ion Mobility Spectrometry.

    PubMed

    Sabo, Martin; Malásková, Michaela; Harmathová, Olga; Hradski, Jasna; Masár, Marián; Radjenovic, Branislav; Matejčík, Štefan

    2015-07-21

    We present a new technique suitable for direct liquid sampling and analysis by ion mobility spectrometry (IMS). The technique is based on introduction of a droplet stream to the IMS reaction region. The technique was successfully used to detect explosives dissolved in methanol and oil as well as to analyze amino acids and dipeptides. One of the main advantages of this technique is its ability to analyze liquid samples without the requirement of any special solution.

  15. Application of inorganic element ratios to chemometrics for determination of the geographic origin of welsh onions.

    PubMed

    Ariyama, Kaoru; Horita, Hiroshi; Yasui, Akemi

    2004-09-22

    The composition of concentration ratios of 19 inorganic elements to Mg (hereinafter referred to as 19-element/Mg composition) was applied to chemometric techniques to determine the geographic origin (Japan or China) of Welsh onions (Allium fistulosum L.). Using a composition of element ratios has the advantage of simplified sample preparation, and it was possible to determine the geographic origin of a Welsh onion within 2 days. The classical technique based on 20 element concentrations was also used along with the new simpler one based on 19 elements/Mg in order to validate the new technique. Twenty elements, Na, P, K, Ca, Mg, Mn, Fe, Cu, Zn, Sr, Ba, Co, Ni, Rb, Mo, Cd, Cs, La, Ce, and Tl, in 244 Welsh onion samples were analyzed by flame atomic absorption spectroscopy, inductively coupled plasma atomic emission spectrometry, and inductively coupled plasma mass spectrometry. Linear discriminant analysis (LDA) on 20-element concentrations and 19-element/Mg composition was applied to these analytical data, and soft independent modeling of class analogy (SIMCA) on 19-element/Mg composition was applied to these analytical data. The results showed that techniques based on 19-element/Mg composition were effective. LDA, based on 19-element/Mg composition for classification of samples from Japan and from Shandong, Shanghai, and Fujian in China, classified 101 samples used for modeling 97% correctly and predicted another 119 samples excluding 24 nonauthentic samples 93% correctly. In discriminations by 10 times of SIMCA based on 19-element/Mg composition modeled using 101 samples, 220 samples from known production areas including samples used for modeling and excluding 24 nonauthentic samples were predicted 92% correctly.

  16. Comparison of the detection of periodontal pathogens in bacteraemia after tooth brushing by culture and molecular techniques.

    PubMed

    Marín, M-J; Figuero, E; González, I; O'Connor, A; Diz, P; Álvarez, M; Herrera, D; Sanz, M

    2016-05-01

    The prevalence and amounts of periodontal pathogens detected in bacteraemia samples after tooth brushing-induced by means of four diagnostic technique, three based on culture and one in a molecular-based technique, have been compared in this study. Blood samples were collected from thirty-six subjects with different periodontal status (17 were healthy, 10 with gingivitis and 9 with periodontitis) at baseline and 2 minutes after tooth brushing. Each sample was analyzed by three culture-based methods [direct anaerobic culturing (DAC), hemo-culture (BACTEC), and lysis-centrifugation (LC)] and one molecular-based technique [quantitative polymerase chain reaction (qPCR)]. With culture any bacterial isolate was detected and quantified, while with qPCR only Porphyromonas gingivalis and Aggregatibacter actinomycetemcomitans were detected and quantified. Descriptive analyses, ANOVA and Chi-squared tests, were performed. Neither BACTEC nor qPCR detected any type of bacteria in the blood samples. Only LC (2.7%) and DAC (8.3%) detected bacteraemia, although not in the same patients. Fusobacterium nucleatum was the most frequently detected bacterial species. The disparity in the results when the same samples were analyzed with four different microbiological detection methods highlights the need for a proper validation of the methodology to detect periodontal pathogens in bacteraemia samples, mainly when the presence of periodontal pathogens in blood samples after tooth brushing was very seldom.

  17. Developing Sediment Remediation Goals at Superfund Sites Based on Pore Water for the Protection of Benthic Organisms from Direct Toxicity to Non-ionic Organic Contaminants

    EPA Science Inventory

    A methodology for developing remediation goals for sites with contaminated sediments is provided. The remediation goals are based upon the concentrations of chemicals in the sediment interstitial water measured using the passive sampling technique. The passive sampling technique ...

  18. Accurate low-cost methods for performance evaluation of cache memory systems

    NASA Technical Reports Server (NTRS)

    Laha, Subhasis; Patel, Janak H.; Iyer, Ravishankar K.

    1988-01-01

    Methods of simulation based on statistical techniques are proposed to decrease the need for large trace measurements and for predicting true program behavior. Sampling techniques are applied while the address trace is collected from a workload. This drastically reduces the space and time needed to collect the trace. Simulation techniques are developed to use the sampled data not only to predict the mean miss rate of the cache, but also to provide an empirical estimate of its actual distribution. Finally, a concept of primed cache is introduced to simulate large caches by the sampling-based method.

  19. Evaluation of primary immunization coverage of infants under universal immunization programme in an urban area of bangalore city using cluster sampling and lot quality assurance sampling techniques.

    PubMed

    K, Punith; K, Lalitha; G, Suman; Bs, Pradeep; Kumar K, Jayanth

    2008-07-01

    Is LQAS technique better than cluster sampling technique in terms of resources to evaluate the immunization coverage in an urban area? To assess and compare the lot quality assurance sampling against cluster sampling in the evaluation of primary immunization coverage. Population-based cross-sectional study. Areas under Mathikere Urban Health Center. Children aged 12 months to 23 months. 220 in cluster sampling, 76 in lot quality assurance sampling. Percentages and Proportions, Chi square Test. (1) Using cluster sampling, the percentage of completely immunized, partially immunized and unimmunized children were 84.09%, 14.09% and 1.82%, respectively. With lot quality assurance sampling, it was 92.11%, 6.58% and 1.31%, respectively. (2) Immunization coverage levels as evaluated by cluster sampling technique were not statistically different from the coverage value as obtained by lot quality assurance sampling techniques. Considering the time and resources required, it was found that lot quality assurance sampling is a better technique in evaluating the primary immunization coverage in urban area.

  20. Actinide bioimaging in tissues: Comparison of emulsion and solid track autoradiography techniques with the iQID camera

    PubMed Central

    Miller, Brian W.; Van der Meeren, Anne; Tazrart, Anissa; Angulo, Jaime F.; Griffiths, Nina M.

    2017-01-01

    This work presents a comparison of three autoradiography techniques for imaging biological samples contaminated with actinides: emulsion-based, plastic-based autoradiography and a quantitative digital technique, the iQID camera, based on the numerical analysis of light from a scintillator screen. In radiation toxicology it has been important to develop means of imaging actinide distribution in tissues as these radionuclides may be heterogeneously distributed within and between tissues after internal contamination. Actinide distribution determines which cells are exposed to alpha radiation and is thus potentially critical for assessing absorbed dose. The comparison was carried out by generating autoradiographs of the same biological samples contaminated with actinides with the three autoradiography techniques. These samples were cell preparations or tissue sections collected from animals contaminated with different physico-chemical forms of actinides. The autoradiograph characteristics and the performances of the techniques were evaluated and discussed mainly in terms of acquisition process, activity distribution patterns, spatial resolution and feasibility of activity quantification. The obtained autoradiographs presented similar actinide distribution at low magnification. Out of the three techniques, emulsion autoradiography is the only one to provide a highly-resolved image of the actinide distribution inherently superimposed on the biological sample. Emulsion autoradiography is hence best interpreted at higher magnifications. However, this technique is destructive for the biological sample. Both emulsion- and plastic-based autoradiography record alpha tracks and thus enabled the differentiation between ionized forms of actinides and oxide particles. This feature can help in the evaluation of decorporation therapy efficacy. The most recent technique, the iQID camera, presents several additional features: real-time imaging, separate imaging of alpha particles and gamma rays, and alpha activity quantification. The comparison of these three autoradiography techniques showed that they are complementary and the choice of the technique depends on the purpose of the imaging experiment. PMID:29023595

  1. Mass Spectrometric and Synchrotron Radiation based techniques for the identification and distribution of painting materials in samples from paints of Josep Maria Sert

    PubMed Central

    2012-01-01

    Background Establishing the distribution of materials in paintings and that of their degradation products by imaging techniques is fundamental to understand the painting technique and can improve our knowledge on the conservation status of the painting. The combined use of chromatographic-mass spectrometric techniques, such as GC/MS or Py/GC/MS, and the chemical mapping of functional groups by imaging SR FTIR in transmission mode on thin sections and SR XRD line scans will be presented as a suitable approach to have a detailed characterisation of the materials in a paint sample, assuring their localisation in the sample build-up. This analytical approach has been used to study samples from Catalan paintings by Josep Maria Sert y Badía (20th century), a muralist achieving international recognition whose canvases adorned international buildings. Results The pigments used by the painter as well as the organic materials used as binders and varnishes could be identified by means of conventional techniques. The distribution of these materials by means of Synchrotron Radiation based techniques allowed to establish the mixtures used by the painter depending on the purpose. Conclusions Results show the suitability of the combined use of SR μFTIR and SR μXRD mapping and conventional techniques to unequivocally identify all the materials present in the sample and their localization in the sample build-up. This kind of approach becomes indispensable to solve the challenge of micro heterogeneous samples. The complementary interpretation of the data obtained with all the different techniques allowed the characterization of both organic and inorganic materials in the samples layer by layer as well as to establish the painting techniques used by Sert in the works-of-art under study. PMID:22616949

  2. Explosive detection technology

    NASA Astrophysics Data System (ADS)

    Doremus, Steven; Crownover, Robin

    2017-05-01

    The continuing proliferation of improvised explosive devices is an omnipresent threat to civilians and members of military and law enforcement around the world. The ability to accurately and quickly detect explosive materials from a distance would be an extremely valuable tool for mitigating the risk posed by these devices. A variety of techniques exist that are capable of accurately identifying explosive compounds, but an effective standoff technique is still yet to be realized. Most of the methods being investigated to fill this gap in capabilities are laser based. Raman spectroscopy is one such technique that has been demonstrated to be effective at a distance. Spatially Offset Raman Spectroscopy (SORS) is a technique capable of identifying chemical compounds inside of containers, which could be used to detect hidden explosive devices. Coherent Anti-Stokes Raman Spectroscopy (CARS) utilized a coherent pair of lasers to excite a sample, greatly increasing the response of sample while decreasing the strength of the lasers being used, which significantly improves the eye safety issue that typically hinders laser-based detection methods. Time-gating techniques are also being developed to improve the data collection from Raman techniques, which are often hindered fluorescence of the test sample in addition to atmospheric, substrate, and contaminant responses. Ultraviolet based techniques have also shown significant promise by greatly improved signal strength from excitation of resonance in many explosive compounds. Raman spectroscopy, which identifies compounds based on their molecular response, can be coupled with Laser Induced Breakdown Spectroscopy (LIBS) capable of characterizing the sample's atomic composition using a single laser.

  3. Improving EEG-Based Motor Imagery Classification for Real-Time Applications Using the QSA Method.

    PubMed

    Batres-Mendoza, Patricia; Ibarra-Manzano, Mario A; Guerra-Hernandez, Erick I; Almanza-Ojeda, Dora L; Montoro-Sanjose, Carlos R; Romero-Troncoso, Rene J; Rostro-Gonzalez, Horacio

    2017-01-01

    We present an improvement to the quaternion-based signal analysis (QSA) technique to extract electroencephalography (EEG) signal features with a view to developing real-time applications, particularly in motor imagery (IM) cognitive processes. The proposed methodology (iQSA, improved QSA) extracts features such as the average, variance, homogeneity, and contrast of EEG signals related to motor imagery in a more efficient manner (i.e., by reducing the number of samples needed to classify the signal and improving the classification percentage) compared to the original QSA technique. Specifically, we can sample the signal in variable time periods (from 0.5 s to 3 s, in half-a-second intervals) to determine the relationship between the number of samples and their effectiveness in classifying signals. In addition, to strengthen the classification process a number of boosting-technique-based decision trees were implemented. The results show an 82.30% accuracy rate for 0.5 s samples and 73.16% for 3 s samples. This is a significant improvement compared to the original QSA technique that offered results from 33.31% to 40.82% without sampling window and from 33.44% to 41.07% with sampling window, respectively. We can thus conclude that iQSA is better suited to develop real-time applications.

  4. Improving EEG-Based Motor Imagery Classification for Real-Time Applications Using the QSA Method

    PubMed Central

    Batres-Mendoza, Patricia; Guerra-Hernandez, Erick I.; Almanza-Ojeda, Dora L.; Montoro-Sanjose, Carlos R.

    2017-01-01

    We present an improvement to the quaternion-based signal analysis (QSA) technique to extract electroencephalography (EEG) signal features with a view to developing real-time applications, particularly in motor imagery (IM) cognitive processes. The proposed methodology (iQSA, improved QSA) extracts features such as the average, variance, homogeneity, and contrast of EEG signals related to motor imagery in a more efficient manner (i.e., by reducing the number of samples needed to classify the signal and improving the classification percentage) compared to the original QSA technique. Specifically, we can sample the signal in variable time periods (from 0.5 s to 3 s, in half-a-second intervals) to determine the relationship between the number of samples and their effectiveness in classifying signals. In addition, to strengthen the classification process a number of boosting-technique-based decision trees were implemented. The results show an 82.30% accuracy rate for 0.5 s samples and 73.16% for 3 s samples. This is a significant improvement compared to the original QSA technique that offered results from 33.31% to 40.82% without sampling window and from 33.44% to 41.07% with sampling window, respectively. We can thus conclude that iQSA is better suited to develop real-time applications. PMID:29348744

  5. Dispersive Solid Phase Extraction for the Analysis of Veterinary Drugs Applied to Food Samples: A Review

    PubMed Central

    Islas, Gabriela; Hernandez, Prisciliano

    2017-01-01

    To achieve analytical success, it is necessary to develop thorough clean-up procedures to extract analytes from the matrix. Dispersive solid phase extraction (DSPE) has been used as a pretreatment technique for the analysis of several compounds. This technique is based on the dispersion of a solid sorbent in liquid samples in the extraction isolation and clean-up of different analytes from complex matrices. DSPE has found a wide range of applications in several fields, and it is considered to be a selective, robust, and versatile technique. The applications of dispersive techniques in the analysis of veterinary drugs in different matrices involve magnetic sorbents, molecularly imprinted polymers, carbon-based nanomaterials, and the Quick, Easy, Cheap, Effective, Rugged, and Safe (QuEChERS) method. Techniques based on DSPE permit minimization of additional steps such as precipitation, centrifugation, and filtration, which decreases the manipulation of the sample. In this review, we describe the main procedures used for synthesis, characterization, and application of this pretreatment technique and how it has been applied to food analysis. PMID:29181027

  6. Evaluation of Primary Immunization Coverage of Infants Under Universal Immunization Programme in an Urban Area of Bangalore City Using Cluster Sampling and Lot Quality Assurance Sampling Techniques

    PubMed Central

    K, Punith; K, Lalitha; G, Suman; BS, Pradeep; Kumar K, Jayanth

    2008-01-01

    Research Question: Is LQAS technique better than cluster sampling technique in terms of resources to evaluate the immunization coverage in an urban area? Objective: To assess and compare the lot quality assurance sampling against cluster sampling in the evaluation of primary immunization coverage. Study Design: Population-based cross-sectional study. Study Setting: Areas under Mathikere Urban Health Center. Study Subjects: Children aged 12 months to 23 months. Sample Size: 220 in cluster sampling, 76 in lot quality assurance sampling. Statistical Analysis: Percentages and Proportions, Chi square Test. Results: (1) Using cluster sampling, the percentage of completely immunized, partially immunized and unimmunized children were 84.09%, 14.09% and 1.82%, respectively. With lot quality assurance sampling, it was 92.11%, 6.58% and 1.31%, respectively. (2) Immunization coverage levels as evaluated by cluster sampling technique were not statistically different from the coverage value as obtained by lot quality assurance sampling techniques. Considering the time and resources required, it was found that lot quality assurance sampling is a better technique in evaluating the primary immunization coverage in urban area. PMID:19876474

  7. Direct Analysis of Samples of Various Origin and Composition Using Specific Types of Mass Spectrometry.

    PubMed

    Byliński, Hubert; Gębicki, Jacek; Dymerski, Tomasz; Namieśnik, Jacek

    2017-07-04

    One of the major sources of error that occur during chemical analysis utilizing the more conventional and established analytical techniques is the possibility of losing part of the analytes during the sample preparation stage. Unfortunately, this sample preparation stage is required to improve analytical sensitivity and precision. Direct techniques have helped to shorten or even bypass the sample preparation stage; and in this review, we comment of some of the new direct techniques that are mass-spectrometry based. The study presents information about the measurement techniques using mass spectrometry, which allow direct sample analysis, without sample preparation or limiting some pre-concentration steps. MALDI - MS, PTR - MS, SIFT - MS, DESI - MS techniques are discussed. These solutions have numerous applications in different fields of human activity due to their interesting properties. The advantages and disadvantages of these techniques are presented. The trends in development of direct analysis using the aforementioned techniques are also presented.

  8. Increasing the speed of tumour diagnosis during surgery with selective scanning Raman microscopy

    NASA Astrophysics Data System (ADS)

    Kong, Kenny; Rowlands, Christopher J.; Varma, Sandeep; Perkins, William; Leach, Iain H.; Koloydenko, Alexey A.; Pitiot, Alain; Williams, Hywel C.; Notingher, Ioan

    2014-09-01

    One of the main challenges in cancer surgery is ensuring that all tumour cells are removed during surgery, while sparing as much healthy tissue as possible. Histopathology, the gold-standard technique for cancer diagnosis, is often impractical for intra-operative use because of the time-consuming tissue preparation procedures (sectioning and staining). Raman micro-spectroscopy is a powerful technique that can discriminate between tumours and healthy tissues with high accuracy, based entirely on intrinsic chemical differences. However, raster-scanning Raman micro-spectroscopy is a slow imaging technique that typically requires data acquisition times as long as several days for typical tissue samples obtained during surgery (1 × 1 cm2) - in particular when high signal-to-noise ratio spectra are required to ensure accurate diagnosis. In this paper we present two techniques based on selective sampling Raman micro-spectroscopy that can overcome these limitations. In selective sampling, information regarding the spatial features of the tissue, either measured by an alternative optical technique or estimated in real-time from the Raman spectra, can be used to drastically reduce the number of Raman spectra required for diagnosis. These sampling strategies allowed diagnosis of basal cell carcinoma in skin tissue samples excised during Mohs micrographic surgery faster than frozen section histopathology, and two orders of magnitude faster than previous techniques based on raster-scanning Raman microscopy. Further development of these techniques may help during cancer surgery by providing a fast and objective way for surgeons to ensure the complete removal of tumour cells while sparing as much healthy tissue as possible.

  9. Density-based empirical likelihood procedures for testing symmetry of data distributions and K-sample comparisons.

    PubMed

    Vexler, Albert; Tanajian, Hovig; Hutson, Alan D

    In practice, parametric likelihood-ratio techniques are powerful statistical tools. In this article, we propose and examine novel and simple distribution-free test statistics that efficiently approximate parametric likelihood ratios to analyze and compare distributions of K groups of observations. Using the density-based empirical likelihood methodology, we develop a Stata package that applies to a test for symmetry of data distributions and compares K -sample distributions. Recognizing that recent statistical software packages do not sufficiently address K -sample nonparametric comparisons of data distributions, we propose a new Stata command, vxdbel, to execute exact density-based empirical likelihood-ratio tests using K samples. To calculate p -values of the proposed tests, we use the following methods: 1) a classical technique based on Monte Carlo p -value evaluations; 2) an interpolation technique based on tabulated critical values; and 3) a new hybrid technique that combines methods 1 and 2. The third, cutting-edge method is shown to be very efficient in the context of exact-test p -value computations. This Bayesian-type method considers tabulated critical values as prior information and Monte Carlo generations of test statistic values as data used to depict the likelihood function. In this case, a nonparametric Bayesian method is proposed to compute critical values of exact tests.

  10. A New Method for Estimating Bacterial Abundances in Natural Samples using Sublimation

    NASA Technical Reports Server (NTRS)

    Glavin, Daniel P.; Cleaves, H. James; Schubert, Michael; Aubrey, Andrew; Bada, Jeffrey L.

    2004-01-01

    We have developed a new method based on the sublimation of adenine from Escherichia coli to estimate bacterial cell counts in natural samples. To demonstrate this technique, several types of natural samples including beach sand, seawater, deep-sea sediment, and two soil samples from the Atacama Desert were heated to a temperature of 500 C for several seconds under reduced pressure. The sublimate was collected on a cold finger and the amount of adenine released from the samples then determined by high performance liquid chromatography (HPLC) with UV absorbance detection. Based on the total amount of adenine recovered from DNA and RNA in these samples, we estimated bacterial cell counts ranging from approx. l0(exp 5) to l0(exp 9) E. coli cell equivalents per gram. For most of these samples, the sublimation based cell counts were in agreement with total bacterial counts obtained by traditional DAPI staining. The simplicity and robustness of the sublimation technique compared to the DAPI staining method makes this approach particularly attractive for use by spacecraft instrumentation. NASA is currently planning to send a lander to Mars in 2009 in order to assess whether or not organic compounds, especially those that might be associated with life, are present in Martian surface samples. Based on our analyses of the Atacama Desert soil samples, several million bacterial cells per gam of Martian soil should be detectable using this sublimation technique.

  11. Sexing adult black-legged kittiwakes by DNA, behavior, and morphology

    USGS Publications Warehouse

    Jodice, P.G.R.; Lanctot, Richard B.; Gill, V.A.; Roby, D.D.; Hatch, Shyla A.

    2000-01-01

    We sexed adult Black-legged Kittiwakes (Rissa tridactyla) using DNA-based genetic techniques, behavior and morphology and compared results from these techniques. Genetic and morphology data were collected on 605 breeding kittiwakes and sex-specific behaviors were recorded for a sub-sample of 285 of these individuals. We compared sex classification based on both genetic and behavioral techniques for this sub-sample to assess the accuracy of the genetic technique. DNA-based techniques correctly sexed 97.2% and sex-specific behaviors, 96.5% of this sub-sample. We used the corrected genetic classifications from this sub-sample and the genetic classifications for the remaining birds, under the assumption they were correct, to develop predictive morphometric discriminant function models for all 605 birds. These models accurately predicted the sex of 73-96% of individuals examined, depending on the sample of birds used and the characters included. The most accurate single measurement for determining sex was length of head plus bill, which correctly classified 88% of individuals tested. When both members of a pair were measured, classification levels improved and approached the accuracy of both behavioral observations and genetic analyses. Morphometric techniques were only slightly less accurate than genetic techniques but were easier to implement in the field and less costly. Behavioral observations, while highly accurate, required that birds be easily observable during the breeding season and that birds be identifiable. As such, sex-specific behaviors may best be applied as a confirmation of sex for previously marked birds. All three techniques thus have the potential to be highly accurate, and the selection of one or more will depend on the circumstances of any particular field study.

  12. A microhistological technique for analysis of food habits of mycophagous rodents.

    Treesearch

    Patrick W. McIntire; Andrew B. Carey

    1989-01-01

    We present a technique, based on microhistological analysis of fecal pellets, for quantifying the diets of forest rodents. This technique provides for the simultaneous recording of fungal spores and vascular plant material. Fecal samples should be freeze dried, weighed, and rehydrated with distilled water. We recommend a minimum sampling intensity of 50 fields of view...

  13. Proteomic Challenges: Sample Preparation Techniques for Microgram-Quantity Protein Analysis from Biological Samples

    PubMed Central

    Feist, Peter; Hummon, Amanda B.

    2015-01-01

    Proteins regulate many cellular functions and analyzing the presence and abundance of proteins in biological samples are central focuses in proteomics. The discovery and validation of biomarkers, pathways, and drug targets for various diseases can be accomplished using mass spectrometry-based proteomics. However, with mass-limited samples like tumor biopsies, it can be challenging to obtain sufficient amounts of proteins to generate high-quality mass spectrometric data. Techniques developed for macroscale quantities recover sufficient amounts of protein from milligram quantities of starting material, but sample losses become crippling with these techniques when only microgram amounts of material are available. To combat this challenge, proteomicists have developed micro-scale techniques that are compatible with decreased sample size (100 μg or lower) and still enable excellent proteome coverage. Extraction, contaminant removal, protein quantitation, and sample handling techniques for the microgram protein range are reviewed here, with an emphasis on liquid chromatography and bottom-up mass spectrometry-compatible techniques. Also, a range of biological specimens, including mammalian tissues and model cell culture systems, are discussed. PMID:25664860

  14. Orthogonality-breaking sensing model based on the instantaneous Stokes vector and the Mueller calculus

    NASA Astrophysics Data System (ADS)

    Ortega-Quijano, Noé; Fade, Julien; Roche, Muriel; Parnet, François; Alouini, Mehdi

    2016-04-01

    Polarimetric sensing by orthogonality breaking has been recently proposed as an alternative technique for performing direct and fast polarimetric measurements using a specific dual-frequency dual-polarization (DFDP) source. Based on the instantaneous Stokes-Mueller formalism to describe the high-frequency evolution of the DFDP beam intensity, we thoroughly analyze the interaction of such a beam with birefringent, dichroic and depolarizing samples. This allows us to confirm that orthogonality breaking is produced by the sample diattenuation, whereas this technique is immune to both birefringence and diagonal depolarization. We further analyze the robustness of this technique when polarimetric sensing is performed through a birefringent waveguide, and the optimal DFDP source configuration for fiber-based endoscopic measurements is subsequently identified. Finally, we consider a stochastic depolarization model based on an ensemble of random linear diattenuators, which makes it possible to understand the progressive vanishing of the detected orthogonality breaking signal as the spatial heterogeneity of the sample increases, thus confirming the insensitivity of this method to diagonal depolarization. The fact that the orthogonality breaking signal is exclusively due to the sample dichroism is an advantageous feature for the precise decoupled characterization of such an anisotropic parameter in samples showing several simultaneous effects.

  15. High-throughput immunomagnetic scavenging technique for quantitative analysis of live VX nerve agent in water, hamburger, and soil matrixes.

    PubMed

    Knaack, Jennifer S; Zhou, Yingtao; Abney, Carter W; Prezioso, Samantha M; Magnuson, Matthew; Evans, Ronald; Jakubowski, Edward M; Hardy, Katelyn; Johnson, Rudolph C

    2012-11-20

    We have developed a novel immunomagnetic scavenging technique for extracting cholinesterase inhibitors from aqueous matrixes using biological targeting and antibody-based extraction. The technique was characterized using the organophosphorus nerve agent VX. The limit of detection for VX in high-performance liquid chromatography (HPLC)-grade water, defined as the lowest calibrator concentration, was 25 pg/mL in a small, 500 μL sample. The method was characterized over the course of 22 sample sets containing calibrators, blanks, and quality control samples. Method precision, expressed as the mean relative standard deviation, was less than 9.2% for all calibrators. Quality control sample accuracy was 102% and 100% of the mean for VX spiked into HPLC-grade water at concentrations of 2.0 and 0.25 ng/mL, respectively. This method successfully was applied to aqueous extracts from soil, hamburger, and finished tap water spiked with VX. Recovery was 65%, 81%, and 100% from these matrixes, respectively. Biologically based extractions of organophosphorus compounds represent a new technique for sample extraction that provides an increase in extraction specificity and sensitivity.

  16. Mercury in Environmental and Biological Samples Using Online Combustion with Sequential Atomic Absorption and Fluorescence Measurements: A Direct Comparison of Two Fundamental Techniques in Spectrometry

    ERIC Educational Resources Information Center

    Cizdziel, James V.

    2011-01-01

    In this laboratory experiment, students quantitatively determine the concentration of an element (mercury) in an environmental or biological sample while comparing and contrasting the fundamental techniques of atomic absorption spectrometry (AAS) and atomic fluorescence spectrometry (AFS). A mercury analyzer based on sample combustion,…

  17. Data in support of the detection of genetically modified organisms (GMOs) in food and feed samples.

    PubMed

    Alasaad, Noor; Alzubi, Hussein; Kader, Ahmad Abdul

    2016-06-01

    Food and feed samples were randomly collected from different sources, including local and imported materials from the Syrian local market. These included maize, barley, soybean, fresh food samples and raw material. GMO detection was conducted by PCR and nested PCR-based techniques using specific primers for the most used foreign DNA commonly used in genetic transformation procedures, i.e., 35S promoter, T-nos, epsps, cryIA(b) gene and nptII gene. The results revealed for the first time in Syria the presence of GM foods and feeds with glyphosate-resistant trait of P35S promoter and NOS terminator in the imported soybean samples with high frequency (5 out of the 6 imported soybean samples). While, tests showed negative results for the local samples. Also, tests revealed existence of GMOs in two imported maize samples detecting the presence of 35S promoter and nos terminator. Nested PCR results using two sets of primers confirmed our data. The methods applied in the brief data are based on DNA analysis by Polymerase Chain Reaction (PCR). This technique is specific, practical, reproducible and sensitive enough to detect up to 0.1% GMO in food and/or feedstuffs. Furthermore, all of the techniques mentioned are economic and can be applied in Syria and other developing countries. For all these reasons, the DNA-based analysis methods were chosen and preferred over protein-based analysis.

  18. The Effect of Learning Based on Technology Model and Assessment Technique toward Thermodynamic Learning Achievement

    NASA Astrophysics Data System (ADS)

    Makahinda, T.

    2018-02-01

    The purpose of this research is to find out the effect of learning model based on technology and assessment technique toward thermodynamic achievement by controlling students intelligence. This research is an experimental research. The sample is taken through cluster random sampling with the total respondent of 80 students. The result of the research shows that the result of learning of thermodynamics of students who taught the learning model of environmental utilization is higher than the learning result of student thermodynamics taught by simulation animation, after controlling student intelligence. There is influence of student interaction, and the subject between models of technology-based learning with assessment technique to student learning result of Thermodynamics, after controlling student intelligence. Based on the finding in the lecture then should be used a thermodynamic model of the learning environment with the use of project assessment technique.

  19. Comparison of three methods of sampling for endometrial cytology in the mare. Preliminary study.

    PubMed

    Defontis, M; Vaillancourt, D; Grand, F X

    2011-01-01

    This prospective study aims to compare three different sampling techniques for the collection of endometrial cytological specimens in the mare: the guarded culture swab, the uterine cytobrush and the low volume uterine flush. The study population consisted of six healthy Standardbred mares in dioestrus. In each mare an acute endometritis was induced by performing a low- volume uterine flush 6days after ovulation using a sterile isotonic solution (lactated Ringer's solution or ViGro™ Complete Flush Solution). Two days after initiating inflammation, samples were collected from each mare using the three compared techniques: the double guarded cotton swab, the uterine cytobrush and the low volume uterine flush. The cytological evaluation of the samples was based on following criteria: the quality and cellularity of the samples and the number of neutrophils recovered. The uterine cytobrush yielded slides of significantly (p=0.02) better quality than the low volume uterine flush. There was no significant difference between the cytobrush and the double guarded swab technique for the quality. There was no difference between techniques in the number of endometrial cells (p=0.55) and neutrophils recovered (p=0.28). Endometrial cytology is a practical method for the diagnosis of acute endometrial inflammation in the mare. Since no difference in the number of neutrophils was found between the three techniques, the choice of the sampling method should be based on other factors such as practicability, costs and disadvantages of each technique.

  20. Phosphorus and nitrogen concentrations and loads at Illinois River south of Siloam Springs, Arkansas, 1997-1999

    USGS Publications Warehouse

    Green, W. Reed; Haggard, Brian E.

    2001-01-01

    Water-quality sampling consisting of every other month (bimonthly) routine sampling and storm event sampling (six storms annually) is used to estimate annual phosphorus and nitrogen loads at Illinois River south of Siloam Springs, Arkansas. Hydrograph separation allowed assessment of base-flow and surfacerunoff nutrient relations and yield. Discharge and nutrient relations indicate that water quality at Illinois River south of Siloam Springs, Arkansas, is affected by both point and nonpoint sources of contamination. Base-flow phosphorus concentrations decreased with increasing base-flow discharge indicating the dilution of phosphorus in water from point sources. Nitrogen concentrations increased with increasing base-flow discharge, indicating a predominant ground-water source. Nitrogen concentrations at higher base-flow discharges often were greater than median concentrations reported for ground water (from wells and springs) in the Springfield Plateau aquifer. Total estimated phosphorus and nitrogen annual loads for calendar year 1997-1999 using the regression techniques presented in this paper (35 samples) were similar to estimated loads derived from integration techniques (1,033 samples). Flow-weighted nutrient concentrations and nutrient yields at the Illinois River site were about 10 to 100 times greater than national averages for undeveloped basins and at North Sylamore Creek and Cossatot River (considered to be undeveloped basins in Arkansas). Total phosphorus and soluble reactive phosphorus were greater than 10 times and total nitrogen and dissolved nitrite plus nitrate were greater than 10 to 100 times the national and regional averages for undeveloped basins. These results demonstrate the utility of a strategy whereby samples are collected every other month and during selected storm events annually, with use of regression models to estimate nutrient loads. Annual loads of phosphorus and nitrogen estimated using regression techniques could provide similar results to estimates using integration techniques, with much less investment.

  1. Ambient Mass Spectrometry Imaging Using Direct Liquid Extraction Techniques

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Laskin, Julia; Lanekoff, Ingela

    2015-11-13

    Mass spectrometry imaging (MSI) is a powerful analytical technique that enables label-free spatial localization and identification of molecules in complex samples.1-4 MSI applications range from forensics5 to clinical research6 and from understanding microbial communication7-8 to imaging biomolecules in tissues.1, 9-10 Recently, MSI protocols have been reviewed.11 Ambient ionization techniques enable direct analysis of complex samples under atmospheric pressure without special sample pretreatment.3, 12-16 In fact, in ambient ionization mass spectrometry, sample processing (e.g., extraction, dilution, preconcentration, or desorption) occurs during the analysis.17 This substantially speeds up analysis and eliminates any possible effects of sample preparation on the localization of moleculesmore » in the sample.3, 8, 12-14, 18-20 Venter and co-workers have classified ambient ionization techniques into three major categories based on the sample processing steps involved: 1) liquid extraction techniques, in which analyte molecules are removed from the sample and extracted into a solvent prior to ionization; 2) desorption techniques capable of generating free ions directly from substrates; and 3) desorption techniques that produce larger particles subsequently captured by an electrospray plume and ionized.17 This review focuses on localized analysis and ambient imaging of complex samples using a subset of ambient ionization methods broadly defined as “liquid extraction techniques” based on the classification introduced by Venter and co-workers.17 Specifically, we include techniques where analyte molecules are desorbed from solid or liquid samples using charged droplet bombardment, liquid extraction, physisorption, chemisorption, mechanical force, laser ablation, or laser capture microdissection. Analyte extraction is followed by soft ionization that generates ions corresponding to intact species. Some of the key advantages of liquid extraction techniques include the ease of operation, ability to analyze samples in their native environments, speed of analysis, and ability to tune the extraction solvent composition to a problem at hand. For example, solvent composition may be optimized for efficient extraction of different classes of analytes from the sample or for quantification or online derivatization through reactive analysis. In this review, we will: 1) introduce individual liquid extraction techniques capable of localized analysis and imaging, 2) describe approaches for quantitative MSI experiments free of matrix effects, 3) discuss advantages of reactive analysis for MSI experiments, and 4) highlight selected applications (published between 2012 and 2015) that focus on imaging and spatial profiling of molecules in complex biological and environmental samples.« less

  2. A contribution to reduce sampling variability in the evaluation of deoxynivalenol contamination of organic wheat grain.

    PubMed

    Hallier, Arnaud; Celette, Florian; Coutarel, Julie; David, Christophe

    2013-01-01

    Fusarium head blight caused by different varieties of Fusarium species is one of the major serious worldwide diseases found in wheat production. It is therefore important to be able to quantify the deoxynivalenol concentration in wheat. Unfortunately, in mycotoxin quantification, due to the uneven distribution of mycotoxins within the initial lot, it is difficult, or even impossible, to obtain a truly representative analytical sample. In previous work we showed that the sampling step most responsible for variability was grain sampling. In this paper, it is more particularly the step scaling down from a laboratory sample of some kilograms to an analytical sample of a few grams that is investigated. The naturally contaminated wheat lot was obtained from an organic field located in the southeast of France (Rhône-Alpes) from the year 2008-2009 cropping season. The deoxynivalenol level was found to be 50.6 ± 2.3 ng g⁻¹. Deoxynivalenol was extracted with a acetonitrile-water mix and quantified by gas chromatography-electron capture detection (GC-ECD). Three different grain sampling techniques were tested to obtain analytical samples: a technique based on manually homogenisation and division, a second technique based on the use of a rotating shaker and a third on the use of compressed air. Both the rotating shaker and the compressed air techniques enabled a homogeneous laboratory sample to be obtained, from which representative analytical samples could be taken. Moreover, the techniques did away with many repetitions and grinding. This study, therefore, contributes to sampling variability reduction in the evaluation of deoxynivalenol contamination of organic wheat grain, and then, at a reasonable cost.

  3. Magnetic separation techniques in sample preparation for biological analysis: a review.

    PubMed

    He, Jincan; Huang, Meiying; Wang, Dongmei; Zhang, Zhuomin; Li, Gongke

    2014-12-01

    Sample preparation is a fundamental and essential step in almost all the analytical procedures, especially for the analysis of complex samples like biological and environmental samples. In past decades, with advantages of superparamagnetic property, good biocompatibility and high binding capacity, functionalized magnetic materials have been widely applied in various processes of sample preparation for biological analysis. In this paper, the recent advancements of magnetic separation techniques based on magnetic materials in the field of sample preparation for biological analysis were reviewed. The strategy of magnetic separation techniques was summarized. The synthesis, stabilization and bio-functionalization of magnetic nanoparticles were reviewed in detail. Characterization of magnetic materials was also summarized. Moreover, the applications of magnetic separation techniques for the enrichment of protein, nucleic acid, cell, bioactive compound and immobilization of enzyme were described. Finally, the existed problems and possible trends of magnetic separation techniques for biological analysis in the future were proposed. Copyright © 2014 Elsevier B.V. All rights reserved.

  4. Titration Techniques

    NASA Astrophysics Data System (ADS)

    Jacobsen, Jerrold J.; Houston Jetzer, Kelly; Patani, Néha; Zimmerman, John; Zweerink, Gerald

    1995-07-01

    Significant attention is paid to the proper technique for reading a meniscus. Video shows meniscus-viewing techniques for colorless and dark liquids and the consequences of not reading a meniscus at eye level. Lessons are provided on approaching the end point, focusing on end point colors produced via different commonly used indicators. The concept of a titration curve is illustrated by means of a pH meter. Carefully recorded images of the entire range of meniscus values in a buret, pipet, and graduated cylinder are included so that you can show your students, in lecture or pre-lab discussion, any meniscus and discuss how to read the buret properly. These buret meniscus values are very carefully recorded at the rate of one video frame per hundredth of a milliliter, so that an image showing any given meniscus value can be obtained. These images can be easily incorporated into a computer-based multimedia environment for testing or meniscus-reading exercises. Two of the authors have used this technique and found the exercise to be very well received by their students. Video on side two shows nearly 100 "bloopers", demonstrating both the right way and wrong ways to do tasks associated with titration. This material can be used in a variety of situations: to show students the correct way to do something; to test students by asking them "What is this person doing wrong?"; or to develop multimedia, computer-based lessons. The contents of Titration Techniques are listed below: Side 1 Titration: what it is. A simple titration; Acid-base titration animation; A brief redox titration; Redox titration animation; A complete acid-base titration. Titration techniques. Hand technique variations; Stopcock; Using a buret to measure liquid volumes; Wait before reading meniscus; Dirty and clean burets; Read meniscus at eye level (see Fig. 1); Meniscus viewing techniques--light colored liquids; Meniscus viewing techniques--dark liquids; Using a magnetic stirrer; Rough titration; Significant figures; Approaching the end point; End point colors; Titration with a pH meter; Titration curves; Colors of indicators. Meniscus values. Buret meniscus values; Pipet meniscus values; Graduated cylinder meniscus values. Side 2"Bloopers". Introducing the people; Titration animation; Inspecting the buret; Rinsing the buret with water; Preparing a solid sample; Obtaining a liquid sample; Delivering a liquid sample with a Mohr pipet; Pipetting a liquid sample with a Mohr pipet; Rinsing the Mohr pipet with sample; Using the Mohr pipet to transfer sample; Delivering a liquid sample with a volumetric pipet; Pipetting a liquid sample with a volumetric pipet; Rinsing the volumetric pipet with sample; Using the volumetric pipet to transfer sample; Obtaining the titrant; Rinsing the buret with titrant; Filling the buret with titrant; Adding the indicator; The initial reading; Beginning the titration; Delivering titrant; The final reading. Figure 3. Near the end point a single drop of titrant can cause a lasting color change.

  5. A PCR method based on 18S rRNA gene for detection of malaria parasite in Balochistan.

    PubMed

    Shahwani, Zubeda; Aleem, Abdul; Ahmed, Nazeer; Mushtaq, Muhammad; Afridi, Sarwat

    2016-12-01

    To establish a polymerase chain reaction method based on 18S ribosomal ribonucleic acid gene for the detection of plasmodium deoxyribonucleic acid in patients suffering from malaria symptoms. This cross-sectional study was conducted from September 2013 to October 2014 in district Quetta of Pakistan's Balochistan province. Blood samples were collected from patients suffering from general symptoms of malaria. A polymerase chain reaction-based technique was applied for the diagnosis of malaria and detection of responsible species in the patients who were suspected to carry the parasite. Performance of this polymerase chain reaction method was compared against the microscopy results. Parasite number was also calculated for microscopy positive samples.All samples after the genomic deoxyribonucleic acid isolation were subjected to polymerase chain reaction amplification and agarose gel electrophoresis. Of the 200 samples, 114(57%) were confirmed as positive and 86(43%) as negative for malaria by microscopy. Polymerase chain reaction identified 124(62%) samples as positive and 76(38%) as negative for malaria. The comparative analysis of both diagnostic methods confirmed 109(54.5%) samples as positive by both techniques. Besides, 5(6.58%) samples were identified as false positive and 15(12.1%) samples as false negative by polymerase chain reaction. Sensitivity, specificity and positive predictive values for polymerase chain reaction in comparison to microscopy were 87.98%, 93.42% and 96%, respectively. Polymerase chain reaction-based methods in malaria diagnosis and species identification were found to be more effective than other techniques.

  6. Review of online coupling of sample preparation techniques with liquid chromatography.

    PubMed

    Pan, Jialiang; Zhang, Chengjiang; Zhang, Zhuomin; Li, Gongke

    2014-03-07

    Sample preparation is still considered as the bottleneck of the whole analytical procedure, and efforts has been conducted towards the automation, improvement of sensitivity and accuracy, and low comsuption of organic solvents. Development of online sample preparation techniques (SP) coupled with liquid chromatography (LC) is a promising way to achieve these goals, which has attracted great attention. This article reviews the recent advances on the online SP-LC techniques. Various online SP techniques have been described and summarized, including solid-phase-based extraction, liquid-phase-based extraction assisted with membrane, microwave assisted extraction, ultrasonic assisted extraction, accelerated solvent extraction and supercritical fluids extraction. Specially, the coupling approaches of online SP-LC systems and the corresponding interfaces have been discussed and reviewed in detail, such as online injector, autosampler combined with transport unit, desorption chamber and column switching. Typical applications of the online SP-LC techniques have been summarized. Then the problems and expected trends in this field are attempted to be discussed and proposed in order to encourage the further development of online SP-LC techniques. Copyright © 2014 Elsevier B.V. All rights reserved.

  7. Propagation-based x-ray phase contrast imaging using an iterative phase diversity technique

    NASA Astrophysics Data System (ADS)

    Carroll, Aidan J.; van Riessen, Grant A.; Balaur, Eugeniu; Dolbnya, Igor P.; Tran, Giang N.; Peele, Andrew G.

    2018-03-01

    Through the use of a phase diversity technique, we demonstrate a near-field in-line x-ray phase contrast algorithm that provides improved object reconstruction when compared to our previous iterative methods for a homogeneous sample. Like our previous methods, the new technique uses the sample refractive index distribution during the reconstruction process. The technique complements existing monochromatic and polychromatic methods and is useful in situations where experimental phase contrast data is affected by noise.

  8. Least squares polynomial chaos expansion: A review of sampling strategies

    NASA Astrophysics Data System (ADS)

    Hadigol, Mohammad; Doostan, Alireza

    2018-04-01

    As non-institutive polynomial chaos expansion (PCE) techniques have gained growing popularity among researchers, we here provide a comprehensive review of major sampling strategies for the least squares based PCE. Traditional sampling methods, such as Monte Carlo, Latin hypercube, quasi-Monte Carlo, optimal design of experiments (ODE), Gaussian quadratures, as well as more recent techniques, such as coherence-optimal and randomized quadratures are discussed. We also propose a hybrid sampling method, dubbed alphabetic-coherence-optimal, that employs the so-called alphabetic optimality criteria used in the context of ODE in conjunction with coherence-optimal samples. A comparison between the empirical performance of the selected sampling methods applied to three numerical examples, including high-order PCE's, high-dimensional problems, and low oversampling ratios, is presented to provide a road map for practitioners seeking the most suitable sampling technique for a problem at hand. We observed that the alphabetic-coherence-optimal technique outperforms other sampling methods, specially when high-order ODE are employed and/or the oversampling ratio is low.

  9. Droplet-Based Segregation and Extraction of Concentrated Samples

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Buie, C R; Buckley, P; Hamilton, J

    2007-02-23

    Microfluidic analysis often requires sample concentration and separation techniques to isolate and detect analytes of interest. Complex or scarce samples may also require an orthogonal separation and detection method or off-chip analysis to confirm results. To perform these additional steps, the concentrated sample plug must be extracted from the primary microfluidic channel with minimal sample loss and dilution. We investigated two extraction techniques; injection of immiscible fluid droplets into the sample stream (''capping'''') and injection of the sample into an immiscible fluid stream (''extraction''). From our results we conclude that capping is the more effective partitioning technique. Furthermore, this functionalitymore » enables additional off-chip post-processing procedures such as DNA/RNA microarray analysis, realtime polymerase chain reaction (RT-PCR), and culture growth to validate chip performance.« less

  10. Ring-oven based preconcentration technique for microanalysis: simultaneous determination of Na, Fe, and Cu in fuel ethanol by laser induced breakdown spectroscopy.

    PubMed

    Cortez, Juliana; Pasquini, Celio

    2013-02-05

    The ring-oven technique, originally applied for classical qualitative analysis in the years 1950s to 1970s, is revisited to be used in a simple though highly efficient and green procedure for analyte preconcentration prior to its determination by the microanalytical techniques presently available. The proposed preconcentration technique is based on the dropwise delivery of a small volume of sample to a filter paper substrate, assisted by a flow-injection-like system. The filter paper is maintained in a small circular heated oven (the ring oven). Drops of the sample solution diffuse by capillarity from the center to a circular area of the paper substrate. After the total sample volume has been delivered, a ring with a sharp (c.a. 350 μm) circular contour, of about 2.0 cm diameter, is formed on the paper to contain most of the analytes originally present in the sample volume. Preconcentration coefficients of the analyte can reach 250-fold (on a m/m basis) for a sample volume as small as 600 μL. The proposed system and procedure have been evaluated to concentrate Na, Fe, and Cu in fuel ethanol, followed by simultaneous direct determination of these species in the ring contour, employing the microanalytical technique of laser induced breakdown spectroscopy (LIBS). Detection limits of 0.7, 0.4, and 0.3 μg mL(-1) and mean recoveries of (109 ± 13)%, (92 ± 18)%, and (98 ± 12)%, for Na, Fe, and Cu, respectively, were obtained in fuel ethanol. It is possible to anticipate the application of the technique, coupled to modern microanalytical and multianalyte techniques, to several analytical problems requiring analyte preconcentration and/or sample stabilization.

  11. Office-based narrow band imaging-guided flexible laryngoscopy tissue sampling: A cost-effectiveness analysis evaluating its impact on Taiwanese health insurance program.

    PubMed

    Fang, Tuan-Jen; Li, Hsueh-Yu; Liao, Chun-Ta; Chiang, Hui-Chen; Chen, I-How

    2015-07-01

    Narrow band imaging (NBI)-guided flexible laryngoscopy tissue sampling for laryngopharyngeal lesions is a novel technique. Patients underwent the procedure in an office-based setting without being sedated, which is different from the conventional technique performed using direct laryngoscopy. Although the feasibility and effects of this procedure were established, its financial impact on the institution and Taiwanese National Health Insurance program was not determined. This is a retrospective case-control study. From May 2010 to April 2011, 20 consecutive patients who underwent NBI flexible laryngoscopy tissue sampling were recruited. During the same period, another 20 age-, sex-, and lesion-matched cases were enrolled in the control group. The courses for procedures and financial status were analyzed and compared between groups. Office-based NBI flexible laryngoscopy tissue sampling procedure took 27 minutes to be completed, while 191 minutes were required for the conventional technique. Average reimbursement for each case was New Taiwan Dollar (NT$)1264 for patients undergoing office-based NBI flexible laryngoscopy tissue sampling, while NT$10,913 for those undergoing conventional direct laryngoscopy in the operation room (p < 0.001). The institution suffered a loss of at least NT$690 when performing NBI flexible laryngoscopy tissue sampling. Office-based NBI flexible laryngoscopy tissue sampling is a cost-saving procedure for patients and the Taiwanese National Health Insurance program. It also saves the procedure time. However, the net financial loss for the institution and physician would limit its popularization unless reimbursement patterns are changed. Copyright © 2013. Published by Elsevier B.V.

  12. A technique based on droplet evaporation to recognize alcoholic drinks

    NASA Astrophysics Data System (ADS)

    González-Gutiérrez, Jorge; Pérez-Isidoro, Rosendo; Ruiz-Suárez, J. C.

    2017-07-01

    Chromatography is, at present, the most used technique to determine the purity of alcoholic drinks. This involves a careful separation of the components of the liquid elements. However, since this technique requires sophisticated instrumentation, there are alternative techniques such as conductivity measurements and UV-Vis and infrared spectrometries. We report here a method based on salt-induced crystallization patterns formed during the evaporation of alcoholic drops. We found that droplets of different samples form different structures upon drying, which we characterize by their radial density profiles. We prove that using the dried deposit of a spirit as a control sample, our method allows us to differentiate between pure and adulterated drinks. As a proof of concept, we study tequila.

  13. Water-based gas purge microsyringe extraction coupled with liquid chromatography for determination of alkylphenols from sea food Laminaria japonica Aresh.

    PubMed

    Yang, Cui; Zhao, Jinhua; Wang, Juan; Yu, Hongling; Piao, Xiangfan; Li, Donghao

    2013-07-26

    A novel organic solvent-free mode of gas purge microsyringe extraction, termed water-based gas purge microsyringe extraction, was developed. This technique can directly extract target compounds in wet samples without any drying process. Parameters affecting the extraction efficiency were investigated. Under optimal extraction conditions, the recoveries of alkylphenols were between 87.6 and 105.8%, and reproducibility was between 5.2 and 12.1%. The technique was also used to determine six kinds of alkylphenols (APs) from samples of Laminaria japonica Aresh. The OP and NP were detected in all the samples, and concentrations ranged from 26.0 to 54.5ngg(-1) and 45.0-180.4ngg(-1), respectively. The 4-n-butylphenol was detected in only one sample and its concentration was very low. Other APs were not detected in L. japonica Aresh samples. The experimental results demonstrated that the technique is fast, simple, non-polluting, allows for quantitative extraction, and a drying process was not required for wet samples. Since only aqueous solution and a conventional microsyringe were used, this technique proved affordable, efficient, and convenient for the extraction of volatile and semivolatile ionizable compounds. Copyright © 2013 Elsevier B.V. All rights reserved.

  14. Quadrature demodulation based circuit implementation of pulse stream for ultrasonic signal FRI sparse sampling

    NASA Astrophysics Data System (ADS)

    Shoupeng, Song; Zhou, Jiang

    2017-03-01

    Converting ultrasonic signal to ultrasonic pulse stream is the key step of finite rate of innovation (FRI) sparse sampling. At present, ultrasonic pulse-stream-forming techniques are mainly based on digital algorithms. No hardware circuit that can achieve it has been reported. This paper proposes a new quadrature demodulation (QD) based circuit implementation method for forming an ultrasonic pulse stream. Elaborating on FRI sparse sampling theory, the process of ultrasonic signal is explained, followed by a discussion and analysis of ultrasonic pulse-stream-forming methods. In contrast to ultrasonic signal envelope extracting techniques, a quadrature demodulation method (QDM) is proposed. Simulation experiments were performed to determine its performance at various signal-to-noise ratios (SNRs). The circuit was then designed, with mixing module, oscillator, low pass filter (LPF), and root of square sum module. Finally, application experiments were carried out on pipeline sample ultrasonic flaw testing. The experimental results indicate that the QDM can accurately convert ultrasonic signal to ultrasonic pulse stream, and reverse the original signal information, such as pulse width, amplitude, and time of arrival. This technique lays the foundation for ultrasonic signal FRI sparse sampling directly with hardware circuitry.

  15. On using sample selection methods in estimating the price elasticity of firms' demand for insurance.

    PubMed

    Marquis, M Susan; Louis, Thomas A

    2002-01-01

    We evaluate a technique based on sample selection models that has been used by health economists to estimate the price elasticity of firms' demand for insurance. We demonstrate that, this technique produces inflated estimates of the price elasticity. We show that alternative methods lead to valid estimates.

  16. Non-contact evaluation of milk-based products using air-coupled ultrasound

    NASA Astrophysics Data System (ADS)

    Meyer, S.; Hindle, S. A.; Sandoz, J.-P.; Gan, T. H.; Hutchins, D. A.

    2006-07-01

    An air-coupled ultrasonic technique has been developed and used to detect physicochemical changes of liquid beverages within a glass container. This made use of two wide-bandwidth capacitive transducers, combined with pulse-compression techniques. The use of a glass container to house samples enabled visual inspection, helping to verify the results of some of the ultrasonic measurements. The non-contact pulse-compression system was used to evaluate agglomeration processes in milk-based products. It is shown that the amplitude of the signal varied with time after the samples had been treated with lactic acid, thus promoting sample destabilization. Non-contact imaging was also performed to follow destabilization of samples by scanning in various directions across the container. The obtained ultrasonic images were also compared to those from a digital camera. Coagulation with glucono-delta-lactone of skim milk poured into this container could be monitored within a precision of a pH of 0.15. This rapid, non-contact and non-destructive technique has shown itself to be a feasible method for investigating the quality of milk-based beverages, and possibly other food products.

  17. LOCAD-PTS: Operation of a New System for Microbial Monitoring Aboard the International Space Station (ISS)

    NASA Technical Reports Server (NTRS)

    Maule, J.; Wainwright, N.; Steele, A.; Gunter, D.; Flores, G.; Effinger, M.; Danibm N,; Wells, M.; Williams, S.; Morris, H.; hide

    2008-01-01

    Microorganisms within the space stations Salyut, Mir and the International Space Station (ISS), have traditionally been monitored with culture-based techniques. These techniques involve growing environmental samples (cabin water, air or surfaces) on agar-type media for several days, followed by visualization of resulting colonies; and return of samples to Earth for ground-based analysis. This approach has provided a wealth of useful data and enhanced our understanding of the microbial ecology within space stations. However, the approach is also limited by the following: i) More than 95% microorganisms in the environment cannot grow on conventional growth media; ii) Significant time lags occur between onboard sampling and colony visualization (3-5 days) and ground-based analysis (as long as several months); iii) Colonies are often difficult to visualize due to condensation within contact slide media plates; and iv) Techniques involve growth of potentially harmful microorganisms, which must then be disposed of safely. This report describes the operation of a new culture-independent technique onboard the ISS for rapid analysis (within minutes) of endotoxin and -1, 3-glucan, found in the cell walls of gram-negative bacteria and fungi, respectively. This technique involves analysis of environmental samples with the Limulus Amebocyte Lysate (LAL) assay in a handheld device. This handheld device and sampling system is known as the Lab-On-a-Chip Application Development Portable Test System (LOCAD-PTS). A poster will be presented that describes a comparative study between LOCAD-PTS analysis and existing culture-based methods onboard the ISS; together with an exploratory survey of surface endotoxin throughout the ISS. It is concluded that while a general correlation between LOCAD-PTS and traditional culture-based methods should not necessarily be expected, a combinatorial approach can be adopted where both sets of data are used together to generate a more complete story of the microbial ecology on the ISS.

  18. Advanced Navigation Strategies For Asteroid Sample Return Missions

    NASA Technical Reports Server (NTRS)

    Getzandanner, K.; Bauman, J.; Williams, B.; Carpenter, J.

    2010-01-01

    Flyby and rendezvous missions to asteroids have been accomplished using navigation techniques derived from experience gained in planetary exploration. This paper presents analysis of advanced navigation techniques required to meet unique challenges for precision navigation to acquire a sample from an asteroid and return it to Earth. These techniques rely on tracking data types such as spacecraft-based laser ranging and optical landmark tracking in addition to the traditional Earth-based Deep Space Network radio metric tracking. A systematic study of navigation strategy, including the navigation event timeline and reduction in spacecraft-asteroid relative errors, has been performed using simulation and covariance analysis on a representative mission.

  19. Machine learning based sample extraction for automatic speech recognition using dialectal Assamese speech.

    PubMed

    Agarwalla, Swapna; Sarma, Kandarpa Kumar

    2016-06-01

    Automatic Speaker Recognition (ASR) and related issues are continuously evolving as inseparable elements of Human Computer Interaction (HCI). With assimilation of emerging concepts like big data and Internet of Things (IoT) as extended elements of HCI, ASR techniques are found to be passing through a paradigm shift. Oflate, learning based techniques have started to receive greater attention from research communities related to ASR owing to the fact that former possess natural ability to mimic biological behavior and that way aids ASR modeling and processing. The current learning based ASR techniques are found to be evolving further with incorporation of big data, IoT like concepts. Here, in this paper, we report certain approaches based on machine learning (ML) used for extraction of relevant samples from big data space and apply them for ASR using certain soft computing techniques for Assamese speech with dialectal variations. A class of ML techniques comprising of the basic Artificial Neural Network (ANN) in feedforward (FF) and Deep Neural Network (DNN) forms using raw speech, extracted features and frequency domain forms are considered. The Multi Layer Perceptron (MLP) is configured with inputs in several forms to learn class information obtained using clustering and manual labeling. DNNs are also used to extract specific sentence types. Initially, from a large storage, relevant samples are selected and assimilated. Next, a few conventional methods are used for feature extraction of a few selected types. The features comprise of both spectral and prosodic types. These are applied to Recurrent Neural Network (RNN) and Fully Focused Time Delay Neural Network (FFTDNN) structures to evaluate their performance in recognizing mood, dialect, speaker and gender variations in dialectal Assamese speech. The system is tested under several background noise conditions by considering the recognition rates (obtained using confusion matrices and manually) and computation time. It is found that the proposed ML based sentence extraction techniques and the composite feature set used with RNN as classifier outperform all other approaches. By using ANN in FF form as feature extractor, the performance of the system is evaluated and a comparison is made. Experimental results show that the application of big data samples has enhanced the learning of the ASR system. Further, the ANN based sample and feature extraction techniques are found to be efficient enough to enable application of ML techniques in big data aspects as part of ASR systems. Copyright © 2015 Elsevier Ltd. All rights reserved.

  20. A comparison between polymerase chain reaction (PCR) and traditional techniques for the diagnosis of leptospirosis in bovines.

    PubMed

    Hernández-Rodríguez, Patricia; Díaz, César A; Dalmau, Ernesto A; Quintero, Gladys M

    2011-01-01

    Leptospirosis is caused by Leptospira, gram negative spirochaetes whose microbiologic identification is difficult due to their low rate of growth and metabolic activity. In Colombia leptospirosis diagnosis is achieved by serological techniques without unified criteria for what positive titers are. In this study we compared polymerase chain reaction (PCR) with microbiological culture and dark field microscopy for the diagnosis of leptospirosis. Microbiological and molecular techniques were performed on 83 samples of urine taken from bovines in the savannahs surrounding Bogotá in Colombia, with presumptive diagnosis of leptospirosis. 117 samples of urine taken from healthy bovines were used as negative controls. 83 samples were MAT positive with titers ≥ 1:50; 81 with titers ≥ 1:100; and 66 with titers ≥ 1:200. 36% of the total samples (73/200) were Leptospira positives by microbiological culture, 32% (63/200) by dark field microscopy and 37% (74/200) by PCR. Amplicons obtained by PCR were 482 base pair long which are Leptospira specific. An amplicon of 262 base pairs typical of pathogenic Leptospira was observed in 71 out of the 74 PCR positive samples. The remaining 3 samples showed a 240 base pair amplicon which is typical of saprophytic Leptospira. PCR as a Leptospira diagnosis technique was 100% sensitive and 99% specific in comparison to microbiological culture. Kappa value of 0.99 indicated an excellent concordance between these techniques. Sensitivity and specificity reported for MAT when compared to microbiological culture was 0.95 and 0.89 with a ≥ 1:50 cut off. PCR was a reliable method for the rapid and precise diagnosis of leptospirosis when compared to traditional techniques in our study. The research presented here will be helpful to improve diagnosis and control of leptospirosis in Colombia and other endemic countries. Copyright © 2010 Elsevier B.V. All rights reserved.

  1. Use of a Smartphone as a Colorimetric Analyzer in Paper-based Devices for Sensitive and Selective Determination of Mercury in Water Samples.

    PubMed

    Jarujamrus, Purim; Meelapsom, Rattapol; Pencharee, Somkid; Obma, Apinya; Amatatongchai, Maliwan; Ditcharoen, Nadh; Chairam, Sanoe; Tamuang, Suparb

    2018-01-01

    A smartphone application, called CAnal, was developed as a colorimetric analyzer in paper-based devices for sensitive and selective determination of mercury(II) in water samples. Measurement on the double layer of a microfluidic paper-based analytical device (μPAD) fabricated by alkyl ketene dimer (AKD)-inkjet printing technique with special design doped with unmodified silver nanoparticles (AgNPs) onto the detection zones was performed by monitoring the gray intensity in the blue channel of AgNPs, which disintegrated when exposed to mercury(II) on μPAD. Under the optimized conditions, the developed approach showed high sensitivity, low limit of detection (0.003 mg L -1 , 3SD blank/slope of the calibration curve), small sample volume uptake (two times of 2 μL), and short analysis time. The linearity range of this technique ranged from 0.01 to 10 mg L -1 (r 2 = 0.993). Furthermore, practical analysis of various water samples was also demonstrated to have acceptable performance that was in agreement with the data from cold vapor atomic absorption spectrophotometry (CV-AAS), a conventional method. The proposed technique allows for a rapid, simple (instant report of the final mercury(II) concentration in water samples via smartphone display), sensitive, selective, and on-site analysis with high sample throughput (48 samples h -1 , n = 3) of trace mercury(II) in water samples, which is suitable for end users who are unskilled in analyzing mercury(II) in water samples.

  2. Fully Integrated Microfluidic Device for Direct Sample-to-Answer Genetic Analysis

    NASA Astrophysics Data System (ADS)

    Liu, Robin H.; Grodzinski, Piotr

    Integration of microfluidics technology with DNA microarrays enables building complete sample-to-answer systems that are useful in many applications such as clinic diagnostics. In this chapter, a fully integrated microfluidic device [1] that consists of microfluidic mixers, valves, pumps, channels, chambers, heaters, and a DNA microarray sensor to perform DNA analysis of complex biological sample solutions is present. This device can perform on-chip sample preparation (including magnetic bead-based cell capture, cell preconcentration and purification, and cell lysis) of complex biological sample solutions (such as whole blood), polymerase chain reaction, DNA hybridization, and electrochemical detection. A few novel microfluidic techniques were developed and employed. A micromix-ing technique based on a cavitation microstreaming principle was implemented to enhance target cell capture from whole blood samples using immunomagnetic beads. This technique was also employed to accelerate DNA hybridization reaction. Thermally actuated paraffin-based microvalves were developed to regulate flows. Electrochemical pumps and thermopneumatic pumps were integrated on the chip to provide pumping of liquid solutions. The device is completely self-contained: no external pressure sources, fluid storage, mechanical pumps, or valves are necessary for fluid manipulation, thus eliminating possible sample contamination and simplifying device operation. Pathogenic bacteria detection from ~mL whole blood samples and single-nucleotide polymorphism analysis directly from diluted blood were demonstrated. The device provides a cost-effective solution to direct sample-to-answer genetic analysis, and thus has a potential impact in the fields of point-of-care genetic analysis, environmental testing, and biological warfare agent detection.

  3. Application of the laser induced deflection (LID) technique for low absorption measurements in bulk materials and coatings

    NASA Astrophysics Data System (ADS)

    Triebel, W.; Mühlig, C.; Kufert, S.

    2005-10-01

    Precise absorption measurements of bulk materials and coatings upon pulsed ArF laser irradiation are presented using a compact experimental setup based on the laser induced deflection technique (LID). For absorption measurements of bulk materials the influence of pure bulk and pure surface absorption on the temperature and refractive index profile and thus for the probe beam deflection is analyzed in detail. The separation of bulk and surface absorption via the commonly used variation of the sample thickness is carried out for fused silica and calcium fluoride. The experimental results show that for the given surface polishing quality the bulk absorption coefficient of fused silica can be obtained by investigating only one sample. To avoid the drawback of different bulk and surface properties amongst a thickness series, we propose a strategy based on the LID technique to generally obtain surface and bulk absorption separately by investigating only one sample. Apart from measuring bulk absorption coefficients the LID technique is applied to determine the absorption of highly reflecting (HR) coatings on CaF2 substrates. Beside the measuring strategy the experimental results of a AlF3/LaF3 based HR coating are presented. In order to investigate a larger variety of coatings, including high transmitting coatings, a general measuring strategy based on the LID technique is proposed.

  4. Sample Preparation for Mass Spectrometry Imaging of Plant Tissues: A Review

    PubMed Central

    Dong, Yonghui; Li, Bin; Malitsky, Sergey; Rogachev, Ilana; Aharoni, Asaph; Kaftan, Filip; Svatoš, Aleš; Franceschi, Pietro

    2016-01-01

    Mass spectrometry imaging (MSI) is a mass spectrometry based molecular ion imaging technique. It provides the means for ascertaining the spatial distribution of a large variety of analytes directly on tissue sample surfaces without any labeling or staining agents. These advantages make it an attractive molecular histology tool in medical, pharmaceutical, and biological research. Likewise, MSI has started gaining popularity in plant sciences; yet, information regarding sample preparation methods for plant tissues is still limited. Sample preparation is a crucial step that is directly associated with the quality and authenticity of the imaging results, it therefore demands in-depth studies based on the characteristics of plant samples. In this review, a sample preparation pipeline is discussed in detail and illustrated through selected practical examples. In particular, special concerns regarding sample preparation for plant imaging are critically evaluated. Finally, the applications of MSI techniques in plants are reviewed according to different classes of plant metabolites. PMID:26904042

  5. A preclustering-based ensemble learning technique for acute appendicitis diagnoses.

    PubMed

    Lee, Yen-Hsien; Hu, Paul Jen-Hwa; Cheng, Tsang-Hsiang; Huang, Te-Chia; Chuang, Wei-Yao

    2013-06-01

    Acute appendicitis is a common medical condition, whose effective, timely diagnosis can be difficult. A missed diagnosis not only puts the patient in danger but also requires additional resources for corrective treatments. An acute appendicitis diagnosis constitutes a classification problem, for which a further fundamental challenge pertains to the skewed outcome class distribution of instances in the training sample. A preclustering-based ensemble learning (PEL) technique aims to address the associated imbalanced sample learning problems and thereby support the timely, accurate diagnosis of acute appendicitis. The proposed PEL technique employs undersampling to reduce the number of majority-class instances in a training sample, uses preclustering to group similar majority-class instances into multiple groups, and selects from each group representative instances to create more balanced samples. The PEL technique thereby reduces potential information loss from random undersampling. It also takes advantage of ensemble learning to improve performance. We empirically evaluate this proposed technique with 574 clinical cases obtained from a comprehensive tertiary hospital in southern Taiwan, using several prevalent techniques and a salient scoring system as benchmarks. The comparative results show that PEL is more effective and less biased than any benchmarks. The proposed PEL technique seems more sensitive to identifying positive acute appendicitis than the commonly used Alvarado scoring system and exhibits higher specificity in identifying negative acute appendicitis. In addition, the sensitivity and specificity values of PEL appear higher than those of the investigated benchmarks that follow the resampling approach. Our analysis suggests PEL benefits from the more representative majority-class instances in the training sample. According to our overall evaluation results, PEL records the best overall performance, and its area under the curve measure reaches 0.619. The PEL technique is capable of addressing imbalanced sample learning associated with acute appendicitis diagnosis. Our evaluation results suggest PEL is less biased toward a positive or negative class than the investigated benchmark techniques. In addition, our results indicate the overall effectiveness of the proposed technique, compared with prevalent scoring systems or salient classification techniques that follow the resampling approach. Copyright © 2013 Elsevier B.V. All rights reserved.

  6. Large-volume constant-concentration sampling technique coupling with surface-enhanced Raman spectroscopy for rapid on-site gas analysis

    NASA Astrophysics Data System (ADS)

    Zhang, Zhuomin; Zhan, Yisen; Huang, Yichun; Li, Gongke

    2017-08-01

    In this work, a portable large-volume constant-concentration (LVCC) sampling technique coupling with surface-enhanced Raman spectroscopy (SERS) was developed for the rapid on-site gas analysis based on suitable derivatization methods. LVCC sampling technique mainly consisted of a specially designed sampling cell including the rigid sample container and flexible sampling bag, and an absorption-derivatization module with a portable pump and a gas flowmeter. LVCC sampling technique allowed large, alterable and well-controlled sampling volume, which kept the concentration of gas target in headspace phase constant during the entire sampling process and made the sampling result more representative. Moreover, absorption and derivatization of gas target during LVCC sampling process were efficiently merged in one step using bromine-thiourea and OPA-NH4+ strategy for ethylene and SO2 respectively, which made LVCC sampling technique conveniently adapted to consequent SERS analysis. Finally, a new LVCC sampling-SERS method was developed and successfully applied for rapid analysis of trace ethylene and SO2 from fruits. It was satisfied that trace ethylene and SO2 from real fruit samples could be actually and accurately quantified by this method. The minor concentration fluctuations of ethylene and SO2 during the entire LVCC sampling process were proved to be < 4.3% and 2.1% respectively. Good recoveries for ethylene and sulfur dioxide from fruit samples were achieved in range of 95.0-101% and 97.0-104% respectively. It is expected that portable LVCC sampling technique would pave the way for rapid on-site analysis of accurate concentrations of trace gas targets from real samples by SERS.

  7. Large-volume constant-concentration sampling technique coupling with surface-enhanced Raman spectroscopy for rapid on-site gas analysis.

    PubMed

    Zhang, Zhuomin; Zhan, Yisen; Huang, Yichun; Li, Gongke

    2017-08-05

    In this work, a portable large-volume constant-concentration (LVCC) sampling technique coupling with surface-enhanced Raman spectroscopy (SERS) was developed for the rapid on-site gas analysis based on suitable derivatization methods. LVCC sampling technique mainly consisted of a specially designed sampling cell including the rigid sample container and flexible sampling bag, and an absorption-derivatization module with a portable pump and a gas flowmeter. LVCC sampling technique allowed large, alterable and well-controlled sampling volume, which kept the concentration of gas target in headspace phase constant during the entire sampling process and made the sampling result more representative. Moreover, absorption and derivatization of gas target during LVCC sampling process were efficiently merged in one step using bromine-thiourea and OPA-NH 4 + strategy for ethylene and SO 2 respectively, which made LVCC sampling technique conveniently adapted to consequent SERS analysis. Finally, a new LVCC sampling-SERS method was developed and successfully applied for rapid analysis of trace ethylene and SO 2 from fruits. It was satisfied that trace ethylene and SO 2 from real fruit samples could be actually and accurately quantified by this method. The minor concentration fluctuations of ethylene and SO 2 during the entire LVCC sampling process were proved to be <4.3% and 2.1% respectively. Good recoveries for ethylene and sulfur dioxide from fruit samples were achieved in range of 95.0-101% and 97.0-104% respectively. It is expected that portable LVCC sampling technique would pave the way for rapid on-site analysis of accurate concentrations of trace gas targets from real samples by SERS. Copyright © 2017 Elsevier B.V. All rights reserved.

  8. Flow-based analysis using microfluidics-chemiluminescence systems.

    PubMed

    Al Lawati, Haider A J

    2013-01-01

    This review will discuss various approaches and techniques in which analysis using microfluidics-chemiluminescence systems (MF-CL) has been reported. A variety of applications is examined, including environmental, pharmaceutical, biological, food and herbal analysis. Reported uses of CL reagents, sample introduction techniques, sample pretreatment methods, CL signal enhancement and detection systems are discussed. A hydrodynamic pumping system is predominately used for these applications. However, several reports are available in which electro-osmotic (EO) pumping has been implemented. Various sample pretreatment methods have been used, including liquid-liquid extraction, solid-phase extraction and molecularly imprinted polymers. A wide range of innovative techniques has been reported for CL signal enhancement. Most of these techniques are based on enhancement of the mixing process in the microfluidics channels, which leads to enhancement of the CL signal. However, other techniques are also reported, such as mirror reaction, liquid core waveguide, on-line pre-derivatization and the use of an opaque white chip with a thin transparent seal. Photodetectors are the most commonly used detectors; however, other detection systems have also been used, including integrated electrochemiluminescence (ECL) and organic photodiodes (OPDs). Copyright © 2012 John Wiley & Sons, Ltd.

  9. Incorporation of Multiwalled Carbon Nanotubes into High Temperature Resin Using Dry Mixing Techniques

    NASA Technical Reports Server (NTRS)

    Ghose, Sayata; Watson, Kent A.; Delozier, Donavon M.; Working, Dennis C.; Siochi, Emilie J.; Connell, John W.

    2006-01-01

    As part of an ongoing effort to develop multifunctional advanced composites, blends of PETI330 and multiwalled carbon nanotubes (MWNTs) were prepared and characterized. Dry mixing techniques were employed and the maximum loading level of the MWNT chosen was based primarily on its effect on melt viscosity. The PETI330/ MWNT mixtures were prepared at concentrations ranging from 3 to 25 wt %. The resulting powders were characterized for homogeneity, thermal and rheological properties and extrudability as continuous fibers. Based on the characterization results, samples containing 10, 15 and 20 wt % MWNTs were chosen for more comprehensive evaluation. Samples were also prepared using in situ polymerization and solution mixing techniques and their properties were compared with the ball-mill prepared samples. The preparation and characterization of PETI330/ MWNT nanocomposites are discussed herein.

  10. Evaluation of a new serological technique for detecting rabies virus antibodies following vaccination.

    PubMed

    Ma, Xiaoyue; Niezgoda, Michael; Blanton, Jesse D; Recuenco, Sergio; Rupprecht, Charles E

    2012-08-03

    Two major techniques are currently used to estimate rabies virus antibody values: neutralization assays, such as the rapid fluorescent focus inhibition test (RFFIT), and enzyme-linked immunosorbent assays (ELISAs). The RFFIT is considered the gold standard assay and has been used to assess the titer of rabies virus neutralizing antibodies for more than three decades. In the late 1970s, ELISA began to be used to estimate the level of rabies virus antibody and has recently been used by some laboratories as an alternate screening test for animal sera. Although the ELISA appears simpler, safer and more efficient, the assay is less sensitive in detecting low values of rabies virus neutralizing antibodies than neutralization tests. This study was designed to evaluate a new ELISA-based method for detecting rabies virus binding antibody. This new technique uses electro-chemi-luminescence labels and carbon electrode plates to detect binding events. In this comparative study, the RFFIT and the new ELISA-based technique were used to evaluate the level of rabies virus antibodies in human and animal serum samples. By using a conservative approximation of 0.15 IU/ml as a cutoff point, the new ELISA-based technique demonstrated a sensitivity of 100% and a specificity of 95% for human samples and for experimental animal samples. The sensitivity and specificity for field animal samples was 96% and 95%, respectively. The preliminary results from this study appear promising and demonstrate a higher sensitivity than traditional ELISA methods. Published by Elsevier Ltd.

  11. Computationally-efficient optical coherence elastography to assess degenerative osteoarthritis based on ultrasound-induced fringe washout (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Tong, Minh Q.; Hasan, M. Monirul; Gregory, Patrick D.; Shah, Jasmine; Park, B. Hyle; Hirota, Koji; Liu, Junze; Choi, Andy; Low, Karen; Nam, Jin

    2017-02-01

    We demonstrate a computationally-efficient optical coherence elastography (OCE) method based on fringe washout. By introducing ultrasound in alternating depth profile, we can obtain information on the mechanical properties of a sample within acquisition of a single image. This can be achieved by simply comparing the intensity in adjacent depth profiles in order to quantify the degree of fringe washout. Phantom agar samples with various densities were measured and quantified by our OCE technique, the correlation to Young's modulus measurement by atomic force micrscopy (AFM) were observed. Knee cartilage samples of monoiodo acetate-induced arthiritis (MIA) rat models were utilized to replicate cartilage damages where our proposed OCE technique along with intensity and birefringence analyses and AFM measurements were applied. The results indicate that our OCE technique shows a correlation to the techniques as polarization-sensitive OCT, AFM Young's modulus measurements and histology were promising. Our OCE is applicable to any of existing OCT systems and demonstrated to be computationally-efficient.

  12. Identification of Species and Sources of Cryptosporidium Oocysts in Storm Waters with a Small-Subunit rRNA-Based Diagnostic and Genotyping Tool

    PubMed Central

    Xiao, Lihua; Alderisio, Kerri; Limor, Josef; Royer, Michael; Lal, Altaf A.

    2000-01-01

    The identification of Cryptosporidium oocysts in environmental samples is largely made by the use of an immunofluorescent assay. In this study, we have used a small-subunit rRNA-based PCR-restriction fragment length polymorphism technique to identify species and sources of Cryptosporidium oocysts present in 29 storm water samples collected from a stream in New York. A total of 12 genotypes were found in 27 positive samples; for 4 the species and probable origins were identified by sequence analysis, whereas the rest represent new genotypes from wildlife. Thus, this technique provides an alternative method for the detection and differentiation of Cryptosporidium parasites in environmental samples. PMID:11097935

  13. Comparison of two headspace sampling techniques for the analysis of off-flavour volatiles from oat based products.

    PubMed

    Cognat, Claudine; Shepherd, Tom; Verrall, Susan R; Stewart, Derek

    2012-10-01

    Two different headspace sampling techniques were compared for analysis of aroma volatiles from freshly produced and aged plain oatcakes. Solid phase microextraction (SPME) using a Carboxen-Polydimethylsiloxane (PDMS) fibre and entrainment on Tenax TA within an adsorbent tube were used for collection of volatiles. The effects of variation in the sampling method were also considered using SPME. The data obtained using both techniques were processed by multivariate statistical analysis (PCA). Both techniques showed similar capacities to discriminate between the samples at different ages. Discrimination between fresh and rancid samples could be made on the basis of changes in the relative abundances of 14-15 of the constituents in the volatile profiles. A significant effect on the detection level of volatile compounds was observed when samples were crushed and analysed by SPME-GC-MS, in comparison to undisturbed product. The applicability and cost effectiveness of both methods were considered. Copyright © 2012 Elsevier Ltd. All rights reserved.

  14. Hyphenated analytical techniques for materials characterisation

    NASA Astrophysics Data System (ADS)

    Armstrong, Gordon; Kailas, Lekshmi

    2017-09-01

    This topical review will provide a survey of the current state of the art in ‘hyphenated’ techniques for characterisation of bulk materials, surface, and interfaces, whereby two or more analytical methods investigating different properties are applied simultaneously to the same sample to better characterise the sample than can be achieved by conducting separate analyses in series using different instruments. It is intended for final year undergraduates and recent graduates, who may have some background knowledge of standard analytical techniques, but are not familiar with ‘hyphenated’ techniques or hybrid instrumentation. The review will begin by defining ‘complementary’, ‘hybrid’ and ‘hyphenated’ techniques, as there is not a broad consensus among analytical scientists as to what each term means. The motivating factors driving increased development of hyphenated analytical methods will also be discussed. This introduction will conclude with a brief discussion of gas chromatography-mass spectroscopy and energy dispersive x-ray analysis in electron microscopy as two examples, in the context that combining complementary techniques for chemical analysis were among the earliest examples of hyphenated characterisation methods. The emphasis of the main review will be on techniques which are sufficiently well-established that the instrumentation is commercially available, to examine physical properties including physical, mechanical, electrical and thermal, in addition to variations in composition, rather than methods solely to identify and quantify chemical species. Therefore, the proposed topical review will address three broad categories of techniques that the reader may expect to encounter in a well-equipped materials characterisation laboratory: microscopy based techniques, scanning probe-based techniques, and thermal analysis based techniques. Examples drawn from recent literature, and a concluding case study, will be used to explain the practical issues that arise in combining different techniques. We will consider how the complementary and varied information obtained by combining these techniques may be interpreted together to better understand the sample in greater detail than that was possible before, and also how combining different techniques can simplify sample preparation and ensure reliable comparisons are made between multiple analyses on the same samples—a topic of particular importance as nanoscale technologies become more prevalent in applied and industrial research and development (R&D). The review will conclude with a brief outline of the emerging state of the art in the research laboratory, and a suggested approach to using hyphenated techniques, whether in the teaching, quality control or R&D laboratory.

  15. Impulse excitation scanning acoustic microscopy for local quantification of Rayleigh surface wave velocity using B-scan analysis

    NASA Astrophysics Data System (ADS)

    Cherry, M.; Dierken, J.; Boehnlein, T.; Pilchak, A.; Sathish, S.; Grandhi, R.

    2018-01-01

    A new technique for performing quantitative scanning acoustic microscopy imaging of Rayleigh surface wave (RSW) velocity was developed based on b-scan processing. In this technique, the focused acoustic beam is moved through many defocus distances over the sample and excited with an impulse excitation, and advanced algorithms based on frequency filtering and the Hilbert transform are used to post-process the b-scans to estimate the Rayleigh surface wave velocity. The new method was used to estimate the RSW velocity on an optically flat E6 glass sample, and the velocity was measured at ±2 m/s and the scanning time per point was on the order of 1.0 s, which are both improvement from the previous two-point defocus method. The new method was also applied to the analysis of two titanium samples, and the velocity was estimated with very low standard deviation in certain large grains on the sample. A new behavior was observed with the b-scan analysis technique where the amplitude of the surface wave decayed dramatically on certain crystallographic orientations. The new technique was also compared with previous results, and the new technique has been found to be much more reliable and to have higher contrast than previously possible with impulse excitation.

  16. Urine sampling techniques in symptomatic primary-care patients: a diagnostic accuracy review.

    PubMed

    Holm, Anne; Aabenhus, Rune

    2016-06-08

    Choice of urine sampling technique in urinary tract infection may impact diagnostic accuracy and thus lead to possible over- or undertreatment. Currently no evidencebased consensus exists regarding correct sampling technique of urine from women with symptoms of urinary tract infection in primary care. The aim of this study was to determine the accuracy of urine culture from different sampling-techniques in symptomatic non-pregnant women in primary care. A systematic review was conducted by searching Medline and Embase for clinical studies conducted in primary care using a randomized or paired design to compare the result of urine culture obtained with two or more collection techniques in adult, female, non-pregnant patients with symptoms of urinary tract infection. We evaluated quality of the studies and compared accuracy based on dichotomized outcomes. We included seven studies investigating urine sampling technique in 1062 symptomatic patients in primary care. Mid-stream-clean-catch had a positive predictive value of 0.79 to 0.95 and a negative predictive value close to 1 compared to sterile techniques. Two randomized controlled trials found no difference in infection rate between mid-stream-clean-catch, mid-stream-urine and random samples. At present, no evidence suggests that sampling technique affects the accuracy of the microbiological diagnosis in non-pregnant women with symptoms of urinary tract infection in primary care. However, the evidence presented is in-direct and the difference between mid-stream-clean-catch, mid-stream-urine and random samples remains to be investigated in a paired design to verify the present findings.

  17. On the Applications of IBA Techniques to Biological Samples Analysis: PIXE and RBS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Falcon-Gonzalez, J. M.; Bernal-Alvarado, J.; Sosa, M.

    2008-08-11

    The analytical techniques based on ion beams or IBA techniques give quantitative information on elemental concentration in samples of a wide variety of nature. In this work, we focus on PIXE technique, analyzing thick target biological specimens (TTPIXE), using 3 MeV protons produced by an electrostatic accelerator. A nuclear microprobe was used performing PIXE and RBS simultaneously, in order to solve the uncertainties produced in the absolute PIXE quantifying. The advantages of using both techniques and a nuclear microprobe are discussed. Quantitative results are shown to illustrate the multielemental resolution of the PIXE technique; for this, a blood standard wasmore » used.« less

  18. A Monte Carlo technique for signal level detection in implanted intracranial pressure monitoring.

    PubMed

    Avent, R K; Charlton, J D; Nagle, H T; Johnson, R N

    1987-01-01

    Statistical monitoring techniques like CUSUM, Trigg's tracking signal and EMP filtering have a major advantage over more recent techniques, such as Kalman filtering, because of their inherent simplicity. In many biomedical applications, such as electronic implantable devices, these simpler techniques have greater utility because of the reduced requirements on power, logic complexity and sampling speed. The determination of signal means using some of the earlier techniques are reviewed in this paper, and a new Monte Carlo based method with greater capability to sparsely sample a waveform and obtain an accurate mean value is presented. This technique may find widespread use as a trend detection method when reduced power consumption is a requirement.

  19. Advanced Navigation Strategies for an Asteroid Sample Return Mission

    NASA Technical Reports Server (NTRS)

    Bauman, J.; Getzandanner, K.; Williams, B.; Williams, K.

    2011-01-01

    The proximity operations phases of a sample return mission to an asteroid have been analyzed using advanced navigation techniques derived from experience gained in planetary exploration. These techniques rely on tracking types such as Earth-based radio metric Doppler and ranging, spacecraft-based ranging, and optical navigation using images of landmarks on the asteroid surface. Navigation strategies for the orbital phases leading up to sample collection, the touch down for collecting the sample, and the post sample collection phase at the asteroid are included. Options for successfully executing the phases are studied using covariance analysis and Monte Carlo simulations of an example mission to the near Earth asteroid 4660 Nereus. Two landing options were studied including trajectories with either one or two bums from orbit to the surface. Additionally, a comparison of post-sample collection strategies is presented. These strategies include remaining in orbit about the asteroid or standing-off a given distance until departure to Earth.

  20. Wavelength-spacing-tunable multichannel filter incorporating a sampled chirped fiber Bragg grating based on a symmetrical chirp-tuning technique without center wavelength shift

    NASA Astrophysics Data System (ADS)

    Han, Young-Geun; Dong, Xinyong; Lee, Ju Han; Lee, Sang Bae

    2006-12-01

    We propose and experimentally demonstrate a simple and flexible scheme for a wavelength-spacing-tunable multichannel filter exploiting a sampled chirped fiber Bragg grating based on a symmetrical modification of the chirp ratio. Symmetrical bending along a sampled chirped fiber Bragg grating attached to a flexible cantilever beam induces a variation of the chirp ratio and a reflection chirp bandwidth of the grating without a center wavelength shift. Accordingly, the wavelength spacing of a sampled chirped fiber Bragg grating is continuously controlled by the reflection chirp bandwidth variation of the grating corresponding to the bending direction, which allows for realization of an effective wavelength-spacing-tunable multichannel filter. Based on the proposed technique, we achieve the continuous tunability of the wavelength spacing in a range from 1.51 to 6.11 nm, depending on the bending direction of the cantilever beam.

  1. Puzzle Imaging: Using Large-Scale Dimensionality Reduction Algorithms for Localization.

    PubMed

    Glaser, Joshua I; Zamft, Bradley M; Church, George M; Kording, Konrad P

    2015-01-01

    Current high-resolution imaging techniques require an intact sample that preserves spatial relationships. We here present a novel approach, "puzzle imaging," that allows imaging a spatially scrambled sample. This technique takes many spatially disordered samples, and then pieces them back together using local properties embedded within the sample. We show that puzzle imaging can efficiently produce high-resolution images using dimensionality reduction algorithms. We demonstrate the theoretical capabilities of puzzle imaging in three biological scenarios, showing that (1) relatively precise 3-dimensional brain imaging is possible; (2) the physical structure of a neural network can often be recovered based only on the neural connectivity matrix; and (3) a chemical map could be reproduced using bacteria with chemosensitive DNA and conjugative transfer. The ability to reconstruct scrambled images promises to enable imaging based on DNA sequencing of homogenized tissue samples.

  2. Potentiometric chip-based multipumping flow system for the simultaneous determination of fluoride, chloride, pH, and redox potential in water samples.

    PubMed

    Chango, Gabriela; Palacio, Edwin; Cerdà, Víctor

    2018-08-15

    A simple potentiometric chip-based multipumping flow system (MPFS) has been developed for the simultaneous determination of fluoride, chloride, pH, and redox potential in water samples. The proposed system was developed by using a poly(methyl methacrylate) chip microfluidic-conductor using the advantages of flow techniques with potentiometric detection. For this purpose, an automatic system has been designed and built by optimizing the variables involved in the process, such as: pH, ionic strength, stirring and sample volume. This system was applied successfully to water samples getting a versatile system with an analysis frequency of 12 samples per hour. Good correlation between chloride and fluoride concentration measured with ISE and ionic chromatography technique suggests satisfactory reliability of the system. Copyright © 2018 Elsevier B.V. All rights reserved.

  3. Hybrid Stochastic Search Technique based Suboptimal AGC Regulator Design for Power System using Constrained Feedback Control Strategy

    NASA Astrophysics Data System (ADS)

    Ibraheem, Omveer, Hasan, N.

    2010-10-01

    A new hybrid stochastic search technique is proposed to design of suboptimal AGC regulator for a two area interconnected non reheat thermal power system incorporating DC link in parallel with AC tie-line. In this technique, we are proposing the hybrid form of Genetic Algorithm (GA) and simulated annealing (SA) based regulator. GASA has been successfully applied to constrained feedback control problems where other PI based techniques have often failed. The main idea in this scheme is to seek a feasible PI based suboptimal solution at each sampling time. The feasible solution decreases the cost function rather than minimizing the cost function.

  4. An inverter-based capacitive trans-impedance amplifier readout with offset cancellation and temporal noise reduction for IR focal plane array

    NASA Astrophysics Data System (ADS)

    Chen, Hsin-Han; Hsieh, Chih-Cheng

    2013-09-01

    This paper presents a readout integrated circuit (ROIC) with inverter-based capacitive trans-impedance amplifier (CTIA) and pseudo-multiple sampling technique for infrared focal plane array (IRFPA). The proposed inverter-based CTIA with a coupling capacitor [1], executing auto-zeroing technique to cancel out the varied offset voltage from process variation, is used to substitute differential amplifier in conventional CTIA. The tunable detector bias is applied from a global external bias before exposure. This scheme not only retains stable detector bias voltage and signal injection efficiency, but also reduces the pixel area as well. Pseudo-multiple sampling technique [2] is adopted to reduce the temporal noise of readout circuit. The noise reduction performance is comparable to the conventional multiple sampling operation without need of longer readout time proportional to the number of samples. A CMOS image sensor chip with 55×65 pixel array has been fabricated in 0.18um CMOS technology. It achieves a 12um×12um pixel size, a frame rate of 72 fps, a power-per-pixel of 0.66uW/pixel, and a readout temporal noise of 1.06mVrms (16 times of pseudo-multiple sampling), respectively.

  5. The Recent Developments in Sample Preparation for Mass Spectrometry-Based Metabolomics.

    PubMed

    Gong, Zhi-Gang; Hu, Jing; Wu, Xi; Xu, Yong-Jiang

    2017-07-04

    Metabolomics is a critical member in systems biology. Although great progress has been achieved in metabolomics, there are still some problems in sample preparation, data processing and data interpretation. In this review, we intend to explore the roles, challenges and trends in sample preparation for mass spectrometry- (MS-) based metabolomics. The newly emerged sample preparation methods were also critically examined, including laser microdissection, in vivo sampling, dried blood spot, microwave, ultrasound and enzyme-assisted extraction, as well as microextraction techniques. Finally, we provide some conclusions and perspectives for sample preparation in MS-based metabolomics.

  6. ANALYSIS OF SAMPLING TECHNIQUES FOR IMBALANCED DATA: AN N=648 ADNI STUDY

    PubMed Central

    Dubey, Rashmi; Zhou, Jiayu; Wang, Yalin; Thompson, Paul M.; Ye, Jieping

    2013-01-01

    Many neuroimaging applications deal with imbalanced imaging data. For example, in Alzheimer’s Disease Neuroimaging Initiative (ADNI) dataset, the mild cognitive impairment (MCI) cases eligible for the study are nearly two times the Alzheimer’s disease (AD) patients for structural magnetic resonance imaging (MRI) modality and six times the control cases for proteomics modality. Constructing an accurate classifier from imbalanced data is a challenging task. Traditional classifiers that aim to maximize the overall prediction accuracy tend to classify all data into the majority class. In this paper, we study an ensemble system of feature selection and data sampling for the class imbalance problem. We systematically analyze various sampling techniques by examining the efficacy of different rates and types of undersampling, oversampling, and a combination of over and under sampling approaches. We thoroughly examine six widely used feature selection algorithms to identify significant biomarkers and thereby reduce the complexity of the data. The efficacy of the ensemble techniques is evaluated using two different classifiers including Random Forest and Support Vector Machines based on classification accuracy, area under the receiver operating characteristic curve (AUC), sensitivity, and specificity measures. Our extensive experimental results show that for various problem settings in ADNI, (1). a balanced training set obtained with K-Medoids technique based undersampling gives the best overall performance among different data sampling techniques and no sampling approach; and (2). sparse logistic regression with stability selection achieves competitive performance among various feature selection algorithms. Comprehensive experiments with various settings show that our proposed ensemble model of multiple undersampled datasets yields stable and promising results. PMID:24176869

  7. Biology Procedural Knowledge at Eleventh Grade of Senior High School in West Lampung Based on Curriculum

    NASA Astrophysics Data System (ADS)

    Sari, T. M.; Paidi; Mercuriani, I. S.

    2018-03-01

    This study was aim to determine Biology procedural knowledge of senior high school in West Lampung based on curriculum at 11th grade in even semester. This research was descriptive research. The population was all students of senior high school in West Lampung. The sampling technique in this research used purposive sampling technique, so the researcher obtained 3 schools using K13 and 3 schools using KTSP. Data collecting technique used instrument test. Data analysis technique used U-Mann Whitney test. The result showed that p=0.028 (p<0.05), so there was significant differences between school using K13 and KTSP. The procedural knowledge of schools which using K13 is higher than school which using KTSP, with the mean score K13=4.35 and KTSP=4.00

  8. Cost minimization analysis for combinations of sampling techniques in bronchoscopy of endobronchial lesions.

    PubMed

    Roth, Kjetil; Hardie, Jon Andrew; Andreassen, Alf Henrik; Leh, Friedemann; Eagan, Tomas Mikal Lind

    2009-06-01

    The choice of sampling techniques in bronchoscopy with sampling from a visible lesion will depend on the expected diagnostic yields and the costs of the sampling techniques. The aim of this study was to determine the most economical combination of sampling techniques when approaching endobronchial visible lesions. A cost minimization analysis was performed. All bronchoscopies from 2003 and 2004 at Haukeland university hospital, Bergen, Norway, were reviewed retrospectively for diagnostic yields. 162 patients with endobronchial disease were included. Potential sampling techniques used were biopsy, brushing, endobronchial needle aspiration (EBNA) and washings. Costs were estimated based on registration of equipment costs and personnel costs. Sensitivity analyses were performed to determine threshold values. The combination of biopsy, brushing and EBNA was the most economical strategy with an average cost of Euro 893 (95% CI: 657, 1336). The cost of brushing had to be below Euro 83 and it had to increase the diagnostic yield more than 2.2%, for biopsy and brushing to be more economical than biopsy alone. The combination of biopsy, brushing and EBNA was more economical than biopsy and brushing when the cost of EBNA was below Euro 205 and the increase in diagnostic yield was above 5.2%. In the current study setting, biopsy, brushing and EBNA was the most economical combination of sampling techniques for endobronchial visible lesions.

  9. Optimal spatial sampling techniques for ground truth data in microwave remote sensing of soil moisture

    NASA Technical Reports Server (NTRS)

    Rao, R. G. S.; Ulaby, F. T.

    1977-01-01

    The paper examines optimal sampling techniques for obtaining accurate spatial averages of soil moisture, at various depths and for cell sizes in the range 2.5-40 acres, with a minimum number of samples. Both simple random sampling and stratified sampling procedures are used to reach a set of recommended sample sizes for each depth and for each cell size. Major conclusions from statistical sampling test results are that (1) the number of samples required decreases with increasing depth; (2) when the total number of samples cannot be prespecified or the moisture in only one single layer is of interest, then a simple random sample procedure should be used which is based on the observed mean and SD for data from a single field; (3) when the total number of samples can be prespecified and the objective is to measure the soil moisture profile with depth, then stratified random sampling based on optimal allocation should be used; and (4) decreasing the sensor resolution cell size leads to fairly large decreases in samples sizes with stratified sampling procedures, whereas only a moderate decrease is obtained in simple random sampling procedures.

  10. Biology Factual Knowledge at Eleventh Grade of Senior High School Students in Pacitan based on Favorite Schools

    NASA Astrophysics Data System (ADS)

    Yustiana, I. A.; Paidi; Mercuriani, I. S.

    2018-03-01

    This study aimed to determine the Biology factual knowledge at eleventh grade of senior high school students in Pacitan based on favorite schools. This research was a descriptive research by using survey method. The population in this study was all of senior high school students in Pacitan. The sampling technique used purposive sampling technique and obtained 3 favorite schools and 3 non-favorite schools. The technique of collecting data used test form which was as the instrument of the research. Data analysis technique used Mann-Whitney U test. Based on the test, it was obtained p = 0,000 (p <0,05) so there was a significant difference between the factual knowledge of the students in the favorite schools and non-favorite schools in Pacitan. The factual knowledge of students in favorite schools was higher with an average of 5.32 while non-favorite schools were obtained an average of 4.36.

  11. Monolithic methacrylate packed 96-tips for high throughput bioanalysis.

    PubMed

    Altun, Zeki; Skoglund, Christina; Abdel-Rehim, Mohamed

    2010-04-16

    In the pharmaceutical industry the growing number of samples to be analyzed requires high throughput and fully automated analytical techniques. Commonly used sample-preparation methods are solid-phase extraction (SPE), liquid-liquid extraction (LLE) and protein precipitation. In this paper we will discus a new sample-preparation technique based on SPE for high throughput drug extraction developed and used by our group. This new sample-preparation method is based on monolithic methacrylate polymer as packing sorbent for 96-tip robotic device. Using this device a 96-well plate could be handled in 2-4min. The key aspect of the monolithic phase is that monolithic material can offer both good binding capacity and low back-pressure properties compared to e.g. silica phases. The present paper presents the successful application of monolithic 96-tips and LC-MS/MS by the sample preparation of busulphan, rescovitine, metoprolol, pindolol and local anaesthetics from human plasma samples and cyklophosphamid from mice blood samples. Copyright 2009 Elsevier B.V. All rights reserved.

  12. Evaluation of pyramid training as a method to increase diagnostic sampling capacity during an emergency veterinary response to a swine disease outbreak.

    PubMed

    Canon, Abbey J; Lauterbach, Nicholas; Bates, Jessica; Skoland, Kristin; Thomas, Paul; Ellingson, Josh; Ruston, Chelsea; Breuer, Mary; Gerardy, Kimberlee; Hershberger, Nicole; Hayman, Kristen; Buckley, Alexis; Holtkamp, Derald; Karriker, Locke

    2017-06-15

    OBJECTIVE To develop and evaluate a pyramid training method for teaching techniques for collection of diagnostic samples from swine. DESIGN Experimental trial. SAMPLE 45 veterinary students. PROCEDURES Participants went through a preinstruction assessment to determine their familiarity with the equipment needed and techniques used to collect samples of blood, nasal secretions, feces, and oral fluid from pigs. Participants were then shown a series of videos illustrating the correct equipment and techniques for collecting samples and were provided hands-on pyramid-based instruction wherein a single swine veterinarian trained 2 or 3 participants on each of the techniques and each of those participants, in turn, trained additional participants. Additional assessments were performed after the instruction was completed. RESULTS Following the instruction phase, percentages of participants able to collect adequate samples of blood, nasal secretions, feces, and oral fluid increased, as did scores on a written quiz assessing participants' ability to identify the correct equipment, positioning, and procedures for collection of samples. CONCLUSIONS AND CLINICAL RELEVANCE Results suggested that the pyramid training method may be a feasible way to rapidly increase diagnostic sampling capacity during an emergency veterinary response to a swine disease outbreak.

  13. Rope-based oral fluid sampling for early detection of classical swine fever in domestic pigs at group level.

    PubMed

    Dietze, Klaas; Tucakov, Anna; Engel, Tatjana; Wirtz, Sabine; Depner, Klaus; Globig, Anja; Kammerer, Robert; Mouchantat, Susan

    2017-01-05

    Non-invasive sampling techniques based on the analysis of oral fluid specimen have gained substantial importance in the field of swine herd management. Methodological advances have a focus on endemic viral diseases in commercial pig production. More recently, these approaches have been adapted to non-invasive sampling of wild boar for transboundary animal disease detection for which these effective population level sampling methods have not been available. In this study, a rope-in-a-bait based oral fluid sampling technique was tested to detect classical swine fever virus nucleic acid shedding from experimentally infected domestic pigs. Separated in two groups treated identically, the course of the infection was slightly differing in terms of onset of the clinical signs and levels of viral ribonucleic acid detection in the blood and oral fluid. The technique was capable of detecting classical swine fever virus nucleic acid as of day 7 post infection coinciding with the first detection in conventional oropharyngeal swab samples from some individual animals. Except for day 7 post infection in the "slower onset group", the chances of classical swine fever virus nucleic acid detection in ropes were identical or higher as compared to the individual sampling. With the provided evidence, non-invasive oral fluid sampling at group level can be considered as additional cost-effective detection tool in classical swine fever prevention and control strategies. The proposed methodology is of particular use in production systems with reduced access to veterinary services such as backyard or scavenging pig production where it can be integrated in feeding or baiting practices.

  14. Review of Fluorescence-Based Velocimetry Techniques to Study High-Speed Compressible Flows

    NASA Technical Reports Server (NTRS)

    Bathel, Brett F.; Johansen, Criag; Inman, Jennifer A.; Jones, Stephen B.; Danehy, Paul M.

    2013-01-01

    This paper reviews five laser-induced fluorescence-based velocimetry techniques that have been used to study high-speed compressible flows at NASA Langley Research Center. The techniques discussed in this paper include nitric oxide (NO) molecular tagging velocimetry (MTV), nitrogen dioxide photodissociation (NO2-to-NO) MTV, and NO and atomic oxygen (O-atom) Doppler-shift-based velocimetry. Measurements of both single-component and two-component velocity have been performed using these techniques. This paper details the specific application and experiment for which each technique has been used, the facility in which the experiment was performed, the experimental setup, sample results, and a discussion of the lessons learned from each experiment.

  15. Rapid Monitoring of Bacteria and Fungi aboard the International Space Station (ISS)

    NASA Technical Reports Server (NTRS)

    Gunter, D.; Flores, G.; Effinger, M.; Maule, J.; Wainwright, N.; Steele, A.; Damon, M.; Wells, M.; Williams, S.; Morris, H.; hide

    2009-01-01

    Microorganisms within spacecraft have traditionally been monitored with culture-based techniques. These techniques involve growth of environmental samples (cabin water, air or surfaces) on agar-type media for several days, followed by visualization of resulting colonies or return of samples to Earth for ground-based analysis. Data obtained over the past 4 decades have enhanced our understanding of the microbial ecology within space stations. However, the approach has been limited by the following factors: i) Many microorganisms (estimated > 95%) in the environment cannot grow on conventional growth media; ii) Significant time lags (3-5 days for incubation and up to several months to return samples to ground); iii) Condensation in contact slides hinders colony counting by crew; and iv) Growth of potentially harmful microorganisms, which must then be disposed of safely. This report describes the operation of a new culture-independent technique onboard the ISS for rapid analysis (within minutes) of endotoxin and beta-1, 3-glucan, found in the cell walls of gramnegative bacteria and fungi, respectively. The technique involves analysis of environmental samples with the Limulus Amebocyte Lysate (LAL) assay in a handheld device, known as the Lab-On-a-Chip Application Development Portable Test System (LOCAD-PTS). LOCADPTS was launched to the ISS in December 2006, and here we present data obtained from Mach 2007 until the present day. These data include a comparative study between LOCADPTS analysis and existing culture-based methods; and an exploratory survey of surface endotoxin and beta-1, 3-glucan throughout the ISS. While a general correlation between LOCAD-PTS and traditional culture-based methods should not be expected, we will suggest new requirements for microbial monitoring based upon culture-independent parameters measured by LOCAD-PTS.

  16. A technique for sampling low shrub vegetation, by crown volume classes

    Treesearch

    Jay R. Bentley; Donald W. Seegrist; David A. Blakeman

    1970-01-01

    The effects of herbicides or other cultural treatments of low shrubs can be sampled by a new technique using crown volume as the key variable. Low shrubs were grouped in 12 crown volume classes with index values based on height times surface area of crown. The number of plants, by species, in each class is counted on quadrats. Many quadrats are needed for highly...

  17. Improvement of the efficient referencing and sample positioning system for micro focused synchrotron X-ray techniques

    NASA Astrophysics Data System (ADS)

    Spangenberg, T.; Göttlicher, J.; Steininger, R.

    2016-05-01

    An efficient referencing and sample positioning system is a basic tool for a micro focus beamline at a synchrotron. The seven years ago introduced command line based system was upgraded at SUL-X beamline at ANKA [1]. A new combination of current server client techniques offers direct control and facilitates unexperienced users the handling of this frequently used tool.

  18. Puzzle Imaging: Using Large-Scale Dimensionality Reduction Algorithms for Localization

    PubMed Central

    Glaser, Joshua I.; Zamft, Bradley M.; Church, George M.; Kording, Konrad P.

    2015-01-01

    Current high-resolution imaging techniques require an intact sample that preserves spatial relationships. We here present a novel approach, “puzzle imaging,” that allows imaging a spatially scrambled sample. This technique takes many spatially disordered samples, and then pieces them back together using local properties embedded within the sample. We show that puzzle imaging can efficiently produce high-resolution images using dimensionality reduction algorithms. We demonstrate the theoretical capabilities of puzzle imaging in three biological scenarios, showing that (1) relatively precise 3-dimensional brain imaging is possible; (2) the physical structure of a neural network can often be recovered based only on the neural connectivity matrix; and (3) a chemical map could be reproduced using bacteria with chemosensitive DNA and conjugative transfer. The ability to reconstruct scrambled images promises to enable imaging based on DNA sequencing of homogenized tissue samples. PMID:26192446

  19. Using machine learning to accelerate sampling-based inversion

    NASA Astrophysics Data System (ADS)

    Valentine, A. P.; Sambridge, M.

    2017-12-01

    In most cases, a complete solution to a geophysical inverse problem (including robust understanding of the uncertainties associated with the result) requires a sampling-based approach. However, the computational burden is high, and proves intractable for many problems of interest. There is therefore considerable value in developing techniques that can accelerate sampling procedures.The main computational cost lies in evaluation of the forward operator (e.g. calculation of synthetic seismograms) for each candidate model. Modern machine learning techniques-such as Gaussian Processes-offer a route for constructing a computationally-cheap approximation to this calculation, which can replace the accurate solution during sampling. Importantly, the accuracy of the approximation can be refined as inversion proceeds, to ensure high-quality results.In this presentation, we describe and demonstrate this approach-which can be seen as an extension of popular current methods, such as the Neighbourhood Algorithm, and bridges the gap between prior- and posterior-sampling frameworks.

  20. Solventless and solvent-minimized sample preparation techniques for determining currently used pesticides in water samples: a review.

    PubMed

    Tankiewicz, Maciej; Fenik, Jolanta; Biziuk, Marek

    2011-10-30

    The intensification of agriculture means that increasing amounts of toxic organic and inorganic compounds are entering the environment. The pesticides generally applied nowadays are regarded as some of the most dangerous contaminants of the environment. Their presence in the environment, especially in water, is hazardous because they cause human beings to become more susceptible to disease. For these reasons, it is essential to monitor pesticide residues in the environment with the aid of all accessible analytical methods. The analysis of samples for the presence of pesticides is problematic, because of the laborious and time-consuming operations involved in preparing samples for analysis, which themselves may be a source of additional contaminations and errors. To date, it has been standard practice to use large quantities of organic solvents in the sample preparation process; but as these solvents are themselves hazardous, solventless and solvent-minimized techniques are coming into use. This paper discusses the most commonly used over the last 15 years sample preparation techniques for monitoring organophosphorus and organonitrogen pesticides residue in water samples. Furthermore, a significant trend in sample preparation, in accordance with the principles of 'Green Chemistry' is the simplification, miniaturization and automation of analytical techniques. In view of this aspect, several novel techniques are being developed in order to reduce the analysis step, increase the sample throughput and to improve the quality and the sensitivity of analytical methods. The paper describes extraction techniques requiring the use of solvents - liquid-liquid extraction (LLE) and its modifications, membrane extraction techniques, hollow fibre-protected two-phase solvent microextraction, liquid phase microextraction based on the solidification of a floating organic drop (LPME-SFO), solid-phase extraction (SPE) and single-drop microextraction (SDME) - as well as solvent-free techniques - solid phase microextraction (SPME) and stir bar sorptive extraction (SBSE). The advantages and drawbacks of these techniques are also discussed, and some solutions to their limitations are proposed. Copyright © 2011 Elsevier B.V. All rights reserved.

  1. Experimental technique for studying high-temperature phase equilibria in reactive molten metal based systems

    NASA Astrophysics Data System (ADS)

    Ermoline, Alexandre

    The general objective of this work is to develop an experimental technique for studying the high-temperature phase compositions and phase equilibria in molten metal-based binary and ternary systems, such as Zr-O-N, B-N-O, Al-O, and others. A specific material system of Zr-O-N was selected for studying and testing this technique. The information about the high-temperature phase equilibria in reactive metal-based systems is scarce and their studying is difficult because of chemical reactions occurring between samples and essentially any container materials, and causing contamination of the system. Containerless microgravity experiments for studying equilibria in molten metal-gas systems were designed to be conducted onboard of a NASA KC-135 aircraft flying parabolic trajectories. A uniaxial apparatus suitable for acoustic levitation, laser heating, and splat quenching of small samples was developed and equipped with computer-based controller and optical diagnostics. Normal-gravity tests were conducted to determine the most suitable operating parameters of the levitator by direct observations of the levitated samples, as opposed to more traditional pressure mapping of the acoustic field. The size range of samples that could be reliably heated and quenched in this setup was determined to be on the order of 1--3 mm. In microgravity experiments, small spherical specimens (1--2 mm diameter), prepared as pressed, premixed solid components, ZrO2, ZrN, and Zr powders, were acoustically levitated inside an argon-filled chamber at one atmosphere and heated by a CO2 laser. The levitating samples could be continuously laser heated for about 1 sec, resulting in local sample melting. The sample stability in the vertical direction was undisturbed by simultaneous laser heating. Oscillations of the levitating sample in the horizontal direction increased while it was heated, which eventually resulted in the movement of the sample away from its stable levitation position and the laser beam. The follow-up on-ground experiments were conducted to study phase relations in the Zr-O-N system at high-temperatures. Samples with specific compositions were laser-heated above the melt formation and naturally cooled. Recovered samples were characterized using electron microscopy, energy-dispersive spectroscopy, and x-ray diffraction. Results of these analyses combined with the interpretations of the binary Zr-O and Zr-N phase diagrams enabled us to outline the liquidus and the subsolidus equilibria for the ternary Zr-ZrO2-ZrN phase diagrams. Further research is suggested to develop the microgravity techniques for detailed characterization of high-temperature relations in the reactive, metal based systems.

  2. An expert system shell for inferring vegetation characteristics: Implementation of additional techniques (task E)

    NASA Technical Reports Server (NTRS)

    Harrison, P. Ann

    1992-01-01

    The NASA VEGetation Workbench (VEG) is a knowledge based system that infers vegetation characteristics from reflectance data. The VEG subgoal PROPORTION.GROUND.COVER has been completed and a number of additional techniques that infer the proportion ground cover of a sample have been implemented. Some techniques operate on sample data at a single wavelength. The techniques previously incorporated in VEG for other subgoals operated on data at a single wavelength so implementing the additional single wavelength techniques required no changes to the structure of VEG. Two techniques which use data at multiple wavelengths to infer proportion ground cover were also implemented. This work involved modifying the structure of VEG so that multiple wavelength techniques could be incorporated. All the new techniques were tested using both the VEG 'Research Mode' and the 'Automatic Mode.'

  3. Terahertz thickness determination with interferometric vibration correction for industrial applications.

    PubMed

    Pfeiffer, Tobias; Weber, Stefan; Klier, Jens; Bachtler, Sebastian; Molter, Daniel; Jonuscheit, Joachim; Von Freymann, Georg

    2018-05-14

    In many industrial fields, like automotive and painting industry, the thickness of thin layers is a crucial parameter for quality control. Hence, the demand for thickness measurement techniques continuously grows. In particular, non-destructive and contact-free terahertz techniques access a wide range of thickness determination applications. However, terahertz time-domain spectroscopy based systems perform the measurement in a sampling manner, requiring fixed distances between measurement head and sample. In harsh industrial environments vibrations of sample and measurement head distort the time-base and decrease measurement accuracy. We present an interferometer-based vibration correction for terahertz time-domain measurements, able to reduce thickness distortion by one order of magnitude for vibrations with frequencies up to 100 Hz and amplitudes up to 100 µm. We further verify the experimental results by numerical calculations and find very good agreement.

  4. The discrimination of geoforensic trace material from close proximity locations by organic profiling using HPLC and plant wax marker analysis by GC.

    PubMed

    McCulloch, G; Dawson, L A; Ross, J M; Morgan, R M

    2018-07-01

    There is a need to develop a wider empirical research base to expand the scope for utilising the organic fraction of soil in forensic geoscience, and to demonstrate the capability of the analytical techniques used in forensic geoscience to discriminate samples at close proximity locations. The determination of wax markers from soil samples by GC analysis has been used extensively in court and is known to be effective in discriminating samples from different land use types. A new HPLC method for the analysis of the organic fraction of forensic sediment samples has also been shown recently to add value in conjunction with existing inorganic techniques for the discrimination of samples derived from close proximity locations. This study compares the ability of these two organic techniques to discriminate samples derived from close proximity locations and finds the GC technique to provide good discrimination at this scale, providing quantification of known compounds, whilst the HPLC technique offered a shorter and simpler sample preparation method and provided very good discrimination between groups of samples of different provenance in most cases. The use of both data sets together gave further improved accuracy rates in some cases, suggesting that a combined organic approach can provide added benefits in certain case scenarios and crime reconstruction contexts. Copyright © 2018 The Authors. Published by Elsevier B.V. All rights reserved.

  5. Whole arm manipulation planning based on feedback velocity fields and sampling-based techniques.

    PubMed

    Talaei, B; Abdollahi, F; Talebi, H A; Omidi Karkani, E

    2013-09-01

    Changing the configuration of a cooperative whole arm manipulator is not easy while enclosing an object. This difficulty is mainly because of risk of jamming caused by kinematic constraints. To reduce this risk, this paper proposes a feedback manipulation planning algorithm that takes grasp kinematics into account. The idea is based on a vector field that imposes perturbation in object motion inducing directions when the movement is considerably along manipulator redundant directions. Obstacle avoidance problem is then considered by combining the algorithm with sampling-based techniques. As experimental results confirm, the proposed algorithm is effective in avoiding jamming as well as obstacles for a 6-DOF dual arm whole arm manipulator. Copyright © 2013 ISA. Published by Elsevier Ltd. All rights reserved.

  6. Sampling methods, dispersion patterns, and fixed precision sequential sampling plans for western flower thrips (Thysanoptera: Thripidae) and cotton fleahoppers (Hemiptera: Miridae) in cotton.

    PubMed

    Parajulee, M N; Shrestha, R B; Leser, J F

    2006-04-01

    A 2-yr field study was conducted to examine the effectiveness of two sampling methods (visual and plant washing techniques) for western flower thrips, Frankliniella occidentalis (Pergande), and five sampling methods (visual, beat bucket, drop cloth, sweep net, and vacuum) for cotton fleahopper, Pseudatomoscelis seriatus (Reuter), in Texas cotton, Gossypium hirsutum (L.), and to develop sequential sampling plans for each pest. The plant washing technique gave similar results to the visual method in detecting adult thrips, but the washing technique detected significantly higher number of thrips larvae compared with the visual sampling. Visual sampling detected the highest number of fleahoppers followed by beat bucket, drop cloth, vacuum, and sweep net sampling, with no significant difference in catch efficiency between vacuum and sweep net methods. However, based on fixed precision cost reliability, the sweep net sampling was the most cost-effective method followed by vacuum, beat bucket, drop cloth, and visual sampling. Taylor's Power Law analysis revealed that the field dispersion patterns of both thrips and fleahoppers were aggregated throughout the crop growing season. For thrips management decision based on visual sampling (0.25 precision), 15 plants were estimated to be the minimum sample size when the estimated population density was one thrips per plant, whereas the minimum sample size was nine plants when thrips density approached 10 thrips per plant. The minimum visual sample size for cotton fleahoppers was 16 plants when the density was one fleahopper per plant, but the sample size decreased rapidly with an increase in fleahopper density, requiring only four plants to be sampled when the density was 10 fleahoppers per plant. Sequential sampling plans were developed and validated with independent data for both thrips and cotton fleahoppers.

  7. Non-contact Creep Resistance Measurement for Ultra-High Temperature Materials

    NASA Technical Reports Server (NTRS)

    Lee, J.; Bradshaw, C.; Rogers, J. R.; Rathz, T. J.; Wall, J. J.; Choo, H.; Liaw, P. K.; Hyers, R. W.

    2005-01-01

    Conventional techniques for measuring creep are limited to about 1700 C, so a new technique is required for higher temperatures. This technique is based on electrostatic levitation (ESL) of a spherical sample, which is rotated quickly enough to cause creep deformation by centrifugal acceleration. Creep of samples has been demonstrated at up to 2300 C in the ESL facility at NASA MSFC, while ESL itself has been applied at over 3000 C, and has no theoretical maximum temperature. The preliminary results and future directions of this NASA-funded research collaboration will be presented.

  8. Appreciating the difference between design-based and model-based sampling strategies in quantitative morphology of the nervous system.

    PubMed

    Geuna, S

    2000-11-20

    Quantitative morphology of the nervous system has undergone great developments over recent years, and several new technical procedures have been devised and applied successfully to neuromorphological research. However, a lively debate has arisen on some issues, and a great deal of confusion appears to exist that is definitely responsible for the slow spread of the new techniques among scientists. One such element of confusion is related to uncertainty about the meaning, implications, and advantages of the design-based sampling strategy that characterize the new techniques. In this article, to help remove this uncertainty, morphoquantitative methods are described and contrasted on the basis of the inferential paradigm of the sampling strategy: design-based vs model-based. Moreover, some recommendations are made to help scientists judge the appropriateness of a method used for a given study in relation to its specific goals. Finally, the use of the term stereology to label, more or less expressly, only some methods is critically discussed. Copyright 2000 Wiley-Liss, Inc.

  9. Thermoelectrically cooled water trap

    DOEpatents

    Micheels, Ronald H [Concord, MA

    2006-02-21

    A water trap system based on a thermoelectric cooling device is employed to remove a major fraction of the water from air samples, prior to analysis of these samples for chemical composition, by a variety of analytical techniques where water vapor interferes with the measurement process. These analytical techniques include infrared spectroscopy, mass spectrometry, ion mobility spectrometry and gas chromatography. The thermoelectric system for trapping water present in air samples can substantially improve detection sensitivity in these analytical techniques when it is necessary to measure trace analytes with concentrations in the ppm (parts per million) or ppb (parts per billion) partial pressure range. The thermoelectric trap design is compact and amenable to use in a portable gas monitoring instrumentation.

  10. Neutron activation analysis of certified samples by the absolute method

    NASA Astrophysics Data System (ADS)

    Kadem, F.; Belouadah, N.; Idiri, Z.

    2015-07-01

    The nuclear reactions analysis technique is mainly based on the relative method or the use of activation cross sections. In order to validate nuclear data for the calculated cross section evaluated from systematic studies, we used the neutron activation analysis technique (NAA) to determine the various constituent concentrations of certified samples for animal blood, milk and hay. In this analysis, the absolute method is used. The neutron activation technique involves irradiating the sample and subsequently performing a measurement of the activity of the sample. The fundamental equation of the activation connects several physical parameters including the cross section that is essential for the quantitative determination of the different elements composing the sample without resorting to the use of standard sample. Called the absolute method, it allows a measurement as accurate as the relative method. The results obtained by the absolute method showed that the values are as precise as the relative method requiring the use of standard sample for each element to be quantified.

  11. 11 CFR 9036.4 - Commission review of submissions.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ..., in conducting its review, may utilize statistical sampling techniques. Based on the results of its... nonmatchable and the reason that it is not matchable; or if statistical sampling is used, the estimated amount...

  12. Imputatoin and Model-Based Updating Technique for Annual Forest Inventories

    Treesearch

    Ronald E. McRoberts

    2001-01-01

    The USDA Forest Service is developing an annual inventory system to establish the capability of producing annual estimates of timber volume and related variables. The inventory system features measurement of an annual sample of field plots with options for updating data for plots measured in previous years. One imputation and two model-based updating techniques are...

  13. Active learning for semi-supervised clustering based on locally linear propagation reconstruction.

    PubMed

    Chang, Chin-Chun; Lin, Po-Yi

    2015-03-01

    The success of semi-supervised clustering relies on the effectiveness of side information. To get effective side information, a new active learner learning pairwise constraints known as must-link and cannot-link constraints is proposed in this paper. Three novel techniques are developed for learning effective pairwise constraints. The first technique is used to identify samples less important to cluster structures. This technique makes use of a kernel version of locally linear embedding for manifold learning. Samples neither important to locally linear propagation reconstructions of other samples nor on flat patches in the learned manifold are regarded as unimportant samples. The second is a novel criterion for query selection. This criterion considers not only the importance of a sample to expanding the space coverage of the learned samples but also the expected number of queries needed to learn the sample. To facilitate semi-supervised clustering, the third technique yields inferred must-links for passing information about flat patches in the learned manifold to semi-supervised clustering algorithms. Experimental results have shown that the learned pairwise constraints can capture the underlying cluster structures and proven the feasibility of the proposed approach. Copyright © 2014 Elsevier Ltd. All rights reserved.

  14. Ultra-sensitive high performance liquid chromatography-laser-induced fluorescence based proteomics for clinical applications.

    PubMed

    Patil, Ajeetkumar; Bhat, Sujatha; Pai, Keerthilatha M; Rai, Lavanya; Kartha, V B; Chidangil, Santhosh

    2015-09-08

    An ultra-sensitive high performance liquid chromatography-laser induced fluorescence (HPLC-LIF) based technique has been developed by our group at Manipal, for screening, early detection, and staging for various cancers, using protein profiling of clinical samples like, body fluids, cellular specimens, and biopsy-tissue. More than 300 protein profiles of different clinical samples (serum, saliva, cellular samples and tissue homogenates) from volunteers (normal, and different pre-malignant/malignant conditions) were recorded using this set-up. The protein profiles were analyzed using principal component analysis (PCA) to achieve objective detection and classification of malignant, premalignant and healthy conditions with high sensitivity and specificity. The HPLC-LIF protein profiling combined with PCA, as a routine method for screening, diagnosis, and staging of cervical cancer and oral cancer, is discussed in this paper. In recent years, proteomics techniques have advanced tremendously in life sciences and medical sciences for the detection and identification of proteins in body fluids, tissue homogenates and cellular samples to understand biochemical mechanisms leading to different diseases. Some of the methods include techniques like high performance liquid chromatography, 2D-gel electrophoresis, MALDI-TOF-MS, SELDI-TOF-MS, CE-MS and LC-MS techniques. We have developed an ultra-sensitive high performance liquid chromatography-laser induced fluorescence (HPLC-LIF) based technique, for screening, early detection, and staging for various cancers, using protein profiling of clinical samples like, body fluids, cellular specimens, and biopsy-tissue. More than 300 protein profiles of different clinical samples (serum, saliva, cellular samples and tissue homogenates) from healthy and volunteers with different malignant conditions were recorded by using this set-up. The protein profile data were analyzed using principal component analysis (PCA) for objective classification and detection of malignant, premalignant and healthy conditions. The method is extremely sensitive to detect proteins with limit of detection of the order of femto-moles. The HPLC-LIF combined with PCA as a potential proteomic method for the diagnosis of oral cancer and cervical cancer has been discussed in this paper. This article is part of a Special Issue entitled: Proteomics in India. Copyright © 2015 Elsevier B.V. All rights reserved.

  15. Combined X-ray CT and mass spectrometry for biomedical imaging applications

    NASA Astrophysics Data System (ADS)

    Schioppa, E., Jr.; Ellis, S.; Bruinen, A. L.; Visser, J.; Heeren, R. M. A.; Uher, J.; Koffeman, E.

    2014-04-01

    Imaging technologies play a key role in many branches of science, especially in biology and medicine. They provide an invaluable insight into both internal structure and processes within a broad range of samples. There are many techniques that allow one to obtain images of an object. Different techniques are based on the analysis of a particular sample property by means of a dedicated imaging system, and as such, each imaging modality provides the researcher with different information. The use of multimodal imaging (imaging with several different techniques) can provide additional and complementary information that is not possible when employing a single imaging technique alone. In this study, we present for the first time a multi-modal imaging technique where X-ray computerized tomography (CT) is combined with mass spectrometry imaging (MSI). While X-ray CT provides 3-dimensional information regarding the internal structure of the sample based on X-ray absorption coefficients, MSI of thin sections acquired from the same sample allows the spatial distribution of many elements/molecules, each distinguished by its unique mass-to-charge ratio (m/z), to be determined within a single measurement and with a spatial resolution as low as 1 μm or even less. The aim of the work is to demonstrate how molecular information from MSI can be spatially correlated with 3D structural information acquired from X-ray CT. In these experiments, frozen samples are imaged in an X-ray CT setup using Medipix based detectors equipped with a CO2 cooled sample holder. Single projections are pre-processed before tomographic reconstruction using a signal-to-thickness calibration. In the second step, the object is sliced into thin sections (circa 20 μm) that are then imaged using both matrix-assisted laser desorption/ionization mass spectrometry (MALDI-MS) and secondary ion (SIMS) mass spectrometry, where the spatial distribution of specific molecules within the sample is determined. The combination of two vastly different imaging approaches provides complementary information (i.e., anatomical and molecular distributions) that allows the correlation of distinct structural features with specific molecules distributions leading to unique insights in disease development.

  16. Enzyme-based electrochemical biosensors for determination of organophosphorus and carbamate pesticides

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Everett, W.R.; Rechnitz, G.A.

    1999-01-01

    A mini review of enzyme-based electrochemical biosensors for inhibition analysis of organophosphorus and carbamate pesticides is presented. Discussion includes the most recent literature to present advances in detection limits, selectivity and real sample analysis. Recent reviews on the monitoring of pesticides and their residues suggest that the classical analytical techniques of gas and liquid chromatography are the most widely used methods of detection. These techniques, although very accurate in their determinations, can be quite time consuming and expensive and usually require extensive sample clean up and pro-concentration. For these and many other reasons, the classical techniques are very difficult tomore » adapt for field use. Numerous researchers, in the past decade, have developed and made improvements on biosensors for use in pesticide analysis. This mini review will focus on recent advances made in enzyme-based electrochemical biosensors for the determinations of organophosphorus and carbamate pesticides.« less

  17. Novel On-wafer Radiation Pattern Measurement Technique for MEMS Actuator Based Reconfigurable Patch Antennas

    NASA Technical Reports Server (NTRS)

    Simons, Rainee N.

    2002-01-01

    The paper presents a novel on-wafer, antenna far field pattern measurement technique for microelectromechanical systems (MEMS) based reconfigurable patch antennas. The measurement technique significantly reduces the time and the cost associated with the characterization of printed antennas, fabricated on a semiconductor wafer or dielectric substrate. To measure the radiation patterns, the RF probe station is modified to accommodate an open-ended rectangular waveguide as the rotating linearly polarized sampling antenna. The open-ended waveguide is attached through a coaxial rotary joint to a Plexiglas(Trademark) arm and is driven along an arc by a stepper motor. Thus, the spinning open-ended waveguide can sample the relative field intensity of the patch as a function of the angle from bore sight. The experimental results include the measured linearly polarized and circularly polarized radiation patterns for MEMS-based frequency reconfigurable rectangular and polarization reconfigurable nearly square patch antennas, respectively.

  18. Impact of sampling techniques on measured stormwater quality data for small streams

    USDA-ARS?s Scientific Manuscript database

    Science-based sampling methodologies are needed to enhance water quality characterization for developing Total Maximum Daily Loads (TMDLs), setting appropriate water quality standards, and managing nonpoint source pollution. Storm event sampling, which is vital for adequate assessment of water qual...

  19. The signature-based radiation-scanning approach to standoff detection of improvised explosive devices.

    PubMed

    Brewer, R L; Dunn, W L; Heider, S; Matthew, C; Yang, X

    2012-07-01

    The signature-based radiation-scanning technique for detection of improvised explosive devices is described. The technique seeks to detect nitrogen-rich chemical explosives present in a target. The technology compares a set of "signatures" obtained from a test target to a collection of "templates", sets of signatures for a target that contain an explosive in a specific configuration. Interrogation of nitrogen-rich fertilizer samples, which serve as surrogates for explosives, is shown experimentally to be able to discriminate samples of 3.8L and larger. Copyright © 2011 Elsevier Ltd. All rights reserved.

  20. Predictors of community therapists' use of therapy techniques in a large public mental health system.

    PubMed

    Beidas, Rinad S; Marcus, Steven; Aarons, Gregory A; Hoagwood, Kimberly E; Schoenwald, Sonja; Evans, Arthur C; Hurford, Matthew O; Hadley, Trevor; Barg, Frances K; Walsh, Lucia M; Adams, Danielle R; Mandell, David S

    2015-04-01

    Few studies have examined the effects of individual and organizational characteristics on the use of evidence-based practices in mental health care. Improved understanding of these factors could guide future implementation efforts to ensure effective adoption, implementation, and sustainment of evidence-based practices. To estimate the relative contribution of individual and organizational factors on therapist self-reported use of cognitive-behavioral, family, and psychodynamic therapy techniques within the context of a large-scale effort to increase use of evidence-based practices in an urban public mental health system serving youth and families. In this observational, cross-sectional study of 23 organizations, data were collected from March 1 through July 25, 2013. We used purposive sampling to recruit the 29 largest child-serving agencies, which together serve approximately 80% of youth receiving publically funded mental health care. The final sample included 19 agencies with 23 sites, 130 therapists, 36 supervisors, and 22 executive administrators. Therapist self-reported use of cognitive-behavioral, family, and psychodynamic therapy techniques, as measured by the Therapist Procedures Checklist-Family Revised. Individual factors accounted for the following percentages of the overall variation: cognitive-behavioral therapy techniques, 16%; family therapy techniques, 7%; and psychodynamic therapy techniques, 20%. Organizational factors accounted for the following percentages of the overall variation: cognitive-behavioral therapy techniques, 23%; family therapy techniques, 19%; and psychodynamic therapy techniques, 7%. Older therapists and therapists with more open attitudes were more likely to endorse use of cognitive-behavioral therapy techniques, as were those in organizations that had spent fewer years participating in evidence-based practice initiatives, had more resistant cultures, and had more functional climates. Women were more likely to endorse use of family therapy techniques, as were those in organizations employing more fee-for-service staff and with more stressful climates. Therapists with more divergent attitudes and less knowledge about evidence-based practices were more likely to use psychodynamic therapy techniques. This study suggests that individual and organizational factors are important in explaining therapist behavior and use of evidence-based practices, but the relative importance varies by therapeutic technique.

  1. A single-sampling hair trap for mesocarnivores

    Treesearch

    Jonathan N. Pauli; Matthew B. Hamilton; Edward B. Crain; Steven W. Buskirk

    2007-01-01

    Although techniques to analyze and quantifY DNA-based data have progressed, methods to noninvasively collect samples lag behind. Samples are generally collected from devices that permit coincident sampling of multiple individuals. Because of cross-contamination, substantive genotyping errors can arise. We developed a cost-effective (US$4.60/trap) single-capture hair...

  2. Phase contrast STEM for thin samples: Integrated differential phase contrast.

    PubMed

    Lazić, Ivan; Bosch, Eric G T; Lazar, Sorin

    2016-01-01

    It has been known since the 1970s that the movement of the center of mass (COM) of a convergent beam electron diffraction (CBED) pattern is linearly related to the (projected) electrical field in the sample. We re-derive a contrast transfer function (CTF) for a scanning transmission electron microscopy (STEM) imaging technique based on this movement from the point of view of image formation and continue by performing a two-dimensional integration on the two images based on the two components of the COM movement. The resulting integrated COM (iCOM) STEM technique yields a scalar image that is linear in the phase shift caused by the sample and therefore also in the local (projected) electrostatic potential field of a thin sample. We confirm that the differential phase contrast (DPC) STEM technique using a segmented detector with 4 quadrants (4Q) yields a good approximation for the COM movement. Performing a two-dimensional integration, just as for the COM, we obtain an integrated DPC (iDPC) image which is approximately linear in the phase of the sample. Beside deriving the CTFs of iCOM and iDPC, we clearly point out the objects of the two corresponding imaging techniques, and highlight the differences to objects corresponding to COM-, DPC-, and (HA) ADF-STEM. The theory is validated with simulations and we present first experimental results of the iDPC-STEM technique showing its capability for imaging both light and heavy elements with atomic resolution and a good signal to noise ratio (SNR). Copyright © 2015 Elsevier B.V. All rights reserved.

  3. Analysis of sampling techniques for imbalanced data: An n = 648 ADNI study.

    PubMed

    Dubey, Rashmi; Zhou, Jiayu; Wang, Yalin; Thompson, Paul M; Ye, Jieping

    2014-02-15

    Many neuroimaging applications deal with imbalanced imaging data. For example, in Alzheimer's Disease Neuroimaging Initiative (ADNI) dataset, the mild cognitive impairment (MCI) cases eligible for the study are nearly two times the Alzheimer's disease (AD) patients for structural magnetic resonance imaging (MRI) modality and six times the control cases for proteomics modality. Constructing an accurate classifier from imbalanced data is a challenging task. Traditional classifiers that aim to maximize the overall prediction accuracy tend to classify all data into the majority class. In this paper, we study an ensemble system of feature selection and data sampling for the class imbalance problem. We systematically analyze various sampling techniques by examining the efficacy of different rates and types of undersampling, oversampling, and a combination of over and undersampling approaches. We thoroughly examine six widely used feature selection algorithms to identify significant biomarkers and thereby reduce the complexity of the data. The efficacy of the ensemble techniques is evaluated using two different classifiers including Random Forest and Support Vector Machines based on classification accuracy, area under the receiver operating characteristic curve (AUC), sensitivity, and specificity measures. Our extensive experimental results show that for various problem settings in ADNI, (1) a balanced training set obtained with K-Medoids technique based undersampling gives the best overall performance among different data sampling techniques and no sampling approach; and (2) sparse logistic regression with stability selection achieves competitive performance among various feature selection algorithms. Comprehensive experiments with various settings show that our proposed ensemble model of multiple undersampled datasets yields stable and promising results. © 2013 Elsevier Inc. All rights reserved.

  4. Non-destructive evaluation of laboratory scale hydraulic fracturing using acoustic emission

    NASA Astrophysics Data System (ADS)

    Hampton, Jesse Clay

    The primary objective of this research is to develop techniques to characterize hydraulic fractures and fracturing processes using acoustic emission monitoring based on laboratory scale hydraulic fracturing experiments. Individual microcrack AE source characterization is performed to understand the failure mechanisms associated with small failures along pre-existing discontinuities and grain boundaries. Individual microcrack analysis methods include moment tensor inversion techniques to elucidate the mode of failure, crack slip and crack normal direction vectors, and relative volumetric deformation of an individual microcrack. Differentiation between individual microcrack analysis and AE cloud based techniques is studied in efforts to refine discrete fracture network (DFN) creation and regional damage quantification of densely fractured media. Regional damage estimations from combinations of individual microcrack analyses and AE cloud density plotting are used to investigate the usefulness of weighting cloud based AE analysis techniques with microcrack source data. Two granite types were used in several sample configurations including multi-block systems. Laboratory hydraulic fracturing was performed with sample sizes ranging from 15 x 15 x 25 cm3 to 30 x 30 x 25 cm 3 in both unconfined and true-triaxially confined stress states using different types of materials. Hydraulic fracture testing in rock block systems containing a large natural fracture was investigated in terms of AE response throughout fracture interactions. Investigations of differing scale analyses showed the usefulness of individual microcrack characterization as well as DFN and cloud based techniques. Individual microcrack characterization weighting cloud based techniques correlated well with post-test damage evaluations.

  5. When continuous observations just won't do: developing accurate and efficient sampling strategies for the laying hen.

    PubMed

    Daigle, Courtney L; Siegford, Janice M

    2014-03-01

    Continuous observation is the most accurate way to determine animals' actual time budget and can provide a 'gold standard' representation of resource use, behavior frequency, and duration. Continuous observation is useful for capturing behaviors that are of short duration or occur infrequently. However, collecting continuous data is labor intensive and time consuming, making multiple individual or long-term data collection difficult. Six non-cage laying hens were video recorded for 15 h and behavioral data collected every 2 s were compared with data collected using scan sampling intervals of 5, 10, 15, 30, and 60 min and subsamples of 2 second observations performed for 10 min every 30 min, 15 min every 1 h, 30 min every 1.5 h, and 15 min every 2 h. Three statistical approaches were used to provide a comprehensive analysis to examine the quality of the data obtained via different sampling methods. General linear mixed models identified how the time budget from the sampling techniques differed from continuous observation. Correlation analysis identified how strongly results from the sampling techniques were associated with those from continuous observation. Regression analysis identified how well the results from the sampling techniques were associated with those from continuous observation, changes in magnitude, and whether a sampling technique had bias. Static behaviors were well represented with scan and time sampling techniques, while dynamic behaviors were best represented with time sampling techniques. Methods for identifying an appropriate sampling strategy based upon the type of behavior of interest are outlined and results for non-caged laying hens are presented. Copyright © 2013 Elsevier B.V. All rights reserved.

  6. Soil Sampling Techniques For Alabama Grain Fields

    NASA Technical Reports Server (NTRS)

    Thompson, A. N.; Shaw, J. N.; Mask, P. L.; Touchton, J. T.; Rickman, D.

    2003-01-01

    Characterizing the spatial variability of nutrients facilitates precision soil sampling. Questions exist regarding the best technique for directed soil sampling based on a priori knowledge of soil and crop patterns. The objective of this study was to evaluate zone delineation techniques for Alabama grain fields to determine which method best minimized the soil test variability. Site one (25.8 ha) and site three (20.0 ha) were located in the Tennessee Valley region, and site two (24.2 ha) was located in the Coastal Plain region of Alabama. Tennessee Valley soils ranged from well drained Rhodic and Typic Paleudults to somewhat poorly drained Aquic Paleudults and Fluventic Dystrudepts. Coastal Plain s o i l s ranged from coarse-loamy Rhodic Kandiudults to loamy Arenic Kandiudults. Soils were sampled by grid soil sampling methods (grid sizes of 0.40 ha and 1 ha) consisting of: 1) twenty composited cores collected randomly throughout each grid (grid-cell sampling) and, 2) six composited cores collected randomly from a -3x3 m area at the center of each grid (grid-point sampling). Zones were established from 1) an Order 1 Soil Survey, 2) corn (Zea mays L.) yield maps, and 3) airborne remote sensing images. All soil properties were moderately to strongly spatially dependent as per semivariogram analyses. Differences in grid-point and grid-cell soil test values suggested grid-point sampling does not accurately represent grid values. Zones created by soil survey, yield data, and remote sensing images displayed lower coefficient of variations (8CV) for soil test values than overall field values, suggesting these techniques group soil test variability. However, few differences were observed between the three zone delineation techniques. Results suggest directed sampling using zone delineation techniques outlined in this paper would result in more efficient soil sampling for these Alabama grain fields.

  7. Impacts of Sampling and Handling Procedures on DNA- and RNA-based Microbial Characterization and Quantification of Groundwater and Saturated Soil

    DTIC Science & Technology

    2012-07-01

    use of molecular biological techniques (MBTs) has allowed microbial ecologists and environmental engineers to determine microbial community...metabolic genes). The most common approaches used in bioremediation research are those based on the polymerase chain reaction (PCR) amplification of... bioremediation . Because of its sensitivity compared to direct hybridization/probing, PCR is increasingly used to analyze groundwater samples and soil samples

  8. [Potentials in the regionalization of health indicators using small-area estimation methods : Exemplary results based on the 2009, 2010 and 2012 GEDA studies].

    PubMed

    Kroll, Lars Eric; Schumann, Maria; Müters, Stephan; Lampert, Thomas

    2017-12-01

    Nationwide health surveys can be used to estimate regional differences in health. Using traditional estimation techniques, the spatial depth for these estimates is limited due to the constrained sample size. So far - without special refreshment samples - results have only been available for larger populated federal states of Germany. An alternative is regression-based small-area estimation techniques. These models can generate smaller-scale data, but are also subject to greater statistical uncertainties because of the model assumptions. In the present article, exemplary regionalized results based on the studies "Gesundheit in Deutschland aktuell" (GEDA studies) 2009, 2010 and 2012, are compared to the self-rated health status of the respondents. The aim of the article is to analyze the range of regional estimates in order to assess the usefulness of the techniques for health reporting more adequately. The results show that the estimated prevalence is relatively stable when using different samples. Important determinants of the variation of the estimates are the achieved sample size on the district level and the type of the district (cities vs. rural regions). Overall, the present study shows that small-area modeling of prevalence is associated with additional uncertainties compared to conventional estimates, which should be taken into account when interpreting the corresponding findings.

  9. Smart Sampling and HPC-based Probabilistic Look-ahead Contingency Analysis Implementation and its Evaluation with Real-world Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Yousu; Etingov, Pavel V.; Ren, Huiying

    This paper describes a probabilistic look-ahead contingency analysis application that incorporates smart sampling and high-performance computing (HPC) techniques. Smart sampling techniques are implemented to effectively represent the structure and statistical characteristics of uncertainty introduced by different sources in the power system. They can significantly reduce the data set size required for multiple look-ahead contingency analyses, and therefore reduce the time required to compute them. High-performance-computing (HPC) techniques are used to further reduce computational time. These two techniques enable a predictive capability that forecasts the impact of various uncertainties on potential transmission limit violations. The developed package has been tested withmore » real world data from the Bonneville Power Administration. Case study results are presented to demonstrate the performance of the applications developed.« less

  10. Technology for Elevated Temperature Tests of Structural Panels

    NASA Technical Reports Server (NTRS)

    Thornton, E. A.

    1999-01-01

    A technique for full-field measurement of surface temperature and in-plane strain using a single grid imaging technique was demonstrated on a sample subjected to thermally-induced strain. The technique is based on digital imaging of a sample marked by an alternating line array of La2O2S:Eu(+3) thermographic phosphor and chromium illuminated by a UV lamp. Digital images of this array in unstrained and strained states were processed using a modified spin filter. Normal strain distribution was determined by combining unstrained and strained grid images using a single grid digital moire technique. Temperature distribution was determined by ratioing images of phosphor intensity at two wavelengths. Combined strain and temperature measurements demonstrated on the thermally heated sample were DELTA-epsilon = +/- 250 microepsilon and DELTA-T = +/- 5 K respectively with a spatial resolution of 0.8 mm.

  11. Joint use of over- and under-sampling techniques and cross-validation for the development and assessment of prediction models.

    PubMed

    Blagus, Rok; Lusa, Lara

    2015-11-04

    Prediction models are used in clinical research to develop rules that can be used to accurately predict the outcome of the patients based on some of their characteristics. They represent a valuable tool in the decision making process of clinicians and health policy makers, as they enable them to estimate the probability that patients have or will develop a disease, will respond to a treatment, or that their disease will recur. The interest devoted to prediction models in the biomedical community has been growing in the last few years. Often the data used to develop the prediction models are class-imbalanced as only few patients experience the event (and therefore belong to minority class). Prediction models developed using class-imbalanced data tend to achieve sub-optimal predictive accuracy in the minority class. This problem can be diminished by using sampling techniques aimed at balancing the class distribution. These techniques include under- and oversampling, where a fraction of the majority class samples are retained in the analysis or new samples from the minority class are generated. The correct assessment of how the prediction model is likely to perform on independent data is of crucial importance; in the absence of an independent data set, cross-validation is normally used. While the importance of correct cross-validation is well documented in the biomedical literature, the challenges posed by the joint use of sampling techniques and cross-validation have not been addressed. We show that care must be taken to ensure that cross-validation is performed correctly on sampled data, and that the risk of overestimating the predictive accuracy is greater when oversampling techniques are used. Examples based on the re-analysis of real datasets and simulation studies are provided. We identify some results from the biomedical literature where the incorrect cross-validation was performed, where we expect that the performance of oversampling techniques was heavily overestimated.

  12. Forestry inventory based on multistage sampling with probability proportional to size

    NASA Technical Reports Server (NTRS)

    Lee, D. C. L.; Hernandez, P., Jr.; Shimabukuro, Y. E.

    1983-01-01

    A multistage sampling technique, with probability proportional to size, is developed for a forest volume inventory using remote sensing data. The LANDSAT data, Panchromatic aerial photographs, and field data are collected. Based on age and homogeneity, pine and eucalyptus classes are identified. Selection of tertiary sampling units is made through aerial photographs to minimize field work. The sampling errors for eucalyptus and pine ranged from 8.34 to 21.89 percent and from 7.18 to 8.60 percent, respectively.

  13. Recent developments and future trends in solid phase microextraction techniques towards green analytical chemistry.

    PubMed

    Spietelun, Agata; Marcinkowski, Łukasz; de la Guardia, Miguel; Namieśnik, Jacek

    2013-12-20

    Solid phase microextraction find increasing applications in the sample preparation step before chromatographic determination of analytes in samples with a complex composition. These techniques allow for integrating several operations, such as sample collection, extraction, analyte enrichment above the detection limit of a given measuring instrument and the isolation of analytes from sample matrix. In this work the information about novel methodological and instrumental solutions in relation to different variants of solid phase extraction techniques, solid-phase microextraction (SPME), stir bar sorptive extraction (SBSE) and magnetic solid phase extraction (MSPE) is presented, including practical applications of these techniques and a critical discussion about their advantages and disadvantages. The proposed solutions fulfill the requirements resulting from the concept of sustainable development, and specifically from the implementation of green chemistry principles in analytical laboratories. Therefore, particular attention was paid to the description of possible uses of novel, selective stationary phases in extraction techniques, inter alia, polymeric ionic liquids, carbon nanotubes, and silica- and carbon-based sorbents. The methodological solutions, together with properly matched sampling devices for collecting analytes from samples with varying matrix composition, enable us to reduce the number of errors during the sample preparation prior to chromatographic analysis as well as to limit the negative impact of this analytical step on the natural environment and the health of laboratory employees. Copyright © 2013 Elsevier B.V. All rights reserved.

  14. ESDA®-Lite collection of DNA from latent fingerprints on documents.

    PubMed

    Plaza, Dane T; Mealy, Jamia L; Lane, J Nicholas; Parsons, M Neal; Bathrick, Abigail S; Slack, Donia P

    2015-05-01

    The ability to detect and non-destructively collect biological samples for DNA processing would benefit the forensic community by preserving the physical integrity of evidentiary items for more thorough evaluations by other forensic disciplines. The Electrostatic Detection Apparatus (ESDA®) was systemically evaluated for its ability to non-destructively collect DNA from latent fingerprints deposited on various paper substrates for short tandem repeat (STR) DNA profiling. Fingerprints were deposited on a variety of paper substrates that included resume paper, cotton paper, magazine paper, currency, copy paper, and newspaper. Three DNA collection techniques were performed: ESDA collection, dry swabbing, and substrate cutting. Efficacy of each collection technique was evaluated by the quantity of DNA present in each sample and the percent profile generated by each sample. Both the ESDA and dry swabbing non-destructive sampling techniques outperformed the destructive methodology of substrate cutting. A greater number of full profiles were generated from samples collected with the non-destructive dry swabbing collection technique than were generated from samples collected with the ESDA; however, the ESDA also allowed the user to visualize the area of interest while non-destructively collecting the biological material. The ability to visualize the biological material made sampling straightforward and eliminated the need for numerous, random swabbings/cuttings. Based on these results, the evaluated non-destructive ESDA collection technique has great potential for real-world forensic implementation. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  15. A low-rank matrix recovery approach for energy efficient EEG acquisition for a wireless body area network.

    PubMed

    Majumdar, Angshul; Gogna, Anupriya; Ward, Rabab

    2014-08-25

    We address the problem of acquiring and transmitting EEG signals in Wireless Body Area Networks (WBAN) in an energy efficient fashion. In WBANs, the energy is consumed by three operations: sensing (sampling), processing and transmission. Previous studies only addressed the problem of reducing the transmission energy. For the first time, in this work, we propose a technique to reduce sensing and processing energy as well: this is achieved by randomly under-sampling the EEG signal. We depart from previous Compressed Sensing based approaches and formulate signal recovery (from under-sampled measurements) as a matrix completion problem. A new algorithm to solve the matrix completion problem is derived here. We test our proposed method and find that the reconstruction accuracy of our method is significantly better than state-of-the-art techniques; and we achieve this while saving sensing, processing and transmission energy. Simple power analysis shows that our proposed methodology consumes considerably less power compared to previous CS based techniques.

  16. Ion mobility spectrometry as a detector for molecular imprinted polymer separation and metronidazole determination in pharmaceutical and human serum samples.

    PubMed

    Jafari, M T; Rezaei, B; Zaker, B

    2009-05-01

    Application of ion mobility spectrometry (IMS) as the detection technique for a separation method based on molecular imprinted polymer (MIP) was investigated and evaluated for the first time. On the basis of the results obtained in this work, the MIP-IMS system can be used as a powerful technique for separation, preconcentration, and detection of the metronidazole drug in pharmaceutical and human serum samples. The method is exhaustively validated in terms of sensitivity, selectivity, recovery, reproducibility, and column capacity. The linear dynamic range of 0.05-70.00 microg/mL was obtained for the determination of metronidazole with IMS. The recovery of analyzed drug was calculated to be above 89%, and the relative standard deviation (RSD) was lower than 6% for all experiments. Various real samples were analyzed with the coupled techniques, and the results obtained revealed the efficient cleanup of the samples using MIP separation before the analysis by IMS as a detection technique.

  17. Development of a versatile user-friendly IBA experimental chamber

    NASA Astrophysics Data System (ADS)

    Kakuee, Omidreza; Fathollahi, Vahid; Lamehi-Rachti, Mohammad

    2016-03-01

    Reliable performance of the Ion Beam Analysis (IBA) techniques is based on the accurate geometry of the experimental setup, employment of the reliable nuclear data and implementation of dedicated analysis software for each of the IBA techniques. It has already been shown that geometrical imperfections lead to significant uncertainties in quantifications of IBA measurements. To minimize these uncertainties, a user-friendly experimental chamber with a heuristic sample positioning system for IBA analysis was recently developed in the Van de Graaff laboratory in Tehran. This system enhances IBA capabilities and in particular Nuclear Reaction Analysis (NRA) and Elastic Recoil Detection Analysis (ERDA) techniques. The newly developed sample manipulator provides the possibility of both controlling the tilt angle of the sample and analyzing samples with different thicknesses. Moreover, a reasonable number of samples can be loaded in the sample wheel. A comparison of the measured cross section data of the 16O(d,p1)17O reaction with the data reported in the literature confirms the performance and capability of the newly developed experimental chamber.

  18. Approaches to Recruiting 'Hard-To-Reach' Populations into Re-search: A Review of the Literature.

    PubMed

    Shaghaghi, Abdolreza; Bhopal, Raj S; Sheikh, Aziz

    2011-01-01

    'Hard-to-reach' is a term used to describe those sub-groups of the population that may be difficult to reach or involve in research or public health programmes. Application of a single term to call these sub-sections of populations implies a homogeneity within distinct groups, which does not necessarily exist. Different sampling techniques were introduced so far to recruit hard-to-reach populations. In this article, we have reviewed a range of ap-proaches that have been used to widen participation in studies. We performed a Pubmed and Google search for relevant English language articles using the keywords and phrases: (hard-to-reach AND population* OR sampl*), (hidden AND population* OR sample*) and ("hard to reach" AND population* OR sample*) and a consul-tation of the retrieved articles' bibliographies to extract empirical evidence from publications that discussed or examined the use of sampling techniques to recruit hidden or hard-to-reach populations in health studies. Reviewing the literature has identified a range of techniques to recruit hard-to-reach populations, including snowball sampling, respondent-driven sampling (RDS), indigenous field worker sampling (IFWS), facility-based sampling (FBS), targeted sampling (TS), time-location (space) sampling (TLS), conventional cluster sampling (CCS) and capture re-capture sampling (CR). The degree of compliance with a study by a certain 'hard-to-reach' group de-pends on the characteristics of that group, recruitment technique used and the subject of inter-est. Irrespective of potential advantages or limitations of the recruitment techniques reviewed, their successful use depends mainly upon our knowledge about specific characteristics of the target populations. Thus in line with attempts to expand the current boundaries of our know-ledge about recruitment techniques in health studies and their applications in varying situa-tions, we should also focus on possibly all contributing factors which may have an impact on participation rate within a defined population group.

  19. A comparison of ruminal or reticular digesta sampling as an alternative to sampling from the omasum of lactating dairy cows.

    PubMed

    Fatehi, F; Krizsan, S J; Gidlund, H; Huhtanen, P

    2015-05-01

    The objective of this study was to develop and compare techniques for determining nutrient flow based on digesta samples collected from the reticulum or rumen of lactating dairy cows with estimates generated by the omasal sampling technique. Pre-experimental method development suggested, after comparing with the particle size distribution of feces, application of primary sieving of ruminal and reticular digesta from lactating cows through an 11.6-mm sieve, implying that digesta particles smaller than this were eligible to flow out of the rumen. For flow measurements at the different sampling sites 4 multiparous, lactating Nordic Red cows fitted with ruminal cannulas were used in a Latin square design with 4 dietary treatments, in which crimped barley was replaced with 3 incremental levels of protein supplementation of canola meal. Digesta was collected from the rumen, reticulum, and omasum to represent a 24-h feeding cycle. Nutrient flow was calculated using the reconstitution system based on Cr, Yb, and indigestible neutral detergent fiber and using (15)N as microbial marker. Large and small particles and the fluid phase were recovered from digesta collected at all sampling sites. Bacterial samples were isolated from the digesta collected from the omasum. Several differences existed for digesta composition, nutrient flows, and estimates of ruminal digestibility among the 3 different sampling sites. Sampling site × diet interactions were not significant. The estimated flows of DM, potentially digestible neutral detergent fiber, nonammonia N, and microbial N were significantly different between all sampling sites. However, the difference between DM flow based on sampling from the reticulum and the omasum was small (0.13kg/d greater in the omasum). The equality between the reticulum and the omasum as sampling sites was supported by the following regression: omasal DM flow=0.37 (±0.649) + 0.94 (±0.054) reticular DM flow (R(2)=0.96 and root mean square error=0.438kg/d). More deviating nutrient-flow estimates when sampling digesta from the rumen than the reticulum compared with the omasum suggested that sampling from the reticulum is the most promising alternative to the omasal sampling technique. To definitively promote sampling from the reticulum as an alternative to the omasal sampling technique, more research is needed to determine selection criteria of reticular digesta for accurate and precise flow estimates across a range of diets. Copyright © 2015 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.

  20. NBIT Program Phase I (2007-2010). Part 1, Chapters 1 Through 4

    DTIC Science & Technology

    2009-08-27

    2 schematically shows the sample prepared before hydrothermal synthesis . The thin layer of Zn was convered to ZnO nanowires during hydrothermal ... Nanoparticle -Based Magnetically Amplified Surface Plasmon Resonance (Mag-SPR) Techniques; Jinwoo Cheon (Yonsei University, Korea) and A. Paul...Ion; Chapter 3 ? Ultra-Sensitive Biological Detection via Nanoparticle -Based Magnetically Amplified Surface Plasmon Resonance (Mag-SPR) Techniques

  1. Dense mesh sampling for video-based facial animation

    NASA Astrophysics Data System (ADS)

    Peszor, Damian; Wojciechowska, Marzena

    2016-06-01

    The paper describes an approach for selection of feature points on three-dimensional, triangle mesh obtained using various techniques from several video footages. This approach has a dual purpose. First, it allows to minimize the data stored for the purpose of facial animation, so that instead of storing position of each vertex in each frame, one could store only a small subset of vertices for each frame and calculate positions of others based on the subset. Second purpose is to select feature points that could be used for anthropometry-based retargeting of recorded mimicry to another model, with sampling density beyond that which can be achieved using marker-based performance capture techniques. Developed approach was successfully tested on artificial models, models constructed using structured light scanner, and models constructed from video footages using stereophotogrammetry.

  2. A Note on Sample Size and Solution Propriety for Confirmatory Factor Analytic Models

    ERIC Educational Resources Information Center

    Jackson, Dennis L.; Voth, Jennifer; Frey, Marc P.

    2013-01-01

    Determining an appropriate sample size for use in latent variable modeling techniques has presented ongoing challenges to researchers. In particular, small sample sizes are known to present concerns over sampling error for the variances and covariances on which model estimation is based, as well as for fit indexes and convergence failures. The…

  3. Constructed-Response Matching to Sample and Spelling Instruction.

    ERIC Educational Resources Information Center

    Dube, William V.; And Others

    1991-01-01

    This paper describes a computer-based spelling program grounded in programed instructional techniques and using constructed-response matching-to-sample procedures. Following use of the program, two mentally retarded men successfully spelled previously misspelled words. (JDD)

  4. Cell culture-based biosensing techniques for detecting toxicity in water.

    PubMed

    Tan, Lu; Schirmer, Kristin

    2017-06-01

    The significant increase of contaminants entering fresh water bodies calls for the development of rapid and reliable methods to monitor the aquatic environment and to detect water toxicity. Cell culture-based biosensing techniques utilise the overall cytotoxic response to external stimuli, mediated by a transduced signal, to specify the toxicity of aqueous samples. These biosensing techniques can effectively indicate water toxicity for human safety and aquatic organism health. In this review we account for the recent developments of the mainstream cell culture-based biosensing techniques for water quality evaluation, discuss their key features, potentials and limitations, and outline the future prospects of their development. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.

  5. Schnellverfahren zur flammenlosen AAS-Bestimmung von Spurenelementen in geologischen Proben

    NASA Astrophysics Data System (ADS)

    Schrön, W.; Bombach, G.; Beuge, P.

    This paper reports experience with direct quantitative trace element determinations in powdered geological samples by nameless atomic absorption spectroscopy. Two methods were explored. The first one is based on the production of a sample aerosol by laser radiation in a specifically designed sample chamber and the subsequent transport of the aerosol into a graphite tube, which has been preheated to a stable temperature. This technique is suited for a large range of concentration and is relatively free from matrix interferences. The technique was tested for the elements Ag, As, Bi, Cd, Co, Mn, Ni, Pb, Sb, Se, Sr and Tl. The described sample chamber can be also used in combination with other spcctroscopic techniques. The second method explored permits the quantitative determination of trace elements at very low concentrations. Essentially an accurately weighed amount of sample is placed on a graphite rod and introduced into a graphite furnace by inserting the rod through the sample injection port. Atomization takes place also under stable temperature conditions. Using this technique detection limits were found to be 10 -11 g for Ag, 2 × 10 -11 g for Cd and 10 -10 g for Sb in silicate materials.

  6. A review on creatinine measurement techniques.

    PubMed

    Mohabbati-Kalejahi, Elham; Azimirad, Vahid; Bahrami, Manouchehr; Ganbari, Ahmad

    2012-08-15

    This paper reviews the entire recent global tendency for creatinine measurement. Creatinine biosensors involve complex relationships between biology and micro-mechatronics to which the blood is subjected. Comparison between new and old methods shows that new techniques (e.g. Molecular Imprinted Polymers based algorithms) are better than old methods (e.g. Elisa) in terms of stability and linear range. All methods and their details for serum, plasma, urine and blood samples are surveyed. They are categorized into five main algorithms: optical, electrochemical, impedometrical, Ion Selective Field-Effect Transistor (ISFET) based technique and chromatography. Response time, detection limit, linear range and selectivity of reported sensors are discussed. Potentiometric measurement technique has the lowest response time of 4-10 s and the lowest detection limit of 0.28 nmol L(-1) belongs to chromatographic technique. Comparison between various techniques of measurements indicates that the best selectivity belongs to MIP based and chromatographic techniques. Copyright © 2012 Elsevier B.V. All rights reserved.

  7. Surface characterization based on optical phase shifting interferometry

    DOEpatents

    Mello, Michael , Rosakis; Ares, J [Altadena, CA

    2011-08-02

    Apparatus, techniques and systems for implementing an optical interferometer to measure surfaces, including mapping of instantaneous curvature or in-plane and out-of-plane displacement field gradients of a sample surface based on obtaining and processing four optical interferograms from a common optical reflected beam from the sample surface that are relatively separated in phase by .pi./2.

  8. Nano-yttria dispersed stainless steel composites composed by the 3 dimensional fiber deposition technique

    NASA Astrophysics Data System (ADS)

    Verhiest, K.; Mullens, S.; De Wispelaere, N.; Claessens, S.; DeBremaecker, A.; Verbeken, K.

    2012-09-01

    In this study, oxide dispersion strengthened (ODS) 316L steel samples were manufactured by the 3 dimensional fiber deposition (3DFD) technique. The performance of 3DFD as colloidal consolidation technique to obtain porous green bodies based on yttria (Y2O3) nano-slurries or paste, is discussed within this experimental work. The influence of the sintering temperature and time on sample densification and grain growth was investigated in this study. Hot consolidation was performed to obtain final product quality in terms of residual porosity reduction and final dispersion homogeneity.

  9. Field results of antifouling techniques for optical instruments

    USGS Publications Warehouse

    Strahle, W.J.; Hotchkiss, F.S.; Martini, Marinna A.

    1998-01-01

    An anti-fouling technique is developed for the protection of optical instruments from biofouling which leaches a bromide compound into a sample chamber and pumps new water into the chamber prior to measurement. The primary advantage of using bromide is that it is less toxic than the metal-based antifoulants. The drawback of the bromide technique is also discussed.

  10. Development of Gold Standard Ion-Selective Electrode-Based Methods for Fluoride Analysis

    PubMed Central

    Martínez-Mier, E.A.; Cury, J.A.; Heilman, J.R.; Katz, B.P.; Levy, S.M.; Li, Y.; Maguire, A.; Margineda, J.; O’Mullane, D.; Phantumvanit, P.; Soto-Rojas, A.E.; Stookey, G.K.; Villa, A.; Wefel, J.S.; Whelton, H.; Whitford, G.M.; Zero, D.T.; Zhang, W.; Zohouri, V.

    2011-01-01

    Background/Aims: Currently available techniques for fluoride analysis are not standardized. Therefore, this study was designed to develop standardized methods for analyzing fluoride in biological and nonbiological samples used for dental research. Methods A group of nine laboratories analyzed a set of standardized samples for fluoride concentration using their own methods. The group then reviewed existing analytical techniques for fluoride analysis, identified inconsistencies in the use of these techniques and conducted testing to resolve differences. Based on the results of the testing undertaken to define the best approaches for the analysis, the group developed recommendations for direct and microdiffusion methods using the fluoride ion-selective electrode. Results Initial results demonstrated that there was no consensus regarding the choice of analytical techniques for different types of samples. Although for several types of samples, the results of the fluoride analyses were similar among some laboratories, greater differences were observed for saliva, food and beverage samples. In spite of these initial differences, precise and true values of fluoride concentration, as well as smaller differences between laboratories, were obtained once the standardized methodologies were used. Intraclass correlation coefficients ranged from 0.90 to 0.93, for the analysis of a certified reference material, using the standardized methodologies. Conclusion The results of this study demonstrate that the development and use of standardized protocols for F analysis significantly decreased differences among laboratories and resulted in more precise and true values. PMID:21160184

  11. Dynamic magnetic resonance imaging method based on golden-ratio cartesian sampling and compressed sensing.

    PubMed

    Li, Shuo; Zhu, Yanchun; Xie, Yaoqin; Gao, Song

    2018-01-01

    Dynamic magnetic resonance imaging (DMRI) is used to noninvasively trace the movements of organs and the process of drug delivery. The results can provide quantitative or semiquantitative pathology-related parameters, thus giving DMRI great potential for clinical applications. However, conventional DMRI techniques suffer from low temporal resolution and long scan time owing to the limitations of the k-space sampling scheme and image reconstruction algorithm. In this paper, we propose a novel DMRI sampling scheme based on a golden-ratio Cartesian trajectory in combination with a compressed sensing reconstruction algorithm. The results of two simulation experiments, designed according to the two major DMRI techniques, showed that the proposed method can improve the temporal resolution and shorten the scan time and provide high-quality reconstructed images.

  12. Systematic comparison of static and dynamic headspace sampling techniques for gas chromatography.

    PubMed

    Kremser, Andreas; Jochmann, Maik A; Schmidt, Torsten C

    2016-09-01

    Six automated, headspace-based sample preparation techniques were used to extract volatile analytes from water with the goal of establishing a systematic comparison between commonly available instrumental alternatives. To that end, these six techniques were used in conjunction with the same gas chromatography instrument for analysis of a common set of volatile organic carbon (VOC) analytes. The methods were thereby divided into three classes: static sampling (by syringe or loop), static enrichment (SPME and PAL SPME Arrow), and dynamic enrichment (ITEX and trap sampling). For PAL SPME Arrow, different sorption phase materials were also included in the evaluation. To enable an effective comparison, method detection limits (MDLs), relative standard deviations (RSDs), and extraction yields were determined and are discussed for all techniques. While static sampling techniques exhibited sufficient extraction yields (approx. 10-20 %) to be reliably used down to approx. 100 ng L(-1), enrichment techniques displayed extraction yields of up to 80 %, resulting in MDLs down to the picogram per liter range. RSDs for all techniques were below 27 %. The choice on one of the different instrumental modes of operation (aforementioned classes) was thereby the most influential parameter in terms of extraction yields and MDLs. Individual methods inside each class showed smaller deviations, and the least influences were observed when evaluating different sorption phase materials for the individual enrichment techniques. The option of selecting specialized sorption phase materials may, however, be more important when analyzing analytes with different properties such as high polarity or the capability of specific molecular interactions. Graphical Abstract PAL SPME Arrow during the extraction of volatile analytes from the headspace of an aqueous sample.

  13. Comparison of Sample Size by Bootstrap and by Formulas Based on Normal Distribution Assumption.

    PubMed

    Wang, Zuozhen

    2018-01-01

    Bootstrapping technique is distribution-independent, which provides an indirect way to estimate the sample size for a clinical trial based on a relatively smaller sample. In this paper, sample size estimation to compare two parallel-design arms for continuous data by bootstrap procedure are presented for various test types (inequality, non-inferiority, superiority, and equivalence), respectively. Meanwhile, sample size calculation by mathematical formulas (normal distribution assumption) for the identical data are also carried out. Consequently, power difference between the two calculation methods is acceptably small for all the test types. It shows that the bootstrap procedure is a credible technique for sample size estimation. After that, we compared the powers determined using the two methods based on data that violate the normal distribution assumption. To accommodate the feature of the data, the nonparametric statistical method of Wilcoxon test was applied to compare the two groups in the data during the process of bootstrap power estimation. As a result, the power estimated by normal distribution-based formula is far larger than that by bootstrap for each specific sample size per group. Hence, for this type of data, it is preferable that the bootstrap method be applied for sample size calculation at the beginning, and that the same statistical method as used in the subsequent statistical analysis is employed for each bootstrap sample during the course of bootstrap sample size estimation, provided there is historical true data available that can be well representative of the population to which the proposed trial is planning to extrapolate.

  14. Study on fast measurement of sugar content of yogurt using Vis/NIR spectroscopy techniques

    NASA Astrophysics Data System (ADS)

    He, Yong; Feng, Shuijuan; Wu, Di; Li, Xiaoli

    2006-09-01

    In order to measuring the sugar content of yogurt rapidly, a fast measurement of sugar content of yogurt using Vis/NIR-spectroscopy techniques was established. 25 samples selected separately from five different brands of yogurt were measured by Vis/NIR-spectroscopy. The sugar content of yogurt on positions scanned by spectrum were measured by a sugar content meter. The mathematical model between sugar content and Vis/NIR spectral measurements was established and developed based on partial least squares (PLS). The correlation coefficient of sugar content based on PLS model is more than 0.894, and standard error of calibration (SEC) is 0.356, standard error of prediction (SEP) is 0.389. Through predicting the sugar content quantitatively of 35 samples of yogurt from 5 different brands, the correlation coefficient between predictive value and measured value of those samples is more than 0.934. The results show the good to excellent prediction performance. The Vis/NIR spectroscopy technique had significantly greater accuracy for determining the sugar content. It was concluded that the Vis/NIRS measurement technique seems reliable to assess the fast measurement of sugar content of yogurt, and a new method for the measurement of sugar content of yogurt was established.

  15. The Effects of Sampling Probe Design and Sampling Techniques on Aerosol Measurements

    DTIC Science & Technology

    1975-05-01

    Schematic of Extraction and Sampling System 39 16. Filter Housing 40 17. Theoretical Isokinetic Flow Requirements of the EPA Sampling...from the flow parameters based on a zero-error assumption at isokinetic sampling conditions. Isokinetic , or equal velocity sampling, was...prior to testing the probes. It was also used to measure the flow field adjacent to the probe inlets to determine the isokinetic condition of the

  16. Review of the quantification techniques for polycyclic aromatic hydrocarbons (PAHs) in food products.

    PubMed

    Bansal, Vasudha; Kumar, Pawan; Kwon, Eilhann E; Kim, Ki-Hyun

    2017-10-13

    There is a growing need for accurate detection of trace-level PAHs in food products due to the numerous detrimental effects caused by their contamination (e.g., toxicity, carcinogenicity, and teratogenicity). This review aims to discuss the up-to-date knowledge on the measurement techniques available for PAHs contained in food or its related products. This article aims to provide a comprehensive outline on the measurement techniques of PAHs in food to help reduce their deleterious impacts on human health based on the accurate quantification. The main part of this review is dedicated to the opportunities and practical options for the treatment of various food samples and for accurate quantification of PAHs contained in those samples. Basic information regarding all available analytical measurement techniques for PAHs in food samples is also evaluated with respect to their performance in terms of quality assurance.

  17. Mass spectrometry for fragment screening.

    PubMed

    Chan, Daniel Shiu-Hin; Whitehouse, Andrew J; Coyne, Anthony G; Abell, Chris

    2017-11-08

    Fragment-based approaches in chemical biology and drug discovery have been widely adopted worldwide in both academia and industry. Fragment hits tend to interact weakly with their targets, necessitating the use of sensitive biophysical techniques to detect their binding. Common fragment screening techniques include differential scanning fluorimetry (DSF) and ligand-observed NMR. Validation and characterization of hits is usually performed using a combination of protein-observed NMR, isothermal titration calorimetry (ITC) and X-ray crystallography. In this context, MS is a relatively underutilized technique in fragment screening for drug discovery. MS-based techniques have the advantage of high sensitivity, low sample consumption and being label-free. This review highlights recent examples of the emerging use of MS-based techniques in fragment screening. © 2017 The Author(s). Published by Portland Press Limited on behalf of the Biochemical Society.

  18. Identification of Metal Oxide Nanoparticles in Histological Samples by Enhanced Darkfield Microscopy and Hyperspectral Mapping.

    PubMed

    Roth, Gary A; Sosa Peña, Maria del Pilar; Neu-Baker, Nicole M; Tahiliani, Sahil; Brenner, Sara A

    2015-12-08

    Nanomaterials are increasingly prevalent throughout industry, manufacturing, and biomedical research. The need for tools and techniques that aid in the identification, localization, and characterization of nanoscale materials in biological samples is on the rise. Currently available methods, such as electron microscopy, tend to be resource-intensive, making their use prohibitive for much of the research community. Enhanced darkfield microscopy complemented with a hyperspectral imaging system may provide a solution to this bottleneck by enabling rapid and less expensive characterization of nanoparticles in histological samples. This method allows for high-contrast nanoscale imaging as well as nanomaterial identification. For this technique, histological tissue samples are prepared as they would be for light-based microscopy. First, positive control samples are analyzed to generate the reference spectra that will enable the detection of a material of interest in the sample. Negative controls without the material of interest are also analyzed in order to improve specificity (reduce false positives). Samples can then be imaged and analyzed using methods and software for hyperspectral microscopy or matched against these reference spectra in order to provide maps of the location of materials of interest in a sample. The technique is particularly well-suited for materials with highly unique reflectance spectra, such as noble metals, but is also applicable to other materials, such as semi-metallic oxides. This technique provides information that is difficult to acquire from histological samples without the use of electron microscopy techniques, which may provide higher sensitivity and resolution, but are vastly more resource-intensive and time-consuming than light microscopy.

  19. A rapid method for the sampling of atmospheric water vapour for isotopic analysis.

    PubMed

    Peters, Leon I; Yakir, Dan

    2010-01-01

    Analysis of the stable isotopic composition of atmospheric moisture is widely applied in the environmental sciences. Traditional methods for obtaining isotopic compositional data from ambient moisture have required complicated sampling procedures, expensive and sophisticated distillation lines, hazardous consumables, and lengthy treatments prior to analysis. Newer laser-based techniques are expensive and usually not suitable for large-scale field campaigns, especially in cases where access to mains power is not feasible or high spatial coverage is required. Here we outline the construction and usage of a novel vapour-sampling system based on a battery-operated Stirling cycle cooler, which is simple to operate, does not require any consumables, or post-collection distillation, and is light-weight and highly portable. We demonstrate the ability of this system to reproduce delta(18)O isotopic compositions of ambient water vapour, with samples taken simultaneously by a traditional cryogenic collection technique. Samples were collected over 1 h directly into autosampler vials and were analysed by mass spectrometry after pyrolysis of 1 microL aliquots to CO. This yielded an average error of < +/-0.5 per thousand, approximately equal to the signal-to-noise ratio of traditional approaches. This new system provides a rapid and reliable alternative to conventional cryogenic techniques, particularly in cases requiring high sample throughput or where access to distillation lines, slurry maintenance or mains power is not feasible. Copyright 2009 John Wiley & Sons, Ltd.

  20. The application of compressive sampling in rapid ultrasonic computerized tomography (UCT) technique of steel tube slab (STS).

    PubMed

    Jiang, Baofeng; Jia, Pengjiao; Zhao, Wen; Wang, Wentao

    2018-01-01

    This paper explores a new method for rapid structural damage inspection of steel tube slab (STS) structures along randomly measured paths based on a combination of compressive sampling (CS) and ultrasonic computerized tomography (UCT). In the measurement stage, using fewer randomly selected paths rather than the whole measurement net is proposed to detect the underlying damage of a concrete-filled steel tube. In the imaging stage, the ℓ1-minimization algorithm is employed to recover the information of the microstructures based on the measurement data related to the internal situation of the STS structure. A numerical concrete tube model, with the various level of damage, was studied to demonstrate the performance of the rapid UCT technique. Real-world concrete-filled steel tubes in the Shenyang Metro stations were detected using the proposed UCT technique in a CS framework. Both the numerical and experimental results show the rapid UCT technique has the capability of damage detection in an STS structure with a high level of accuracy and with fewer required measurements, which is more convenient and efficient than the traditional UCT technique.

  1. Enantiospecific Detection of Chiral Nanosamples Using Photoinduced Force

    NASA Astrophysics Data System (ADS)

    Kamandi, Mohammad; Albooyeh, Mohammad; Guclu, Caner; Veysi, Mehdi; Zeng, Jinwei; Wickramasinghe, Kumar; Capolino, Filippo

    2017-12-01

    We propose a high-resolution microscopy technique for enantiospecific detection of chiral samples down to sub-100-nm size based on force measurement. We delve into the differential photoinduced optical force Δ F exerted on an achiral probe in the vicinity of a chiral sample when left and right circularly polarized beams separately excite the sample-probe interactive system. We analytically prove that Δ F is entangled with the enantiomer type of the sample enabling enantiospecific detection of chiral inclusions. Moreover, we demonstrate that Δ F is linearly dependent on both the chiral response of the sample and the electric response of the tip and is inversely related to the quartic power of probe-sample distance. We provide physical insight into the transfer of optical activity from the chiral sample to the achiral tip based on a rigorous analytical approach. We support our theoretical achievements by several numerical examples highlighting the potential application of the derived analytic properties. Lastly, we demonstrate the sensitivity of our method to enantiospecify nanoscale chiral samples with chirality parameter on the order of 0.01 and discuss how the sensitivity of our proposed technique can be further improved.

  2. Comparison of Two Methods for Estimating the Sampling-Related Uncertainty of Satellite Rainfall Averages Based on a Large Radar Data Set

    NASA Technical Reports Server (NTRS)

    Lau, William K. M. (Technical Monitor); Bell, Thomas L.; Steiner, Matthias; Zhang, Yu; Wood, Eric F.

    2002-01-01

    The uncertainty of rainfall estimated from averages of discrete samples collected by a satellite is assessed using a multi-year radar data set covering a large portion of the United States. The sampling-related uncertainty of rainfall estimates is evaluated for all combinations of 100 km, 200 km, and 500 km space domains, 1 day, 5 day, and 30 day rainfall accumulations, and regular sampling time intervals of 1 h, 3 h, 6 h, 8 h, and 12 h. These extensive analyses are combined to characterize the sampling uncertainty as a function of space and time domain, sampling frequency, and rainfall characteristics by means of a simple scaling law. Moreover, it is shown that both parametric and non-parametric statistical techniques of estimating the sampling uncertainty produce comparable results. Sampling uncertainty estimates, however, do depend on the choice of technique for obtaining them. They can also vary considerably from case to case, reflecting the great variability of natural rainfall, and should therefore be expressed in probabilistic terms. Rainfall calibration errors are shown to affect comparison of results obtained by studies based on data from different climate regions and/or observation platforms.

  3. Efficacy of the FilmArray blood culture identification panel for direct molecular diagnosis of infectious diseases from samples other than blood.

    PubMed

    Micó, Miquel; Navarro, Ferran; de Miniac, Daniela; González, Yésica; Brell, Albert; López, Cristina; Sánchez-Reus, Ferran; Mirelis, Beatriz; Coll, Pere

    2015-12-01

    Molecular-based techniques reduce the delay in diagnosing infectious diseases and therefore contribute to better patient outcomes. We assessed the FilmArray blood culture identification (BCID) panel (Biofire Diagnostics/bioMérieux) directly on clinical specimens other than blood: cerebrospinal, joint, pleural and ascitic fluids, bronchoscopy samples and abscesses. We compared the results from 88 samples obtained by culture-based techniques. The percentage of agreement between the two methods was 75 % with a Cohen κ value of 0.51. Global sensitivity and specificity using the FilmArray BCID panel were 71 and 97 %, respectively. Sensitivity was poorer in samples with a low bacterial load, such as ascitic and pleural fluids (25 %), whereas the sensitivity for abscess samples was high (89 %). These findings suggest that the FilmArray BCID panel could be useful to perform microbiological diagnosis directly from samples other than positive blood cultures, as it offers acceptable sensitivity and moderate agreement with conventional microbiological methods. Nevertheless, cost-benefit studies should be performed before introducing this method into algorithms for microbiological diagnostics.

  4. Study of Vis/NIR spectroscopy measurement on acidity of yogurt

    NASA Astrophysics Data System (ADS)

    He, Yong; Feng, Shuijuan; Wu, Di; Li, Xiaoli

    2006-09-01

    A fast measurement of pH of yogurt using Vis/NIR-spectroscopy techniques was established in order to measuring the acidity of yogurt rapidly. 27 samples selected separately from five different brands of yogurt were measured by Vis/NIR-spectroscopy. The pH of yogurt on positions scanned by spectrum was measured by a pH meter. The mathematical model between pH and Vis/NIR spectral measurements was established and developed based on partial least squares (PLS) by using Unscramble V9.2. Then 25 unknown samples from 5 different brands were predicted based on the mathematical model. The result shows that The correlation coefficient of pH based on PLS model is more than 0.890, and standard error of calibration (SEC) is 0.037, standard error of prediction (SEP) is 0.043. Through predicting the pH of 25 samples of yogurt from 5 different brands, the correlation coefficient between predictive value and measured value of those samples is more than 0918. The results show the good to excellent prediction performances. The Vis/NIR spectroscopy technique had a significant greater accuracy for determining the value of pH. It was concluded that the VisINIRS measurement technique can be used to measure pH of yogurt fast and accurately, and a new method for the measurement of pH of yogurt was established.

  5. Comparative Evaluation of Marginal Accuracy of a Cast Fixed Partial Denture Compared to Soldered Fixed Partial Denture Made of Two Different Base Metal Alloys and Casting Techniques: An In vitro Study.

    PubMed

    Jei, J Brintha; Mohan, Jayashree

    2014-03-01

    The periodontal health of abutment teeth and the durability of fixed partial denture depends on the marginal adaptation of the prosthesis. Any discrepancy in the marginal area leads to dissolution of luting agent and plaque accumulation. This study was done with the aim of evaluating the accuracy of marginal fit of four unit crown and bridge made up of Ni-Cr and Cr-Co alloys under induction and centrifugal casting. They were compared to cast fixed partial denture (FPD) and soldered FPD. For the purpose of this study a metal model was fabricated. A total of 40 samples (4-unit crown and bridge) were prepared in which 20 Cr-Co samples and 20 Ni-Cr samples were fabricated. Within these 20 samples of each group 10 samples were prepared by induction casting technique and other 10 samples with centrifugal casting technique. The cast FPD samples obtained were seated on the model and the samples were then measured with travelling microscope having precision of 0.001 cm. Sectioning of samples was done between the two pontics and measurements were made, then the soldering was made with torch soldering unit. The marginal discrepancy of soldered samples was measured and all findings were statistically analysed. The results revealed minimal marginal discrepancy with Cr-Co samples when compared to Ni-Cr samples done under induction casting technique. When compared to cast FPD samples, the soldered group showed reduced marginal discrepancy.

  6. Comparing the accuracy and precision of three techniques used for estimating missing landmarks when reconstructing fossil hominin crania.

    PubMed

    Neeser, Rudolph; Ackermann, Rebecca Rogers; Gain, James

    2009-09-01

    Various methodological approaches have been used for reconstructing fossil hominin remains in order to increase sample sizes and to better understand morphological variation. Among these, morphometric quantitative techniques for reconstruction are increasingly common. Here we compare the accuracy of three approaches--mean substitution, thin plate splines, and multiple linear regression--for estimating missing landmarks of damaged fossil specimens. Comparisons are made varying the number of missing landmarks, sample sizes, and the reference species of the population used to perform the estimation. The testing is performed on landmark data from individuals of Homo sapiens, Pan troglodytes and Gorilla gorilla, and nine hominin fossil specimens. Results suggest that when a small, same-species fossil reference sample is available to guide reconstructions, thin plate spline approaches perform best. However, if no such sample is available (or if the species of the damaged individual is uncertain), estimates of missing morphology based on a single individual (or even a small sample) of close taxonomic affinity are less accurate than those based on a large sample of individuals drawn from more distantly related extant populations using a technique (such as a regression method) able to leverage the information (e.g., variation/covariation patterning) contained in this large sample. Thin plate splines also show an unexpectedly large amount of error in estimating landmarks, especially over large areas. Recommendations are made for estimating missing landmarks under various scenarios. Copyright 2009 Wiley-Liss, Inc.

  7. A High-Throughput Method for Direct Detection of Therapeutic Oligonucleotide-Induced Gene Silencing In Vivo

    PubMed Central

    Coles, Andrew H.; Osborn, Maire F.; Alterman, Julia F.; Turanov, Anton A.; Godinho, Bruno M.D.C.; Kennington, Lori; Chase, Kathryn; Aronin, Neil

    2016-01-01

    Preclinical development of RNA interference (RNAi)-based therapeutics requires a rapid, accurate, and robust method of simultaneously quantifying mRNA knockdown in hundreds of samples. The most well-established method to achieve this is quantitative real-time polymerase chain reaction (qRT-PCR), a labor-intensive methodology that requires sample purification, which increases the potential to introduce additional bias. Here, we describe that the QuantiGene® branched DNA (bDNA) assay linked to a 96-well Qiagen TissueLyser II is a quick and reproducible alternative to qRT-PCR for quantitative analysis of mRNA expression in vivo directly from tissue biopsies. The bDNA assay is a high-throughput, plate-based, luminescence technique, capable of directly measuring mRNA levels from tissue lysates derived from various biological samples. We have performed a systematic evaluation of this technique for in vivo detection of RNAi-based silencing. We show that similar quality data is obtained from purified RNA and tissue lysates. In general, we observe low intra- and inter-animal variability (around 10% for control samples), and high intermediate precision. This allows minimization of sample size for evaluation of oligonucleotide efficacy in vivo. PMID:26595721

  8. Tackling sampling challenges in biomolecular simulations.

    PubMed

    Barducci, Alessandro; Pfaendtner, Jim; Bonomi, Massimiliano

    2015-01-01

    Molecular dynamics (MD) simulations are a powerful tool to give an atomistic insight into the structure and dynamics of proteins. However, the time scales accessible in standard simulations, which often do not match those in which interesting biological processes occur, limit their predictive capabilities. Many advanced sampling techniques have been proposed over the years to overcome this limitation. This chapter focuses on metadynamics, a method based on the introduction of a time-dependent bias potential to accelerate sampling and recover equilibrium properties of a few descriptors that are able to capture the complexity of a process at a coarse-grained level. The theory of metadynamics and its combination with other popular sampling techniques such as the replica exchange method is briefly presented. Practical applications of these techniques to the study of the Trp-Cage miniprotein folding are also illustrated. The examples contain a guide for performing these calculations with PLUMED, a plugin to perform enhanced sampling simulations in combination with many popular MD codes.

  9. Demonstration of full 4×4 Mueller polarimetry through an optical fiber for endoscopic applications.

    PubMed

    Manhas, Sandeep; Vizet, Jérémy; Deby, Stanislas; Vanel, Jean-Charles; Boito, Paola; Verdier, Mireille; De Martino, Antonello; Pagnoux, Dominique

    2015-02-09

    A novel technique to measure the full 4 × 4 Mueller matrix of a sample through an optical fiber is proposed, opening the way for endoscopic applications of Mueller polarimetry for biomedical diagnosis. The technique is based on two subsequent Mueller matrices measurements: one for characterizing the fiber only, and another for the assembly of fiber and sample. From this differential measurement, we proved theoretically that the polarimetric properties of the sample can be deduced. The proof of principle was experimentally validated by measuring various polarimetric parameters of known optical components. Images of manufactured and biological samples acquired by using this approach are also presented.

  10. [Determination of calcium, magnesium and potassium in nurtured cell by AAS with quick-pulsed nebulization technique and NaOH base digestion].

    PubMed

    Shi, C; Gao, S; Gun, S

    1997-06-01

    The sample is digested with 6% NaOH solution and an amount of 50 microl is used for protein content analysis by the method of Comassie Brilliant Blue G250, the residual is diluted with equal 0.4% Lathanurm-EDTA solution. Its Calcium magensium and potassium content are determined by AAS. With quick-pulsed nebulization technique. When a self-made micro-sampling device is used, 20microl of sample volume is needed and it is only the 1/10 approximately 1/20 of the sample volume required for conventional determination. Sensitivity, precision and rate of recovery agree well with those using regular wet ashing method.

  11. A comparative study of quantitative microsegregation analyses performed during the solidification of the Ni-base superalloy CMSX-10

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Seo, Seong-Moon, E-mail: castme@kims.re.kr; Jeong, Hi-Won; Ahn, Young-Keun

    Quantitative microsegregation analyses were systematically carried out during the solidification of the Ni-base superalloy CMSX-10 to clarify the methodological effect on the quantification of microsegregation and to fully understand the solidification microstructure. Three experimental techniques, namely, mushy zone quenching (MZQ), planar directional solidification followed by quenching (PDSQ), and random sampling (RS), were implemented for the analysis of microsegregation tendency and the magnitude of solute elements by electron probe microanalysis. The microprobe data and the calculation results of the diffusion field ahead of the solid/liquid (S/L) interface of PDSQ samples revealed that the liquid composition at the S/L interface is significantlymore » influenced by quenching. By applying the PDSQ technique, it was also found that the partition coefficients of all solute elements do not change appreciably during the solidification of primary γ. All three techniques could reasonably predict the segregation behavior of most solute elements. Nevertheless, the RS approach has a tendency to overestimate the magnitude of segregation for most solute elements when compared to the MZQ and PDSQ techniques. Moreover, the segregation direction of Cr and Mo predicted by the RS approach was found to be opposite from the results obtained by the MZQ and PDSQ techniques. This conflicting segregation behavior of Cr and Mo was discussed intensively. It was shown that the formation of Cr-rich areas near the γ/γ′ eutectic in various Ni-base superalloys, including the CMSX-10 alloy, could be successfully explained by the results of microprobe analysis performed on a sample quenched during the planar directional solidification of γ/γ′ eutectic. - Highlights: • Methodological effect on the quantification of microsegregation was clarified. • The liquid composition at the S/L interface was influenced by quenching. • The segregation direction of Cr varied depending on the experimental techniques. • Cr and Mo segregation in Ni-base superalloys was fully understood.« less

  12. Efficient Stochastic Rendering of Static and Animated Volumes Using Visibility Sweeps.

    PubMed

    von Radziewsky, Philipp; Kroes, Thomas; Eisemann, Martin; Eisemann, Elmar

    2017-09-01

    Stochastically solving the rendering integral (particularly visibility) is the de-facto standard for physically-based light transport but it is computationally expensive, especially when displaying heterogeneous volumetric data. In this work, we present efficient techniques to speed-up the rendering process via a novel visibility-estimation method in concert with an unbiased importance sampling (involving environmental lighting and visibility inside the volume), filtering, and update techniques for both static and animated scenes. Our major contributions include a progressive estimate of partial occlusions based on a fast sweeping-plane algorithm. These occlusions are stored in an octahedral representation, which can be conveniently transformed into a quadtree-based hierarchy suited for a joint importance sampling. Further, we propose sweep-space filtering, which suppresses the occurrence of fireflies and investigate different update schemes for animated scenes. Our technique is unbiased, requires little precomputation, is highly parallelizable, and is applicable to a various volume data sets, dynamic transfer functions, animated volumes and changing environmental lighting.

  13. Heterogeneity and frequency of BRAF mutations in primary melanoma: Comparison between molecular methods and immunohistochemistry

    PubMed Central

    Bruno, William; Martinuzzi, Claudia; Andreotti, Virginia; Pastorino, Lorenza; Spagnolo, Francesco; Dalmasso, Bruna; Cabiddu, Francesco; Gualco, Marina; Ballestrero, Alberto; Bianchi-Scarrà, Giovanna; Queirolo, Paola

    2017-01-01

    Finding the best technique to identify BRAF mutations with a high sensitivity and specificity is mandatory for accurate patient selection for target therapy. BRAF mutation frequency ranges from 40 to 60% depending on melanoma clinical characteristics and detection technique used. Intertumoral heterogeneity could lead to misinterpretation of BRAF mutational status; this is especially important if testing is performed on primary specimens, when metastatic lesions are unavailable. Aim of this study was to identify the best combination of methods for detecting BRAF mutations (among peptide nucleic acid – PNA-clamping real-time PCR, immunohistochemistry and capillary sequencing) and investigate BRAF mutation heterogeneity in a series of 100 primary melanomas and a subset of 25 matched metastatic samples. Overall, we obtained a BRAF mutation frequency of 62%, based on the combination of at least two techniques. Concordance between mutation status in primary and metastatic tumor was good but not complete (67%), when agreement of at least two techniques were considered. Next generation sequencing was used to quantify the threshold of detected mutant alleles in discordant samples. Combining different methods excludes that the observed heterogeneity is technique-based. We propose an algorithm for BRAF mutation testing based on agreement between immunohistochemistry and PNA; a third molecular method could be added in case of discordance of the results. Testing the primary tumor when the metastatic sample is unavailable is a good option if at least two methods of detection are used, however the presence of intertumoral heterogeneity or the occurrence of additional primaries should be carefully considered. PMID:28039443

  14. Microextraction techniques combined with capillary electrophoresis in bioanalysis.

    PubMed

    Kohler, Isabelle; Schappler, Julie; Rudaz, Serge

    2013-01-01

    Over the past two decades, many environmentally sustainable sample-preparation techniques have been proposed, with the objective of reducing the use of toxic organic solvents or substituting these with environmentally friendly alternatives. Microextraction techniques (MEs), in which only a small amount of organic solvent is used, have several advantages, including reduced sample volume, analysis time, and operating costs. Thus, MEs are well adapted in bioanalysis, in which sample preparation is mandatory because of the complexity of a sample that is available in small quantities (mL or even μL only). Capillary electrophoresis (CE) is a powerful and efficient separation technique in which no organic solvents are required for analysis. Combination of CE with MEs is regarded as a very attractive environmentally sustainable analytical tool, and numerous applications have been reported over the last few decades for bioanalysis of low-molecular-weight compounds or for peptide analysis. In this paper we review the use of MEs combined with CE in bioanalysis. The review is divided into two sections: liquid and solid-based MEs. A brief practical and theoretical description of each ME is given, and the techniques are illustrated by relevant applications.

  15. Quantitative filter forensics for indoor particle sampling.

    PubMed

    Haaland, D; Siegel, J A

    2017-03-01

    Filter forensics is a promising indoor air investigation technique involving the analysis of dust which has collected on filters in central forced-air heating, ventilation, and air conditioning (HVAC) or portable systems to determine the presence of indoor particle-bound contaminants. In this study, we summarize past filter forensics research to explore what it reveals about the sampling technique and the indoor environment. There are 60 investigations in the literature that have used this sampling technique for a variety of biotic and abiotic contaminants. Many studies identified differences between contaminant concentrations in different buildings using this technique. Based on this literature review, we identified a lack of quantification as a gap in the past literature. Accordingly, we propose an approach to quantitatively link contaminants extracted from HVAC filter dust to time-averaged integrated air concentrations. This quantitative filter forensics approach has great potential to measure indoor air concentrations of a wide variety of particle-bound contaminants. Future studies directly comparing quantitative filter forensics to alternative sampling techniques are required to fully assess this approach, but analysis of past research suggests the enormous possibility of this approach. © 2016 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  16. Liquid sample delivery techniques for serial femtosecond crystallography

    PubMed Central

    Weierstall, Uwe

    2014-01-01

    X-ray free-electron lasers overcome the problem of radiation damage in protein crystallography and allow structure determination from micro- and nanocrystals at room temperature. To ensure that consecutive X-ray pulses do not probe previously exposed crystals, the sample needs to be replaced with the X-ray repetition rate, which ranges from 120 Hz at warm linac-based free-electron lasers to 1 MHz at superconducting linacs. Liquid injectors are therefore an essential part of a serial femtosecond crystallography experiment at an X-ray free-electron laser. Here, we compare different techniques of injecting microcrystals in solution into the pulsed X-ray beam in vacuum. Sample waste due to mismatch of the liquid flow rate to the X-ray repetition rate can be addressed through various techniques. PMID:24914163

  17. Application of electrochemical impedance spectroscopy: A phase behavior study of babassu biodiesel-based microemulsions.

    PubMed

    Pereira, Thulio C; Conceição, Carlos A F; Khan, Alamgir; Fernandes, Raquel M T; Ferreira, Maira S; Marques, Edmar P; Marques, Aldaléa L B

    2016-11-05

    Microemulsions are thermodynamically stable systems of two immiscible liquids, one aqueous and the other of organic nature, with a surfactant and/or co-surfactant adsorbed in the interface between the two phases. Biodiesel-based microemulsions, consisting of alkyl esters of fatty acids, open a new means of analysis for the application of electroanalytical techniques, and is advantageous as it eliminates the required pre-treatment of a sample. In this work, the phase behaviours of biodiesel-based microemulsions were investigated through the electrochemical impedance spectroscopy (EIS) technique. We observed thatan increase in the amount of biodiesel in the microemulsion formulation increases the resistance to charge transfer at the interface. Also, the electrical conductivity measurements revealed that a decrease or increase in electrical properties depends on the amount of biodiesel. EIS studies of the biodiesel-based microemulsion samples showed the presence of two capacitive arcs: one high-frequency and the other low-frequency. Thus, the formulation of microemulsions plays an important role in estimating the electrical properties through the electrochemical impedance spectroscopy technique. Copyright © 2016 Elsevier B.V. All rights reserved.

  18. Application of electrochemical impedance spectroscopy: A phase behavior study of babassu biodiesel-based microemulsions

    NASA Astrophysics Data System (ADS)

    Pereira, Thulio C.; Conceição, Carlos A. F.; Khan, Alamgir; Fernandes, Raquel M. T.; Ferreira, Maira S.; Marques, Edmar P.; Marques, Aldaléa L. B.

    2016-11-01

    Microemulsions are thermodynamically stable systems of two immiscible liquids, one aqueous and the other of organic nature, with a surfactant and/or co-surfactant adsorbed in the interface between the two phases. Biodiesel-based microemulsions, consisting of alkyl esters of fatty acids, open a new means of analysis for the application of electroanalytical techniques, and is advantageous as it eliminates the required pre-treatment of a sample. In this work, the phase behaviours of biodiesel-based microemulsions were investigated through the electrochemical impedance spectroscopy (EIS) technique. We observed thatan increase in the amount of biodiesel in the microemulsion formulation increases the resistance to charge transfer at the interface. Also, the electrical conductivity measurements revealed that a decrease or increase in electrical properties depends on the amount of biodiesel. EIS studies of the biodiesel-based microemulsion samples showed the presence of two capacitive arcs: one high-frequency and the other low-frequency. Thus, the formulation of microemulsions plays an important role in estimating the electrical properties through the electrochemical impedance spectroscopy technique.

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Maxwell, D.P.; Richardson, C.F.

    Three mercury measurement techniques were performed on synthesis gas streams before and after an amine-based sulfur removal system. The syngas was sampled using (1) gas impingers containing a nitric acid-hydrogen peroxide solution, (2) coconut-based charcoal sorbent, and (3) an on-line atomic absorption spectrophotometer equipped with a gold amalgamation trap and cold vapor cell. Various impinger solutions were applied upstream of the gold amalgamation trap to remove hydrogen sulfide and isolate oxidized and elemental species of mercury. The results from these three techniques are compared to provide an assessment of these measurement techniques in reducing gas atmospheres.

  20. Calibrationless parallel magnetic resonance imaging: a joint sparsity model.

    PubMed

    Majumdar, Angshul; Chaudhury, Kunal Narayan; Ward, Rabab

    2013-12-05

    State-of-the-art parallel MRI techniques either explicitly or implicitly require certain parameters to be estimated, e.g., the sensitivity map for SENSE, SMASH and interpolation weights for GRAPPA, SPIRiT. Thus all these techniques are sensitive to the calibration (parameter estimation) stage. In this work, we have proposed a parallel MRI technique that does not require any calibration but yields reconstruction results that are at par with (or even better than) state-of-the-art methods in parallel MRI. Our proposed method required solving non-convex analysis and synthesis prior joint-sparsity problems. This work also derives the algorithms for solving them. Experimental validation was carried out on two datasets-eight channel brain and eight channel Shepp-Logan phantom. Two sampling methods were used-Variable Density Random sampling and non-Cartesian Radial sampling. For the brain data, acceleration factor of 4 was used and for the other an acceleration factor of 6 was used. The reconstruction results were quantitatively evaluated based on the Normalised Mean Squared Error between the reconstructed image and the originals. The qualitative evaluation was based on the actual reconstructed images. We compared our work with four state-of-the-art parallel imaging techniques; two calibrated methods-CS SENSE and l1SPIRiT and two calibration free techniques-Distributed CS and SAKE. Our method yields better reconstruction results than all of them.

  1. Sampling free energy surfaces as slices by combining umbrella sampling and metadynamics.

    PubMed

    Awasthi, Shalini; Kapil, Venkat; Nair, Nisanth N

    2016-06-15

    Metadynamics (MTD) is a very powerful technique to sample high-dimensional free energy landscapes, and due to its self-guiding property, the method has been successful in studying complex reactions and conformational changes. MTD sampling is based on filling the free energy basins by biasing potentials and thus for cases with flat, broad, and unbound free energy wells, the computational time to sample them becomes very large. To alleviate this problem, we combine the standard Umbrella Sampling (US) technique with MTD to sample orthogonal collective variables (CVs) in a simultaneous way. Within this scheme, we construct the equilibrium distribution of CVs from biased distributions obtained from independent MTD simulations with umbrella potentials. Reweighting is carried out by a procedure that combines US reweighting and Tiwary-Parrinello MTD reweighting within the Weighted Histogram Analysis Method (WHAM). The approach is ideal for a controlled sampling of a CV in a MTD simulation, making it computationally efficient in sampling flat, broad, and unbound free energy surfaces. This technique also allows for a distributed sampling of a high-dimensional free energy surface, further increasing the computational efficiency in sampling. We demonstrate the application of this technique in sampling high-dimensional surface for various chemical reactions using ab initio and QM/MM hybrid molecular dynamics simulations. Further, to carry out MTD bias reweighting for computing forward reaction barriers in ab initio or QM/MM simulations, we propose a computationally affordable approach that does not require recrossing trajectories. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  2. A comparative analysis of conventional cytopreparatory and liquid based cytological techniques (Sure Path) in evaluation of serous effusion fluids.

    PubMed

    Dadhich, Hrishikesh; Toi, Pampa Ch; Siddaraju, Neelaiah; Sevvanthi, Kalidas

    2016-11-01

    Clinically, detection of malignant cells in serous body fluids is critical, as their presence implies the upstaging of the disease. Cytology of body cavity fluids serves as an important tool when other diagnostic tests cannot be performed. In most laboratories, currently, the effusion fluid samples are analysed chiefly by the conventional cytopreparatory (CCP) technique. Although, there are several studies comparing the liquid-based cytology (LBC), with CCP technique in the field of cervicovaginal cytology; the literature on such comparison with respect to serous body fluid examination is sparse. One hundred samples of serous body fluids were processed by both CCP and LBC techniques. Slides prepared by these techniques were studied using six parameters. A comparative analysis of the advantages and disadvantages of the techniques in detection of malignant cells was carried out with appropriate statistical tests. The samples comprised 52 pleural, 44 peritoneal and four pericardial fluids. No statistically significant difference was noted with respect to cellularity (P values = 0.22), cell distribution (P values = 0.39) and diagnosis of malignancy (P values = 0.20). As for the remaining parameters, LBC provided statistically significant clearer smear background (P values < 0.0001) and shorter screening time (P values < 0.0001), while CPP technique provided a significantly better staining quality (P values 0.01) and sharper cytomorphologic features (P values 0.05). Although, a reduced screening time and clearer smear background are the two major advantages of LBC; the CCP technique provides the better staining quality with sharper cytomorphologic features which is more critical from the cytologic interpretation point of view. Diagn. Cytopathol. 2016;44:874-879. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  3. Three-dimensional dominant frequency mapping using autoregressive spectral analysis of atrial electrograms of patients in persistent atrial fibrillation.

    PubMed

    Salinet, João L; Masca, Nicholas; Stafford, Peter J; Ng, G André; Schlindwein, Fernando S

    2016-03-08

    Areas with high frequency activity within the atrium are thought to be 'drivers' of the rhythm in patients with atrial fibrillation (AF) and ablation of these areas seems to be an effective therapy in eliminating DF gradient and restoring sinus rhythm. Clinical groups have applied the traditional FFT-based approach to generate the three-dimensional dominant frequency (3D DF) maps during electrophysiology (EP) procedures but literature is restricted on using alternative spectral estimation techniques that can have a better frequency resolution that FFT-based spectral estimation. Autoregressive (AR) model-based spectral estimation techniques, with emphasis on selection of appropriate sampling rate and AR model order, were implemented to generate high-density 3D DF maps of atrial electrograms (AEGs) in persistent atrial fibrillation (persAF). For each patient, 2048 simultaneous AEGs were recorded for 20.478 s-long segments in the left atrium (LA) and exported for analysis, together with their anatomical locations. After the DFs were identified using AR-based spectral estimation, they were colour coded to produce sequential 3D DF maps. These maps were systematically compared with maps found using the Fourier-based approach. 3D DF maps can be obtained using AR-based spectral estimation after AEGs downsampling (DS) and the resulting maps are very similar to those obtained using FFT-based spectral estimation (mean 90.23 %). There were no significant differences between AR techniques (p = 0.62). The processing time for AR-based approach was considerably shorter (from 5.44 to 5.05 s) when lower sampling frequencies and model order values were used. Higher levels of DS presented higher rates of DF agreement (sampling frequency of 37.5 Hz). We have demonstrated the feasibility of using AR spectral estimation methods for producing 3D DF maps and characterised their differences to the maps produced using the FFT technique, offering an alternative approach for 3D DF computation in human persAF studies.

  4. A timer inventory based upon manual and automated analysis of ERTS-1 and supporting aircraft data using multistage probability sampling. [Plumas National Forest, California

    NASA Technical Reports Server (NTRS)

    Nichols, J. D.; Gialdini, M.; Jaakkola, S.

    1974-01-01

    A quasi-operational study demonstrating that a timber inventory based on manual and automated analysis of ERTS-1, supporting aircraft data and ground data was made using multistage sampling techniques. The inventory proved to be a timely, cost effective alternative to conventional timber inventory techniques. The timber volume on the Quincy Ranger District of the Plumas National Forest was estimated to be 2.44 billion board feet with a sampling error of 8.2 percent. Costs per acre for the inventory procedure at 1.1 cent/acre compared favorably with the costs of a conventional inventory at 25 cents/acre. A point-by-point comparison of CALSCAN-classified ERTS data with human-interpreted low altitude photo plots indicated no significant differences in the overall classification accuracies.

  5. Recent Application of Solid Phase Based Techniques for Extraction and Preconcentration of Cyanotoxins in Environmental Matrices.

    PubMed

    Mashile, Geaneth Pertunia; Nomngongo, Philiswa N

    2017-03-04

    Cyanotoxins are toxic and are found in eutrophic, municipal, and residential water supplies. For this reason, their occurrence in drinking water systems has become a global concern. Therefore, monitoring, control, risk assessment, and prevention of these contaminants in the environmental bodies are important subjects associated with public health. Thus, rapid, sensitive, selective, simple, and accurate analytical methods for the identification and determination of cyanotoxins are required. In this paper, the sampling methodologies and applications of solid phase-based sample preparation methods for the determination of cyanotoxins in environmental matrices are reviewed. The sample preparation techniques mainly include solid phase micro-extraction (SPME), solid phase extraction (SPE), and solid phase adsorption toxin tracking technology (SPATT). In addition, advantages and disadvantages and future prospects of these methods have been discussed.

  6. Simulations of multi-contrast x-ray imaging using near-field speckles

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zdora, Marie-Christine; Diamond Light Source, Harwell Science and Innovation Campus, Didcot, Oxfordshire, OX11 0DE, United Kingdom and Department of Physics & Astronomy, University College London, London, WC1E 6BT; Thibault, Pierre

    2016-01-28

    X-ray dark-field and phase-contrast imaging using near-field speckles is a novel technique that overcomes limitations inherent in conventional absorption x-ray imaging, i.e. poor contrast for features with similar density. Speckle-based imaging yields a wealth of information with a simple setup tolerant to polychromatic and divergent beams, and simple data acquisition and analysis procedures. Here, we present a simulation software used to model the image formation with the speckle-based technique, and we compare simulated results on a phantom sample with experimental synchrotron data. Thorough simulation of a speckle-based imaging experiment will help for better understanding and optimising the technique itself.

  7. Method for improving accuracy in full evaporation headspace analysis.

    PubMed

    Xie, Wei-Qi; Chai, Xin-Sheng

    2017-05-01

    We report a new headspace analytical method in which multiple headspace extraction is incorporated with the full evaporation technique. The pressure uncertainty caused by the solid content change in the samples has a great impact to the measurement accuracy in the conventional full evaporation headspace analysis. The results (using ethanol solution as the model sample) showed that the present technique is effective to minimize such a problem. The proposed full evaporation multiple headspace extraction analysis technique is also automated and practical, and which could greatly broaden the applications of the full-evaporation-based headspace analysis. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  8. Surveillance of Endoscopes: Comparison of Different Sampling Techniques.

    PubMed

    Cattoir, Lien; Vanzieleghem, Thomas; Florin, Lisa; Helleputte, Tania; De Vos, Martine; Verhasselt, Bruno; Boelens, Jerina; Leroux-Roels, Isabel

    2017-09-01

    OBJECTIVE To compare different techniques of endoscope sampling to assess residual bacterial contamination. DESIGN Diagnostic study. SETTING The endoscopy unit of an 1,100-bed university hospital performing ~13,000 endoscopic procedures annually. METHODS In total, 4 sampling techniques, combining flushing fluid with or without a commercial endoscope brush, were compared in an endoscope model. Based on these results, sterile physiological saline flushing with or without PULL THRU brush was selected for evaluation on 40 flexible endoscopes by adenosine triphosphate (ATP) measurement and bacterial culture. Acceptance criteria from the French National guideline (<25 colony-forming units [CFU] per endoscope and absence of indicator microorganisms) were used as part of the evaluation. RESULTS On biofilm-coated PTFE tubes, physiological saline in combination with a PULL THRU brush generated higher mean ATP values (2,579 relative light units [RLU]) compared with saline alone (1,436 RLU; P=.047). In the endoscope samples, culture yield using saline plus the PULL THRU (mean, 43 CFU; range, 1-400 CFU) was significantly higher than that of saline alone (mean, 17 CFU; range, 0-500 CFU; P<.001). In samples obtained using the saline+PULL THRU brush method, ATP values of samples classified as unacceptable were significantly higher than those of samples classified as acceptable (P=.001). CONCLUSION Physiological saline flushing combined with PULL THRU brush to sample endoscopes generated higher ATP values and increased the yield of microbial surveillance culture. Consequently, the acceptance rate of endoscopes based on a defined CFU limit was significantly lower when the saline+PULL THRU method was used instead of saline alone. Infect Control Hosp Epidemiol 2017;38:1062-1069.

  9. Development of a PCR technique specific for Demodex injai in biological specimens.

    PubMed

    Sastre, N; Ravera, I; Ferreira, D; Altet, L; Sánchez, A; Bardagí, M; Francino, O; Ferrer, L

    2013-09-01

    The identification of Demodex injai as a second Demodex species of dog opened new questions and challenges in the understanding on the Demodex-host relationships. In this paper, we describe the development of a conventional PCR technique based on published genome sequences of D. injai from GenBank that specifically detects DNA from D. injai. This technique amplifies a 238-bp fragment corresponding to a region of the mitochondrial 16S rDNA of D. injai. The PCR was positive in DNA samples obtained from mites identified morphologically as D. injai, which served as positive controls, as well as in samples from three cases of demodicosis associated with proliferation of mites identified as D. injai. Furthermore, the PCR was positive in 2 out of 19 healthy dogs. Samples of Demodex canis and Demodex folliculorum were consistently negative. Skin samples from seven dogs with generalized demodicosis caused by D. canis were all negative in the D. injai-specific PCR, demonstrating that in generalized canine demodicosis, mite proliferation is species-specific. This technique can be a useful tool in the diagnosis and in epidemiologic and pathogenic studies.

  10. Estimation of Microbial Concentration in Food Products from Qualitative, Microbiological Test Data with the MPN Technique.

    PubMed

    Fujikawa, Hiroshi

    2017-01-01

    Microbial concentration in samples of a food product lot has been generally assumed to follow the log-normal distribution in food sampling, but this distribution cannot accommodate the concentration of zero. In the present study, first, a probabilistic study with the most probable number (MPN) technique was done for a target microbe present at a low (or zero) concentration in food products. Namely, based on the number of target pathogen-positive samples in the total samples of a product found by a qualitative, microbiological examination, the concentration of the pathogen in the product was estimated by means of the MPN technique. The effects of the sample size and the total sample number of a product were then examined. Second, operating characteristic (OC) curves for the concentration of a target microbe in a product lot were generated on the assumption that the concentration of a target microbe could be expressed with the Poisson distribution. OC curves for Salmonella and Cronobacter sakazakii in powdered formulae for infants and young children were successfully generated. The present study suggested that the MPN technique and the Poisson distribution would be useful for qualitative microbiological test data analysis for a target microbe whose concentration in a lot is expected to be low.

  11. Direct typing of Canine parvovirus (CPV) from infected dog faeces by rapid mini sequencing technique.

    PubMed

    V, Pavana Jyothi; S, Akila; Selvan, Malini K; Naidu, Hariprasad; Raghunathan, Shwethaa; Kota, Sathish; Sundaram, R C Raja; Rana, Samir Kumar; Raj, G Dhinakar; Srinivasan, V A; Mohana Subramanian, B

    2016-12-01

    Canine parvovirus (CPV) is a non-enveloped single stranded DNA virus with an icosahedral capsid. Mini-sequencing based CPV typing was developed earlier to detect and differentiate all the CPV types and FPV in a single reaction. This technique was further evaluated in the present study by performing the mini-sequencing directly from fecal samples which avoided tedious virus isolation steps by cell culture system. Fecal swab samples were collected from 84 dogs with enteritis symptoms, suggestive of parvoviral infection from different locations across India. Seventy six of these samples were positive by PCR; the subsequent mini-sequencing reaction typed 74 of them as type 2a virus, and 2 samples as type 2b. Additionally, 25 of the positive samples were typed by cycle sequencing of PCR products. Direct CPV typing from fecal samples using mini-sequencing showed 100% correlation with CPV typing by cycle sequencing. Moreover, CPV typing was achieved by mini-sequencing even with faintly positive PCR amplicons which was not possible by cycle sequencing. Therefore, the mini-sequencing technique is recommended for regular epidemiological follow up of CPV types, since the technique is rapid, highly sensitive and high capacity method for CPV typing. Copyright © 2016. Published by Elsevier B.V.

  12. Biosensor-based microRNA detection: techniques, design, performance, and challenges.

    PubMed

    Johnson, Blake N; Mutharasan, Raj

    2014-04-07

    The current state of biosensor-based techniques for amplification-free microRNA (miRNA) detection is critically reviewed. Comparison with non-sensor and amplification-based molecular techniques (MTs), such as polymerase-based methods, is made in terms of transduction mechanism, associated protocol, and sensitivity. Challenges associated with miRNA hybridization thermodynamics which affect assay selectivity and amplification bias are briefly discussed. Electrochemical, electromechanical, and optical classes of miRNA biosensors are reviewed in terms of transduction mechanism, limit of detection (LOD), time-to-results (TTR), multiplexing potential, and measurement robustness. Current trends suggest that biosensor-based techniques (BTs) for miRNA assay will complement MTs due to the advantages of amplification-free detection, LOD being femtomolar (fM)-attomolar (aM), short TTR, multiplexing capability, and minimal sample preparation requirement. Areas of future importance in miRNA BT development are presented which include focus on achieving high measurement confidence and multiplexing capabilities.

  13. Effect of background correction on peak detection and quantification in online comprehensive two-dimensional liquid chromatography using diode array detection.

    PubMed

    Allen, Robert C; John, Mallory G; Rutan, Sarah C; Filgueira, Marcelo R; Carr, Peter W

    2012-09-07

    A singular value decomposition-based background correction (SVD-BC) technique is proposed for the reduction of background contributions in online comprehensive two-dimensional liquid chromatography (LC×LC) data. The SVD-BC technique was compared to simply subtracting a blank chromatogram from a sample chromatogram and to a previously reported background correction technique for one dimensional chromatography, which uses an asymmetric weighted least squares (AWLS) approach. AWLS was the only background correction technique to completely remove the background artifacts from the samples as evaluated by visual inspection. However, the SVD-BC technique greatly reduced or eliminated the background artifacts as well and preserved the peak intensity better than AWLS. The loss in peak intensity by AWLS resulted in lower peak counts at the detection thresholds established using standards samples. However, the SVD-BC technique was found to introduce noise which led to detection of false peaks at the lower detection thresholds. As a result, the AWLS technique gave more precise peak counts than the SVD-BC technique, particularly at the lower detection thresholds. While the AWLS technique resulted in more consistent percent residual standard deviation values, a statistical improvement in peak quantification after background correction was not found regardless of the background correction technique used. Copyright © 2012 Elsevier B.V. All rights reserved.

  14. Feasibility of track-based multiple scattering tomography

    NASA Astrophysics Data System (ADS)

    Jansen, H.; Schütze, P.

    2018-04-01

    We present a tomographic technique making use of a gigaelectronvolt electron beam for the determination of the material budget distribution of centimeter-sized objects by means of simulations and measurements. In both cases, the trajectory of electrons traversing a sample under test is reconstructed using a pixel beam-telescope. The width of the deflection angle distribution of electrons undergoing multiple Coulomb scattering at the sample is estimated. Basing the sinogram on position-resolved estimators enables the reconstruction of the original sample using an inverse radon transform. We exemplify the feasibility of this tomographic technique via simulations of two structured cubes—made of aluminium and lead—and via an in-beam measured coaxial adapter. The simulations yield images with FWHM edge resolutions of (177 ± 13) μm and a contrast-to-noise ratio of 5.6 ± 0.2 (7.8 ± 0.3) for aluminium (lead) compared to air. The tomographic reconstruction of a coaxial adapter serves as experimental evidence of the technique and yields a contrast-to-noise ratio of 15.3 ± 1.0 and a FWHM edge resolution of (117 ± 4) μm.

  15. Coarse kMC-based replica exchange algorithms for the accelerated simulation of protein folding in explicit solvent.

    PubMed

    Peter, Emanuel K; Shea, Joan-Emma; Pivkin, Igor V

    2016-05-14

    In this paper, we present a coarse replica exchange molecular dynamics (REMD) approach, based on kinetic Monte Carlo (kMC). The new development significantly can reduce the amount of replicas and the computational cost needed to enhance sampling in protein simulations. We introduce 2 different methods which primarily differ in the exchange scheme between the parallel ensembles. We apply this approach on folding of 2 different β-stranded peptides: the C-terminal β-hairpin fragment of GB1 and TrpZip4. Additionally, we use the new simulation technique to study the folding of TrpCage, a small fast folding α-helical peptide. Subsequently, we apply the new methodology on conformation changes in signaling of the light-oxygen voltage (LOV) sensitive domain from Avena sativa (AsLOV2). Our results agree well with data reported in the literature. In simulations of dialanine, we compare the statistical sampling of the 2 techniques with conventional REMD and analyze their performance. The new techniques can reduce the computational cost of REMD significantly and can be used in enhanced sampling simulations of biomolecules.

  16. Comparison of two novel in-syringe dispersive liquid-liquid microextraction techniques for the determination of iodide in water samples using spectrophotometry.

    PubMed

    Kaykhaii, Massoud; Sargazi, Mona

    2014-01-01

    Two new, rapid methodologies have been developed and applied successfully for the determination of trace levels of iodide in real water samples. Both techniques are based on a combination of in-syringe dispersive liquid-liquid microextraction (IS-DLLME) and micro-volume UV-Vis spectrophotometry. In the first technique, iodide is oxidized with nitrous acid to the colorless anion of ICl2(-) at high concentration of hydrochloric acid. Rhodamine B is added and by means of one step IS-DLLME, the ion-pair formed was extracted into toluene and measured spectrophotometrically. Acetone is used as dispersive solvent. The second method is based on the IS-DLLME microextraction of iodide as iodide/1, 10-phenanthroline-iron((II)) chelate cation ion-pair (colored) into nitrobenzene. Methanol was selected as dispersive solvent. Optimal conditions for iodide extraction were determined for both approaches. Methods are compared in terms of analytical parameters such as precision, accuracy, speed and limit of detection. Both methods were successfully applied to determining iodide in tap and river water samples. Copyright © 2013 Elsevier B.V. All rights reserved.

  17. Approaches to Recruiting ‘Hard-To-Reach’ Populations into Re­search: A Review of the Literature

    PubMed Central

    Shaghaghi, Abdolreza; Bhopal, Raj S; Sheikh, Aziz

    2011-01-01

    Background: ‘Hard-to-reach’ is a term used to describe those sub-groups of the population that may be difficult to reach or involve in research or public health programmes. Application of a single term to call these sub-sections of populations implies a homogeneity within distinct groups, which does not necessarily exist. Different sampling techniques were introduced so far to recruit hard-to-reach populations. In this article, we have reviewed a range of ap­proaches that have been used to widen participation in studies. Methods: We performed a Pubmed and Google search for relevant English language articles using the keywords and phrases: (hard-to-reach AND population* OR sampl*), (hidden AND population* OR sample*) and (“hard to reach” AND population* OR sample*) and a consul­tation of the retrieved articles’ bibliographies to extract empirical evidence from publications that discussed or examined the use of sampling techniques to recruit hidden or hard-to-reach populations in health studies. Results: Reviewing the literature has identified a range of techniques to recruit hard-to-reach populations, including snowball sampling, respondent-driven sampling (RDS), indigenous field worker sampling (IFWS), facility-based sampling (FBS), targeted sampling (TS), time-location (space) sampling (TLS), conventional cluster sampling (CCS) and capture re-capture sampling (CR). Conclusion: The degree of compliance with a study by a certain ‘hard-to-reach’ group de­pends on the characteristics of that group, recruitment technique used and the subject of inter­est. Irrespective of potential advantages or limitations of the recruitment techniques reviewed, their successful use depends mainly upon our knowledge about specific characteristics of the target populations. Thus in line with attempts to expand the current boundaries of our know­ledge about recruitment techniques in health studies and their applications in varying situa­tions, we should also focus on possibly all contributing factors which may have an impact on participation rate within a defined population group. PMID:24688904

  18. System health monitoring using multiple-model adaptive estimation techniques

    NASA Astrophysics Data System (ADS)

    Sifford, Stanley Ryan

    Monitoring system health for fault detection and diagnosis by tracking system parameters concurrently with state estimates is approached using a new multiple-model adaptive estimation (MMAE) method. This novel method is called GRid-based Adaptive Parameter Estimation (GRAPE). GRAPE expands existing MMAE methods by using new techniques to sample the parameter space. GRAPE expands on MMAE with the hypothesis that sample models can be applied and resampled without relying on a predefined set of models. GRAPE is initially implemented in a linear framework using Kalman filter models. A more generalized GRAPE formulation is presented using extended Kalman filter (EKF) models to represent nonlinear systems. GRAPE can handle both time invariant and time varying systems as it is designed to track parameter changes. Two techniques are presented to generate parameter samples for the parallel filter models. The first approach is called selected grid-based stratification (SGBS). SGBS divides the parameter space into equally spaced strata. The second approach uses Latin Hypercube Sampling (LHS) to determine the parameter locations and minimize the total number of required models. LHS is particularly useful when the parameter dimensions grow. Adding more parameters does not require the model count to increase for LHS. Each resample is independent of the prior sample set other than the location of the parameter estimate. SGBS and LHS can be used for both the initial sample and subsequent resamples. Furthermore, resamples are not required to use the same technique. Both techniques are demonstrated for both linear and nonlinear frameworks. The GRAPE framework further formalizes the parameter tracking process through a general approach for nonlinear systems. These additional methods allow GRAPE to either narrow the focus to converged values within a parameter range or expand the range in the appropriate direction to track the parameters outside the current parameter range boundary. Customizable rules define the specific resample behavior when the GRAPE parameter estimates converge. Convergence itself is determined from the derivatives of the parameter estimates using a simple moving average window to filter out noise. The system can be tuned to match the desired performance goals by making adjustments to parameters such as the sample size, convergence criteria, resample criteria, initial sampling method, resampling method, confidence in prior sample covariances, sample delay, and others.

  19. Evaluation of mechanical properties of Aluminum-Copper cold sprayed and alloy 625 wire arc sprayed coatings

    NASA Astrophysics Data System (ADS)

    Bashirzadeh, Milad

    This study examines microstructural-based mechanical properties of Al-Cu composite deposited by cold spraying and wire arc sprayed nickel-based alloy 625 coating using numerical modeling and experimental techniques. The microhardness and elastic modulus of samples were determined using the Knoop hardness technique. Hardness in both transverse and longitudinal directions on the sample cross-sections has been measured. An image-based finite element simulation algorithm was employed to determine the mechanical properties through an inverse analysis. In addition mechanical tests including, tensile, bending, and nano-indentation tests were performed on alloy 625 wire arc sprayed samples. Overall, results from the experimental tests are in relatively good agreement for deposited Al-Cu composites and alloy 625 coating. However, results obtained from numerical simulation are significantly higher in value than experimentally obtained results. Examination and comparison of the results are strong indications of the influence of microstructure characteristics on the mechanical properties of thermally spray deposited coatings.

  20. An ARMS-based technique for sex determination of red panda (Ailurus fulgens).

    PubMed

    Li, Yuzhi; Xu, Xiao; Zhang, Liang; Zhang, Zhihe; Shen, Fujun; Zhang, Wenping; Yue, Bisong

    2011-03-01

    Molecular sexing is a key component in the investigation of wild populations. In this study, we developed a fast, accurate and reliable amplification refractory mutation system (ARMS) technique for sex determination of red panda based on the exon 4 of the ZFX/ZFY gene. The amplicons were distinguished simply by agarose gel electrophoresis, exhibiting one fragment in females (X: 300 bp) and two in males (X: 300 bp, Y: 166 bp). Robustness of this ARMS system was confirmed by testing both 43 captive red pandas using DNA samples with known-sex and 10 wild red pandas using faecal DNA samples with unknown sex. © 2010 Blackwell Publishing Ltd.

  1. Detection of Mycoplasma hyopneumoniae by polymerase chain reaction in swine presenting respiratory problems

    PubMed Central

    Yamaguti, M.; Muller, E.E.; Piffer, A.I.; Kich, J.D.; Klein, C.S.; Kuchiishi, S.S.

    2008-01-01

    Since Mycoplasma hyopneumoniae isolation in appropriate media is a difficult task and impractical for daily routine diagnostics, Nested-PCR (N-PCR) techniques are currently used to improve the direct diagnostic sensitivity of Swine Enzootic Pneumonia. In a first experiment, this paper describes a N-PCR technique optimization based on three variables: different sampling sites, sample transport media, and DNA extraction methods, using eight pigs. Based on the optimization results, a second experiment was conducted for testing validity using 40 animals. In conclusion, the obtained results of the N-PCR optimization and validation allow us to recommend this test as a routine monitoring diagnostic method for Mycoplasma hyopneumoniae infection in swine herds. PMID:24031248

  2. Detection of Cryptosporidium and Cyclospora Oocysts from Environmental Water for Drinking and Recreational Activities in Sarawak, Malaysia.

    PubMed

    Bilung, Lesley Maurice; Tahar, Ahmad Syatir; Yunos, Nur Emyliana; Apun, Kasing; Lim, Yvonne Ai-Lian; Nillian, Elexson; Hashim, Hashimatul Fatma

    2017-01-01

    Cryptosporidiosis and cyclosporiasis are caused by waterborne coccidian protozoan parasites of the genera Cryptosporidium and Cyclospora, respectively. This study was conducted to detect Cryptosporidium and Cyclospora oocysts from environmental water abstracted by drinking water treatment plants and recreational activities in Sarawak, Malaysia. Water samples (12 each) were collected from Sungai Sarawak Kanan in Bau and Sungai Sarawak Kiri in Batu Kitang, respectively. In addition, 6 water samples each were collected from Ranchan Recreational Park and UNIMAS Lake at Universiti Malaysia Sarawak, Kota Samarahan, respectively. Water physicochemical parameters were also recorded. All samples were concentrated by the iron sulfate flocculation method followed by the sucrose floatation technique. Cryptosporidium and Cyclospora were detected by modified Ziehl-Neelsen technique. Correlation of the parasites distribution with water physicochemical parameters was analysed using bivariate Pearson correlation. Based on the 24 total samples of environmental water abstracted by drinking water treatment plants, all the samples (24/24; 100%) were positive with Cryptosporidium , and only 2 samples (2/24; 8.33%) were positive with Cyclospora . Based on the 12 total samples of water for recreational activities, 4 samples (4/12; 33%) were positive with Cryptosporidium , while 2 samples (2/12; 17%) were positive with Cyclospora . Cryptosporidium oocysts were negatively correlated with dissolved oxygen (DO).

  3. Detection of Cryptosporidium and Cyclospora Oocysts from Environmental Water for Drinking and Recreational Activities in Sarawak, Malaysia

    PubMed Central

    Tahar, Ahmad Syatir; Yunos, Nur Emyliana; Apun, Kasing; Nillian, Elexson; Hashim, Hashimatul Fatma

    2017-01-01

    Cryptosporidiosis and cyclosporiasis are caused by waterborne coccidian protozoan parasites of the genera Cryptosporidium and Cyclospora, respectively. This study was conducted to detect Cryptosporidium and Cyclospora oocysts from environmental water abstracted by drinking water treatment plants and recreational activities in Sarawak, Malaysia. Water samples (12 each) were collected from Sungai Sarawak Kanan in Bau and Sungai Sarawak Kiri in Batu Kitang, respectively. In addition, 6 water samples each were collected from Ranchan Recreational Park and UNIMAS Lake at Universiti Malaysia Sarawak, Kota Samarahan, respectively. Water physicochemical parameters were also recorded. All samples were concentrated by the iron sulfate flocculation method followed by the sucrose floatation technique. Cryptosporidium and Cyclospora were detected by modified Ziehl-Neelsen technique. Correlation of the parasites distribution with water physicochemical parameters was analysed using bivariate Pearson correlation. Based on the 24 total samples of environmental water abstracted by drinking water treatment plants, all the samples (24/24; 100%) were positive with Cryptosporidium, and only 2 samples (2/24; 8.33%) were positive with Cyclospora. Based on the 12 total samples of water for recreational activities, 4 samples (4/12; 33%) were positive with Cryptosporidium, while 2 samples (2/12; 17%) were positive with Cyclospora. Cryptosporidium oocysts were negatively correlated with dissolved oxygen (DO). PMID:29234679

  4. Mass spectrometry-based proteomics for translational research: a technical overview.

    PubMed

    Paulo, Joao A; Kadiyala, Vivek; Banks, Peter A; Steen, Hanno; Conwell, Darwin L

    2012-03-01

    Mass spectrometry-based investigation of clinical samples enables the high-throughput identification of protein biomarkers. We provide an overview of mass spectrometry-based proteomic techniques that are applicable to the investigation of clinical samples. We address sample collection, protein extraction and fractionation, mass spectrometry modalities, and quantitative proteomics. Finally, we examine the limitations and further potential of such technologies. Liquid chromatography fractionation coupled with tandem mass spectrometry is well suited to handle mixtures of hundreds or thousands of proteins. Mass spectrometry-based proteome elucidation can reveal potential biomarkers and aid in the development of hypotheses for downstream investigation of the molecular mechanisms of disease.

  5. Mass Spectrometry-Based Proteomics for Translational Research: A Technical Overview

    PubMed Central

    Paulo, Joao A.; Kadiyala, Vivek; Banks, Peter A.; Steen, Hanno; Conwell, Darwin L.

    2012-01-01

    Mass spectrometry-based investigation of clinical samples enables the high-throughput identification of protein biomarkers. We provide an overview of mass spectrometry-based proteomic techniques that are applicable to the investigation of clinical samples. We address sample collection, protein extraction and fractionation, mass spectrometry modalities, and quantitative proteomics. Finally, we examine the limitations and further potential of such technologies. Liquid chromatography fractionation coupled with tandem mass spectrometry is well suited to handle mixtures of hundreds or thousands of proteins. Mass spectrometry-based proteome elucidation can reveal potential biomarkers and aid in the development of hypotheses for downstream investigation of the molecular mechanisms of disease. PMID:22461744

  6. Microwave Heating of Synthetic Skin Samples for Potential Treatment of Gout Using the Metal-Assisted and Microwave-Accelerated Decrystallization Technique

    PubMed Central

    2016-01-01

    Physical stability of synthetic skin samples during their exposure to microwave heating was investigated to demonstrate the use of the metal-assisted and microwave-accelerated decrystallization (MAMAD) technique for potential biomedical applications. In this regard, optical microscopy and temperature measurements were employed for the qualitative and quantitative assessment of damage to synthetic skin samples during 20 s intermittent microwave heating using a monomode microwave source (at 8 GHz, 2–20 W) up to 120 s. The extent of damage to synthetic skin samples, assessed by the change in the surface area of skin samples, was negligible for microwave power of ≤7 W and more extensive damage (>50%) to skin samples occurred when exposed to >7 W at initial temperature range of 20–39 °C. The initial temperature of synthetic skin samples significantly affected the extent of change in temperature of synthetic skin samples during their exposure to microwave heating. The proof of principle use of the MAMAD technique was demonstrated for the decrystallization of a model biological crystal (l-alanine) placed under synthetic skin samples in the presence of gold nanoparticles. Our results showed that the size (initial size ∼850 μm) of l-alanine crystals can be reduced up to 60% in 120 s without damage to synthetic skin samples using the MAMAD technique. Finite-difference time-domain-based simulations of the electric field distribution of an 8 GHz monomode microwave radiation showed that synthetic skin samples are predicted to absorb ∼92.2% of the microwave radiation. PMID:27917407

  7. Applying a low energy HPGe detector gamma ray spectrometric technique for the evaluation of Pu/Am ratio in biological samples.

    PubMed

    Singh, I S; Mishra, Lokpati; Yadav, J R; Nadar, M Y; Rao, D D; Pradeepkumar, K S

    2015-10-01

    The estimation of Pu/(241)Am ratio in the biological samples is an important input for the assessment of internal dose received by the workers. The radiochemical separation of Pu isotopes and (241)Am in a sample followed by alpha spectrometry is a widely used technique for the determination of Pu/(241)Am ratio. However, this method is time consuming and many times quick estimation is required. In this work, Pu/(241)Am ratio in the biological sample was estimated with HPGe detector based measurements using gamma/X-rays emitted by these radionuclides. These results were compared with those obtained from alpha spectroscopy of sample after radiochemical analysis and found to be in good agreement. Copyright © 2015 Elsevier Ltd. All rights reserved.

  8. Intracavity optogalvanic spectroscopy. An analytical technique for 14C analysis with subattomole sensitivity.

    PubMed

    Murnick, Daniel E; Dogru, Ozgur; Ilkmen, Erhan

    2008-07-01

    We show a new ultrasensitive laser-based analytical technique, intracavity optogalvanic spectroscopy, allowing extremely high sensitivity for detection of (14)C-labeled carbon dioxide. Capable of replacing large accelerator mass spectrometers, the technique quantifies attomoles of (14)C in submicrogram samples. Based on the specificity of narrow laser resonances coupled with the sensitivity provided by standing waves in an optical cavity and detection via impedance variations, limits of detection near 10(-15) (14)C/(12)C ratios are obtained. Using a 15-W (14)CO2 laser, a linear calibration with samples from 10(-15) to >1.5 x 10(-12) in (14)C/(12)C ratios, as determined by accelerator mass spectrometry, is demonstrated. Possible applications include microdosing studies in drug development, individualized subtherapeutic tests of drug metabolism, carbon dating and real time monitoring of atmospheric radiocarbon. The method can also be applied to detection of other trace entities.

  9. Chemical and biological threat-agent detection using electrophoresis-based lab-on-a-chip devices.

    PubMed

    Borowsky, Joseph; Collins, Greg E

    2007-10-01

    The ability to separate complex mixtures of analytes has made capillary electrophoresis (CE) a powerful analytical tool since its modern configuration was first introduced over 25 years ago. The technique found new utility with its application to the microfluidics based lab-on-a-chip platform (i.e., microchip), which resulted in ever smaller footprints, sample volumes, and analysis times. These features, coupled with the technique's potential for portability, have prompted recent interest in the development of novel analyzers for chemical and biological threat agents. This article will comment on three main areas of microchip CE as applied to the separation and detection of threat agents: detection techniques and their corresponding limits of detection, sampling protocol and preparation time, and system portability. These three areas typify the broad utility of lab-on-a-chip for meeting critical, present-day security, in addition to illustrating areas wherein advances are necessary.

  10. Thermophysical Properties Measurements of Zr62Cu20Al10Ni8

    NASA Technical Reports Server (NTRS)

    Bradshaw, Richard C.; Waren, Mary; Rogers, Jan R.; Rathz, Thomas J.; Gangopadhyay, Anup K.; Kelton, Ken F.; Hyers, Robert W.

    2006-01-01

    Thermophysical property studies performed at high temperature can prove challenging because of reactivity problems brought on by the elevated temperatures. Contaminants from measuring devices and container walls can cause changes in properties. To prevent this, containerless processing techniques can be employed to isolate a sample during study. A common method used for this is levitation. Typical levitation methods used for containerless processing are, aerodynamically, electromagnetically and electrostatically based. All levitation methods reduce heterogeneous nucleation sites, 'which in turn provide access to metastable undercooled phases. In particular, electrostatic levitation is appealing because sample motion and stirring are minimized; and by combining it with optically based non-contact measuring techniques, many thermophysical properties can be measured. Applying some of these techniques, surface tension, viscosity and density have been measured for the glass forming alloy Zr62Cu20Al10Ni8 and will be presented with a brief overview of the non-contact measuring method used.

  11. Mars chronology: Assessing techniques for quantifying surficial processes

    USGS Publications Warehouse

    Doran, P.T.; Clifford, S.M.; Forman, S.L.; Nyquist, Larry; Papanastassiou, D.A.; Stewart, B.W.; Sturchio, N.C.; Swindle, T.D.; Cerling, T.; Kargel, J.; McDonald, G.; Nishiizumi, K.; Poreda, R.; Rice, J.W.; Tanaka, K.

    2004-01-01

    Currently, the absolute chronology of Martian rocks, deposits and events is based mainly on crater counting and remains highly imprecise with epoch boundary uncertainties in excess of 2 billion years. Answers to key questions concerning the comparative origin and evolution of Mars and Earth will not be forthcoming without a rigid Martian chronology, enabling the construction of a time scale comparable to Earth's. Priorities for exploration include calibration of the cratering rate, dating major volcanic and fluvial events and establishing chronology of the polar layered deposits. If extinct and/or extant life is discovered, the chronology of the biosphere will be of paramount importance. Many radiometric and cosmogenic techniques applicable on Earth and the Moon will apply to Mars after certain baselines (e.g. composition of the atmosphere, trace species, chemical and physical characteristics of Martian dust) are established. The high radiation regime may pose a problem for dosimetry-based techniques (e.g. luminescence). The unique isotopic composition of nitrogen in the Martian atmosphere may permit a Mars-specific chronometer for tracing the time-evolution of the atmosphere and of lithic phases with trapped atmospheric gases. Other Mars-specific chronometers include measurement of gas fluxes and accumulation of platinum group elements (PGE) in the regolith. Putting collected samples into geologic context is deemed essential, as is using multiple techniques on multiple samples. If in situ measurements are restricted to a single technique it must be shown to give consistent results on multiple samples, but in all cases, using two or more techniques (e.g. on the same lander) will reduce error. While there is no question that returned samples will yield the best ages, in situ techniques have the potential to be flown on multiple missions providing a larger data set and broader context in which to place the more accurate dates. ?? 2004 Elsevier B.V. All rights reserved.

  12. Intelligent evaluation of color sensory quality of black tea by visible-near infrared spectroscopy technology: A comparison of spectra and color data information

    NASA Astrophysics Data System (ADS)

    Ouyang, Qin; Liu, Yan; Chen, Quansheng; Zhang, Zhengzhu; Zhao, Jiewen; Guo, Zhiming; Gu, Hang

    2017-06-01

    Instrumental test of black tea samples instead of human panel test is attracting massive attention recently. This study focused on an investigation of the feasibility for estimation of the color sensory quality of black tea samples using the VIS-NIR spectroscopy technique, comparing the performances of models based on the spectra and color information. In model calibration, the variables were first selected by genetic algorithm (GA); then the nonlinear back propagation-artificial neural network (BPANN) models were established based on the optimal variables. In comparison with the other models, GA-BPANN models from spectra data information showed the best performance, with the correlation coefficient of 0.8935, and the root mean square error of 0.392 in the prediction set. In addition, models based on the spectra information provided better performance than that based on the color parameters. Therefore, the VIS-NIR spectroscopy technique is a promising tool for rapid and accurate evaluation of the sensory quality of black tea samples.

  13. Intelligent evaluation of color sensory quality of black tea by visible-near infrared spectroscopy technology: A comparison of spectra and color data information.

    PubMed

    Ouyang, Qin; Liu, Yan; Chen, Quansheng; Zhang, Zhengzhu; Zhao, Jiewen; Guo, Zhiming; Gu, Hang

    2017-06-05

    Instrumental test of black tea samples instead of human panel test is attracting massive attention recently. This study focused on an investigation of the feasibility for estimation of the color sensory quality of black tea samples using the VIS-NIR spectroscopy technique, comparing the performances of models based on the spectra and color information. In model calibration, the variables were first selected by genetic algorithm (GA); then the nonlinear back propagation-artificial neural network (BPANN) models were established based on the optimal variables. In comparison with the other models, GA-BPANN models from spectra data information showed the best performance, with the correlation coefficient of 0.8935, and the root mean square error of 0.392 in the prediction set. In addition, models based on the spectra information provided better performance than that based on the color parameters. Therefore, the VIS-NIR spectroscopy technique is a promising tool for rapid and accurate evaluation of the sensory quality of black tea samples. Copyright © 2017 Elsevier B.V. All rights reserved.

  14. Determination of Ammonia in Household Cleaners: An Instrumental Analysis Experiment.

    ERIC Educational Resources Information Center

    Graham, Richard C.; DePew, Steven

    1983-01-01

    Briefly discusses three techniques for assessing amount of ammonia present in household cleaners. Because of disadvantages with these methods, the thermometric titration technique is suggested in which students judge the best buy based on relative cost of ammonia present in samples. Laboratory procedures, typical results, and reactions involved…

  15. Evaluation and recommendation of sensitivity analysis methods for application to Stochastic Human Exposure and Dose Simulation models.

    PubMed

    Mokhtari, Amirhossein; Christopher Frey, H; Zheng, Junyu

    2006-11-01

    Sensitivity analyses of exposure or risk models can help identify the most significant factors to aid in risk management or to prioritize additional research to reduce uncertainty in the estimates. However, sensitivity analysis is challenged by non-linearity, interactions between inputs, and multiple days or time scales. Selected sensitivity analysis methods are evaluated with respect to their applicability to human exposure models with such features using a testbed. The testbed is a simplified version of a US Environmental Protection Agency's Stochastic Human Exposure and Dose Simulation (SHEDS) model. The methods evaluated include the Pearson and Spearman correlation, sample and rank regression, analysis of variance, Fourier amplitude sensitivity test (FAST), and Sobol's method. The first five methods are known as "sampling-based" techniques, wheras the latter two methods are known as "variance-based" techniques. The main objective of the test cases was to identify the main and total contributions of individual inputs to the output variance. Sobol's method and FAST directly quantified these measures of sensitivity. Results show that sensitivity of an input typically changed when evaluated under different time scales (e.g., daily versus monthly). All methods provided similar insights regarding less important inputs; however, Sobol's method and FAST provided more robust insights with respect to sensitivity of important inputs compared to the sampling-based techniques. Thus, the sampling-based methods can be used in a screening step to identify unimportant inputs, followed by application of more computationally intensive refined methods to a smaller set of inputs. The implications of time variation in sensitivity results for risk management are briefly discussed.

  16. Countering imbalanced datasets to improve adverse drug event predictive models in labor and delivery.

    PubMed

    Taft, L M; Evans, R S; Shyu, C R; Egger, M J; Chawla, N; Mitchell, J A; Thornton, S N; Bray, B; Varner, M

    2009-04-01

    The IOM report, Preventing Medication Errors, emphasizes the overall lack of knowledge of the incidence of adverse drug events (ADE). Operating rooms, emergency departments and intensive care units are known to have a higher incidence of ADE. Labor and delivery (L&D) is an emergency care unit that could have an increased risk of ADE, where reported rates remain low and under-reporting is suspected. Risk factor identification with electronic pattern recognition techniques could improve ADE detection rates. The objective of the present study is to apply Synthetic Minority Over Sampling Technique (SMOTE) as an enhanced sampling method in a sparse dataset to generate prediction models to identify ADE in women admitted for labor and delivery based on patient risk factors and comorbidities. By creating synthetic cases with the SMOTE algorithm and using a 10-fold cross-validation technique, we demonstrated improved performance of the Naïve Bayes and the decision tree algorithms. The true positive rate (TPR) of 0.32 in the raw dataset increased to 0.67 in the 800% over-sampled dataset. Enhanced performance from classification algorithms can be attained with the use of synthetic minority class oversampling techniques in sparse clinical datasets. Predictive models created in this manner can be used to develop evidence based ADE monitoring systems.

  17. Comet composition and density analyzer

    NASA Technical Reports Server (NTRS)

    Clark, B. C.

    1982-01-01

    Distinctions between cometary material and other extraterrestrial materials (meteorite suites and stratospherically-captured cosmic dust) are addressed. The technique of X-ray fluorescence (XRF) for analysis of elemental composition is involved. Concomitant with these investigations, the problem of collecting representative samples of comet dust (for rendezvous missions) was solved, and several related techniques such as mineralogic analysis (X-ray diffraction), direct analysis of the nucleus without docking (electron macroprobe), dust flux rate measurement, and test sample preparation were evaluated. An explicit experiment concept based upon X-ray fluorescence analysis of biased and unbiased sample collections was scoped and proposed for a future rendezvous mission with a short-period comet.

  18. Optical fiber-based full Mueller polarimeter for endoscopic imaging using a two-wavelength simultaneous measurement method

    NASA Astrophysics Data System (ADS)

    Vizet, Jérémy; Manhas, Sandeep; Tran, Jacqueline; Validire, Pierre; Benali, Abdelali; Garcia-Caurel, Enric; Pierangelo, Angelo; Martino, Antonello De; Pagnoux, Dominique

    2016-07-01

    This paper reports a technique based on spectrally differential measurement for determining the full Mueller matrix of a biological sample through an optical fiber. In this technique, two close wavelengths were used simultaneously, one for characterizing the fiber and the other for characterizing the assembly of fiber and sample. The characteristics of the fiber measured at one wavelength were used to decouple its contribution from the measurement on the assembly of fiber and sample and then to extract sample Mueller matrix at the second wavelength. The proof of concept was experimentally validated by measuring polarimetric parameters of various calibrated optical components through the optical fiber. Then, polarimetric images of histological cuts of human colon tissues were measured, and retardance, diattenuation, and orientation of the main axes of fibrillar regions were displayed. Finally, these images were successfully compared with images obtained by a free space Mueller microscope. As the reported method does not use any moving component, it offers attractive integration possibilities with an endoscopic probe.

  19. The Detection Method of Escherichia coli in Water Resources: A Review

    NASA Astrophysics Data System (ADS)

    Nurliyana, M. R.; Sahdan, M. Z.; Wibowo, K. M.; Muslihati, A.; Saim, H.; Ahmad, S. A.; Sari, Y.; Mansor, Z.

    2018-04-01

    This article reviews several approaches for Escherichia coli (E. coli) bacteria detection from conventional methods, emerging method and goes to biosensor-based techniques. Detection and enumeration of E. coli bacteria usually required long duration of time in obtaining the result since laboratory-based approach is normally used in its assessment. It requires 24 hours to 72 hours after sampling to process the culturing samples before results are available. Although faster technique for detecting E. coli in water such as Polymerase Chain Reaction (PCR) and Enzyme-Linked Immunosorbent Assay (ELISA) have been developed, it still required transporting the samples from water resources to the laboratory, high-cost, complicated equipment usage, complex procedures, as well as the requirement of skilled specialist to cope with the complexity which limit their wide spread practice in water quality detection. Recently, development of biosensor device that is easy to perform, portable, highly sensitive and selective becomes indispensable in detecting extremely lower consolidation of pathogenic E. coli bacteria in water samples.

  20. Hyperspectral imaging as a technique for investigating the effect of consolidating materials on wood

    NASA Astrophysics Data System (ADS)

    Bonifazi, Giuseppe; Serranti, Silvia; Capobianco, Giuseppe; Agresti, Giorgia; Calienno, Luca; Picchio, Rodolfo; Lo Monaco, Angela; Santamaria, Ulderico; Pelosi, Claudia

    2017-01-01

    The focus of this study was to investigate the potential of hyperspectral imaging (HSI) in the monitoring of commercial consolidant products applied on wood samples. Poplar (Populus spp.) and walnut (Juglans Regia L.) were chosen for the consolidant application. Both traditional and innovative products were selected, based on acrylic, epoxy, and aliphatic compounds. Wood samples were stressed by freeze/thaw cycles in order to cause material degradation without the loss of wood components. Then the consolidant was applied under vacuum. The samples were finally artificially aged for 168 h in a solar box chamber. The samples were acquired in the short wave infrared (1000 to 2500 nm) range by SISUChema XL™ device (Specim, Finland) after 168 h of irradiation. As comparison, color measurement was also used as an economic, simple, and noninvasive technique to evaluate the deterioration and consolidation effects on wood. All data were then processed adopting a chemometric approach finalized to define correlation models, HSI based, between consolidating materials, wood species, and short-time aging effects.

  1. Operational considerations for the application of remotely sensed forest data from LANDSAT or other airborne platforms

    NASA Technical Reports Server (NTRS)

    Baker, G. R.; Fethe, T. P.

    1975-01-01

    Research in the application of remotely sensed data from LANDSAT or other airborne platforms to the efficient management of a large timber based forest industry was divided into three phases: (1) establishment of a photo/ground sample correlation, (2) investigation of techniques for multi-spectral digital analysis, and (3) development of a semi-automated multi-level sampling system. To properly verify results, three distinct test areas were selected: (1) Jacksonville Mill Region, Lower Coastal Plain, Flatwoods, (2) Pensacola Mill Region, Middle Coastal Plain, and (3) Mississippi Mill Region, Middle Coastal Plain. The following conclusions were reached: (1) the probability of establishing an information base suitable for management requirements through a photo/ground double sampling procedure, alleviating the ground sampling effort, is encouraging, (2) known classification techniques must be investigated to ascertain the level of precision possible in separating the many densities involved, and (3) the multi-level approach must be related to an information system that is executable and feasible.

  2. Optical fiber-based full Mueller polarimeter for endoscopic imaging using a two-wavelength simultaneous measurement method.

    PubMed

    Vizet, Jérémy; Manhas, Sandeep; Tran, Jacqueline; Validire, Pierre; Benali, Abdelali; Garcia-Caurel, Enric; Pierangelo, Angelo; De Martino, Antonello; Pagnoux, Dominique

    2016-07-01

    This paper reports a technique based on spectrally differential measurement for determining the full Mueller matrix of a biological sample through an optical fiber. In this technique, two close wavelengths were used simultaneously, one for characterizing the fiber and the other for characterizing the assembly of fiber and sample. The characteristics of the fiber measured at one wavelength were used to decouple its contribution from the measurement on the assembly of fiber and sample and then to extract sample Mueller matrix at the second wavelength. The proof of concept was experimentally validated by measuring polarimetric parameters of various calibrated optical components through the optical fiber. Then, polarimetric images of histological cuts of human colon tissues were measured, and retardance, diattenuation, and orientation of the main axes of fibrillar regions were displayed. Finally, these images were successfully compared with images obtained by a free space Mueller microscope. As the reported method does not use any moving component, it offers attractive integration possibilities with an endoscopic probe.

  3. A Technique for Thermal Desorption Analyses Suitable for Thermally-Labile, Volatile Compounds.

    PubMed

    Alborn, Hans T

    2018-02-01

    Many plant and insect interactions are governed by odors released by the plants or insects and there exists a continual need for new or improved methods to collect and identify these odors. Our group has for some time studied below-ground, plant-produced volatile signals affecting nematode and insect behavior. The research requires repeated sampling of volatiles of intact plant/soil systems in the laboratory as well as the field with the help of probes to minimize unwanted effects on the systems we are studying. After evaluating solid adsorbent filters with solvent extraction or solid phase micro extraction fiber sample collection, we found dynamic sampling of small air volumes on Tenax TA filters followed by thermal desorption sample introduction to be the most suitable analytical technique for our applications. Here we present the development and evaluation of a low-cost and relatively simple thermal desorption technique where a cold trap cooled with liquid carbon dioxide is added as an integral part of a splitless injector. Temperature gradient-based focusing and low thermal mass minimizes aerosol formation and eliminates the need for flash heating, resulting in low sample degradation comparable to solvent-based on-column injections. Additionally, since the presence of the cold trap does not affect normal splitless injections, on-the-fly switching between splitless and thermal desorption modes can be used for external standard quantification.

  4. The use of multilevel sampling techniques for determining shallow aquifer nitrate profiles.

    PubMed

    Lasagna, Manuela; De Luca, Domenico Antonio

    2016-10-01

    Nitrate is a worldwide pollutant in aquifers. Shallow aquifer nitrate concentrations generally display vertical stratification, with a maximum concentration immediately below the water level. The concentration then gradually decreases with depth. Different techniques can be used to highlight this stratification. The paper aims at comparing the advantages and limitations of three open hole multilevel sampling techniques (packer system, dialysis membrane samplers and bailer), chosen on the base of a literary review, to highlight a nitrate vertical stratification under the assumption of (sub)horizontal flow in the aquifer. The sampling systems were employed at three different times of the year in a shallow aquifer piezometer in northern Italy. The optimal purge time, equilibration time and water volume losses during the time in the piezometer were evaluated. Multilevel techniques highlighted a similar vertical nitrate stratification, present throughout the year. Indeed, nitrate concentrations generally decreased with depth downwards, but with significantly different levels in the sampling campaigns. Moreover, the sampling techniques produced different degrees of accuracy. More specifically, the dialysis membrane samplers provided the most accurate hydrochemical profile of the shallow aquifer and they appear to be necessary when the objective is to detect the discontinuities in the nitrate profile. Bailer and packer system showed the same nitrate profile with little differences of concentration. However, the bailer resulted much more easier to use.

  5. Effect of non-Poisson samples on turbulence spectra from laser velocimetry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sree, D.; Kjelgaard, S.O.; Sellers, W.L. III

    1994-12-01

    Spectral estimations from LV data are typically based on the assumption of a Poisson sampling process. It is demonstrated here that the sampling distribution must be considered before spectral estimates are used to infer turbulence scales. A non-Poisson sampling process can occur if there is nonhomogeneous distribution of particles in the flow. Based on the study of a simulated first-order spectrum, it has been shown that a non-Poisson sampling process causes the estimated spectrum to deviate from the true spectrum. Also, in this case the prefiltering techniques do not improve the spectral estimates at higher frequencies. 4 refs.

  6. Thermal Conductivity Measurements of Helium 4 Near the Lambda-Transition Using a Magnetostrictive Low Gravity Simulator

    NASA Technical Reports Server (NTRS)

    Larson, Melora; Israelsson, Ulf E.

    1995-01-01

    There has been a recent increase in interest both experimentally and theoretically in the study of liquid helium very near the lambda-transition in the presence of a heat current. In traditional ground based experiments there are gravitationally induced pressure variations in any macroscopic helium sample that limit how closely the transition can be approached. We have taken advantage of the finite magnetic susceptibility of He 4 to build a magnetostrictive low gravity simulator. The simulator consists of a superconducting magnet with field profile shaped to counteract the force of gravity in a helium sample. When the magnet is operated with B x dB/dz = 21T(exp 2)/cm at the location of the cell, the gravitationally induced pressure variations will be canceled to within 1% over a volume of 0.5 cm in height and 0.5 cm in diameter. This technique for canceling the pressure variations in a long sample cell allows the lambda-transition to be studied much closer in reduced temperature and under a wider range of applied heat currents than is possible using other ground based techniques. Preliminary results using this low gravity simulator and the limitations of the magnetostrictive technique in comparison to doing space based experiments will be presented.

  7. Sequential time interleaved random equivalent sampling for repetitive signal.

    PubMed

    Zhao, Yijiu; Liu, Jingjing

    2016-12-01

    Compressed sensing (CS) based sampling techniques exhibit many advantages over other existing approaches for sparse signal spectrum sensing; they are also incorporated into non-uniform sampling signal reconstruction to improve the efficiency, such as random equivalent sampling (RES). However, in CS based RES, only one sample of each acquisition is considered in the signal reconstruction stage, and it will result in more acquisition runs and longer sampling time. In this paper, a sampling sequence is taken in each RES acquisition run, and the corresponding block measurement matrix is constructed using a Whittaker-Shannon interpolation formula. All the block matrices are combined into an equivalent measurement matrix with respect to all sampling sequences. We implemented the proposed approach with a multi-cores analog-to-digital converter (ADC), whose ADC cores are time interleaved. A prototype realization of this proposed CS based sequential random equivalent sampling method has been developed. It is able to capture an analog waveform at an equivalent sampling rate of 40 GHz while sampled at 1 GHz physically. Experiments indicate that, for a sparse signal, the proposed CS based sequential random equivalent sampling exhibits high efficiency.

  8. Redesigning flow injection after 40 years of development: Flow programming.

    PubMed

    Ruzicka, Jaromir Jarda

    2018-01-01

    Automation of reagent based assays, by means of Flow Injection (FI), is based on sample processing, in which a sample flows continuously towards and through a detector for quantification of the target analyte. The Achilles heel of this methodology, the legacy of Auto Analyzer®, is continuous reagent consumption, and continuous generation of chemical waste. However, flow programming, assisted by recent advances in precise pumping, combined with the lab-on-valve technique, allows the FI manifold to be designed around a single confluence point through which sample and reagents are sequentially directed by means of a series of flow reversals. This approach results in sample/reagent mixing analogous to the traditional FI, reduces sample and reagent consumption, and uses the stop flow technique for enhancement of the yield of chemical reactions. The feasibility of programmable Flow Injection (pFI) is documented by example of commonly used spectrophotometric assays of, phosphate, nitrate, nitrite and glucose. Experimental details and additional information are available in online tutorial http://www.flowinjectiontutorial.com/. Copyright © 2017 Elsevier B.V. All rights reserved.

  9. Beyond simple small-angle X-ray scattering: developments in online complementary techniques and sample environments.

    PubMed

    Bras, Wim; Koizumi, Satoshi; Terrill, Nicholas J

    2014-11-01

    Small- and wide-angle X-ray scattering (SAXS, WAXS) are standard tools in materials research. The simultaneous measurement of SAXS and WAXS data in time-resolved studies has gained popularity due to the complementary information obtained. Furthermore, the combination of these data with non X-ray based techniques, via either simultaneous or independent measurements, has advanced understanding of the driving forces that lead to the structures and morphologies of materials, which in turn give rise to their properties. The simultaneous measurement of different data regimes and types, using either X-rays or neutrons, and the desire to control parameters that initiate and control structural changes have led to greater demands on sample environments. Examples of developments in technique combinations and sample environment design are discussed, together with a brief speculation about promising future developments.

  10. Beyond simple small-angle X-ray scattering: developments in online complementary techniques and sample environments

    PubMed Central

    Bras, Wim; Koizumi, Satoshi; Terrill, Nicholas J

    2014-01-01

    Small- and wide-angle X-ray scattering (SAXS, WAXS) are standard tools in materials research. The simultaneous measurement of SAXS and WAXS data in time-resolved studies has gained popularity due to the complementary information obtained. Furthermore, the combination of these data with non X-ray based techniques, via either simultaneous or independent measurements, has advanced understanding of the driving forces that lead to the structures and morphologies of materials, which in turn give rise to their properties. The simultaneous measurement of different data regimes and types, using either X-rays or neutrons, and the desire to control parameters that initiate and control structural changes have led to greater demands on sample environments. Examples of developments in technique combinations and sample environment design are discussed, together with a brief speculation about promising future developments. PMID:25485128

  11. Probing dynamics of micro-magnets with multi-mode superconducting resonator

    NASA Astrophysics Data System (ADS)

    Golovchanskiy, I. A.; Abramov, N. N.; Stolyarov, V. S.; Shchetinin, I. V.; Dzhumaev, P. S.; Averkin, A. S.; Kozlov, S. N.; Golubov, A. A.; Ryazanov, V. V.; Ustinov, A. V.

    2018-05-01

    In this work, we propose and explore a sensitive technique for investigation of ferromagnetic resonance and corresponding magnetic properties of individual micro-scaled and/or weak ferromagnetic samples. The technique is based on coupling the investigated sample to a high-Q transmission line superconducting resonator, where the response of the sample is studied at eigen frequencies of the resonator. The high quality factor of the resonator enables sensitive detection of weak absorption losses at multiple frequencies of the ferromagnetic resonance. Studying the microwave response of individual micro-scaled permalloy rectangles, we have confirmed the superiority of fluxometric demagnetizing factor over the commonly accepted magnetometric one and have depicted the demagnetization of the sample, as well as magnetostatic standing wave resonance.

  12. Nonprobability and probability-based sampling strategies in sexual science.

    PubMed

    Catania, Joseph A; Dolcini, M Margaret; Orellana, Roberto; Narayanan, Vasudah

    2015-01-01

    With few exceptions, much of sexual science builds upon data from opportunistic nonprobability samples of limited generalizability. Although probability-based studies are considered the gold standard in terms of generalizability, they are costly to apply to many of the hard-to-reach populations of interest to sexologists. The present article discusses recent conclusions by sampling experts that have relevance to sexual science that advocates for nonprobability methods. In this regard, we provide an overview of Internet sampling as a useful, cost-efficient, nonprobability sampling method of value to sex researchers conducting modeling work or clinical trials. We also argue that probability-based sampling methods may be more readily applied in sex research with hard-to-reach populations than is typically thought. In this context, we provide three case studies that utilize qualitative and quantitative techniques directed at reducing limitations in applying probability-based sampling to hard-to-reach populations: indigenous Peruvians, African American youth, and urban men who have sex with men (MSM). Recommendations are made with regard to presampling studies, adaptive and disproportionate sampling methods, and strategies that may be utilized in evaluating nonprobability and probability-based sampling methods.

  13. Method and apparatus for measuring butterfat and protein content using microwave absorption techniques

    DOEpatents

    Fryer, Michael O.; Hills, Andrea J.; Morrison, John L.

    2000-01-01

    A self calibrating method and apparatus for measuring butterfat and protein content based on measuring the microwave absorption of a sample of milk at several microwave frequencies. A microwave energy source injects microwave energy into the resonant cavity for absorption and reflection by the sample undergoing evaluation. A sample tube is centrally located in the resonant cavity passing therethrough and exposing the sample to the microwave energy. A portion of the energy is absorbed by the sample while another portion of the microwave energy is reflected back to an evaluation device such as a network analyzer. The frequency at which the reflected radiation is at a minimum within the cavity is combined with the scatter coefficient S.sub.11 as well as a phase change to calculate the butterfat content in the sample. The protein located within the sample may also be calculated in a likewise manner using the frequency, S.sub.11 and phase variables. A differential technique using a second resonant cavity containing a reference standard as a sample will normalize the measurements from the unknown sample and thus be self-calibrating. A shuttered mechanism will switch the microwave excitation between the unknown and the reference cavities. An integrated apparatus for measuring the butterfat content in milk using microwave absorption techniques is also presented.

  14. An electrochemical and structural study of highly uniform tin oxide nanowires fabricated by a novel, scalable solvoplasma technique as anode material for sodium ion batteries

    NASA Astrophysics Data System (ADS)

    Mukherjee, Santanu; Schuppert, Nicholas; Bates, Alex; Jasinski, Jacek; Hong, Jong-Eun; Choi, Moon Jong; Park, Sam

    2017-04-01

    A novel solvoplasma based technique was used to fabricate highly uniform SnO2 nanowires (NWs) for application as an anode in sodium-ion batteries (SIBs). This technique is scalable, rapid, and utilizes a rigorous cleaning process to produce very pure SnO2 NWs with enhanced porosity; which improves sodium-ion hosting and reaction kinetics. The batch of NWs obtained from the plasma process were named the "as-made" sample and after cleaning the "pure" sample. Structural characterization showed that the as-made sample has a K+ ion impurity which is absent in the pure samples. The pure samples have a higher maximum specific capacity, 400.71 mAhg-1, and Coulombic efficiency, 85%, compared to the as-made samples which have a maximum specific capacity of 174.69 mAhg-1 and Coulombic efficiency of 74% upon cycling. A study of the electrochemical impedance spectra showed that the as-made samples have a higher interfacial and diffusion resistance than the pure samples and resistances increased after 50 cycles of cell operation for both samples due to progressive electrode degradation. Specific energy vs specific power plots were employed to analyze the performance of the system with respect to the working conditions.

  15. Two phase sampling for wheat acreage estimation. [large area crop inventory experiment

    NASA Technical Reports Server (NTRS)

    Thomas, R. W.; Hay, C. M.

    1977-01-01

    A two phase LANDSAT-based sample allocation and wheat proportion estimation method was developed. This technique employs manual, LANDSAT full frame-based wheat or cultivated land proportion estimates from a large number of segments comprising a first sample phase to optimally allocate a smaller phase two sample of computer or manually processed segments. Application to the Kansas Southwest CRD for 1974 produced a wheat acreage estimate for that CRD within 2.42 percent of the USDA SRS-based estimate using a lower CRD inventory budget than for a simulated reference LACIE system. Factor of 2 or greater cost or precision improvements relative to the reference system were obtained.

  16. A wall-free climate unit for acoustic levitators.

    PubMed

    Schlegel, M C; Wenzel, K-J; Sarfraz, A; Panne, U; Emmerling, F

    2012-05-01

    Acoustic levitation represents the physical background of trapping a sample in a standing acoustic wave with no contact to the wave generating device. For the last three decades, sample holders based on this effect have been commonly used for contact free handling of samples coupled with a number of analytical techniques. In this study, a wall-free climate unit is presented, which allows the control of the environmental conditions of suspended samples. The insulation is based on a continuous cold/hot gas flow around the sample and thus does not require any additional isolation material. This provides a direct access to the levitated sample and circumvents any influence of the climate unit material to the running analyses.

  17. A wall-free climate unit for acoustic levitators

    NASA Astrophysics Data System (ADS)

    Schlegel, M. C.; Wenzel, K.-J.; Sarfraz, A.; Panne, U.; Emmerling, F.

    2012-05-01

    Acoustic levitation represents the physical background of trapping a sample in a standing acoustic wave with no contact to the wave generating device. For the last three decades, sample holders based on this effect have been commonly used for contact free handling of samples coupled with a number of analytical techniques. In this study, a wall-free climate unit is presented, which allows the control of the environmental conditions of suspended samples. The insulation is based on a continuous cold/hot gas flow around the sample and thus does not require any additional isolation material. This provides a direct access to the levitated sample and circumvents any influence of the climate unit material to the running analyses.

  18. Tight-frame based iterative image reconstruction for spectral breast CT

    PubMed Central

    Zhao, Bo; Gao, Hao; Ding, Huanjun; Molloi, Sabee

    2013-01-01

    Purpose: To investigate tight-frame based iterative reconstruction (TFIR) technique for spectral breast computed tomography (CT) using fewer projections while achieving greater image quality. Methods: The experimental data were acquired with a fan-beam breast CT system based on a cadmium zinc telluride photon-counting detector. The images were reconstructed with a varying number of projections using the TFIR and filtered backprojection (FBP) techniques. The image quality between these two techniques was evaluated. The image's spatial resolution was evaluated using a high-resolution phantom, and the contrast to noise ratio (CNR) was evaluated using a postmortem breast sample. The postmortem breast samples were decomposed into water, lipid, and protein contents based on images reconstructed from TFIR with 204 projections and FBP with 614 projections. The volumetric fractions of water, lipid, and protein from the image-based measurements in both TFIR and FBP were compared to the chemical analysis. Results: The spatial resolution and CNR were comparable for the images reconstructed by TFIR with 204 projections and FBP with 614 projections. Both reconstruction techniques provided accurate quantification of water, lipid, and protein composition of the breast tissue when compared with data from the reference standard chemical analysis. Conclusions: Accurate breast tissue decomposition can be done with three fold fewer projection images by the TFIR technique without any reduction in image spatial resolution and CNR. This can result in a two-third reduction of the patient dose in a multislit and multislice spiral CT system in addition to the reduced scanning time in this system. PMID:23464320

  19. Rock surface roughness measurement using CSI technique and analysis of surface characterization by qualitative and quantitative results

    NASA Astrophysics Data System (ADS)

    Mukhtar, Husneni; Montgomery, Paul; Gianto; Susanto, K.

    2016-01-01

    In order to develop image processing that is widely used in geo-processing and analysis, we introduce an alternative technique for the characterization of rock samples. The technique that we have used for characterizing inhomogeneous surfaces is based on Coherence Scanning Interferometry (CSI). An optical probe is first used to scan over the depth of the surface roughness of the sample. Then, to analyse the measured fringe data, we use the Five Sample Adaptive method to obtain quantitative results of the surface shape. To analyse the surface roughness parameters, Hmm and Rq, a new window resizing analysis technique is employed. The results of the morphology and surface roughness analysis show micron and nano-scale information which is characteristic of each rock type and its history. These could be used for mineral identification and studies in rock movement on different surfaces. Image processing is thus used to define the physical parameters of the rock surface.

  20. Optimization of Native and Formaldehyde iPOND Techniques for Use in Suspension Cells.

    PubMed

    Wiest, Nathaniel E; Tomkinson, Alan E

    2017-01-01

    The isolation of proteins on nascent DNA (iPOND) technique developed by the Cortez laboratory allows a previously unparalleled ability to examine proteins associated with replicating and newly synthesized DNA in mammalian cells. Both the original, formaldehyde-based iPOND technique and a more recent derivative, accelerated native iPOND (aniPOND), have mostly been performed in adherent cell lines. Here, we describe modifications to both protocols for use with suspension cell lines. These include cell culture, pulse, and chase conditions that optimize sample recovery in both protocols using suspension cells and several key improvements to the published aniPOND technique that reduce sample loss, increase signal to noise, and maximize sample recovery. Additionally, we directly and quantitatively compare the iPOND and aniPOND protocols to test the strengths and limitations of both. Finally, we present a detailed protocol to perform the optimized aniPOND protocol in suspension cell lines. © 2017 Elsevier Inc. All rights reserved.

  1. Quantitative X-ray dark-field and phase tomography using single directional speckle scanning technique

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Hongchang, E-mail: hongchang.wang@diamond.ac.uk; Kashyap, Yogesh; Sawhney, Kawal

    2016-03-21

    X-ray dark-field contrast tomography can provide important supplementary information inside a sample to the conventional absorption tomography. Recently, the X-ray speckle based technique has been proposed to provide qualitative two-dimensional dark-field imaging with a simple experimental arrangement. In this letter, we deduce a relationship between the second moment of scattering angle distribution and cross-correlation degradation of speckle and establish a quantitative basis of X-ray dark-field tomography using single directional speckle scanning technique. In addition, the phase contrast images can be simultaneously retrieved permitting tomographic reconstruction, which yields enhanced contrast in weakly absorbing materials. Such complementary tomography technique can allow systematicmore » investigation of complex samples containing both soft and hard materials.« less

  2. X-ray absorption and Mössbauer spectroscopies characterization of iron nanoclusters prepared by the gas aggregation technique.

    PubMed

    Sánchez-Marcos, J; Laguna-Marco, M A; Martínez-Morillas, R; Céspedes, E; Menéndez, N; Jiménez-Villacorta, F; Prieto, C

    2012-11-01

    Partially oxidized iron nanoclusters have been prepared by the gas-phase aggregation technique with typical sizes of 2-3 nm. This preparation technique has been reported to obtain clusters with interesting magnetic properties such as very large exchange bias. In this paper, a sample composition study carried out by Mössbauer and X-ray absorption spectroscopies is reported. The information reached by these techniques, which is based on the iron short range order, results to be an ideal way to have a characterization of the whole sample since the obtained data are an average over a very large amount of the clusters. In addition, our results indicate the presence of ferrihydrite, which is a compound typically ignored when studying this type of systems.

  3. Quality of the antimalarial medicine artemether - lumefantrine in eight cities of the Democratic Republic of the Congo.

    PubMed

    Mufusama, Jean-Pierre; Ioset, Karine Ndjoko; Feineis, Doris; Hoellein, Ludwig; Holzgrabe, Ulrike; Bringmann, Gerhard

    2018-06-12

    In the context of post-marketing surveillance supporting public-health authorities to take evidence-based decisions to fight the spread of poor-quality medicines, the quality of antimalarial artemether-lumefantrine (AL) medicines was assessed in the Democratic Republic of the Congo (DRC). A total of 150 samples of AL containing products was collected from private pharmaceutical outlets in eight main cities: Goma, Kikwit, Kinshasa, Kisangani, Lubumbashi, Matadi, Mbandaka, and Mbuji-Mayi. All drug samples were successively analyzed by visual inspection, thin-layer chromatography (TLC), and high-performance liquid chromatography (HPLC) following The International Pharmacopoeia. Out of the 150 collected drug samples, three (2%) failed the visual inspection as they had shelf lives different from those of other samples with the same brand name. Four samples (2.7%) did not pass the TLC test as they contained only one or even none of the two declared active pharmaceutical ingredients (APIs). HPLC assays showed that 46 (30.7%) samples had artemether contents below 90% and 17 (11.3%) above 110% of the content claimed on the label. For lumefantrine, 32 (21.7%) samples had contents below 90%, and eight (5.3%) had contents above 110%. This survey in DRC gives evidence that poor-quality antimalarial medicines are widely present. Based on three detection techniques, the study shows the necessity to equip developing countries with modern techniques such as HPLC, which, if combined with affordable techniques like TLC, could provide a pertinent analytical strategy to combat drug counterfeiting and poor manufacturing. This article is protected by copyright. All rights reserved.

  4. Detection of Wuchereria bancrofti DNA in paired serum and urine samples using polymerase chain reaction-based systems.

    PubMed

    Ximenes, Camila; Brandão, Eduardo; Oliveira, Paula; Rocha, Abraham; Rego, Tamisa; Medeiros, Rafael; Aguiar-Santos, Ana; Ferraz, João; Reis, Christian; Araujo, Paulo; Carvalho, Luiz; Melo, Fabio L

    2014-12-01

    The Global Program for the Elimination of Lymphatic Filariasis (GPELF) aims to eliminate this disease by the year 2020. However, the development of more specific and sensitive tests is important for the success of the GPELF. The present study aimed to standardise polymerase chain reaction (PCR)-based systems for the diagnosis of filariasis in serum and urine. Twenty paired biological urine and serum samples from individuals already known to be positive for Wuchereria bancrofti were collected during the day. Conventional PCR and semi-nested PCR assays were optimised. The detection limit of the technique for purified W. bancrofti DNA extracted from adult worms was 10 fg for the internal systems (WbF/Wb2) and 0.1 fg by using semi-nested PCR. The specificity of the primers was confirmed experimentally by amplification of 1 ng of purified genomic DNA from other species of parasites. Evaluation of the paired urine and serum samples by the semi-nested PCR technique indicated only two of the 20 tested individuals were positive, whereas the simple internal PCR system (WbF/Wb2), which has highly promising performance, revealed that all the patients were positive using both samples. This study successfully demonstrated the possibility of using the PCR technique on urine for the diagnosis of W. bancrofti infection.

  5. Estimation of rumen outflow in dairy cows fed grass silage-based diets by use of reticular sampling as an alternative to sampling from the omasal canal

    USDA-ARS?s Scientific Manuscript database

    TA study was conducted to compare nutrient flows determined by a reticular sampling technique with those made by sampling of digesta from the omasal canal. Six lactating dairy cows fitted with ruminal cannulas were used in a design with a 3 x 2 factorial arrangement of treatments and 4 periods. Trea...

  6. Analysis of creative mathematic thinking ability in problem based learning model based on self-regulation learning

    NASA Astrophysics Data System (ADS)

    Munahefi, D. N.; Waluya, S. B.; Rochmad

    2018-03-01

    The purpose of this research identified the effectiveness of Problem Based Learning (PBL) models based on Self Regulation Leaning (SRL) on the ability of mathematical creative thinking and analyzed the ability of mathematical creative thinking of high school students in solving mathematical problems. The population of this study was students of grade X SMA N 3 Klaten. The research method used in this research was sequential explanatory. Quantitative stages with simple random sampling technique, where two classes were selected randomly as experimental class was taught with the PBL model based on SRL and control class was taught with expository model. The selection of samples at the qualitative stage was non-probability sampling technique in which each selected 3 students were high, medium, and low academic levels. PBL model with SRL approach effectived to students’ mathematical creative thinking ability. The ability of mathematical creative thinking of low academic level students with PBL model approach of SRL were achieving the aspect of fluency and flexibility. Students of academic level were achieving fluency and flexibility aspects well. But the originality of students at the academic level was not yet well structured. Students of high academic level could reach the aspect of originality.

  7. Use of randomized sampling for analysis of metabolic networks.

    PubMed

    Schellenberger, Jan; Palsson, Bernhard Ø

    2009-02-27

    Genome-scale metabolic network reconstructions in microorganisms have been formulated and studied for about 8 years. The constraint-based approach has shown great promise in analyzing the systemic properties of these network reconstructions. Notably, constraint-based models have been used successfully to predict the phenotypic effects of knock-outs and for metabolic engineering. The inherent uncertainty in both parameters and variables of large-scale models is significant and is well suited to study by Monte Carlo sampling of the solution space. These techniques have been applied extensively to the reaction rate (flux) space of networks, with more recent work focusing on dynamic/kinetic properties. Monte Carlo sampling as an analysis tool has many advantages, including the ability to work with missing data, the ability to apply post-processing techniques, and the ability to quantify uncertainty and to optimize experiments to reduce uncertainty. We present an overview of this emerging area of research in systems biology.

  8. Quantitative elemental analysis of an industrial mineral talc, using accelerator-based analytical technique

    NASA Astrophysics Data System (ADS)

    Olabanji, S. O.; Ige, A. O.; Mazzoli, C.; Ceccato, D.; Ajayi, E. O. B.; De Poli, M.; Moschini, G.

    2005-10-01

    Accelerator-based technique of PIXE was employed for the determination of the elemental concentration of an industrial mineral, talc. Talc is a very versatile mineral in industries with several applications. Due to this, there is a need to know its constituents to ensure that the workers are not exposed to health risks. Besides, microscopic tests on some talc samples in Nigeria confirm that they fall within the BP British Pharmacopoeia standard for tablet formation. However, for these samples to become a local source of raw material for pharmaceutical grade talc, the precise elemental compositions should be established which is the focus of this work. Proton beam produced by the 2.5 MV AN 2000 Van de Graaff accelerator at INFN, LNL, Legnaro, Padova, Italy was used for the PIXE measurements. The results which show the concentration of different elements in the talc samples, their health implications and metabolic roles are presented and discussed.

  9. The application of compressive sampling in rapid ultrasonic computerized tomography (UCT) technique of steel tube slab (STS)

    PubMed Central

    Jiang, Baofeng; Jia, Pengjiao; Zhao, Wen; Wang, Wentao

    2018-01-01

    This paper explores a new method for rapid structural damage inspection of steel tube slab (STS) structures along randomly measured paths based on a combination of compressive sampling (CS) and ultrasonic computerized tomography (UCT). In the measurement stage, using fewer randomly selected paths rather than the whole measurement net is proposed to detect the underlying damage of a concrete-filled steel tube. In the imaging stage, the ℓ1-minimization algorithm is employed to recover the information of the microstructures based on the measurement data related to the internal situation of the STS structure. A numerical concrete tube model, with the various level of damage, was studied to demonstrate the performance of the rapid UCT technique. Real-world concrete-filled steel tubes in the Shenyang Metro stations were detected using the proposed UCT technique in a CS framework. Both the numerical and experimental results show the rapid UCT technique has the capability of damage detection in an STS structure with a high level of accuracy and with fewer required measurements, which is more convenient and efficient than the traditional UCT technique. PMID:29293593

  10. Photon event distribution sampling: an image formation technique for scanning microscopes that permits tracking of sub-diffraction particles with high spatial and temporal resolutions.

    PubMed

    Larkin, J D; Publicover, N G; Sutko, J L

    2011-01-01

    In photon event distribution sampling, an image formation technique for scanning microscopes, the maximum likelihood position of origin of each detected photon is acquired as a data set rather than binning photons in pixels. Subsequently, an intensity-related probability density function describing the uncertainty associated with the photon position measurement is applied to each position and individual photon intensity distributions are summed to form an image. Compared to pixel-based images, photon event distribution sampling images exhibit increased signal-to-noise and comparable spatial resolution. Photon event distribution sampling is superior to pixel-based image formation in recognizing the presence of structured (non-random) photon distributions at low photon counts and permits use of non-raster scanning patterns. A photon event distribution sampling based method for localizing single particles derived from a multi-variate normal distribution is more precise than statistical (Gaussian) fitting to pixel-based images. Using the multi-variate normal distribution method, non-raster scanning and a typical confocal microscope, localizations with 8 nm precision were achieved at 10 ms sampling rates with acquisition of ~200 photons per frame. Single nanometre precision was obtained with a greater number of photons per frame. In summary, photon event distribution sampling provides an efficient way to form images when low numbers of photons are involved and permits particle tracking with confocal point-scanning microscopes with nanometre precision deep within specimens. © 2010 The Authors Journal of Microscopy © 2010 The Royal Microscopical Society.

  11. Surface profilometry using the incoherent self-imaging technique in reflection mode

    NASA Astrophysics Data System (ADS)

    Hassani, Khosrow; Nahal, Arashmid; Tirandazi, Negin

    2018-01-01

    In this paper, we introduce a highly sensitive and cost-effective surface profilometry technique based on the Lau self-imaging phenomenon in reflection mode, combined with the Moiré technique. Standard incoherent grating imaging with two Ronchi rulings is deployed to produce localized Fresnel pseudoimages, except that the light wavefront gets modulated after reflecting off the surface under test and before the final image forms. A third grating is superimposed on the pseudoimage to take advantage of the magnification property of the Moiré fringes and enhance the surface-induced modulations. A five-step phase-shifting technique is used to extract the 2D surface profile of the sample from the recorded Moiré patterns. To demonstrate our technique, we measure the profile of a 250 nm step-like metallic sample. The results show a few nanometer uncertainties, very good reproducibility, and agreement with other known optical and mechanical surface profilometry methods.

  12. Hybrid real-code ant colony optimisation for constrained mechanical design

    NASA Astrophysics Data System (ADS)

    Pholdee, Nantiwat; Bureerat, Sujin

    2016-01-01

    This paper proposes a hybrid meta-heuristic based on integrating a local search simplex downhill (SDH) method into the search procedure of real-code ant colony optimisation (ACOR). This hybridisation leads to five hybrid algorithms where a Monte Carlo technique, a Latin hypercube sampling technique (LHS) and a translational propagation Latin hypercube design (TPLHD) algorithm are used to generate an initial population. Also, two numerical schemes for selecting an initial simplex are investigated. The original ACOR and its hybrid versions along with a variety of established meta-heuristics are implemented to solve 17 constrained test problems where a fuzzy set theory penalty function technique is used to handle design constraints. The comparative results show that the hybrid algorithms are the top performers. Using the TPLHD technique gives better results than the other sampling techniques. The hybrid optimisers are a powerful design tool for constrained mechanical design problems.

  13. Estimation variance bounds of importance sampling simulations in digital communication systems

    NASA Technical Reports Server (NTRS)

    Lu, D.; Yao, K.

    1991-01-01

    In practical applications of importance sampling (IS) simulation, two basic problems are encountered, that of determining the estimation variance and that of evaluating the proper IS parameters needed in the simulations. The authors derive new upper and lower bounds on the estimation variance which are applicable to IS techniques. The upper bound is simple to evaluate and may be minimized by the proper selection of the IS parameter. Thus, lower and upper bounds on the improvement ratio of various IS techniques relative to the direct Monte Carlo simulation are also available. These bounds are shown to be useful and computationally simple to obtain. Based on the proposed technique, one can readily find practical suboptimum IS parameters. Numerical results indicate that these bounding techniques are useful for IS simulations of linear and nonlinear communication systems with intersymbol interference in which bit error rate and IS estimation variances cannot be obtained readily using prior techniques.

  14. Properties of hypothesis testing techniques and (Bayesian) model selection for exploration-based and theory-based (order-restricted) hypotheses.

    PubMed

    Kuiper, Rebecca M; Nederhoff, Tim; Klugkist, Irene

    2015-05-01

    In this paper, the performance of six types of techniques for comparisons of means is examined. These six emerge from the distinction between the method employed (hypothesis testing, model selection using information criteria, or Bayesian model selection) and the set of hypotheses that is investigated (a classical, exploration-based set of hypotheses containing equality constraints on the means, or a theory-based limited set of hypotheses with equality and/or order restrictions). A simulation study is conducted to examine the performance of these techniques. We demonstrate that, if one has specific, a priori specified hypotheses, confirmation (i.e., investigating theory-based hypotheses) has advantages over exploration (i.e., examining all possible equality-constrained hypotheses). Furthermore, examining reasonable order-restricted hypotheses has more power to detect the true effect/non-null hypothesis than evaluating only equality restrictions. Additionally, when investigating more than one theory-based hypothesis, model selection is preferred over hypothesis testing. Because of the first two results, we further examine the techniques that are able to evaluate order restrictions in a confirmatory fashion by examining their performance when the homogeneity of variance assumption is violated. Results show that the techniques are robust to heterogeneity when the sample sizes are equal. When the sample sizes are unequal, the performance is affected by heterogeneity. The size and direction of the deviations from the baseline, where there is no heterogeneity, depend on the effect size (of the means) and on the trend in the group variances with respect to the ordering of the group sizes. Importantly, the deviations are less pronounced when the group variances and sizes exhibit the same trend (e.g., are both increasing with group number). © 2014 The British Psychological Society.

  15. Parametric, bootstrap, and jackknife variance estimators for the k-Nearest Neighbors technique with illustrations using forest inventory and satellite image data

    Treesearch

    Ronald E. McRoberts; Steen Magnussen; Erkki O. Tomppo; Gherardo Chirici

    2011-01-01

    Nearest neighbors techniques have been shown to be useful for estimating forest attributes, particularly when used with forest inventory and satellite image data. Published reports of positive results have been truly international in scope. However, for these techniques to be more useful, they must be able to contribute to scientific inference which, for sample-based...

  16. Nano-Al Based Energetics: Rapid Heating Studies and a New Preparation Technique

    NASA Astrophysics Data System (ADS)

    Sullivan, Kyle; Kuntz, Josh; Gash, Alex; Zachariah, Michael

    2011-06-01

    Nano-Al based thermites have become an attractive alternative to traditional energetic formulations due to their increased energy density and high reactivity. Understanding the intrinsic reaction mechanism has been a difficult task, largely due to the lack of experimental techniques capable of rapidly and uniform heating a sample (~104- 108 K/s). The current work presents several studies on nano-Al based thermites, using rapid heating techniques. A new mechanism termed a Reactive Sintering Mechanism is proposed for nano-Al based thermites. In addition, new experimental techniques for nanocomposite thermite deposition onto thin Pt electrodes will be discussed. This combined technique will offer more precise control of the deposition, and will serve to further our understanding of the intrinsic reaction mechanism of rapidly heated energetic systems. An improved mechanistic understanding will lead to the development of optimized formulations and architectures. This work performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344.

  17. Ultra-high dynamic range electro-optic sampling for detecting millimeter and sub-millimeter radiation

    PubMed Central

    Ibrahim, Akram; Férachou, Denis; Sharma, Gargi; Singh, Kanwarpal; Kirouac-Turmel, Marie; Ozaki, Tsuneyuki

    2016-01-01

    Time-domain spectroscopy using coherent millimeter and sub-millimeter radiation (also known as terahertz radiation) is rapidly expanding its application, owing greatly to the remarkable advances in generating and detecting such radiation. However, many current techniques for coherent terahertz detection have limited dynamic range, thus making it difficult to perform some basic experiments that need to directly compare strong and weak terahertz signals. Here, we propose and demonstrate a novel technique based on cross-polarized spectral-domain interferometry to achieve ultra-high dynamic range electro-optic sampling measurement of coherent millimeter and sub-millimeter radiation. In our scheme, we exploit the birefringence in a single-mode polarization maintaining fiber in order to measure the phase change induced by the electric field of terahertz radiation in the detection crystal. With our new technique, we have achieved a dynamic range of 7 × 106, which is 4 orders of magnitude higher than conventional electro-optic sampling techniques, while maintaining comparable signal-to-noise ratio. The present technique is foreseen to have great impact on experiments such as linear terahertz spectroscopy of optically thick materials (such as aqueous samples) and nonlinear terahertz spectroscopy, where the higher dynamic range is crucial for proper interpretation of experimentally obtained results. PMID:26976363

  18. Ultra-high dynamic range electro-optic sampling for detecting millimeter and sub-millimeter radiation.

    PubMed

    Ibrahim, Akram; Férachou, Denis; Sharma, Gargi; Singh, Kanwarpal; Kirouac-Turmel, Marie; Ozaki, Tsuneyuki

    2016-03-15

    Time-domain spectroscopy using coherent millimeter and sub-millimeter radiation (also known as terahertz radiation) is rapidly expanding its application, owing greatly to the remarkable advances in generating and detecting such radiation. However, many current techniques for coherent terahertz detection have limited dynamic range, thus making it difficult to perform some basic experiments that need to directly compare strong and weak terahertz signals. Here, we propose and demonstrate a novel technique based on cross-polarized spectral-domain interferometry to achieve ultra-high dynamic range electro-optic sampling measurement of coherent millimeter and sub-millimeter radiation. In our scheme, we exploit the birefringence in a single-mode polarization maintaining fiber in order to measure the phase change induced by the electric field of terahertz radiation in the detection crystal. With our new technique, we have achieved a dynamic range of 7 × 10(6), which is 4 orders of magnitude higher than conventional electro-optic sampling techniques, while maintaining comparable signal-to-noise ratio. The present technique is foreseen to have great impact on experiments such as linear terahertz spectroscopy of optically thick materials (such as aqueous samples) and nonlinear terahertz spectroscopy, where the higher dynamic range is crucial for proper interpretation of experimentally obtained results.

  19. Ag-doped Lithium alumino silicate photostructurable glass for microdevice fabrication

    NASA Astrophysics Data System (ADS)

    Mishra, Richa; Goswami, Madhumita; Krishnan, Madangopal

    2018-04-01

    Ag-doped LAS glass of composition (wt.%):74SiO2-6Al2O3-15Li2O-5X (X=other additives) were prepared by melt-quench technique and characterized for thermal and optical properties using DTA and UV-Visible spectrometer. XRD technique was used for phase identification in the heat treated glasses. Glass samples were exposed to UV-light for conversion of Ce3+ to Ce4+ state and Ag+ into Ago metallic state. DTA shows a lower crystallization temperature (Tp) at around 605°C for exposed samples as compared to unexposed base glass which is at around 625°C. UV-Visible spectra shows a broad band at around 305nm which indicates Ce3+ in base glass whereas in case of UV-exposed sample the reduced peak intensity indicates conversion of Ce3+ to Ce4+ ions, which also confirm formation of Ago in glass samples. Ag agglomeration was also confirmed from the band position at 430nm in heat treated sample, found responsible for early growth of meta-silicate phase in exposed sample. The meta-silicate phase was selectively etched for fabrication of micro-devices.

  20. Laboratory-based x-ray phase-contrast tomography enables 3D virtual histology

    NASA Astrophysics Data System (ADS)

    Töpperwien, Mareike; Krenkel, Martin; Quade, Felix; Salditt, Tim

    2016-09-01

    Due to the large penetration depth and small wavelength hard x-rays offer a unique potential for 3D biomedical and biological imaging, combining capabilities of high resolution and large sample volume. However, in classical absorption-based computed tomography, soft tissue only shows a weak contrast, limiting the actual resolution. With the advent of phase-contrast methods, the much stronger phase shift induced by the sample can now be exploited. For high resolution, free space propagation behind the sample is particularly well suited to make the phase shift visible. Contrast formation is based on the self-interference of the transmitted beam, resulting in object-induced intensity modulations in the detector plane. As this method requires a sufficiently high degree of spatial coherence, it was since long perceived as a synchrotron-based imaging technique. In this contribution we show that by combination of high brightness liquid-metal jet microfocus sources and suitable sample preparation techniques, as well as optimized geometry, detection and phase retrieval, excellent three-dimensional image quality can be obtained, revealing the anatomy of a cobweb spider in high detail. This opens up new opportunities for 3D virtual histology of small organisms. Importantly, the image quality is finally augmented to a level accessible to automatic 3D segmentation.

  1. Neutron Bragg-edge-imaging for strain mapping under in situ tensile loading

    NASA Astrophysics Data System (ADS)

    Woracek, R.; Penumadu, D.; Kardjilov, N.; Hilger, A.; Strobl, M.; Wimpory, R. C.; Manke, I.; Banhart, J.

    2011-05-01

    Wavelength selective neutron radiography at a cold neutron reactor source was used to measure strain and determine (residual) stresses in a steel sample under plane stress conditions. We present a new technique that uses an energy-resolved neutron imaging system based on a double crystal monochromator and is equipped with a specially developed (in situ) biaxial load frame to perform Bragg edge based transmission imaging. The neutron imaging technique provides a viewing area of 7 cm by 7 cm with a spatial resolution on the order of ˜ 100 μm. The stress-induced shifts of the Bragg edge corresponding to the (110) lattice plane were resolved spatially for a ferritic steel alloy A36 (ASTM international) sample. Furthermore it is demonstrated that results agree with comparative data obtained using neutron diffraction and resistance based strain-gauge rosettes.

  2. An Object-Based Requirements Modeling Method.

    ERIC Educational Resources Information Center

    Cordes, David W.; Carver, Doris L.

    1992-01-01

    Discusses system modeling and specification as it relates to object-based information systems development and software development. An automated system model based on the objects in the initial requirements document is described, the requirements document translator is explained, and a sample application of the technique is provided. (12…

  3. Large antenna experiments aboard the space shuttle: Application of nonuniform sampling techniques

    NASA Technical Reports Server (NTRS)

    Rahmatsamii, Y.

    1988-01-01

    Future satellite communication and scientific spacecraft will utilize antennas with dimensions as large as 20 meters. In order to commercially use these large, low sidelobe and multiple beam antennas, a high level of confidence must be established as to their performance in the 0-g and space environment. Furthermore, it will be desirable to demonstrate the applicability of surface compensation techniques for slowly varying surface distortions which could result from thermal effects. An overview of recent advances in performing RF measurements on large antennas is presented with emphasis given to the application of a space based far-field range utilizing the Space Shuttle and the concept of a newly developed nonuniform sampling technique.

  4. Investigation of the tone-burst tube for duct lining attenuation measurement

    NASA Technical Reports Server (NTRS)

    Soffel, A. R.; Morrow, P. F.

    1972-01-01

    The tone burst technique makes practical the laboratory evaluation of potential inlet and discharge duct treatments. Tone burst apparatus requires only simple machined parts and standard components. Small, simply made, lining samples are quickly and easily installed in the system. Two small electromagnetric loudspeaker drivers produce peak sound pressure level of over 166 db in the 3-square-inch sample duct. Air pump available in most laboratories can produce air flows of over plus and minus Mach 0.3 in the sample duct. The technique uses short shaped pulses of sound propagated down a progressive wave tube containing the sample duct. The peak pressure level output of the treated duct is compared with the peak pressure level output of a substituted reference duct. The difference between the levels is the attenuation or insertion loss of the treated duct. Evaluations of resonant absorber linings by the tone burst technique check attenuation values predicted by empirical formulas based on full scale ducts.

  5. Adaptive Conditioning of Multiple-Point Geostatistical Facies Simulation to Flow Data with Facies Probability Maps

    NASA Astrophysics Data System (ADS)

    Khodabakhshi, M.; Jafarpour, B.

    2013-12-01

    Characterization of complex geologic patterns that create preferential flow paths in certain reservoir systems requires higher-order geostatistical modeling techniques. Multipoint statistics (MPS) provides a flexible grid-based approach for simulating such complex geologic patterns from a conceptual prior model known as a training image (TI). In this approach, a stationary TI that encodes the higher-order spatial statistics of the expected geologic patterns is used to represent the shape and connectivity of the underlying lithofacies. While MPS is quite powerful for describing complex geologic facies connectivity, the nonlinear and complex relation between the flow data and facies distribution makes flow data conditioning quite challenging. We propose an adaptive technique for conditioning facies simulation from a prior TI to nonlinear flow data. Non-adaptive strategies for conditioning facies simulation to flow data can involves many forward flow model solutions that can be computationally very demanding. To improve the conditioning efficiency, we develop an adaptive sampling approach through a data feedback mechanism based on the sampling history. In this approach, after a short period of sampling burn-in time where unconditional samples are generated and passed through an acceptance/rejection test, an ensemble of accepted samples is identified and used to generate a facies probability map. This facies probability map contains the common features of the accepted samples and provides conditioning information about facies occurrence in each grid block, which is used to guide the conditional facies simulation process. As the sampling progresses, the initial probability map is updated according to the collective information about the facies distribution in the chain of accepted samples to increase the acceptance rate and efficiency of the conditioning. This conditioning process can be viewed as an optimization approach where each new sample is proposed based on the sampling history to improve the data mismatch objective function. We extend the application of this adaptive conditioning approach to the case where multiple training images are proposed to describe the geologic scenario in a given formation. We discuss the advantages and limitations of the proposed adaptive conditioning scheme and use numerical experiments from fluvial channel formations to demonstrate its applicability and performance compared to non-adaptive conditioning techniques.

  6. The application of novel nano-thermal and imaging techniques for monitoring drug microstructure and distribution within PLGA microspheres.

    PubMed

    Yang, Fan; Chen, De; Guo, Zhe-Fei; Zhang, Yong-Ming; Liu, Yi; Askin, Sean; Craig, Duncan Q M; Zhao, Min

    2017-04-30

    Poly (d,l-lactic-co-glycolic) acid (PLGA) based microspheres have been extensively used as controlled drug release systems. However, the burst effect has been a persistent issue associated with such systems, especially for those prepared by the double emulsion technique. An effective approach to preventing the burst effect and achieving a more ideal drug release profile is to improve the drug distribution within the polymeric matrix. Therefore, it is of great importance to establish a rapid and robust tool for screening and optimizing the drug distribution during pre-formulation. Transition Temperature Microscopy (TTM), a novel nano-thermal and imaging technique, is an extension of nano-thermal analysis (nano-TA) whereby a transition temperature is detected at a localized region of a sample and then designated a color based on a particular temperature/color palette, finally resulting in a coded map based on transition temperatures detected by carrying out a series of nanoTA measurements across the surface of the sample. In this study, we investigate the feasibility of applying the aforementioned technique combined with other thermal, imaging and structural techniques for monitoring the drug microstructure and spatial distribution within bovine serum albumin (BSA) loaded and nimodipine loaded PLGA microspheres, with a view to better predicting the in vitro drug release performance. Copyright © 2017 Elsevier B.V. All rights reserved.

  7. Fabrication of setup for high temperature thermal conductivity measurement.

    PubMed

    Patel, Ashutosh; Pandey, Sudhir K

    2017-01-01

    In this work, we report the fabrication of an experimental setup for high temperature thermal conductivity (κ) measurement. It can characterize samples with various dimensions and shapes. Steady state based axial heat flow technique is used for κ measurement. Heat loss is measured using parallel thermal conductance technique. Simple design, lightweight, and small size sample holder is developed by using a thin heater and limited components. Low heat loss value is achieved by using very low thermal conductive insulator block with small cross-sectional area. Power delivered to the heater is measured accurately by using 4-wire technique and for this, the heater is developed with 4 wires. This setup is validated by using Bi 0.36 Sb 1.45 Te 3 , polycrystalline bismuth, gadolinium, and alumina samples. The data obtained for these samples are found to be in good agreement with the reported data. The maximum deviation of 6% in the value κ is observed. This maximum deviation is observed with the gadolinium sample. We also report the thermal conductivity of polycrystalline tellurium from 320 K to 550 K and the nonmonotonous behavior of κ with temperature is observed.

  8. The Life Cycle of Academic Management Fads. ASHE Annual Meeting Paper.

    ERIC Educational Resources Information Center

    Birnbaum, Robert

    This study reviewed the literature to trace the evolution and life cycles of seven management techniques related to higher education. The seven case studies involved analysis of a selected sample of periodical, monograph, and technical literature from 1960 to the present. The literature base on each management technique was reviewed in reference…

  9. Biotechnical use of polymerase chain reaction for microbiological analysis of biological samples.

    PubMed

    Lantz, P G; Abu al-Soud, W; Knutsson, R; Hahn-Hägerdal, B; Rådström, P

    2000-01-01

    Since its introduction in the mid-80s, polymerase chain reaction (PCR) technology has been recognised as a rapid, sensitive and specific molecular diagnostic tool for the analysis of micro-organisms in clinical, environmental and food samples. Although this technique can be extremely effective with pure solutions of nucleic acids, it's sensitivity may be reduced dramatically when applied directly to biological samples. This review describes PCR technology as a microbial detection method, PCR inhibitors in biological samples and various sample preparation techniques that can be used to facilitate PCR detection, by either separating the micro-organisms from PCR inhibitors and/or by concentrating the micro-organisms to detectable concentrations. Parts of this review are updated and based on a doctoral thesis by Lantz [1] and on a review discussing methods to overcome PCR inhibition in foods [2].

  10. Ultrasonic attenuation - Q measurements on 70215,29. [lunar rock

    NASA Technical Reports Server (NTRS)

    Warren, N.; Trice, R.; Stephens, J.

    1974-01-01

    Ultrasonic attenuation measurements have been made on an aluminum alloy, obsidian, and rock samples including lunar sample 70215,29. The measurement technique is based on a combination of the pulse transmission method and the forced resonance method. The technique is designed to explore the problem of defining experimentally, the Q of a medium or sample in which mode conversion may occur. If modes are coupled, the measured attenuation is strongly dependent on individual modes of vibration, and a range of Q-factors may be measured over various resonances or from various portions of a transient signal. On 70215,29, measurements were made over a period of a month while the sample outgassed in hard varuum. During this period, the highest measured Q of this sample increased from a few hundred into the range of 1000-1300.

  11. Toward quantitative estimation of material properties with dynamic mode atomic force microscopy: a comparative study.

    PubMed

    Ghosal, Sayan; Gannepalli, Anil; Salapaka, Murti

    2017-08-11

    In this article, we explore methods that enable estimation of material properties with the dynamic mode atomic force microscopy suitable for soft matter investigation. The article presents the viewpoint of casting the system, comprising of a flexure probe interacting with the sample, as an equivalent cantilever system and compares a steady-state analysis based method with a recursive estimation technique for determining the parameters of the equivalent cantilever system in real time. The steady-state analysis of the equivalent cantilever model, which has been implicitly assumed in studies on material property determination, is validated analytically and experimentally. We show that the steady-state based technique yields results that quantitatively agree with the recursive method in the domain of its validity. The steady-state technique is considerably simpler to implement, however, slower compared to the recursive technique. The parameters of the equivalent system are utilized to interpret storage and dissipative properties of the sample. Finally, the article identifies key pitfalls that need to be avoided toward the quantitative estimation of material properties.

  12. Implementation of authentic assessment in the project based learning to improve student's concept mastering

    NASA Astrophysics Data System (ADS)

    Sambeka, Yana; Nahadi, Sriyati, Siti

    2017-05-01

    The study aimed to obtain the scientific information about increase of student's concept mastering in project based learning that used authentic assessment. The research was conducted in May 2016 at one of junior high school in Bandung in the academic year of 2015/2016. The research method was weak experiment with the one-group pretest-posttest design. The sample was taken by random cluster sampling technique and the sample was 24 students. Data collected through instruments, i.e. written test, observation sheet, and questionnaire sheet. Student's concept mastering test obtained N-Gain of 0.236 with the low category. Based on the result of paired sample t-test showed that implementation of authentic assessment in the project based learning increased student's concept mastering significantly, (sig<0.05).

  13. Forensic analysis of explosives using isotope ratio mass spectrometry (IRMS)--preliminary study on TATP and PETN.

    PubMed

    Benson, Sarah J; Lennard, Christopher J; Maynard, Philip; Hill, David M; Andrew, Anita S; Roux, Claude

    2009-06-01

    The application of isotopic techniques to investigations requiring the provision of evidence to a Court is limited. The objective of this research was to investigate the application of light stable isotopes and isotope ratio mass spectrometry (IRMS) to solve complex forensic cases by providing a level of discrimination not achievable utilising traditional forensic techniques. Due to the current threat of organic peroxide explosives, such as triacetone triperoxide (TATP), research was undertaken to determine the potential of IRMS to differentiate samples of TATP that had been manufactured utilising different starting materials and/or manufacturing processes. In addition, due to the prevalence of pentaerythritoltetranitrate (PETN) in detonators, detonating cord, and boosters, the potential of the IRMS technique to differentiate PETN samples from different sources was also investigated. Carbon isotope values were measured in fourteen TATP samples, with three definite groups appearing in the initial sample set based on the carbon data alone. Four additional TATP samples (in a second set of samples) were distinguishable utilising the carbon and hydrogen isotopic compositions individually, and also in combination with the oxygen isotope values. The 3D plot of the carbon, oxygen and hydrogen data demonstrated the clear discrimination of the four samples of TATP. The carbon and nitrogen isotope values measured from fifteen PETN samples, allowed samples from different sources to be readily discriminated. This paper demonstrates the successful application of IRMS to the analysis of explosives of forensic interest to assist in discriminating samples from different sources. This research represents a preliminary evaluation of the IRMS technique for the measurement of stable isotope values in TATP and PETN samples, and supports the dedication of resources for a full evaluation of this application in order to achieve Court reportable IRMS results.

  14. Rapid assessment of insect fauna based on local knowledge: comparing ecological and ethnobiological methods.

    PubMed

    Lima, Daniele Cristina de Oliveira; Ramos, Marcelo Alves; da Silva, Henrique Costa Hermenegildo; Alves, Angelo Giuseppe Chaves

    2016-03-01

    The rapid assessment of biodiversity making use of surveys of local knowledge has been successful for different biological taxa. However, there are no reports on the testing of such tools for sampling insect fauna. The present study aimed to evaluate the efficiency of different ethnobiological techniques for rapid sampling of insect fauna. Field research for the conventional survey of insect fauna was conducted on a private farm (9 ° 43'38.95 "S, 37 ° 45'11.97" W) , where there was intensive cultivation of okra (Abelmoschus esculentus L. (Moench)). The survey of local entomological knowledge was conducted among all the producers of okra living in the rural villages Pereira, Santa Luzia, and Nassau de Souza, within the Jacaré Curituba irrigated settlement scheme. The combined use of the techniques "free list" and projective interviews was analyzed, using two types of visual stimuli: stock photos and an entomological box. During the conventional survey of insect fauna, the species Bemisia tabaci biotype B, Aphis gossypii, Phenacoccus sp., Icerya purchasi and Lagria villosa were the primary pests found in the okra crop. Regarding the survey of insect pests, the results were convergent  in both techniques (conventional sampling and free list). Comparing the interview with visual stimuli (pictures) and specimen witnesses (entomological box) revealed that the latter was more effective. Techniques based on the recording and analysis of local knowledge about insects are effective for quick sampling of pest insects, but ineffective in sampling predator insects. The utilization of collected insects, infested branches, or photos of the symptoms of damage caused by pests in projective interviews is recommended.

  15. New method for estimating bacterial cell abundances in natural samples by use of sublimation

    NASA Technical Reports Server (NTRS)

    Glavin, Daniel P.; Cleaves, H. James; Schubert, Michael; Aubrey, Andrew; Bada, Jeffrey L.

    2004-01-01

    We have developed a new method based on the sublimation of adenine from Escherichia coli to estimate bacterial cell counts in natural samples. To demonstrate this technique, several types of natural samples, including beach sand, seawater, deep-sea sediment, and two soil samples from the Atacama Desert, were heated to a temperature of 500 degrees C for several seconds under reduced pressure. The sublimate was collected on a cold finger, and the amount of adenine released from the samples was then determined by high-performance liquid chromatography with UV absorbance detection. Based on the total amount of adenine recovered from DNA and RNA in these samples, we estimated bacterial cell counts ranging from approximately 10(5) to 10(9) E. coli cell equivalents per gram. For most of these samples, the sublimation-based cell counts were in agreement with total bacterial counts obtained by traditional DAPI (4,6-diamidino-2-phenylindole) staining.

  16. In Situ Noble-Gas Based Chronology on Mars

    NASA Technical Reports Server (NTRS)

    Swindle, T. D.

    2000-01-01

    Determining radiometric ages in situ on another planet's surface has never been done, and there are good reasons to think that it will be extremely difficult. It is certainly hard to imagine that such ages could be measured as precisely as they could be measured on returned samples in state-of-the-art terrestrial laboratories. However, it may be possible, by using simple noble-gas-based chronology techniques, to determine ages on Mars to a precision that is scientifically useful. This abstract will: (1) describe the techniques we envision; (2) give some examples of how such information might be scientifically useful; and (3) describe the system we are developing, including the requirements in terms of mass, power, volume, and sample selection and preparation.

  17. Image contrast mechanisms in dynamic friction force microscopy: Antimony particles on graphite

    NASA Astrophysics Data System (ADS)

    Mertens, Felix; Göddenhenrich, Thomas; Dietzel, Dirk; Schirmeisen, Andre

    2017-01-01

    Dynamic Friction Force Microscopy (DFFM) is a technique based on Atomic Force Microscopy (AFM) where resonance oscillations of the cantilever are excited by lateral actuation of the sample. During this process, the AFM tip in contact with the sample undergoes a complex movement which consists of alternating periods of sticking and sliding. Therefore, DFFM can give access to dynamic transition effects in friction that are not accessible by alternative techniques. Using antimony nanoparticles on graphite as a model system, we analyzed how combined influences of friction and topography can effect different experimental configurations of DFFM. Based on the experimental results, for example, contrast inversion between fractional resonance and band excitation imaging strategies to extract reliable tribological information from DFFM images are devised.

  18. 24 CFR 35.86 - Definitions.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... and/or Lead-Based Paint Hazards Upon Sale or Lease of Residential Property § 35.86 Definitions. The... purchase and sale of residential real property means any contract or agreement in which one party agrees to... sampling or other environmental sampling techniques; (4) Other activity as may be appropriate; and (5...

  19. Dielectric studies on PEG-LTMS based polymer composites

    NASA Astrophysics Data System (ADS)

    Patil, Ravikumar V.; Praveen, D.; Damle, R.

    2018-02-01

    PEG LTMS based polymer composites were prepared and studied for dielectric constant variation with frequency and temperature as a potential candidate with better dielectric properties. Solution cast technique is used for the preparation of polymer composite with five different compositions. Samples show variation in dielectric constant with frequency and temperature. Dielectric constant is large at low frequencies and higher temperatures. Samples with larger space charges have shown larger dielectric constant. The highest dielectric constant observed was about 29244 for PEG25LTMS sample at 100Hz and 312 K.

  20. [A comparison of convenience sampling and purposive sampling].

    PubMed

    Suen, Lee-Jen Wu; Huang, Hui-Man; Lee, Hao-Hsien

    2014-06-01

    Convenience sampling and purposive sampling are two different sampling methods. This article first explains sampling terms such as target population, accessible population, simple random sampling, intended sample, actual sample, and statistical power analysis. These terms are then used to explain the difference between "convenience sampling" and purposive sampling." Convenience sampling is a non-probabilistic sampling technique applicable to qualitative or quantitative studies, although it is most frequently used in quantitative studies. In convenience samples, subjects more readily accessible to the researcher are more likely to be included. Thus, in quantitative studies, opportunity to participate is not equal for all qualified individuals in the target population and study results are not necessarily generalizable to this population. As in all quantitative studies, increasing the sample size increases the statistical power of the convenience sample. In contrast, purposive sampling is typically used in qualitative studies. Researchers who use this technique carefully select subjects based on study purpose with the expectation that each participant will provide unique and rich information of value to the study. As a result, members of the accessible population are not interchangeable and sample size is determined by data saturation not by statistical power analysis.

  1. Capillary pressure curves for low permeability chalk obtained by NMR imaging of core saturation profiles

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Norgaard, J.V.; Olsen, D.; Springer, N.

    1995-12-31

    A new technique for obtaining water-oil capillary pressure curves, based on NMR imaging of the saturation distribution in flooded cores is presented. In this technique, a steady state fluid saturation profile is developed by flooding the core at a constant flow rate. At the steady state situation where the saturation distribution no longer changes, the local pressure difference between the wetting and non-wetting phases represents the capillary pressure. The saturation profile is measured using an NMR technique and for a drainage case, the pressure in the non-wetting phase is calculated numerically. The paper presents the NMR technique and the proceduremore » for calculating the pressure distribution in the sample. Inhomogeneous samples produce irregular saturation profiles, which may be interpreted in terms of variation in permeability, porosity, and capillary pressure. Capillary pressure curves for North Sea chalk obtained by the new technique show good agreement with capillary pressure curves obtained by traditional techniques.« less

  2. Misrepresenting random sampling? A systematic review of research papers in the Journal of Advanced Nursing.

    PubMed

    Williamson, Graham R

    2003-11-01

    This paper discusses the theoretical limitations of the use of random sampling and probability theory in the production of a significance level (or P-value) in nursing research. Potential alternatives, in the form of randomization tests, are proposed. Research papers in nursing, medicine and psychology frequently misrepresent their statistical findings, as the P-values reported assume random sampling. In this systematic review of studies published between January 1995 and June 2002 in the Journal of Advanced Nursing, 89 (68%) studies broke this assumption because they used convenience samples or entire populations. As a result, some of the findings may be questionable. The key ideas of random sampling and probability theory for statistical testing (for generating a P-value) are outlined. The result of a systematic review of research papers published in the Journal of Advanced Nursing is then presented, showing how frequently random sampling appears to have been misrepresented. Useful alternative techniques that might overcome these limitations are then discussed. REVIEW LIMITATIONS: This review is limited in scope because it is applied to one journal, and so the findings cannot be generalized to other nursing journals or to nursing research in general. However, it is possible that other nursing journals are also publishing research articles based on the misrepresentation of random sampling. The review is also limited because in several of the articles the sampling method was not completely clearly stated, and in this circumstance a judgment has been made as to the sampling method employed, based on the indications given by author(s). Quantitative researchers in nursing should be very careful that the statistical techniques they use are appropriate for the design and sampling methods of their studies. If the techniques they employ are not appropriate, they run the risk of misinterpreting findings by using inappropriate, unrepresentative and biased samples.

  3. Comprehensive Non-Destructive Conservation Documentation of Lunar Samples Using High-Resolution Image-Based 3D Reconstructions and X-Ray CT Data

    NASA Technical Reports Server (NTRS)

    Blumenfeld, E. H.; Evans, C. A.; Oshel, E. R.; Liddle, D. A.; Beaulieu, K.; Zeigler, R. A.; Hanna, R. D.; Ketcham, R. A.

    2015-01-01

    Established contemporary conservation methods within the fields of Natural and Cultural Heritage encourage an interdisciplinary approach to preservation of heritage material (both tangible and intangible) that holds "Outstanding Universal Value" for our global community. NASA's lunar samples were acquired from the moon for the primary purpose of intensive scientific investigation. These samples, however, also invoke cultural significance, as evidenced by the millions of people per year that visit lunar displays in museums and heritage centers around the world. Being both scientifically and culturally significant, the lunar samples require a unique conservation approach. Government mandate dictates that NASA's Astromaterials Acquisition and Curation Office develop and maintain protocols for "documentation, preservation, preparation and distribution of samples for research, education and public outreach" for both current and future collections of astromaterials. Documentation, considered the first stage within the conservation methodology, has evolved many new techniques since curation protocols for the lunar samples were first implemented, and the development of new documentation strategies for current and future astromaterials is beneficial to keeping curation protocols up to date. We have developed and tested a comprehensive non-destructive documentation technique using high-resolution image-based 3D reconstruction and X-ray CT (XCT) data in order to create interactive 3D models of lunar samples that would ultimately be served to both researchers and the public. These data enhance preliminary scientific investigations including targeted sample requests, and also provide a new visual platform for the public to experience and interact with the lunar samples. We intend to serve these data as they are acquired on NASA's Astromaterials Acquisistion and Curation website at http://curator.jsc.nasa.gov/. Providing 3D interior and exterior documentation of astromaterial samples addresses the increasing demands for accessability to data and contemporary techniques for documentation, which can be realized for both current collections as well as future sample return missions.

  4. Flow injection gas chromatography with sulfur chemiluminescence detection for the analysis of total sulfur in complex hydrocarbon matrixes.

    PubMed

    Hua, Yujuan; Hawryluk, Myron; Gras, Ronda; Shearer, Randall; Luong, Jim

    2018-01-01

    A fast and reliable analytical technique for the determination of total sulfur levels in complex hydrocarbon matrices is introduced. The method employed flow injection technique using a gas chromatograph as a sample introduction device and a gas phase dual-plasma sulfur chemiluminescence detector for sulfur quantification. Using the technique described, total sulfur measurement in challenging hydrocarbon matrices can be achieved in less than 10 s with sample-to-sample time <2 min. The high degree of selectivity and sensitivity toward sulfur compounds of the detector offers the ability to measure low sulfur levels with a detection limit in the range of 20 ppb w/w S. The equimolar response characteristic of the detector allows the quantitation of unknown sulfur compounds and simplifies the calibration process. Response is linear over a concentration range of five orders of magnitude, with a high degree of repeatability. The detector's lack of response to hydrocarbons enables direct analysis without the need for time-consuming sample preparation and chromatographic separation processes. This flow injection-based sulfur chemiluminescence detection technique is ideal for fast analysis or trace sulfur analysis. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  5. Improvement of the tetrachloromercurate absorption technique for measuring low atmospheric SO2 mixing ratios

    NASA Astrophysics Data System (ADS)

    Jaeschke, W.; Beltz, N.; Haunold, W.; Krischke, U.

    1997-07-01

    During the Gas-Phase Sulfur Intercomparison Experiment (GASIE) in 1994 an analytical system for measuring sulfur dioxide mixing ratios at low parts per trillion (pptv) levels was employed. It is based on the absorption of SO2 on a tetrachloromercurate(II)-impregnated filter. The subsequent analysis uses a chemiluminescence reaction by treating the resulting disulfitomercurate(II) complex with an acidic cerium sulfate solution. An improved sampling device has been introduced that increases the maximum sampling volume from 200 L to 500 L. It is also possible to determine the blank value accurately for each sample. The absorption efficiency of the sampling system is 98.7±6.4% at a nominal flow rate of 10 L/min. The calculated (3σ) detection limit is 3±1 pptv SO2. The sample solution is stable for up to 30 days, which allows the samples to be safely stored or shipped before analysis. This permits the use of a sensitive, compact, and reliable sampling system in the field with subsequent analysis under optimal conditions in the laboratory. A continuous flow chemiluminescence (CFCL) analyzer for on-line measurements is also presented. The system is based on the same chemical principles as the described filter technique.

  6. An Improved Computational Technique for Calculating Electromagnetic Forces and Power Absorptions Generated in Spherical and Deformed Body in Levitation Melting Devices

    NASA Technical Reports Server (NTRS)

    Zong, Jin-Ho; Szekely, Julian; Schwartz, Elliot

    1992-01-01

    An improved computational technique for calculating the electromagnetic force field, the power absorption and the deformation of an electromagnetically levitated metal sample is described. The technique is based on the volume integral method, but represents a substantial refinement; the coordinate transformation employed allows the efficient treatment of a broad class of rotationally symmetrical bodies. Computed results are presented to represent the behavior of levitation melted metal samples in a multi-coil, multi-frequency levitation unit to be used in microgravity experiments. The theoretical predictions are compared with both analytical solutions and with the results or previous computational efforts for the spherical samples and the agreement has been very good. The treatment of problems involving deformed surfaces and actually predicting the deformed shape of the specimens breaks new ground and should be the major usefulness of the proposed method.

  7. Applications of derivatization reactions to trace organic compounds during sample preparation based on pressurized liquid extraction.

    PubMed

    Carro, Antonia M; González, Paula; Lorenzo, Rosa A

    2013-06-28

    Pressurized liquid extraction (PLE) is an exhaustive technique used for the extraction of analytes from solid samples. Temperature, pressure, solvent type and volume, and the addition of other reagents notably influence the efficiency of the extraction. The analytical applications of this technique can be improved by coupling with appropriate derivatization reactions. The aim of this review is to discuss the recent applications of the sequential combination of PLE with derivatization and the approaches that involve simultaneous extraction and in situ derivatization. The potential of the latest developments to the trace analysis of environmental, food and biological samples is also analyzed. Copyright © 2013 Elsevier B.V. All rights reserved.

  8. Investigation of laser Doppler anemometry in developing a velocity-based measurement technique

    NASA Astrophysics Data System (ADS)

    Jung, Ki Won

    2009-12-01

    Acoustic properties, such as the characteristic impedance and the complex propagation constant, of porous materials have been traditionally characterized based on pressure-based measurement techniques using microphones. Although the microphone techniques have evolved since their introduction, the most general form of the microphone technique employs two microphones in characterizing the acoustic field for one continuous medium. The shortcomings of determining the acoustic field based on only two microphones can be overcome by using numerous microphones. However, the use of a number of microphones requires a careful and intricate calibration procedure. This dissertation uses laser Doppler anemometry (LDA) to establish a new measurement technique which can resolve issues that microphone techniques have: First, it is based on a single sensor, thus the calibration is unnecessary when only overall ratio of the acoustic field is required for the characterization of a system. This includes the measurements of the characteristic impedance and the complex propagation constant of a system. Second, it can handle multiple positional measurements without calibrating the signal at each position. Third, it can measure three dimensional components of velocity even in a system with a complex geometry. Fourth, it has a flexible adaptability which is not restricted to a certain type of apparatus only if the apparatus is transparent. LDA is known to possess several disadvantages, such as the requirement of a transparent apparatus, high cost, and necessity of seeding particles. The technique based on LDA combined with a curvefitting algorithm is validated through measurements on three systems. First, the complex propagation constant of the air is measured in a rigidly terminated cylindrical pipe which has very low dissipation. Second, the radiation impedance of an open-ended pipe is measured. These two parameters can be characterized by the ratio of acoustic field measured at multiple locations. Third, the power dissipated in a variable RLC load is measured. The three experiments validate the LDA technique proposed. The utility of the LDA method is then extended to the measurement of the complex propagation constant of the air inside a 100 ppi reticulated vitreous carbon (RVC) sample. Compared to measurements in the available studies, the measurement with the 100 ppi RVC sample supports the LDA technique in that it can achieve a low uncertainty in the determined quantity. This dissertation concludes with using the LDA technique for modal decomposition of the plane wave mode and the (1,1) mode that are driven simultaneously. This modal decomposition suggests that the LDA technique surpasses microphone-based techniques, because they are unable to determine the acoustic field based on an acoustic model with unconfined propagation constants for each modal component.

  9. Measures of precision for dissimilarity-based multivariate analysis of ecological communities

    PubMed Central

    Anderson, Marti J; Santana-Garcon, Julia

    2015-01-01

    Ecological studies require key decisions regarding the appropriate size and number of sampling units. No methods currently exist to measure precision for multivariate assemblage data when dissimilarity-based analyses are intended to follow. Here, we propose a pseudo multivariate dissimilarity-based standard error (MultSE) as a useful quantity for assessing sample-size adequacy in studies of ecological communities. Based on sums of squared dissimilarities, MultSE measures variability in the position of the centroid in the space of a chosen dissimilarity measure under repeated sampling for a given sample size. We describe a novel double resampling method to quantify uncertainty in MultSE values with increasing sample size. For more complex designs, values of MultSE can be calculated from the pseudo residual mean square of a permanova model, with the double resampling done within appropriate cells in the design. R code functions for implementing these techniques, along with ecological examples, are provided. PMID:25438826

  10. Comparison of Quantifiler(®) Trio and InnoQuant™ human DNA quantification kits for detection of DNA degradation in developed and aged fingerprints.

    PubMed

    Goecker, Zachary C; Swiontek, Stephen E; Lakhtakia, Akhlesh; Roy, Reena

    2016-06-01

    The development techniques employed to visualize fingerprints collected from crime scenes as well as post-development ageing may result in the degradation of the DNA present in low quantities in such evidence samples. Amplification of the DNA samples with short tandem repeat (STR) amplification kits may result in partial DNA profiles. A comparative study of two commercially available quantification kits, Quantifiler(®) Trio and InnoQuant™, was performed on latent fingerprint samples that were either (i) developed using one of three different techniques and then aged in ambient conditions or (ii) undeveloped and then aged in ambient conditions. The three fingerprint development techniques used were: cyanoacrylate fuming, dusting with black powder, and the columnar-thin-film (CTF) technique. In order to determine the differences between the expected quantities and actual quantities of DNA, manually degraded samples generated by controlled exposure of DNA standards to ultraviolet radiation were also analyzed. A total of 144 fingerprint and 42 manually degraded DNA samples were processed in this study. The results indicate that the InnoQuant™ kit is capable of producing higher degradation ratios compared to the Quantifiler(®) Trio kit. This was an expected result since the degradation ratio is a relative value specific for a kit based on the length and extent of amplification of the two amplicons that vary from one kit to the other. Additionally, samples with lower concentrations of DNA yielded non-linear relationships of degradation ratio with the duration of aging, whereas samples with higher concentrations of DNA yielded quasi-linear relationships. None of the three development techniques produced a noticeably different degradation pattern when compared to undeveloped fingerprints, and therefore do not impede downstream DNA analysis. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  11. Recovery of 1,3-, 2,3-dichloropropenes, 1,2-dibromo-3-chloropropane, and o-, p-dichlorobenzenes from fatty and non-fat foodstuffs by liquid extraction technique.

    PubMed

    Daft, J L

    1990-01-01

    Food samples including fatty, non-fatty, grain-based, and nongrain-based types were fortified with the following five nematocides and fumigants: 1,3-dichloropropene, 2,3-dichloropropene, 1,2-dibromo-3-chloropropane, o-dichlorobenzene, and p-dichlorobenzene. Then, depending on sample consistency and type, the samples were diluted in, or extracted with organic solvent such as isooctane. A few of the high-fat extracts were passed through Florisil to remove excess fat or endogenous interferences. Analysis of the initial or cleaned up extracts was done by gas chromatography (GC) at 90 degrees C. The dichloropropenes were determined on 20% OV-101 columns with electron-capture and Hall electroconductivity detectors. The dichlorobenzenes and 1,2-dibromo-3-chloropropane, which elute beyond 30 min on the above columns, were determined on 5%-loaded columns using the same detectors. All five analytes were recovered from these techniques. Mean analyte recovery following a direct dilution or extraction was 83%, and following the Florisil cleanup step, was 52%. In 1986, a fumigant survey of about 200 foodstuffs by using this overall technique gave no findings of the five compounds studied here.

  12. [Theoretical and methodological uses of research in Social and Human Sciences in Health].

    PubMed

    Deslandes, Suely Ferreira; Iriart, Jorge Alberto Bernstein

    2012-12-01

    The current article aims to map and critically reflect on the current theoretical and methodological uses of research in the subfield of social and human sciences in health. A convenience sample was used to select three Brazilian public health journals. Based on a reading of 1,128 abstracts published from 2009 to 2010, 266 articles were selected that presented the empirical base of research stemming from social and human sciences in health. The sample was classified thematically as "theoretical/ methodological reference", "study type/ methodological design", "analytical categories", "data production techniques", and "analytical procedures". We analyze the sample's emic categories, drawing on the authors' literal statements. All the classifications and respective variables were tabulated in Excel. Most of the articles were self-described as qualitative and used more than one data production technique. There was a wide variety of theoretical references, in contrast with the almost total predominance of a single type of data analysis (content analysis). In several cases, important gaps were identified in expounding the study methodology and instrumental use of the qualitative research techniques and methods. However, the review did highlight some new objects of study and innovations in theoretical and methodological approaches.

  13. CpG PatternFinder: a Windows-based utility program for easy and rapid identification of the CpG methylation status of DNA.

    PubMed

    Xu, Yi-Hua; Manoharan, Herbert T; Pitot, Henry C

    2007-09-01

    The bisulfite genomic sequencing technique is one of the most widely used techniques to study sequence-specific DNA methylation because of its unambiguous ability to reveal DNA methylation status to the order of a single nucleotide. One characteristic feature of the bisulfite genomic sequencing technique is that a number of sample sequence files will be produced from a single DNA sample. The PCR products of bisulfite-treated DNA samples cannot be sequenced directly because they are heterogeneous in nature; therefore they should be cloned into suitable plasmids and then sequenced. This procedure generates an enormous number of sample DNA sequence files as well as adding extra bases belonging to the plasmids to the sequence, which will cause problems in the final sequence comparison. Finding the methylation status for each CpG in each sample sequence is not an easy job. As a result CpG PatternFinder was developed for this purpose. The main functions of the CpG PatternFinder are: (i) to analyze the reference sequence to obtain CpG and non-CpG-C residue position information. (ii) To tailor sample sequence files (delete insertions and mark deletions from the sample sequence files) based on a configuration of ClustalW multiple alignment. (iii) To align sample sequence files with a reference file to obtain bisulfite conversion efficiency and CpG methylation status. And, (iv) to produce graphics, highlighted aligned sequence text and a summary report which can be easily exported to Microsoft Office suite. CpG PatternFinder is designed to operate cooperatively with BioEdit, a freeware on the internet. It can handle up to 100 files of sample DNA sequences simultaneously, and the total CpG pattern analysis process can be finished in minutes. CpG PatternFinder is an ideal software tool for DNA methylation studies to determine the differential methylation pattern in a large number of individuals in a population. Previously we developed the CpG Analyzer program; CpG PatternFinder is our further effort to create software tools for DNA methylation studies.

  14. Physicochemical, bioactive, and sensory properties of persimmon-based ice cream: technique for order preference by similarity to ideal solution to determine optimum concentration.

    PubMed

    Karaman, Safa; Toker, Ömer Said; Yüksel, Ferhat; Çam, Mustafa; Kayacier, Ahmed; Dogan, Mahmut

    2014-01-01

    In the present study, persimmon puree was incorporated into the ice cream mix at different concentrations (8, 16, 24, 32, and 40%) and some physicochemical (dry matter, ash, protein, pH, sugar, fat, mineral, color, and viscosity), textural (hardness, stickiness, and work of penetration), bioactive (antiradical activity and total phenolic content), and sensory properties of samples were investigated. The technique for order preference by similarity to ideal solution approach was used for the determination of optimum persimmon puree concentration based on the sensory and bioactive characteristics of final products. Increase in persimmon puree resulted in a decrease in the dry matter, ash, fat, protein contents, and viscosity of ice cream mix. Glucose, fructose, sucrose, and lactose were determined to be major sugars in the ice cream samples including persimmon and increase in persimmon puree concentration increased the fructose and glucose content. Better melting properties and textural characteristics were observed for the samples with the addition of persimmon. Magnesium, K, and Ca were determined to be major minerals in the samples and only K concentration increased with the increase in persimmon content. Bioactive properties of ice cream samples improved and, in general, acetone-water extracts showed higher bioactivity compared with ones obtained using methanol-water extracts. The technique for order preference by similarity to ideal solution approach showed that the most preferred sample was the ice cream containing 24% persimmon puree. Copyright © 2014 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.

  15. Laser-induced breakdown spectroscopy (LIBS) technique for the determination of the chemical composition of complex inorganic materials

    NASA Astrophysics Data System (ADS)

    Łazarek, Łukasz; Antończak, Arkadiusz J.; Wójcik, Michał R.; Kozioł, Paweł E.; Stepak, Bogusz; Abramski, Krzysztof M.

    2014-08-01

    Laser-induced breakdown spectroscopy (LIBS) is a fast, fully optical method, that needs little or no sample preparation. In this technique qualitative and quantitative analysis is based on comparison. The determination of composition is generally based on the construction of a calibration curve namely the LIBS signal versus the concentration of the analyte. Typically, to calibrate the system, certified reference materials with known elemental composition are used. Nevertheless, such samples due to differences in the overall composition with respect to the used complex inorganic materials can influence significantly on the accuracy. There are also some intermediate factors which can cause imprecision in measurements, such as optical absorption, surface structure, thermal conductivity etc. This paper presents the calibration procedure performed with especially prepared pellets from the tested materials, which composition was previously defined. We also proposed methods of post-processing which allowed for mitigation of the matrix effects and for a reliable and accurate analysis. This technique was implemented for determination of trace elements in industrial copper concentrates standardized by conventional atomic absorption spectroscopy with a flame atomizer. A series of copper flotation concentrate samples was analyzed for contents of three elements, that is silver, cobalt and vanadium. It has been shown that the described technique can be used to qualitative and quantitative analyses of complex inorganic materials, such as copper flotation concentrates.

  16. Sample-space-based feature extraction and class preserving projection for gene expression data.

    PubMed

    Wang, Wenjun

    2013-01-01

    In order to overcome the problems of high computational complexity and serious matrix singularity for feature extraction using Principal Component Analysis (PCA) and Fisher's Linear Discrinimant Analysis (LDA) in high-dimensional data, sample-space-based feature extraction is presented, which transforms the computation procedure of feature extraction from gene space to sample space by representing the optimal transformation vector with the weighted sum of samples. The technique is used in the implementation of PCA, LDA, Class Preserving Projection (CPP) which is a new method for discriminant feature extraction proposed, and the experimental results on gene expression data demonstrate the effectiveness of the method.

  17. Steady-state low thermal resistance characterization apparatus: The bulk thermal tester

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Burg, Brian R.; Kolly, Manuel; Blasakis, Nicolas

    The reliability of microelectronic devices is largely dependent on electronic packaging, which includes heat removal. The appropriate packaging design therefore necessitates precise knowledge of the relevant material properties, including thermal resistance and thermal conductivity. Thin materials and high conductivity layers make their thermal characterization challenging. A steady state measurement technique is presented and evaluated with the purpose to characterize samples with a thermal resistance below 100 mm{sup 2} K/W. It is based on the heat flow meter bar approach made up by two copper blocks and relies exclusively on temperature measurements from thermocouples. The importance of thermocouple calibration is emphasizedmore » in order to obtain accurate temperature readings. An in depth error analysis, based on Gaussian error propagation, is carried out. An error sensitivity analysis highlights the importance of the precise knowledge of the thermal interface materials required for the measurements. Reference measurements on Mo samples reveal a measurement uncertainty in the range of 5% and most accurate measurements are obtained at high heat fluxes. Measurement techniques for homogeneous bulk samples, layered materials, and protruding cavity samples are discussed. Ultimately, a comprehensive overview of a steady state thermal characterization technique is provided, evaluating the accuracy of sample measurements with thermal resistances well below state of the art setups. Accurate characterization of materials used in heat removal applications, such as electronic packaging, will enable more efficient designs and ultimately contribute to energy savings.« less

  18. Sampling enhancement for the quantum mechanical potential based molecular dynamics simulations: a general algorithm and its extension for free energy calculation on rugged energy surface.

    PubMed

    Li, Hongzhi; Yang, Wei

    2007-03-21

    An approach is developed in the replica exchange framework to enhance conformational sampling for the quantum mechanical (QM) potential based molecular dynamics simulations. Importantly, with our enhanced sampling treatment, a decent convergence for electronic structure self-consistent-field calculation is robustly guaranteed, which is made possible in our replica exchange design by avoiding direct structure exchanges between the QM-related replicas and the activated (scaled by low scaling parameters or treated with high "effective temperatures") molecular mechanical (MM) replicas. Although the present approach represents one of the early efforts in the enhanced sampling developments specifically for quantum mechanical potentials, the QM-based simulations treated with the present technique can possess the similar sampling efficiency to the MM based simulations treated with the Hamiltonian replica exchange method (HREM). In the present paper, by combining this sampling method with one of our recent developments (the dual-topology alchemical HREM approach), we also introduce a method for the sampling enhanced QM-based free energy calculations.

  19. Integrating machine learning techniques into robust data enrichment approach and its application to gene expression data.

    PubMed

    Erdoğdu, Utku; Tan, Mehmet; Alhajj, Reda; Polat, Faruk; Rokne, Jon; Demetrick, Douglas

    2013-01-01

    The availability of enough samples for effective analysis and knowledge discovery has been a challenge in the research community, especially in the area of gene expression data analysis. Thus, the approaches being developed for data analysis have mostly suffered from the lack of enough data to train and test the constructed models. We argue that the process of sample generation could be successfully automated by employing some sophisticated machine learning techniques. An automated sample generation framework could successfully complement the actual sample generation from real cases. This argument is validated in this paper by describing a framework that integrates multiple models (perspectives) for sample generation. We illustrate its applicability for producing new gene expression data samples, a highly demanding area that has not received attention. The three perspectives employed in the process are based on models that are not closely related. The independence eliminates the bias of having the produced approach covering only certain characteristics of the domain and leading to samples skewed towards one direction. The first model is based on the Probabilistic Boolean Network (PBN) representation of the gene regulatory network underlying the given gene expression data. The second model integrates Hierarchical Markov Model (HIMM) and the third model employs a genetic algorithm in the process. Each model learns as much as possible characteristics of the domain being analysed and tries to incorporate the learned characteristics in generating new samples. In other words, the models base their analysis on domain knowledge implicitly present in the data itself. The developed framework has been extensively tested by checking how the new samples complement the original samples. The produced results are very promising in showing the effectiveness, usefulness and applicability of the proposed multi-model framework.

  20. Insights into organic-aerosol sources via a novel laser-desorption/ionization mass spectrometry technique applied to one year of PM10 samples from nine sites in central Europe

    NASA Astrophysics Data System (ADS)

    Daellenbach, Kaspar R.; El-Haddad, Imad; Karvonen, Lassi; Vlachou, Athanasia; Corbin, Joel C.; Slowik, Jay G.; Heringa, Maarten F.; Bruns, Emily A.; Luedin, Samuel M.; Jaffrezo, Jean-Luc; Szidat, Sönke; Piazzalunga, Andrea; Gonzalez, Raquel; Fermo, Paola; Pflueger, Valentin; Vogel, Guido; Baltensperger, Urs; Prévôt, André S. H.

    2018-02-01

    We assess the benefits of offline laser-desorption/ionization mass spectrometry in understanding ambient particulate matter (PM) sources. The technique was optimized for measuring PM collected on quartz-fiber filters using silver nitrate as an internal standard for m/z calibration. This is the first application of this technique to samples collected at nine sites in central Europe throughout the entire year of 2013 (819 samples). Different PM sources were identified by positive matrix factorization (PMF) including also concomitant measurements (such as NOx, levoglucosan, and temperature). By comparison to reference mass spectral signatures from laboratory wood burning experiments as well as samples from a traffic tunnel, three biomass burning factors and two traffic factors were identified. The wood burning factors could be linked to the burning conditions; the factors related to inefficient burns had a larger impact on air quality in southern Alpine valleys than in northern Switzerland. The traffic factors were identified as primary tailpipe exhaust and most possibly aged/secondary traffic emissions. The latter attribution was supported by radiocarbon analyses of both the organic and elemental carbon. Besides these sources, factors related to secondary organic aerosol were also separated. The contribution of the wood burning emissions based on LDI-PMF (laser-desorption/ionization PMF) correlates well with that based on AMS-PMF (aerosol mass spectrometer PMF) analyses, while the comparison between the two techniques for other components is more complex.

  1. The application of compressed sensing to long-term acoustic emission-based structural health monitoring

    NASA Astrophysics Data System (ADS)

    Cattaneo, Alessandro; Park, Gyuhae; Farrar, Charles; Mascareñas, David

    2012-04-01

    The acoustic emission (AE) phenomena generated by a rapid release in the internal stress of a material represent a promising technique for structural health monitoring (SHM) applications. AE events typically result in a discrete number of short-time, transient signals. The challenge associated with capturing these events using classical techniques is that very high sampling rates must be used over extended periods of time. The result is that a very large amount of data is collected to capture a phenomenon that rarely occurs. Furthermore, the high energy consumption associated with the required high sampling rates makes the implementation of high-endurance, low-power, embedded AE sensor nodes difficult to achieve. The relatively rare occurrence of AE events over long time scales implies that these measurements are inherently sparse in the spike domain. The sparse nature of AE measurements makes them an attractive candidate for the application of compressed sampling techniques. Collecting compressed measurements of sparse AE signals will relax the requirements on the sampling rate and memory demands. The focus of this work is to investigate the suitability of compressed sensing techniques for AE-based SHM. The work explores estimating AE signal statistics in the compressed domain for low-power classification applications. In the event compressed classification finds an event of interest, ι1 norm minimization will be used to reconstruct the measurement for further analysis. The impact of structured noise on compressive measurements is specifically addressed. The suitability of a particular algorithm, called Justice Pursuit, to increase robustness to a small amount of arbitrary measurement corruption is investigated.

  2. Portable Electronic Nose Based on Electrochemical Sensors for Food Quality Assessment

    PubMed Central

    Dymerski, Tomasz; Gębicki, Jacek; Namieśnik, Jacek

    2017-01-01

    The steady increase in global consumption puts a strain on agriculture and might lead to a decrease in food quality. Currently used techniques of food analysis are often labour-intensive and time-consuming and require extensive sample preparation. For that reason, there is a demand for novel methods that could be used for rapid food quality assessment. A technique based on the use of an array of chemical sensors for holistic analysis of the sample’s headspace is called electronic olfaction. In this article, a prototype of a portable, modular electronic nose intended for food analysis is described. Using the SVM method, it was possible to classify samples of poultry meat based on shelf-life with 100% accuracy, and also samples of rapeseed oil based on the degree of thermal degradation with 100% accuracy. The prototype was also used to detect adulterations of extra virgin olive oil with rapeseed oil with 82% overall accuracy. Due to the modular design, the prototype offers the advantages of solutions targeted for analysis of specific food products, at the same time retaining the flexibility of application. Furthermore, its portability allows the device to be used at different stages of the production and distribution process. PMID:29186754

  3. The implementation of liquid-based cytology for lung and pleural-based diseases.

    PubMed

    Michael, Claire W; Bedrossian, Carlos C W M

    2014-01-01

    First introduced for the processing of cervico-vaginal samples, liquid-based cytology (LBC) soon found application in nongynecological specimens, including bronchoscopic brushings, washings and transcutaneous and transbronchial aspiration biopsy of the lung as well as pleural effusions. This article reviews the existing literature related to these specimens along with the authors' own experience. A literature review was conducted through Ovid MEDLINE and PubMed search engines using several key words. Most of the literature is based on data collected through the use of split samples. The data confirms that the use of LBC is an acceptable, and sometimes superior, alternative to the conventional preparations (CP). LBC offers several advantages, including the ability to transport in a stable collecting media, elimination of obscuring elements, ease of screening, excellent preservation, random representative sample, and application of ancillary techniques on additional preparations. Some diagnostic pitfalls related to the introduced artifacts were reported. The utilization of LBC offers many advantages over CP and has a diagnostic accuracy that is equal to or surpasses that of CP. LBC affords a bridge to the future application of molecular and other ancillary techniques to cytology. Knowledge of the morphological artifacts is useful at the early stages of implementation.

  4. Using next-generation sequencing for high resolution multiplex analysis of copy number variation from nanogram quantities of DNA from formalin-fixed paraffin-embedded specimens.

    PubMed

    Wood, Henry M; Belvedere, Ornella; Conway, Caroline; Daly, Catherine; Chalkley, Rebecca; Bickerdike, Melissa; McKinley, Claire; Egan, Phil; Ross, Lisa; Hayward, Bruce; Morgan, Joanne; Davidson, Leslie; MacLennan, Ken; Ong, Thian K; Papagiannopoulos, Kostas; Cook, Ian; Adams, David J; Taylor, Graham R; Rabbitts, Pamela

    2010-08-01

    The use of next-generation sequencing technologies to produce genomic copy number data has recently been described. Most approaches, however, reply on optimal starting DNA, and are therefore unsuitable for the analysis of formalin-fixed paraffin-embedded (FFPE) samples, which largely precludes the analysis of many tumour series. We have sought to challenge the limits of this technique with regards to quality and quantity of starting material and the depth of sequencing required. We confirm that the technique can be used to interrogate DNA from cell lines, fresh frozen material and FFPE samples to assess copy number variation. We show that as little as 5 ng of DNA is needed to generate a copy number karyogram, and follow this up with data from a series of FFPE biopsies and surgical samples. We have used various levels of sample multiplexing to demonstrate the adjustable resolution of the methodology, depending on the number of samples and available resources. We also demonstrate reproducibility by use of replicate samples and comparison with microarray-based comparative genomic hybridization (aCGH) and digital PCR. This technique can be valuable in both the analysis of routine diagnostic samples and in examining large repositories of fixed archival material.

  5. Critical review of analytical techniques for safeguarding the thorium-uranium fuel cycle

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hakkila, E.A.

    1978-10-01

    Conventional analytical methods applicable to the determination of thorium, uranium, and plutonium in feed, product, and waste streams from reprocessing thorium-based nuclear reactor fuels are reviewed. Separations methods of interest for these analyses are discussed. Recommendations concerning the applicability of various techniques to reprocessing samples are included. 15 tables, 218 references.

  6. Case-based reasoning for space applications: Utilization of prior experience in knowledge-based systems

    NASA Technical Reports Server (NTRS)

    King, James A.

    1987-01-01

    The goal is to explain Case-Based Reasoning as a vehicle to establish knowledge-based systems based on experimental reasoning for possible space applications. This goal will be accomplished through an examination of reasoning based on prior experience in a sample domain, and also through a presentation of proposed space applications which could utilize Case-Based Reasoning techniques.

  7. Tensorial dynamic time warping with articulation index representation for efficient audio-template learning.

    PubMed

    Le, Long N; Jones, Douglas L

    2018-03-01

    Audio classification techniques often depend on the availability of a large labeled training dataset for successful performance. However, in many application domains of audio classification (e.g., wildlife monitoring), obtaining labeled data is still a costly and laborious process. Motivated by this observation, a technique is proposed to efficiently learn a clean template from a few labeled, but likely corrupted (by noise and interferences), data samples. This learning can be done efficiently via tensorial dynamic time warping on the articulation index-based time-frequency representations of audio data. The learned template can then be used in audio classification following the standard template-based approach. Experimental results show that the proposed approach outperforms both (1) the recurrent neural network approach and (2) the state-of-the-art in the template-based approach on a wildlife detection application with few training samples.

  8. Fast Measurement of Soluble Solid Content in Mango Based on Visible and Infrared Spectroscopy Technique

    NASA Astrophysics Data System (ADS)

    Yu, Jiajia; He, Yong

    Mango is a kind of popular tropical fruit, and the soluble solid content is an important in this study visible and short-wave near-infrared spectroscopy (VIS/SWNIR) technique was applied. For sake of investigating the feasibility of using VIS/SWNIR spectroscopy to measure the soluble solid content in mango, and validating the performance of selected sensitive bands, for the calibration set was formed by 135 mango samples, while the remaining 45 mango samples for the prediction set. The combination of partial least squares and backpropagation artificial neural networks (PLS-BP) was used to calculate the prediction model based on raw spectrum data. Based on PLS-BP, the determination coefficient for prediction (Rp) was 0.757 and root mean square and the process is simple and easy to operate. Compared with the Partial least squares (PLS) result, the performance of PLS-BP is better.

  9. Containerless Studies of Nucleation and Undercooling

    NASA Technical Reports Server (NTRS)

    Trinh, E. H.

    1985-01-01

    The long term research goals are to perform experiments to determine the achievable limits of undercooling, the characteristics of heterogeneous nucleation, and the physical properties of significantly undercooled melts. The techniques used are based on the newly developed containerless manipulation methods afforded by acoustic levitation. Ground based investigations involved 0.1 to 2 mm specimens of pure metals and alloys (In, Ga, Sn, Ga-In, ...) as well as glass-forming organic compounds (O-Terphenyl). A currently operating ultrasonic high temperature apparatus has allowed the ground-based levitation of 1 to 2 mm samples of solid aluminum at 550 deg C in an argon atmosphere. Present work is concentrating on the undercooling of pure metal samples (In, Sn), and on the measurements of surface tension and viscosity of the undercooled melts via shape oscillation techniques monitored through optical detection methods. The sound velocity of undercooled O-Terphenyl is being measured in an immiscible liquid levitation cells.

  10. Veterinary extension on sampling techniques related to heartwater research.

    PubMed

    Steyn, H C; McCrindle, C M E; Du Toit, D

    2010-09-01

    Heartwater, a tick-borne disease caused by Ehrlichia ruminantium, is considered to be a significant cause of mortality amongst domestic and wild ruminants in South Africa. The main vector is Amblyomma hebraeum and although previous epidemiological studies have outlined endemic areas based on mortalities, these have been limited by diagnostic methods which relied mainly on positive brain smears. The indirect fluorescent antibody test (IFA) has a low specificity for heartwater organisms as it cross-reacts with some other species. Since the advent of biotechnology and genomics, molecular epidemiology has evolved using the methodology of traditional epidemiology coupled with the new molecular techniques. A new quantitative real-time polymerase chain reaction (qPCR) test has been developed for rapid and accurate diagnosis of heartwater in the live animal. This method can also be used to survey populations of A. hebraeum ticks for heartwater. Sampling whole blood and ticks for this qPCR differs from routine serum sampling, which is used for many serological tests. Veterinary field staff, particularly animal health technicians, are involved in surveillance and monitoring of controlled and other diseases of animals in South Africa. However, it was found that the sampling of whole blood was not done correctly, probably because it is a new sampling technique specific for new technology, where the heartwater organism is much more labile than the serum antibodies required for other tests. This qPCR technique is highly sensitive and can diagnose heartwater in the living animal within 2 hours, in time to treat it. Poor sampling techniques that decrease the sensitivity of the test will, however, result in a false negative diagnosis. This paper describes the development of a skills training programme for para-veterinary field staff, to facilitate research into the molecular epidemiology of heartwater in ruminants and eliminate any sampling bias due to collection errors. Humane handling techniques were also included in the training, in line with the current focus on improved livestock welfare.

  11. Evaluation and comparison of FTA card and CTAB DNA extraction methods for non-agricultural taxa.

    PubMed

    Siegel, Chloe S; Stevenson, Florence O; Zimmer, Elizabeth A

    2017-02-01

    An efficient, effective DNA extraction method is necessary for comprehensive analysis of plant genomes. This study analyzed the quality of DNA obtained using paper FTA cards prepared directly in the field when compared to the more traditional cetyltrimethylammonium bromide (CTAB)-based extraction methods from silica-dried samples. DNA was extracted using FTA cards according to the manufacturer's protocol. In parallel, CTAB-based extractions were done using the automated AutoGen DNA isolation system. DNA quality for both methods was determined for 15 non-agricultural species collected in situ, by gel separation, spectrophotometry, fluorometry, and successful amplification and sequencing of nuclear and chloroplast gene markers. The FTA card extraction method yielded less concentrated, but also less fragmented samples than the CTAB-based technique. The card-extracted samples provided DNA that could be successfully amplified and sequenced. The FTA cards are also useful because the collected samples do not require refrigeration, extensive laboratory expertise, or as many hazardous chemicals as extractions using the CTAB-based technique. The relative success of the FTA card method in our study suggested that this method could be a valuable tool for studies in plant population genetics and conservation biology that may involve screening of hundreds of individual plants. The FTA cards, like the silica gel samples, do not contain plant material capable of propagation, and therefore do not require permits from the U.S. Department of Agriculture (USDA) Animal and Plant Health Inspection Service (APHIS) for transportation.

  12. Highly efficient peptide separations in proteomics. Part 2: bi- and multidimensional liquid-based separation techniques.

    PubMed

    Sandra, Koen; Moshir, Mahan; D'hondt, Filip; Tuytten, Robin; Verleysen, Katleen; Kas, Koen; François, Isabelle; Sandra, Pat

    2009-04-15

    Multidimensional liquid-based separation techniques are described for maximizing the resolution of the enormous number of peptides generated upon tryptic digestion of proteomes, and hence, reduce the spatial and temporal complexity of the sample to a level that allows successful mass spectrometric analysis. This review complements the previous contribution on unidimensional high performance liquid chromatography (HPLC). Both chromatography and electrophoresis will be discussed albeit with reversed-phase HPLC (RPLC) as the final separation dimension prior to MS analysis.

  13. Matrix Assisted Laser Desorption Ionization Mass Spectrometric Analysis of Bacillus anthracis: From Fingerprint Analysis of the Bacterium to Quantification of its Toxins in Clinical Samples

    NASA Astrophysics Data System (ADS)

    Woolfitt, Adrian R.; Boyer, Anne E.; Quinn, Conrad P.; Hoffmaster, Alex R.; Kozel, Thomas R.; de, Barun K.; Gallegos, Maribel; Moura, Hercules; Pirkle, James L.; Barr, John R.

    A range of mass spectrometry-based techniques have been used to identify, characterize and differentiate Bacillus anthracis, both in culture for forensic applications and for diagnosis during infection. This range of techniques could usefully be considered to exist as a continuum, based on the degrees of specificity involved. We show two examples here, a whole-organism fingerprinting method and a high-specificity assay for one unique protein, anthrax lethal factor.

  14. Concentration and separation of biological organisms by ultrafiltration and dielectrophoresis

    DOEpatents

    Simmons, Blake A.; Hill, Vincent R.; Fintschenko, Yolanda; Cummings, Eric B.

    2010-10-12

    Disclosed is a method for monitoring sources of public water supply for a variety of pathogens by using a combination of ultrafiltration techniques together dielectrophoretic separation techniques. Because water-borne pathogens, whether present due to "natural" contamination or intentional introduction, would likely be present in drinking water at low concentrations when samples are collected for monitoring or outbreak investigations, an approach is needed to quickly and efficiently concentrate and separate particles such as viruses, bacteria, and parasites in large volumes of water (e.g., 100 L or more) while simultaneously reducing the sample volume to levels sufficient for detecting low concentrations of microbes (e.g., <10 mL). The technique is also designed to screen the separated microbes based on specific conductivity and size.

  15. Numerical dispersion compensation for Partial Coherence Interferometry and Optical Coherence Tomography.

    PubMed

    Fercher, A; Hitzenberger, C; Sticker, M; Zawadzki, R; Karamata, B; Lasser, T

    2001-12-03

    Dispersive samples introduce a wavelength dependent phase distortion to the probe beam. This leads to a noticeable loss of depth resolution in high resolution OCT using broadband light sources. The standard technique to avoid this consequence is to balance the dispersion of the sample byarrangingadispersive materialinthereference arm. However, the impact of dispersion is depth dependent. A corresponding depth dependent dispersion balancing technique is diffcult to implement. Here we present a numerical dispersion compensation technique for Partial Coherence Interferometry (PCI) and Optical Coherence Tomography (OCT) based on numerical correlation of the depth scan signal with a depth variant kernel. It can be used a posteriori and provides depth dependent dispersion compensation. Examples of dispersion compensated depth scan signals obtained from microscope cover glasses are presented.

  16. Non-contact measurement of diamagnetic susceptibility change by a magnetic levitation technique

    NASA Astrophysics Data System (ADS)

    Takahashi, K.; Mogi, I.; Awaji, S.; Watanabe, K.

    2011-03-01

    A new method for measuring the temperature dependence of the diamagnetic susceptibility is described. It is based on the Faraday method and employs a magnetic levitation technique. The susceptibility of a magnetically levitating diamagnetic sample is determined from the product of the magnetic flux density and the field gradient at the levitating position observed using a micro CCD camera. The susceptibility of a sample during containerless melting and solidification can be measured to a precision of better than ±0.05%. The temperature dependence of the susceptibility of paraffin wax was measured by the magnetic levitation technique with an accuracy of ±0.25%. This method enables sensitive and contactless measurements of the diamagnetic susceptibility across the melting point with in situ observations.

  17. Measuring molecular biomarkers in epidemiologic studies: laboratory techniques and biospecimen considerations.

    PubMed

    Erickson, Heidi S

    2012-09-28

    The future of personalized medicine depends on the ability to efficiently and rapidly elucidate a reliable set of disease-specific molecular biomarkers. High-throughput molecular biomarker analysis methods have been developed to identify disease risk, diagnostic, prognostic, and therapeutic targets in human clinical samples. Currently, high throughput screening allows us to analyze thousands of markers from one sample or one marker from thousands of samples and will eventually allow us to analyze thousands of markers from thousands of samples. Unfortunately, the inherent nature of current high throughput methodologies, clinical specimens, and cost of analysis is often prohibitive for extensive high throughput biomarker analysis. This review summarizes the current state of high throughput biomarker screening of clinical specimens applicable to genetic epidemiology and longitudinal population-based studies with a focus on considerations related to biospecimens, laboratory techniques, and sample pooling. Copyright © 2012 John Wiley & Sons, Ltd.

  18. Manual versus automated blood sampling: impact of repeated blood sampling on stress parameters and behavior in male NMRI mice

    PubMed Central

    Kalliokoski, Otto; Sørensen, Dorte B; Hau, Jann; Abelson, Klas S P

    2014-01-01

    Facial vein (cheek blood) and caudal vein (tail blood) phlebotomy are two commonly used techniques for obtaining blood samples from laboratory mice, while automated blood sampling through a permanent catheter is a relatively new technique in mice. The present study compared physiological parameters, glucocorticoid dynamics as well as the behavior of mice sampled repeatedly for 24 h by cheek blood, tail blood or automated blood sampling from the carotid artery. Mice subjected to cheek blood sampling lost significantly more body weight, had elevated levels of plasma corticosterone, excreted more fecal corticosterone metabolites, and expressed more anxious behavior than did the mice of the other groups. Plasma corticosterone levels of mice subjected to tail blood sampling were also elevated, although less significantly. Mice subjected to automated blood sampling were less affected with regard to the parameters measured, and expressed less anxious behavior. We conclude that repeated blood sampling by automated blood sampling and from the tail vein is less stressful than cheek blood sampling. The choice between automated blood sampling and tail blood sampling should be based on the study requirements, the resources of the laboratory and skills of the staff. PMID:24958546

  19. Toward More Evidence-Based Practice

    PubMed Central

    Hotelling, Barbara A.

    2005-01-01

    Childbirth educators are responsible for providing expectant parents with evidence-based information. In this column, the author suggests resources where educators can find evidence-based research for best practices. Additionally, the author describes techniques for childbirth educators to use in presenting research-based information in their classes. A sample of Web sites and books that offer evidence-based resources for expectant parents is provided. PMID:17273422

  20. Analytical model for real time, noninvasive estimation of blood glucose level.

    PubMed

    Adhyapak, Anoop; Sidley, Matthew; Venkataraman, Jayanti

    2014-01-01

    The paper presents an analytical model to estimate blood glucose level from measurements made non-invasively and in real time by an antenna strapped to a patient's wrist. Some promising success has been shown by the RIT ETA Lab research group that an antenna's resonant frequency can track, in real time, changes in glucose concentration. Based on an in-vitro study of blood samples of diabetic patients, the paper presents a modified Cole-Cole model that incorporates a factor to represent the change in glucose level. A calibration technique using the input impedance technique is discussed and the results show a good estimation as compared to the glucose meter readings. An alternate calibration methodology has been developed that is based on the shift in the antenna resonant frequency using an equivalent circuit model containing a shunt capacitor to represent the shift in resonant frequency with changing glucose levels. Work under progress is the optimization of the technique with a larger sample of patients.

  1. Description of Student’s Metacognitive Ability in Understanding and Solving Mathematics Problem

    NASA Astrophysics Data System (ADS)

    Ahmad, Herlina; Febryanti, Fatimah; Febryanti, Fatimah; Muthmainnah

    2018-01-01

    This research was conducted qualitative which was aim to describe metacognitive ability to understand and solve the problems of mathematics. The subject of the research was the first year students at computer and networking department of SMK Mega Link Majene. The sample was taken by purposive sampling technique. The data obtained used the research instrument based on the form of students achievements were collected by using test of student’s achievement and interview guidance. The technique of collecting data researcher had observation to ascertain the model that used by teacher was teaching model of developing metacognitive. The technique of data analysis in this research was reduction data, presentation and conclusion. Based on the whole findings in this study it was shown that student’s metacognitive ability generally not develops optimally. It was because of limited scope of the materials, and cognitive teaching strategy handled by verbal presentation and trained continuously in facing cognitive tasks, such as understanding and solving problem.

  2. Taking a statistical approach

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wild, M.; Rouhani, S.

    1995-02-01

    A typical site investigation entails extensive sampling and monitoring. In the past, sampling plans have been designed on purely ad hoc bases, leading to significant expenditures and, in some cases, collection of redundant information. In many instances, sampling costs exceed the true worth of the collected data. The US Environmental Protection Agency (EPA) therefore has advocated the use of geostatistics to provide a logical framework for sampling and analysis of environmental data. Geostatistical methodology uses statistical techniques for the spatial analysis of a variety of earth-related data. The use of geostatistics was developed by the mining industry to estimate oremore » concentrations. The same procedure is effective in quantifying environmental contaminants in soils for risk assessments. Unlike classical statistical techniques, geostatistics offers procedures to incorporate the underlying spatial structure of the investigated field. Sample points spaced close together tend to be more similar than samples spaced further apart. This can guide sampling strategies and determine complex contaminant distributions. Geostatistic techniques can be used to evaluate site conditions on the basis of regular, irregular, random and even spatially biased samples. In most environmental investigations, it is desirable to concentrate sampling in areas of known or suspected contamination. The rigorous mathematical procedures of geostatistics allow for accurate estimates at unsampled locations, potentially reducing sampling requirements. The use of geostatistics serves as a decision-aiding and planning tool and can significantly reduce short-term site assessment costs, long-term sampling and monitoring needs, as well as lead to more accurate and realistic remedial design criteria.« less

  3. Fuel Load (FL)

    Treesearch

    Duncan C. Lutes; Robert E. Keane

    2006-01-01

    The Fuel Load method (FL) is used to sample dead and down woody debris, determine depth of the duff/ litter profile, estimate the proportion of litter in the profile, and estimate total vegetative cover and dead vegetative cover. Down woody debris (DWD) is sampled using the planar intercept technique based on the methodology developed by Brown (1974). Pieces of dead...

  4. A parameter selection for Raman spectroscopy-based detection of chemical contaminants in food powders

    USDA-ARS?s Scientific Manuscript database

    Raman spectroscopy technique has proven to be a reliable method for detection of chemical contaminants in food ingredients and products. To detect each contaminant particle in a food sample, it is important to determine the effective depth of penetration of laser through the food sample and the corr...

  5. Oral Reading Fluency Growth: A Sample of Methodology and Findings. Research Brief 6

    ERIC Educational Resources Information Center

    Tindal, Gerald; Nese, Joseph F. T.

    2013-01-01

    For the past 20 years, the growth of students' oral reading fluency has been investigated by a number of researchers using curriculum-based measurement. These researchers have used varied methods (student samples, measurement procedures, and analytical techniques) and yet have converged on a relatively consistent finding: General education…

  6. Integration of gel-based and gel-free proteomic data for functional analysis of proteins through Soybean Proteome Database.

    PubMed

    Komatsu, Setsuko; Wang, Xin; Yin, Xiaojian; Nanjo, Yohei; Ohyanagi, Hajime; Sakata, Katsumi

    2017-06-23

    The Soybean Proteome Database (SPD) stores data on soybean proteins obtained with gel-based and gel-free proteomic techniques. The database was constructed to provide information on proteins for functional analyses. The majority of the data is focused on soybean (Glycine max 'Enrei'). The growth and yield of soybean are strongly affected by environmental stresses such as flooding. The database was originally constructed using data on soybean proteins separated by two-dimensional polyacrylamide gel electrophoresis, which is a gel-based proteomic technique. Since 2015, the database has been expanded to incorporate data obtained by label-free mass spectrometry-based quantitative proteomics, which is a gel-free proteomic technique. Here, the portions of the database consisting of gel-free proteomic data are described. The gel-free proteomic database contains 39,212 proteins identified in 63 sample sets, such as temporal and organ-specific samples of soybean plants grown under flooding stress or non-stressed conditions. In addition, data on organellar proteins identified in mitochondria, nuclei, and endoplasmic reticulum are stored. Furthermore, the database integrates multiple omics data such as genomics, transcriptomics, metabolomics, and proteomics. The SPD database is accessible at http://proteome.dc.affrc.go.jp/Soybean/. The Soybean Proteome Database stores data obtained from both gel-based and gel-free proteomic techniques. The gel-free proteomic database comprises 39,212 proteins identified in 63 sample sets, such as different organs of soybean plants grown under flooding stress or non-stressed conditions in a time-dependent manner. In addition, organellar proteins identified in mitochondria, nuclei, and endoplasmic reticulum are stored in the gel-free proteomics database. A total of 44,704 proteins, including 5490 proteins identified using a gel-based proteomic technique, are stored in the SPD. It accounts for approximately 80% of all predicted proteins from genome sequences, though there are over lapped proteins. Based on the demonstrated application of data stored in the database for functional analyses, it is suggested that these data will be useful for analyses of biological mechanisms in soybean. Furthermore, coupled with recent advances in information and communication technology, the usefulness of this database would increase in the analyses of biological mechanisms. Copyright © 2017 Elsevier B.V. All rights reserved.

  7. AFM-porosimetry: density and pore volume measurements of particulate materials.

    PubMed

    Sörensen, Malin H; Valle-Delgado, Juan J; Corkery, Robert W; Rutland, Mark W; Alberius, Peter C

    2008-06-01

    We introduced the novel technique of AFM-porosimetry and applied it to measure the total pore volume of porous particles with a spherical geometry. The methodology is based on using an atomic force microscope as a balance to measure masses of individual particles. Several particles within the same batch were measured, and by plotting particle mass versus particle volume, the bulk density of the sample can be extracted from the slope of the linear fit. The pore volume is then calculated from the densities of the bulk and matrix materials, respectively. In contrast to nitrogen sorption and mercury porosimetry, this method is capable of measuring the total pore volume regardless of pore size distribution and pore connectivity. In this study, three porous samples were investigated by AFM-porosimetry: one ordered mesoporous sample and two disordered foam structures. All samples were based on a matrix of amorphous silica templated by a block copolymer, Pluronic F127, swollen to various degrees with poly(propylene glycol). In addition, the density of silica spheres without a template was measured by two independent techniques: AFM and the Archimedes principle.

  8. Accounting for sensor calibration, data validation, measurement and sampling uncertainties in monitoring urban drainage systems.

    PubMed

    Bertrand-Krajewski, J L; Bardin, J P; Mourad, M; Béranger, Y

    2003-01-01

    Assessing the functioning and the performance of urban drainage systems on both rainfall event and yearly time scales is usually based on online measurements of flow rates and on samples of influent effluent for some rainfall events per year. In order to draw pertinent scientific and operational conclusions from the measurement results, it is absolutely necessary to use appropriate methods and techniques in order to i) calibrate sensors and analytical methods, ii) validate raw data, iii) evaluate measurement uncertainties, iv) evaluate the number of rainfall events to sample per year in order to determine performance indicator with a given uncertainty. Based an previous work, the paper gives a synthetic review of required and techniques, and illustrates their application to storage and settling tanks. Experiments show that, controlled and careful experimental conditions, relative uncertainties are about 20% for flow rates in sewer pipes, 6-10% for volumes, 25-35% for TSS concentrations and loads, and 18-276% for TSS removal rates. In order to evaluate the annual pollutant interception efficiency of storage and settling tanks with a given uncertainty, efforts should first be devoted to decrease the sampling uncertainty by increasing the number of sampled events.

  9. Individual and organizational factors related to community clinicians’ use of therapy techniques in a large public mental health system

    PubMed Central

    Marcus, Steven; Aarons, Gregory A.; Hoagwood, Kimberly E.; Schoenwald, Sonja; Evans, Arthur C.; Hurford, Matthew O.; Hadley, Trevor; Barg, Frances K.; Walsh, Lucia M.; Adams, Danielle R.; Mandell, David S.

    2015-01-01

    Importance Few studies have examined the effects of both clinician and organizational characteristics on the use of evidence-based practices in mental healthcare. Improved understanding of these factors could guide future implementation efforts to ensure effective adoption, implementation, and sustainment of evidence-based practices. Objective To estimate the relative contribution of clinician and organizational factors on clinician self-reported use of cognitive-behavioral, family, and psychodynamic techniques within the context of a large-scale effort to increase use of evidence-based practices in an urban public mental health system serving youth and families. Design Observational and cross-sectional. Data collected in 2013. Setting Twenty-three organizations. Participants We used purposive sampling to recruit the 29 largest child-serving agencies, which together serve approximately 80% of youth receiving publically funded mental health care. The final sample included 19 agencies with 23 sites, 130 therapists, 36 supervisors, and 22 executive administrators. Main Outcome Measures Clinician self-reported use of cognitive-behavioral, family, and psychodynamic techniques, as measured by the Therapist Procedures Checklist – Family Revised. Results Linear mixed-effects regression models were used; models included random intercepts for organization to account for nesting of clinicians within organization. Clinician factors accounted for the following percentage of the overall variation: cognitive-behavioral (16%), family (7%), psychodynamic (20%). Organizational factors accounted for the following percentage of the overall variation: cognitive-behavioral (23%), family (19%), psychodynamic (7%). Older clinicians and clinicians with more open attitudes were more likely to endorse use of cognitive behavioral techniques, as were those in organizations that had spent fewer years participating in evidence-based practice initiatives, had more resistant cultures, and had more functional climates. Female clinicians were more likely to endorse use of family techniques, as were those in organizations employing more fee-for-service staff and with more stressful climates. Clinicians with more divergent attitudes and less knowledge about evidence-based practices were more likely to use psychodynamic techniques. Conclusions & Relevance This study suggests that both clinician and organizational factors are important in explaining clinician behavior and the use of evidence-based practices, but that their relative importance varies by therapeutic technique. PMID:25686473

  10. Review of in situ derivatization techniques for enhanced bioanalysis using liquid chromatography with mass spectrometry.

    PubMed

    Baghdady, Yehia Z; Schug, Kevin A

    2016-01-01

    Accurate and specific analysis of target molecules in complex biological matrices remains a significant challenge, especially when ultra-trace detection limits are required. Liquid chromatography with mass spectrometry is often the method of choice for bioanalysis. Conventional sample preparation and clean-up methods prior to the analysis of biological fluids such as liquid-liquid extraction, solid-phase extraction, or protein precipitation are time-consuming, tedious, and can negatively affect target recovery and detection sensitivity. An alternative or complementary strategy is the use of an off-line or on-line in situ derivatization technique. In situ derivatization can be incorporated to directly derivatize target analytes in their native biological matrices, without any prior sample clean-up methods, to substitute or even enhance the extraction and preconcentration efficiency of these traditional sample preparation methods. Designed appropriately, it can reduce the number of sample preparation steps necessary prior to analysis. Moreover, in situ derivatization can be used to enhance the performance of the developed liquid chromatography with mass spectrometry-based bioanalysis methods regarding stability, chromatographic separation, selectivity, and ionization efficiency. This review presents an overview of the commonly used in situ derivatization techniques coupled to liquid chromatography with mass spectrometry-based bioanalysis to guide and to stimulate future research. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  11. Surface sampling techniques for 3D object inspection

    NASA Astrophysics Data System (ADS)

    Shih, Chihhsiong S.; Gerhardt, Lester A.

    1995-03-01

    While the uniform sampling method is quite popular for pointwise measurement of manufactured parts, this paper proposes three novel sampling strategies which emphasize 3D non-uniform inspection capability. They are: (a) the adaptive sampling, (b) the local adjustment sampling, and (c) the finite element centroid sampling techniques. The adaptive sampling strategy is based on a recursive surface subdivision process. Two different approaches are described for this adaptive sampling strategy. One uses triangle patches while the other uses rectangle patches. Several real world objects were tested using these two algorithms. Preliminary results show that sample points are distributed more closely around edges, corners, and vertices as desired for many classes of objects. Adaptive sampling using triangle patches is shown to generally perform better than both uniform and adaptive sampling using rectangle patches. The local adjustment sampling strategy uses a set of predefined starting points and then finds the local optimum position of each nodal point. This method approximates the object by moving the points toward object edges and corners. In a hybrid approach, uniform points sets and non-uniform points sets, first preprocessed by the adaptive sampling algorithm on a real world object were then tested using the local adjustment sampling method. The results show that the initial point sets when preprocessed by adaptive sampling using triangle patches, are moved the least amount of distance by the subsequently applied local adjustment method, again showing the superiority of this method. The finite element sampling technique samples the centroids of the surface triangle meshes produced from the finite element method. The performance of this algorithm was compared to that of the adaptive sampling using triangular patches. The adaptive sampling with triangular patches was once again shown to be better on different classes of objects.

  12. Development and evaluation of an automatic labeling technique for spring small grains

    NASA Technical Reports Server (NTRS)

    Crist, E. P.; Malila, W. A. (Principal Investigator)

    1981-01-01

    A labeling technique is described which seeks to associate a sampling entity with a particular crop or crop group based on similarity of growing season and temporal-spectral patterns of development. Human analyst provide contextual information, after which labeling decisions are made automatically. Results of a test of the technique on a large, multi-year data set are reported. Grain labeling accuracies are similar to those achieved by human analysis techniques, while non-grain accuracies are lower. Recommendations for improvments and implications of the test results are discussed.

  13. Detection, characterization and quantification of inorganic engineered nanomaterials: A review of techniques and methodological approaches for the analysis of complex samples.

    PubMed

    Laborda, Francisco; Bolea, Eduardo; Cepriá, Gemma; Gómez, María T; Jiménez, María S; Pérez-Arantegui, Josefina; Castillo, Juan R

    2016-01-21

    The increasing demand of analytical information related to inorganic engineered nanomaterials requires the adaptation of existing techniques and methods, or the development of new ones. The challenge for the analytical sciences has been to consider the nanoparticles as a new sort of analytes, involving both chemical (composition, mass and number concentration) and physical information (e.g. size, shape, aggregation). Moreover, information about the species derived from the nanoparticles themselves and their transformations must also be supplied. Whereas techniques commonly used for nanoparticle characterization, such as light scattering techniques, show serious limitations when applied to complex samples, other well-established techniques, like electron microscopy and atomic spectrometry, can provide useful information in most cases. Furthermore, separation techniques, including flow field flow fractionation, capillary electrophoresis and hydrodynamic chromatography, are moving to the nano domain, mostly hyphenated to inductively coupled plasma mass spectrometry as element specific detector. Emerging techniques based on the detection of single nanoparticles by using ICP-MS, but also coulometry, are in their way to gain a position. Chemical sensors selective to nanoparticles are in their early stages, but they are very promising considering their portability and simplicity. Although the field is in continuous evolution, at this moment it is moving from proofs-of-concept in simple matrices to methods dealing with matrices of higher complexity and relevant analyte concentrations. To achieve this goal, sample preparation methods are essential to manage such complex situations. Apart from size fractionation methods, matrix digestion, extraction and concentration methods capable of preserving the nature of the nanoparticles are being developed. This review presents and discusses the state-of-the-art analytical techniques and sample preparation methods suitable for dealing with complex samples. Single- and multi-method approaches applied to solve the nanometrological challenges posed by a variety of stakeholders are also presented. Copyright © 2015 Elsevier B.V. All rights reserved.

  14. Investigating Effect of Origami-Based Instruction on Elementary Students' Spatial Skills and Perceptions

    ERIC Educational Resources Information Center

    Cakmak, Sedanur; Isiksal, Mine; Koc, Yusuf

    2014-01-01

    The authors' purpose was to investigate the effect of origami-based instruction on elementary students' spatial ability. The students' self-reported perceptions related to the origami-based instruction were also examined. Data was collected via purposive sampling techniques from students enrolled in a private elementary school. A spatial ability…

  15. Transforming Classrooms through Game-Based Learning: A Feasibility Study in a Developing Country

    ERIC Educational Resources Information Center

    Vate-U-Lan, Poonsri

    2015-01-01

    This article reports an exploratory study which investigated attitudes towards the practice of game-based learning in teaching STEM (science, technology, engineering and mathematics) within a Thai educational context. This self-administered Internet-based survey yielded 169 responses from a snowball sampling technique. Three fifths of respondents…

  16. Interferometric Dynamic Measurement: Techniques Based on High-Speed Imaging or a Single Photodetector

    PubMed Central

    Fu, Yu; Pedrini, Giancarlo

    2014-01-01

    In recent years, optical interferometry-based techniques have been widely used to perform noncontact measurement of dynamic deformation in different industrial areas. In these applications, various physical quantities need to be measured in any instant and the Nyquist sampling theorem has to be satisfied along the time axis on each measurement point. Two types of techniques were developed for such measurements: one is based on high-speed cameras and the other uses a single photodetector. The limitation of the measurement range along the time axis in camera-based technology is mainly due to the low capturing rate, while the photodetector-based technology can only do the measurement on a single point. In this paper, several aspects of these two technologies are discussed. For the camera-based interferometry, the discussion includes the introduction of the carrier, the processing of the recorded images, the phase extraction algorithms in various domains, and how to increase the temporal measurement range by using multiwavelength techniques. For the detector-based interferometry, the discussion mainly focuses on the single-point and multipoint laser Doppler vibrometers and their applications for measurement under extreme conditions. The results show the effort done by researchers for the improvement of the measurement capabilities using interferometry-based techniques to cover the requirements needed for the industrial applications. PMID:24963503

  17. Quantification of transuranic elements by time interval correlation spectroscopy of the detected neutrons

    PubMed

    Baeten; Bruggeman; Paepen; Carchon

    2000-03-01

    The non-destructive quantification of transuranic elements in nuclear waste management or in safeguards verifications is commonly performed by passive neutron assay techniques. To minimise the number of unknown sample-dependent parameters, Neutron Multiplicity Counting (NMC) is applied. We developed a new NMC-technique, called Time Interval Correlation Spectroscopy (TICS), which is based on the measurement of Rossi-alpha time interval distributions. Compared to other NMC-techniques, TICS offers several advantages.

  18. Field Measurement of the Acoustic Nonlinearity Parameter in Turbine Blades

    NASA Technical Reports Server (NTRS)

    Hinton, Yolanda L.; Na, Jeong K.; Yost, William T.; Kessel, Gregory L.

    2000-01-01

    Nonlinear acoustics techniques were used to measure fatigue in turbine blades in a power generation plant. The measurements were made in the field using a reference based measurement technique, and a reference sample previously measured in the laboratory. The acoustic nonlinearity parameter showed significant increase with fatigue in the blades, as indicated by service age and areas of increased stress. The technique shows promise for effectively measuring fatigue in field applications and predicting subsequent failures.

  19. Instrumental color control for metallic coatings

    NASA Astrophysics Data System (ADS)

    Chou, W.; Han, Bing; Cui, Guihua; Rigg, Bryan; Luo, Ming R.

    2002-06-01

    This paper describes work investigating a suitable color quality control method for metallic coatings. A set of psychological experiments was carried out based upon 50 pairs of samples. The results were used to test the performance of various color difference formulae. Different techniques were developed by optimising the weights and/or the lightness parametric factors of colour differences calculated from the four measuring angles. The results show that the new techniques give a significant improvement compared to conventional techniques.

  20. Light Scattering based detection of food pathogens

    USDA-ARS?s Scientific Manuscript database

    The current methods for detecting foodborne pathogens are mostly destructive (i.e., samples need to be pretreated), and require time, personnel, and laboratories for analyses. Optical methods including light scattering based techniques have gained a lot of attention recently due to its their rapid a...

  1. Practical no-gold-standard evaluation framework for quantitative imaging methods: application to lesion segmentation in positron emission tomography

    PubMed Central

    Jha, Abhinav K.; Mena, Esther; Caffo, Brian; Ashrafinia, Saeed; Rahmim, Arman; Frey, Eric; Subramaniam, Rathan M.

    2017-01-01

    Abstract. Recently, a class of no-gold-standard (NGS) techniques have been proposed to evaluate quantitative imaging methods using patient data. These techniques provide figures of merit (FoMs) quantifying the precision of the estimated quantitative value without requiring repeated measurements and without requiring a gold standard. However, applying these techniques to patient data presents several practical difficulties including assessing the underlying assumptions, accounting for patient-sampling-related uncertainty, and assessing the reliability of the estimated FoMs. To address these issues, we propose statistical tests that provide confidence in the underlying assumptions and in the reliability of the estimated FoMs. Furthermore, the NGS technique is integrated within a bootstrap-based methodology to account for patient-sampling-related uncertainty. The developed NGS framework was applied to evaluate four methods for segmenting lesions from F-Fluoro-2-deoxyglucose positron emission tomography images of patients with head-and-neck cancer on the task of precisely measuring the metabolic tumor volume. The NGS technique consistently predicted the same segmentation method as the most precise method. The proposed framework provided confidence in these results, even when gold-standard data were not available. The bootstrap-based methodology indicated improved performance of the NGS technique with larger numbers of patient studies, as was expected, and yielded consistent results as long as data from more than 80 lesions were available for the analysis. PMID:28331883

  2. Studies on the effect of dispersoid(ZrO2) in PVdF-co-HFP based gel polymer electrolytes

    NASA Astrophysics Data System (ADS)

    Sivakumar, M.; Subadevi, R.; Muthupradeepa, R.

    2013-06-01

    Gel polymer electrolytes containing poly(vinylidenefluoride-co-hexafluoropropylene) (P(VdF-co-HFP)) / Lithium bis(trifluoromethane sulfon)imide (LiTFSI) / mixture of ethylene carbonate and propylene carbonate (EC+PC) with different concendration of ZrO2 has been prepared using the solution casting technique. The conductivity of the prepared electrolyte sample has been determined by AC impedance technique in the range 303-353K. The temperature dependent ionic conductivity plot seems to obey VTF relation. The maximum ionic conductivity value of 4.46 × 10-3S/cm has been obtained for PVdF-co-HFP(32%) - LiTFSI(8%) - EC+PC (60%) + ZrO2(6wt%) based polymer electrolyte. The surface morphology of the prepared electrolyte sample has been studied using SEM.

  3. Neutron/Gamma-ray discrimination through measures of fit

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Amiri, Moslem; Prenosil, Vaclav; Cvachovec, Frantisek

    2015-07-01

    Statistical tests and their underlying measures of fit can be utilized to separate neutron/gamma-ray pulses in a mixed radiation field. In this article, first the application of a sample statistical test is explained. Fit measurement-based methods require true pulse shapes to be used as reference for discrimination. This requirement makes practical implementation of these methods difficult; typically another discrimination approach should be employed to capture samples of neutrons and gamma-rays before running the fit-based technique. In this article, we also propose a technique to eliminate this requirement. These approaches are applied to several sets of mixed neutron and gamma-ray pulsesmore » obtained through different digitizers using stilbene scintillator in order to analyze them and measure their discrimination quality. (authors)« less

  4. A PLL-based resampling technique for vibration analysis in variable-speed wind turbines with PMSG: A bearing fault case

    NASA Astrophysics Data System (ADS)

    Pezzani, Carlos M.; Bossio, José M.; Castellino, Ariel M.; Bossio, Guillermo R.; De Angelo, Cristian H.

    2017-02-01

    Condition monitoring in permanent magnet synchronous machines has gained interest due to the increasing use in applications such as electric traction and power generation. Particularly in wind power generation, non-invasive condition monitoring techniques are of great importance. Usually, in such applications the access to the generator is complex and costly, while unexpected breakdowns results in high repair costs. This paper presents a technique which allows using vibration analysis for bearing fault detection in permanent magnet synchronous generators used in wind turbines. Given that in wind power applications the generator rotational speed may vary during normal operation, it is necessary to use special sampling techniques to apply spectral analysis of mechanical vibrations. In this work, a resampling technique based on order tracking without measuring the rotor position is proposed. To synchronize sampling with rotor position, an estimation of the rotor position obtained from the angle of the voltage vector is proposed. This angle is obtained from a phase-locked loop synchronized with the generator voltages. The proposed strategy is validated by laboratory experimental results obtained from a permanent magnet synchronous generator. Results with single point defects in the outer race of a bearing under variable speed and load conditions are presented.

  5. HMM-based lexicon-driven and lexicon-free word recognition for online handwritten Indic scripts.

    PubMed

    Bharath, A; Madhvanath, Sriganesh

    2012-04-01

    Research for recognizing online handwritten words in Indic scripts is at its early stages when compared to Latin and Oriental scripts. In this paper, we address this problem specifically for two major Indic scripts--Devanagari and Tamil. In contrast to previous approaches, the techniques we propose are largely data driven and script independent. We propose two different techniques for word recognition based on Hidden Markov Models (HMM): lexicon driven and lexicon free. The lexicon-driven technique models each word in the lexicon as a sequence of symbol HMMs according to a standard symbol writing order derived from the phonetic representation. The lexicon-free technique uses a novel Bag-of-Symbols representation of the handwritten word that is independent of symbol order and allows rapid pruning of the lexicon. On handwritten Devanagari word samples featuring both standard and nonstandard symbol writing orders, a combination of lexicon-driven and lexicon-free recognizers significantly outperforms either of them used in isolation. In contrast, most Tamil word samples feature the standard symbol order, and the lexicon-driven recognizer outperforms the lexicon free one as well as their combination. The best recognition accuracies obtained for 20,000 word lexicons are 87.13 percent for Devanagari when the two recognizers are combined, and 91.8 percent for Tamil using the lexicon-driven technique.

  6. Ultrasonic characterization of granites obtained from industrial quarries of Extremadura (Spain).

    PubMed

    del Río, L M; López, F; Esteban, F J; Tejado, J J; Mota, M; González, I; San Emeterio, J L; Ramos, A

    2006-12-22

    The industry of ornamental rocks, such as granites, represents one of the most important industrial activities in the region of Extremadura, SW Spain. A detailed knowledge of the intrinsic properties of this natural stone and its environmental evolution is a required goal in order to fully characterize its quality. In this work, two independent NDT acoustic techniques have been used to measure the acoustic velocity of longitudinal waves in different prismatic granitic-samples of industrial quarries. A low-frequency transceiver set-up, based on a high-voltage BPV Steinkamp instrument and two 50 kHz probes, has been used to measure pulse travel times by ultrasonic through-transmission testing. In complementary fashion, an Erudite MK3 test equipment with an electromagnetic vibrator and two piezoelectric sensors has also been employed to measure ultrasonic velocity by means of a resonance-based method, using the same types of granite varieties. In addition, a comprehensive set of physical/mechanical properties have also been analyzed, according to Spanish regulations in force, by means of alternative methods including destructive techniques such as strength, porosity, absorption, etc. A large number of samples, representing the most important varieties of granites from quarries of Extremadura, have been analyzed using the above-mentioned procedures. Some results obtained by destructive techniques have been correlated with those found using ultrasonic techniques. Our experimental setting allowed a complementary characterization of granite samples and a thorough validation of the different techniques employed, thus providing the industry of ornamental rocks with a non-destructive tool that will facilitate a more detailed insight on the properties of the rocks under study.

  7. Compressive Sampling based Image Coding for Resource-deficient Visual Communication.

    PubMed

    Liu, Xianming; Zhai, Deming; Zhou, Jiantao; Zhang, Xinfeng; Zhao, Debin; Gao, Wen

    2016-04-14

    In this paper, a new compressive sampling based image coding scheme is developed to achieve competitive coding efficiency at lower encoder computational complexity, while supporting error resilience. This technique is particularly suitable for visual communication with resource-deficient devices. At the encoder, compact image representation is produced, which is a polyphase down-sampled version of the input image; but the conventional low-pass filter prior to down-sampling is replaced by a local random binary convolution kernel. The pixels of the resulting down-sampled pre-filtered image are local random measurements and placed in the original spatial configuration. The advantages of local random measurements are two folds: 1) preserve high-frequency image features that are otherwise discarded by low-pass filtering; 2) remain a conventional image and can therefore be coded by any standardized codec to remove statistical redundancy of larger scales. Moreover, measurements generated by different kernels can be considered as multiple descriptions of the original image and therefore the proposed scheme has the advantage of multiple description coding. At the decoder, a unified sparsity-based soft-decoding technique is developed to recover the original image from received measurements in a framework of compressive sensing. Experimental results demonstrate that the proposed scheme is competitive compared with existing methods, with a unique strength of recovering fine details and sharp edges at low bit-rates.

  8. Improving the accuracy of effect-directed analysis: the role of bioavailability.

    PubMed

    You, Jing; Li, Huizhen

    2017-12-13

    Aquatic ecosystems have been suffering from contamination by multiple stressors. Traditional chemical-based risk assessment usually fails to explain the toxicity contributions from contaminants that are not regularly monitored or that have an unknown identity. Diagnosing the causes of noted adverse outcomes in the environment is of great importance in ecological risk assessment and in this regard effect-directed analysis (EDA) has been designed to fulfill this purpose. The EDA approach is now increasingly used in aquatic risk assessment owing to its specialty in achieving effect-directed nontarget analysis; however, a lack of environmental relevance makes conventional EDA less favorable. In particular, ignoring the bioavailability in EDA may cause a biased and even erroneous identification of causative toxicants in a mixture. Taking bioavailability into consideration is therefore of great importance to improve the accuracy of EDA diagnosis. The present article reviews the current status and applications of EDA practices that incorporate bioavailability. The use of biological samples is the most obvious way to include bioavailability into EDA applications, but its development is limited due to the small sample size and lack of evidence for metabolizable compounds. Bioavailability/bioaccessibility-based extraction (bioaccessibility-directed and partitioning-based extraction) and passive-dosing techniques are recommended to be used to integrate bioavailability into EDA diagnosis in abiotic samples. Lastly, the future perspectives of expanding and standardizing the use of biological samples and bioavailability-based techniques in EDA are discussed.

  9. Evaluating structural connectomics in relation to different Q-space sampling techniques.

    PubMed

    Rodrigues, Paulo; Prats-Galino, Alberto; Gallardo-Pujol, David; Villoslada, Pablo; Falcon, Carles; Prckovska, Vesna

    2013-01-01

    Brain networks are becoming forefront research in neuroscience. Network-based analysis on the functional and structural connectomes can lead to powerful imaging markers for brain diseases. However, constructing the structural connectome can be based upon different acquisition and reconstruction techniques whose information content and mutual differences has not yet been properly studied in a unified framework. The variations of the structural connectome if not properly understood can lead to dangerous conclusions when performing these type of studies. In this work we present evaluation of the structural connectome by analysing and comparing graph-based measures on real data acquired by the three most important Diffusion Weighted Imaging techniques: DTI, HARDI and DSI. We thus come to several important conclusions demonstrating that even though the different techniques demonstrate differences in the anatomy of the reconstructed fibers the respective connectomes show variations of 20%.

  10. I Environmental DNA sampling is more sensitive than a traditional survey technique for detecting an aquatic invader.

    PubMed

    Smart, Adam S; Tingley, Reid; Weeks, Andrew R; van Rooyen, Anthony R; McCarthy, Michael A

    2015-10-01

    Effective management of alien species requires detecting populations in the early stages of invasion. Environmental DNA (eDNA) sampling can detect aquatic species at relatively low densities, but few studies have directly compared detection probabilities of eDNA sampling with those of traditional sampling methods. We compare the ability of a traditional sampling technique (bottle trapping) and eDNA to detect a recently established invader, the smooth newt Lissotriton vulgaris vulgaris, at seven field sites in Melbourne, Australia. Over a four-month period, per-trap detection probabilities ranged from 0.01 to 0.26 among sites where L. v. vulgaris was detected, whereas per-sample eDNA estimates were much higher (0.29-1.0). Detection probabilities of both methods varied temporally (across days and months), but temporal variation appeared to be uncorrelated between methods. Only estimates of spatial variation were strongly correlated across the two sampling techniques. Environmental variables (water depth, rainfall, ambient temperature) were not clearly correlated with detection probabilities estimated via trapping, whereas eDNA detection probabilities were negatively correlated with water depth, possibly reflecting higher eDNA concentrations at lower water levels. Our findings demonstrate that eDNA sampling can be an order of magnitude more sensitive than traditional methods, and illustrate that traditional- and eDNA-based surveys can provide independent information on species distributions when occupancy surveys are conducted over short timescales.

  11. A Compact, Solid-State UV (266 nm) Laser System Capable of Burst-Mode Operation for Laser Ablation Desorption Processing

    NASA Technical Reports Server (NTRS)

    Arevalo, Ricardo, Jr.; Coyle, Barry; Paulios, Demetrios; Stysley, Paul; Feng, Steve; Getty, Stephanie; Binkerhoff, William

    2015-01-01

    Compared to wet chemistry and pyrolysis techniques, in situ laser-based methods of chemical analysis provide an ideal way to characterize precious planetary materials without requiring extensive sample processing. In particular, laser desorption and ablation techniques allow for rapid, reproducible and robust data acquisition over a wide mass range, plus: Quantitative, spatially-resolved measurements of elemental and molecular (organic and inorganic) abundances; Low analytical blanks and limits-of-detection ( ng g-1); and, the destruction of minimal quantities of sample ( g) compared to traditional solution and/or pyrolysis analyses (mg).

  12. Method And Apparatus For Evaluatin Of High Temperature Superconductors

    DOEpatents

    Fishman, Ilya M.; Kino, Gordon S.

    1996-11-12

    A technique for evaluation of high-T.sub.c superconducting films and single crystals is based on measurement of temperature dependence of differential optical reflectivity of high-T.sub.c materials. In the claimed method, specific parameters of the superconducting transition such as the critical temperature, anisotropy of the differential optical reflectivity response, and the part of the optical losses related to sample quality are measured. The apparatus for performing this technique includes pump and probe sources, cooling means for sweeping sample temperature across the critical temperature and polarization controller for controlling a state of polarization of a probe light beam.

  13. Fast discrimination of traditional Chinese medicine according to geographical origins with FTIR spectroscopy and advanced pattern recognition techniques

    NASA Astrophysics Data System (ADS)

    Li, Ning; Wang, Yan; Xu, Kexin

    2006-08-01

    Combined with Fourier transform infrared (FTIR) spectroscopy and three kinds of pattern recognition techniques, 53 traditional Chinese medicine danshen samples were rapidly discriminated according to geographical origins. The results showed that it was feasible to discriminate using FTIR spectroscopy ascertained by principal component analysis (PCA). An effective model was built by employing the Soft Independent Modeling of Class Analogy (SIMCA) and PCA, and 82% of the samples were discriminated correctly. Through use of the artificial neural network (ANN)-based back propagation (BP) network, the origins of danshen were completely classified.

  14. Comparison of specimen adequacy and smear quality in oral smears prepared by manual liquid-based cytology and conventional methods

    PubMed Central

    Shukla, Surabhi; Einstein, A; Shukla, Abhilasha; Mishra, Deepika

    2015-01-01

    Background: Liquid-based cytology (LBC), recommended in the mass screening of potentially malignant cervical and oral lesions, suffers from high cost owing to the use of expensive automated devices and materials. Considering the need for cost-effective LBC techniques, we evaluated the efficacy of an inexpensive manual LBC (MLBC) technique against conventional cytological technique in terms of specimen adequacy and smear quality of oral smears. Materials and Methods: Cytological samples were collected from 21 patients using a cytobrush device. After preparation of a conventional smear, the brush containing the remaining sample was immersed in the preservative vial. The preserved material was processed by an MLBC technique and subsequently, direct smears were made from the prepared cell button. Both conventional and MLBC smears were stained by routine Papanicolaou technique and evaluated by an independent observer for the thickness of the smear, cellular distribution, resolution/clarity of cells, cellular staining characteristics and the presence of unsatisfactory background/artifacts. Each parameter was graded as satisfactory; or satisfactory, but limited; or unsatisfactory. Chi-square test was used to compare the values obtained (significance set at P ≤ 0.05). Results: MLBC technique produced a significant number of satisfactory smears with regard to cell distribution, clarity/resolution, staining characteristics and background/artifacts compared to conventional methods. Conclusions: MLBC is a cost-effective cytological technique that may produce oral smears with excellent cytomorphology and longer storage life. PMID:26980958

  15. Comparative performance of three sampling techniques to detect airborne Salmonella species in poultry farms.

    PubMed

    Adell, Elisa; Moset, Verónica; Zhao, Yang; Jiménez-Belenguer, Ana; Cerisuelo, Alba; Cambra-López, María

    2014-01-01

    Sampling techniques to detect airborne Salmonella species (spp.) in two pilot scale broiler houses were compared. Broilers were inoculated at seven days of age with a marked strain of Salmonella enteritidis. The rearing cycle lasted 42 days during the summer. Airborne Salmonella spp. were sampled weekly using impaction, gravitational settling, and impingement techniques. Additionally, Salmonella spp. were sampled on feeders, drinkers, walls, and in the litter. Environmental conditions (temperature, relative humidity, and airborne particulate matter (PM) concentration) were monitored during the rearing cycle. The presence of Salmonella spp. was determined by culture-dependent and molecular methods. No cultivable Salmonella spp. were recovered from the poultry houses' surfaces, the litter, or the air before inoculation. After inoculation, cultivable Salmonella spp. were recovered from the surfaces and in the litter. Airborne cultivable Salmonella spp. Were detected using impaction and gravitational settling one or two weeks after the detection of Salmonella spp. in the litter. No cultivable Salmonella spp. were recovered using impingement based on culture-dependent techniques. At low airborne concentrations, the use of impingement for the quantification or detection of cultivable airborne Salmonella spp. is not recommended. In these cases, a combination of culture-dependent and culture-independent methods is recommended. These data are valuable to improve current measures to control the transmission of pathogens in livestock environments and for optimising the sampling and detection of airborne Salmonella spp. in practical conditions.

  16. A review of microdialysis coupled to microchip electrophoresis for monitoring biological events

    PubMed Central

    Saylor, Rachel A.; Lunte, Susan M.

    2015-01-01

    Microdialysis is a powerful sampling technique that enables monitoring of dynamic processes in vitro and in vivo. The combination of microdialysis with chromatographic or electrophoretic methods yields along with selective detection methods yields a “separation-based sensor” capable of monitoring multiple analytes in near real time. Analysis of microdialysis samples requires techniques that are fast (<1 min), have low volume requirements (nL–pL), and, ideally, can be employed on-line. Microchip electrophoresis fulfills these requirements and also permits the possibility of integrating sample preparation and manipulation with detection strategies directly on-chip. Microdialysis coupled to microchip electrophoresis has been employed for monitoring biological events in vivo and in vitro. This review discusses technical considerations for coupling microdialysis sampling and microchip electrophoresis, including various interface designs, and current applications in the field. PMID:25637011

  17. Screening for trace explosives by AccuTOF™-DART®: an in-depth validation study.

    PubMed

    Sisco, Edward; Dake, Jeffrey; Bridge, Candice

    2013-10-10

    Ambient ionization mass spectrometry is finding increasing utility as a rapid analysis technique in a number of fields. In forensic science specifically, analysis of many types of samples, including drugs, explosives, inks, bank dye, and lotions, has been shown to be possible using these techniques [1]. This paper focuses on one type of ambient ionization mass spectrometry, Direct Analysis in Real Time Mass Spectrometry (DART-MS or DART), and its viability as a screening tool for trace explosives analysis. In order to assess viability, a validation study was completed which focused on the analysis of trace amounts of nitro and peroxide based explosives. Topics which were studied, and are discussed, include method optimization, reproducibility, sensitivity, development of a search library, discrimination of mixtures, and blind sampling. Advantages and disadvantages of this technique over other similar screening techniques are also discussed. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  18. Techniques for cytologic sampling of pancreatic and bile duct lesions: The Papanicolaou Society of Cytopathology Guidelines.

    PubMed

    Brugge, William R; De Witt, John; Klapman, Jason B; Ashfaq, Raheela; Shidham, Vinod; Chhieng, David; Kwon, Richard; Baloch, Zubair; Zarka, Matthew; Staerkel, Gregg

    2014-01-01

    The Papanicolaou Society of Cytopathology has developed a set of guidelines for pancreatobiliary cytology, including indications for endoscopic ultrasound guided fine-needle aspiration biopsy, techniques of the endoscopic retrograde cholangiopancreatography, terminology and nomenclature of pancreatobiliary disease, ancillary testing, and postbiopsy management. All documents are based on the expertise of the authors, a review of literature, discussions of the draft document at several national and international meetings over an 18 month period and synthesis of online comments of the draft document on the Papanicolaou Society of Cytopathology website [www.papsociety.org]. This document presents the results of these discussions regarding the use of sampling techniques in the cytological diagnosis of biliary and pancreatic lesions. This document summarizes the current state of the art for techniques in acquiring cytology specimens from the biliary tree as well as solid and cystic lesions of the pancreas.

  19. Application of Raman spectroscopy to identification and sorting of post-consumer plastics for recycling

    DOEpatents

    Sommer, Edward J.; Rich, John T.

    2001-01-01

    A high accuracy rapid system for sorting a plurality of waste products by polymer type. The invention involves the application of Raman spectroscopy and complex identification techniques to identify and sort post-consumer plastics for recycling. The invention reads information unique to the molecular structure of the materials to be sorted to identify their chemical compositions and uses rapid high volume sorting techniques to sort them into product streams at commercially viable throughput rates. The system employs a laser diode (20) for irradiating the material sample (10), a spectrograph (50) is used to determine the Raman spectrum of the material sample (10) and a microprocessor based controller (70) is employed to identify the polymer type of the material sample (10).

  20. Rapid, all-optical crystal orientation imaging of two-dimensional transition metal dichalcogenide monolayers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    David, Sabrina N.; Zhai, Yao; van der Zande, Arend M.

    Two-dimensional (2D) atomic materials such as graphene and transition metal dichalcogenides (TMDCs) have attracted significant research and industrial interest for their electronic, optical, mechanical, and thermal properties. While large-area crystal growth techniques such as chemical vapor deposition have been demonstrated, the presence of grain boundaries and orientation of grains arising in such growths substantially affect the physical properties of the materials. There is currently no scalable characterization method for determining these boundaries and orientations over a large sample area. We here present a second-harmonic generation based microscopy technique for rapidly mapping grain orientations and boundaries of 2D TMDCs. We experimentallymore » demonstrate the capability to map large samples to an angular resolution of ±1° with minimal sample preparation and without involved analysis. A direct comparison of the all-optical grain orientation maps against results obtained by diffraction-filtered dark-field transmission electron microscopy plus selected-area electron diffraction on identical TMDC samples is provided. This rapid and accurate tool should enable large-area characterization of TMDC samples for expedited studies of grain boundary effects and the efficient characterization of industrial-scale production techniques.« less

  1. Characterization of anisotropic thermal conductivity of suspended nm-thick black phosphorus with frequency-resolved Raman spectroscopy

    NASA Astrophysics Data System (ADS)

    Wang, Tianyu; Han, Meng; Wang, Ridong; Yuan, Pengyu; Xu, Shen; Wang, Xinwei

    2018-04-01

    Frequency-resolved Raman spectroscopy (FR-Raman) is a new technique for nondestructive thermal characterization. Here, we apply this new technique to measure the anisotropic thermal conductivity of suspended nm-thick black phosphorus samples without the need of optical absorption and temperature coefficient. Four samples with thicknesses between 99.8 and 157.6 nm are studied. Based on steady state laser heating and Raman measurement of samples with a specifically designed thermal transport path, the thermal conductivity ratio (κZZ/κAC) is determined to be 1.86-3.06. Based on the FR-Raman measurements, the armchair thermal conductivity is measured as 14-22 W m-1 K-1, while the zigzag thermal conductivity is 40-63 W m-1 K-1. FR-Raman has great potential for studying the thermal properties of various nanomaterials. This study significantly advances our understanding of thermal transport in black phosphorus and facilitates the application of black phosphorus in novel devices.

  2. TL and ESR based identification of gamma-irradiated frozen fish using different hydrolysis techniques

    NASA Astrophysics Data System (ADS)

    Ahn, Jae-Jun; Akram, Kashif; Shahbaz, Hafiz Muhammad; Kwon, Joong-Ho

    2014-12-01

    Frozen fish fillets (walleye Pollack and Japanese Spanish mackerel) were selected as samples for irradiation (0-10 kGy) detection trials using different hydrolysis methods. Photostimulated luminescence (PSL)-based screening analysis for gamma-irradiated frozen fillets showed low sensitivity due to limited silicate mineral contents on the samples. Same limitations were found in the thermoluminescence (TL) analysis on mineral samples isolated by density separation method. However, acid (HCl) and alkali (KOH) hydrolysis methods were effective in getting enough minerals to carry out TL analysis, which was reconfirmed through the normalization step by calculating the TL ratios (TL1/TL2). For improved electron spin resonance (ESR) analysis, alkali and enzyme (alcalase) hydrolysis methods were compared in separating minute-bone fractions. The enzymatic method provided more clear radiation-specific hydroxyapatite radicals than that of the alkaline method. Different hydrolysis methods could extend the application of TL and ESR techniques in identifying the irradiation history of frozen fish fillets.

  3. Probing metamaterials with structured light

    DOE PAGES

    Xu, Yun; Sun, Jingbo; Walasik, Wiktor; ...

    2016-11-03

    Photonic metamaterials and metasurfaces are nanostructured optical materials engineered to enable properties that have not been found in nature. Optical characterization of these structures is a challenging task. We report a reliable technique that is particularly useful for characterization of phase properties introduced by small and spatially inhomogeneous samples of metamaterials and metasurfaces. The proposed structured light, or vortex based interferometric method is used to directly visualize phase changes introduced by subwavelength-thick nanostructures. In order to demonstrate the efficiency of the proposed technique, we designed and fabricated several metasurface samples consisting of metal nano-antennas introducing different phase shifts and experimentallymore » measured phase shifts of the transmitted light. The experimental results are in good agreement with numerical simulations and with the designed properties of the antenna arrays. Finally, due to the presence of the singularity in the vortex beam, one of the potential applications of the proposed approach based on structured light is step-by-step probing of small fractions of the micro-scale samples or images.« less

  4. Detection of Hepatic Fibrosis in Ex Vivo Liver Samples Using an Open-Photoacoustic-Cell Method: Feasibility Study

    NASA Astrophysics Data System (ADS)

    Stolik, S.; Fabila, D. A.; de la Rosa, J. M.; Escobedo, G.; Suárez-Álvarez, K.; Tomás, S. A.

    2015-09-01

    Design of non-invasive and accurate novel methods for liver fibrosis diagnosis has gained growing interest. Different stages of liver fibrosis were induced in Wistar rats by intraperitoneally administering different doses of carbon tetrachloride. The liver fibrosis degree was conventionally determined by means of histological examination. An open-photoacoustic-cell (OPC) technique for the assessment of liver fibrosis was developed and is reported here. The OPC technique is based on the fact that the thermal diffusivity can be accurately measured by photoacoustics taking into consideration the photoacoustic signal amplitude versus the modulation frequency. This technique measures directly the heat generated in a sample, due to non-radiative de-excitation processes, following the absorption of light. The thermal diffusivity was measured with a home-made open-photoacoustic-cell system that was specially designed to perform the measurement from ex vivo liver samples. The human liver tissue showed a significant increase in the thermal diffusivity depending on the fibrosis stage. Specifically, liver samples from rats exhibiting hepatic fibrosis showed a significantly higher value of the thermal diffusivity than for control animals.

  5. Genetic programming based ensemble system for microarray data classification.

    PubMed

    Liu, Kun-Hong; Tong, Muchenxuan; Xie, Shu-Tong; Yee Ng, Vincent To

    2015-01-01

    Recently, more and more machine learning techniques have been applied to microarray data analysis. The aim of this study is to propose a genetic programming (GP) based new ensemble system (named GPES), which can be used to effectively classify different types of cancers. Decision trees are deployed as base classifiers in this ensemble framework with three operators: Min, Max, and Average. Each individual of the GP is an ensemble system, and they become more and more accurate in the evolutionary process. The feature selection technique and balanced subsampling technique are applied to increase the diversity in each ensemble system. The final ensemble committee is selected by a forward search algorithm, which is shown to be capable of fitting data automatically. The performance of GPES is evaluated using five binary class and six multiclass microarray datasets, and results show that the algorithm can achieve better results in most cases compared with some other ensemble systems. By using elaborate base classifiers or applying other sampling techniques, the performance of GPES may be further improved.

  6. Genetic Programming Based Ensemble System for Microarray Data Classification

    PubMed Central

    Liu, Kun-Hong; Tong, Muchenxuan; Xie, Shu-Tong; Yee Ng, Vincent To

    2015-01-01

    Recently, more and more machine learning techniques have been applied to microarray data analysis. The aim of this study is to propose a genetic programming (GP) based new ensemble system (named GPES), which can be used to effectively classify different types of cancers. Decision trees are deployed as base classifiers in this ensemble framework with three operators: Min, Max, and Average. Each individual of the GP is an ensemble system, and they become more and more accurate in the evolutionary process. The feature selection technique and balanced subsampling technique are applied to increase the diversity in each ensemble system. The final ensemble committee is selected by a forward search algorithm, which is shown to be capable of fitting data automatically. The performance of GPES is evaluated using five binary class and six multiclass microarray datasets, and results show that the algorithm can achieve better results in most cases compared with some other ensemble systems. By using elaborate base classifiers or applying other sampling techniques, the performance of GPES may be further improved. PMID:25810748

  7. Comparison of gel column, card, and cartridge techniques for dog erythrocyte antigen 1.1 blood typing

    PubMed Central

    Seth, Mayank; Jackson, Karen V.; Winzelberg, Sarah; Giger, Urs

    2012-01-01

    Objective To compare accuracy and ease of use of a card agglutination assay, an immunochromatographic cartridge method, and a gel-based method for canine blood typing. Sample Blood samples from 52 healthy blood donor dogs, 10 dogs with immune-mediated hemolytic anemia (IMHA), and 29 dogs with other diseases. Procedures Blood samples were tested in accordance with manufacturer guidelines. Samples with low PCVs were created by the addition of autologous plasma to separately assess the effects of anemia on test results. Results Compared with a composite reference standard of agreement between 2 methods, the gel-based method was found to be 100% accurate. The card agglutination assay was 89% to 91% accurate, depending on test interpretation, and the immunochromatographic cartridge method was 93% accurate but 100% specific. Errors were observed more frequently in samples from diseased dogs, particularly those with IMHA. In the presence of persistent autoagglutination, dog erythrocyte antigen (DEA) 1.1 typing was not possible, except with the immunochromatographic cartridge method. Conclusions and Clinical Relevance The card agglutination assay and immunochromatographic cartridge method, performed by trained personnel, were suitable for in-clinic emergency DEA 1.1 blood typing. There may be errors, particularly for samples from dogs with IMHA, and the immunochromatographic cartridge method may have an advantage of allowing typing of samples with persistent autoagglutination. The laboratory gel-based method would be preferred for routine DEA 1.1 typing of donors and patients if it is available and time permits. Current DEA 1.1 typing techniques appear to be appropriately standardized and easy to use. PMID:22280380

  8. Sampling and storage of blood for pH and blood gas analysis.

    PubMed

    Haskins, S C

    1977-02-15

    Techniques used in sampling and storage of a blood sample for pH and gas measurements can have an important effect on the measured values. Observation of these techniques and principles will minimize in vitro alteration of the pH and blood gas values. To consider that a significant change has occurred in a pH or blood gas measurement from previous values, the change must exceed 0.015 for pH, 3 mm Hg for PCO2, 5 mm Hg for PO2, and 2 mEq/L for [HCO-3] or base excess/deficit. In vitro dilution of the blood sample with anticoagulant should be avoided because it will alter the measured PCO2 and base excess/deficit values. Arterial samples should be collected for meaningful pH and blood gas values. Central venous and free-flowing capillary blood can be used for screening procedures in normal patients but are subject to considerable error. A blood sample can be stored for up to 30 minutes at room temperature without significant change in acid-base values but only up to 12 minutes before significant changes occur in PO2. A blood sample can be stored for up to 3.5 hours in an ice-water bath without significant change in pH and for 6 hours without significant change in PCO2 or PO2. Variations of body temperatures from normal will cause a measurable change in pH and blood gas values when the blood is exposed to the normal water bath temperatures of the analyzer.

  9. Direct Detection of Pharmaceuticals and Personal Care Products from Aqueous Samples with Thermally-Assisted Desorption Electrospray Ionization Mass Spectrometry

    NASA Astrophysics Data System (ADS)

    Campbell, Ian S.; Ton, Alain T.; Mulligan, Christopher C.

    2011-07-01

    An ambient mass spectrometric method based on desorption electrospray ionization (DESI) has been developed to allow rapid, direct analysis of contaminated water samples, and the technique was evaluated through analysis of a wide array of pharmaceutical and personal care product (PPCP) contaminants. Incorporating direct infusion of aqueous sample and thermal assistance into the source design has allowed low ppt detection limits for the target analytes in drinking water matrices. With this methodology, mass spectral information can be collected in less than 1 min, consuming ~100 μL of total sample. Quantitative ability was also demonstrated without the use of an internal standard, yielding decent linearity and reproducibility. Initial results suggest that this source configuration is resistant to carryover effects and robust towards multi-component samples. The rapid, continuous analysis afforded by this method offers advantages in terms of sample analysis time and throughput over traditional hyphenated mass spectrometric techniques.

  10. Direct detection of pharmaceuticals and personal care products from aqueous samples with thermally-assisted desorption electrospray ionization mass spectrometry.

    PubMed

    Campbell, Ian S; Ton, Alain T; Mulligan, Christopher C

    2011-07-01

    An ambient mass spectrometric method based on desorption electrospray ionization (DESI) has been developed to allow rapid, direct analysis of contaminated water samples, and the technique was evaluated through analysis of a wide array of pharmaceutical and personal care product (PPCP) contaminants. Incorporating direct infusion of aqueous sample and thermal assistance into the source design has allowed low ppt detection limits for the target analytes in drinking water matrices. With this methodology, mass spectral information can be collected in less than 1 min, consuming ~100 μL of total sample. Quantitative ability was also demonstrated without the use of an internal standard, yielding decent linearity and reproducibility. Initial results suggest that this source configuration is resistant to carryover effects and robust towards multi-component samples. The rapid, continuous analysis afforded by this method offers advantages in terms of sample analysis time and throughput over traditional hyphenated mass spectrometric techniques.

  11. Studying Ultradisperse Diamond Structure within Explosively Synthesized Samples via X-Ray Techniques

    NASA Astrophysics Data System (ADS)

    Sharkov, M. D.; Boiko, M. E.; Ivashevskaya, S. N.; Belyakova, N. S.

    2013-08-01

    XRD (X-Ray Diffraction) and SAXS (Small-Angle X-Ray Scattering) data have been measured for a pair of samples produced with the help of explosives. XRD peaks have shown the both samples to contain crystal diamond components as well as graphite ones. Basing on SAXS analysis, possible presence of grains with radii up to 30-50 nm within all the samples has been shown. Structure components with fractal dimension between 1 and 2 in the sample have been detected, this fact being in agreement with the assumption of diamond grain coating similarity to onion shells. In order to broad rocking curves analysis, the standard SAXS treatment technique has been complemented by a Fourier filtering procedure. For the sample #1, rocking curve components corresponding to individual interplanar distances with magnitudes from 5 nm up to 15 nm have been separated. A hypothesis relating these values to the distances between concentric onion-like shells of diamond grains has been formulated.

  12. Raman spectroscopy-based detection of chemical contaminants in food powders

    USDA-ARS?s Scientific Manuscript database

    Raman spectroscopy technique has proven to be a reliable method for qualitative detection of chemical contaminants in food ingredients and products. For quantitative imaging-based detection, each contaminant particle in a food sample must be detected and it is important to determine the necessary sp...

  13. Fission gas bubble identification using MATLAB's image processing toolbox

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Collette, R.; King, J.; Keiser, Jr., D.

    Automated image processing routines have the potential to aid in the fuel performance evaluation process by eliminating bias in human judgment that may vary from person-to-person or sample-to-sample. In addition, this study presents several MATLAB based image analysis routines designed for fission gas void identification in post-irradiation examination of uranium molybdenum (U–Mo) monolithic-type plate fuels. Frequency domain filtration, enlisted as a pre-processing technique, can eliminate artifacts from the image without compromising the critical features of interest. This process is coupled with a bilateral filter, an edge-preserving noise removal technique aimed at preparing the image for optimal segmentation. Adaptive thresholding provedmore » to be the most consistent gray-level feature segmentation technique for U–Mo fuel microstructures. The Sauvola adaptive threshold technique segments the image based on histogram weighting factors in stable contrast regions and local statistics in variable contrast regions. Once all processing is complete, the algorithm outputs the total fission gas void count, the mean void size, and the average porosity. The final results demonstrate an ability to extract fission gas void morphological data faster, more consistently, and at least as accurately as manual segmentation methods.« less

  14. Fission gas bubble identification using MATLAB's image processing toolbox

    DOE PAGES

    Collette, R.; King, J.; Keiser, Jr., D.; ...

    2016-06-08

    Automated image processing routines have the potential to aid in the fuel performance evaluation process by eliminating bias in human judgment that may vary from person-to-person or sample-to-sample. In addition, this study presents several MATLAB based image analysis routines designed for fission gas void identification in post-irradiation examination of uranium molybdenum (U–Mo) monolithic-type plate fuels. Frequency domain filtration, enlisted as a pre-processing technique, can eliminate artifacts from the image without compromising the critical features of interest. This process is coupled with a bilateral filter, an edge-preserving noise removal technique aimed at preparing the image for optimal segmentation. Adaptive thresholding provedmore » to be the most consistent gray-level feature segmentation technique for U–Mo fuel microstructures. The Sauvola adaptive threshold technique segments the image based on histogram weighting factors in stable contrast regions and local statistics in variable contrast regions. Once all processing is complete, the algorithm outputs the total fission gas void count, the mean void size, and the average porosity. The final results demonstrate an ability to extract fission gas void morphological data faster, more consistently, and at least as accurately as manual segmentation methods.« less

  15. Practical Framework for an Electron Beam Induced Current Technique Based on a Numerical Optimization Approach

    NASA Astrophysics Data System (ADS)

    Yamaguchi, Hideshi; Soeda, Takeshi

    2015-03-01

    A practical framework for an electron beam induced current (EBIC) technique has been established for conductive materials based on a numerical optimization approach. Although the conventional EBIC technique is useful for evaluating the distributions of dopants or crystal defects in semiconductor transistors, issues related to the reproducibility and quantitative capability of measurements using this technique persist. For instance, it is difficult to acquire high-quality EBIC images throughout continuous tests due to variation in operator skill or test environment. Recently, due to the evaluation of EBIC equipment performance and the numerical optimization of equipment items, the constant acquisition of high contrast images has become possible, improving the reproducibility as well as yield regardless of operator skill or test environment. The technique proposed herein is even more sensitive and quantitative than scanning probe microscopy, an imaging technique that can possibly damage the sample. The new technique is expected to benefit the electrical evaluation of fragile or soft materials along with LSI materials.

  16. Online-LASIL: Laser Ablation of Solid Samples in Liquid with online-coupled ICP-OES detection for direct determination of the stoichiometry of complex metal oxide thin layers.

    PubMed

    Bonta, Maximilian; Frank, Johannes; Taibl, Stefanie; Fleig, Jürgen; Limbeck, Andreas

    2018-02-13

    Advanced materials such as complex metal oxides are used in a wide range of applications and have further promising perspectives in the form of thin films. The exact chemical composition essentially influences the electronic properties of these materials which makes correct assessment of their composition necessary. However, due to high chemical resistance and in the case of thin films low absolute analyte amounts, this procedure is in most cases not straightforward and extremely time-demanding. Commonly applied techniques either lack in ease of use (i.e., solution-based analysis with preceding sample dissolution), or adequately accurate quantification (i.e., solid sampling techniques). An analysis approach which combines the beneficial aspects of solution-based analysis as well as direct solid sampling is Laser Ablation of a Sample in Liquid (LASIL). In this work, it is shown that the analysis of major as well as minor sample constituents is possible using a novel online-LASIL setup, allowing sample analysis without manual sample handling after placing it in an ablation chamber. Strontium titanate (STO) thin layers with different compositions were analyzed in the course of this study. Precision of the newly developed online-LASIL method is comparable to conventional wet chemical approaches. With only about 15-20 min required for the analysis per sample, time demand is significantly reduced compared to often necessary fusion procedures lasting multiple hours. Copyright © 2017 Elsevier B.V. All rights reserved.

  17. Immunoexpression of vascular endothelial growth factor in gingival mucosa with papilloma and condyloma acuminata.

    PubMed

    Scrieciu, Monica; MercuŢ, Veronica; Andrei, Oana Cella; Predescu, Anca Mihaela; Niculescu, Mihaela; Pisoschi, Cătălina Gabriela; BaniŢă, Ileana Monica

    2015-01-01

    The histological changes of the oral mucosa in contact with a metal alloy dentures is one of the current issues widely debated in the literature. To highlight the expression of vascular endothelial growth factor (VEGF) in human paraprosthetic gingival mucosa exposed to nickel and copper compounds using the immunohistochemical technique. The selected participants were wearers of fixed dentures made of nickel-based alloys and copper-based alloys. The gingival mucosa fragments were prelevated through excision after removing fixed denture and extraction one of its affected teeth. The gingival mucosa fragments were processed through the histological technique of paraffin inclusion. The paraffin-embedded tissue sections were usually stained with Hematoxylin-Eosin and processed by immunohistochemical technique with VEGF antibody. The gingival mucosa fragments from nickel-based alloys dentures wearers were diagnosed with papilloma and, also, gingival mucosa samples prelevated from copper-based alloys dentures wearers were diagnosed with condyloma acuminata. Immunohistochemical reaction for VEGF was different in the gingival mucosa fragments with papilloma compared with condyloma acuminata samples. In papillomatosis gingival mucosa fragments, VEGF was implicated in principal in vasodilatation and inflammation process, and secondary in angiogenesis. In gingival mucosa fragments with condyloma acuminata, the principal role of VEGF was in angiogenesis and secondary in inflammation.

  18. Comparison of mass spectrometry-based electronic nose and solid phase microextraction gas chromatography-mass spectrometry technique to assess infant formula oxidation.

    PubMed

    Fenaille, François; Visani, Piero; Fumeaux, René; Milo, Christian; Guy, Philippe A

    2003-04-23

    Two headspace techniques based on mass spectrometry detection (MS), electronic nose, and solid phase microextraction coupled to gas chromatography-mass spectrometry (SPME-GC/MS) were evaluated for their ability to differentiate various infant formula powders based on changes of their volatiles upon storage. The electronic nose gave unresolved MS fingerprints of the samples gas phases that were further submitted to principal component analysis (PCA). Such direct MS recording combined to multivariate treatment enabled a rapid differentiation of the infant formulas over a 4 week storage test. Although MS-based electronic nose advantages are its easy-to-use aspect and its meaningful data interpretation obtained with a high throughput (100 samples per 24 h), its greatest disadvantage is that the present compounds could not be identified and quantified. For these reasons, a SPME-GC/MS measurement was also investigated. This technique allowed the identification of saturated aldehydes as the main volatiles present in the headspace of infant milk powders. An isotope dilution assay was further developed to quantitate hexanal as a potential indicator of infant milk powder oxidation. Thus, hexanal content was found to vary from roughly 500 and 3500 microg/kg for relatively non-oxidized and oxidized infant formulas, respectively.

  19. SOIL AND SEDIMENT SAMPLING METHODS | Science ...

    EPA Pesticide Factsheets

    The EPA Office of Solid Waste and Emergency Response's (OSWER) Office of Superfund Remediation and Technology Innovation (OSRTI) needs innovative methods and techniques to solve new and difficult sampling and analytical problems found at the numerous Superfund sites throughout the United States. Inadequate site characterization and a lack of knowledge of surface and subsurface contaminant distributions hinders EPA's ability to make the best decisions on remediation options and to conduct the most effective cleanup efforts. To assist OSWER, NERL conducts research to improve their capability to more accurately, precisely, and efficiently characterize Superfund, RCRA, LUST, oil spills, and brownfield sites and to improve their risk-based decision making capabilities, research is being conducted on improving soil and sediment sampling techniques and improving the sampling and handling of volatile organic compound (VOC) contaminated soils, among the many research programs and tasks being performed at ESD-LV.Under this task, improved sampling approaches and devices will be developed for characterizing the concentration of VOCs in soils. Current approaches and devices used today can lose up to 99% of the VOCs present in the sample due inherent weaknesses in the device and improper/inadequate collection techniques. This error generally causes decision makers to markedly underestimate the soil VOC concentrations and, therefore, to greatly underestimate the ecological

  20. Sample and data processing considerations for the NIST quantitative infrared database

    NASA Astrophysics Data System (ADS)

    Chu, Pamela M.; Guenther, Franklin R.; Rhoderick, George C.; Lafferty, Walter J.; Phillips, William

    1999-02-01

    Fourier-transform infrared (FT-IR) spectrometry has become a useful real-time in situ analytical technique for quantitative gas phase measurements. In fact, the U.S. Environmental Protection Agency (EPA) has recently approved open-path FT-IR monitoring for the determination of hazardous air pollutants (HAP) identified in EPA's Clean Air Act of 1990. To support infrared based sensing technologies, the National Institute of Standards and Technology (NIST) is currently developing a standard quantitative spectral database of the HAPs based on gravimetrically prepared standard samples. The procedures developed to ensure the quantitative accuracy of the reference data are discussed, including sample preparation, residual sample contaminants, data processing considerations, and estimates of error.

  1. Cancer classification through filtering progressive transductive support vector machine based on gene expression data

    NASA Astrophysics Data System (ADS)

    Lu, Xinguo; Chen, Dan

    2017-08-01

    Traditional supervised classifiers neglect a large amount of data which not have sufficient follow-up information, only work with labeled data. Consequently, the small sample size limits the advancement of design appropriate classifier. In this paper, a transductive learning method which combined with the filtering strategy in transductive framework and progressive labeling strategy is addressed. The progressive labeling strategy does not need to consider the distribution of labeled samples to evaluate the distribution of unlabeled samples, can effective solve the problem of evaluate the proportion of positive and negative samples in work set. Our experiment result demonstrate that the proposed technique have great potential in cancer prediction based on gene expression.

  2. Photonic devices based on patterning by two photon induced polymerization techniques

    NASA Astrophysics Data System (ADS)

    Fortunati, I.; Dainese, T.; Signorini, R.; Bozio, R.; Tagliazucca, V.; Dirè, S.; Lemercier, G.; Mulatier, J.-C.; Andraud, C.; Schiavuta, P.; Rinaldi, A.; Licoccia, S.; Bottazzo, J.; Franco Perez, A.; Guglielmi, M.; Brusatin, G.

    2008-04-01

    Two and three dimensional structures with micron and submicron resolution have been achieved in commercial resists, polymeric materials and sol-gel materials by several lithographic techniques. In this context, silicon-based sol-gel materials are particularly interesting because of their versatility, chemical and thermal stability, amount of embeddable active compounds. Compared with other micro- and nano-fabrication schemes, the Two Photon Induced Polymerization is unique in its 3D processing capability. The photopolymerization is performed with laser beam in the near-IR region, where samples show less absorption and less scattering, giving rise to a deeper penetration of the light. The use of ultrashort laser pulses allows the starting of nonlinear processes like multiphoton absorption at relatively low average power without thermally damaging the samples. In this work we report results on the photopolymerization process in hybrid organic-inorganic films based photopolymerizable methacrylate-containing Si-nanobuilding blocks. Films, obtained through sol-gel synthesis, are doped with a photo-initiator allowing a radical polymerization of methacrylic groups. The photo-initiator is activated by femtosecond laser source, at different input energies. The development of the unexposed regions is performed with a suitable solvent and the photopolymerized structures are characterized by microscopy techniques.

  3. Radiomics-based Prognosis Analysis for Non-Small Cell Lung Cancer

    NASA Astrophysics Data System (ADS)

    Zhang, Yucheng; Oikonomou, Anastasia; Wong, Alexander; Haider, Masoom A.; Khalvati, Farzad

    2017-04-01

    Radiomics characterizes tumor phenotypes by extracting large numbers of quantitative features from radiological images. Radiomic features have been shown to provide prognostic value in predicting clinical outcomes in several studies. However, several challenges including feature redundancy, unbalanced data, and small sample sizes have led to relatively low predictive accuracy. In this study, we explore different strategies for overcoming these challenges and improving predictive performance of radiomics-based prognosis for non-small cell lung cancer (NSCLC). CT images of 112 patients (mean age 75 years) with NSCLC who underwent stereotactic body radiotherapy were used to predict recurrence, death, and recurrence-free survival using a comprehensive radiomics analysis. Different feature selection and predictive modeling techniques were used to determine the optimal configuration of prognosis analysis. To address feature redundancy, comprehensive analysis indicated that Random Forest models and Principal Component Analysis were optimum predictive modeling and feature selection methods, respectively, for achieving high prognosis performance. To address unbalanced data, Synthetic Minority Over-sampling technique was found to significantly increase predictive accuracy. A full analysis of variance showed that data endpoints, feature selection techniques, and classifiers were significant factors in affecting predictive accuracy, suggesting that these factors must be investigated when building radiomics-based predictive models for cancer prognosis.

  4. Differentiation of four Aspergillus species and one Zygosaccharomyces with two electronic tongues based on different measurement techniques.

    PubMed

    Söderström, C; Rudnitskaya, A; Legin, A; Krantz-Rülcker, C

    2005-09-29

    Two electronic tongues based on different measurement techniques were applied to the discrimination of four molds and one yeast. Chosen microorganisms were different species of Aspergillus and yeast specie Zygosaccharomyces bailii, which are known as food contaminants. The electronic tongue developed in Linköping University was based on voltammetry. Four working electrodes made of noble metals were used in a standard three-electrode configuration in this case. The St. Petersburg electronic tongue consisted of 27 potentiometric chemical sensors with enhanced cross-sensitivity. Sensors with chalcogenide glass and plasticized PVC membranes were used. Two sets of samples were measured using both electronic tongues. Firstly, broths were measured in which either one of the molds or the yeast grew until late logarithmic phase or border of the stationary phase. Broths inoculated by either one of molds or the yeast was measured at five different times during microorganism growth. Data were evaluated using principal component analysis (PCA), partial least square regression (PLS) and linear discriminant analysis (LDA). It was found that both measurement techniques could differentiate between fungi species. Merged data from both electronic tongues improved differentiation of the samples in selected cases.

  5. Molecular cancer classification using a meta-sample-based regularized robust coding method.

    PubMed

    Wang, Shu-Lin; Sun, Liuchao; Fang, Jianwen

    2014-01-01

    Previous studies have demonstrated that machine learning based molecular cancer classification using gene expression profiling (GEP) data is promising for the clinic diagnosis and treatment of cancer. Novel classification methods with high efficiency and prediction accuracy are still needed to deal with high dimensionality and small sample size of typical GEP data. Recently the sparse representation (SR) method has been successfully applied to the cancer classification. Nevertheless, its efficiency needs to be improved when analyzing large-scale GEP data. In this paper we present the meta-sample-based regularized robust coding classification (MRRCC), a novel effective cancer classification technique that combines the idea of meta-sample-based cluster method with regularized robust coding (RRC) method. It assumes that the coding residual and the coding coefficient are respectively independent and identically distributed. Similar to meta-sample-based SR classification (MSRC), MRRCC extracts a set of meta-samples from the training samples, and then encodes a testing sample as the sparse linear combination of these meta-samples. The representation fidelity is measured by the l2-norm or l1-norm of the coding residual. Extensive experiments on publicly available GEP datasets demonstrate that the proposed method is more efficient while its prediction accuracy is equivalent to existing MSRC-based methods and better than other state-of-the-art dimension reduction based methods.

  6. Environmental Survey Plans, Fort Sheridan, Sampling and Analysis Plan, Fort Sheridan, Illinois

    DTIC Science & Technology

    1990-07-01

    technique. The flameless AA procedure is based on the absorption of radiation at 253.7 nm by Hg vapor. The Hg is reduced to the elemental state and aerated...desiccator, cupric oxide is added, and the sample is combusted in an induction furnace. The organic carbon content is determined through a calculation in

  7. Developing sediment remediation goals at superfund sites based on pore water for the protection of benthic organisms from direct toxicity to non-ionic organic contaminants (presentation)

    EPA Science Inventory

    Passive sampling is becoming a frequently used measurement technique at Superfund sites with contaminated sediments. Passive sampling measures the concentrations of freely dissolved chemicals (Cfrees) in the sediment pore water. Cfree has been found to be a very practical means f...

  8. A Development of Participation of Primary School Students in Conservation of School Environments

    ERIC Educational Resources Information Center

    Klongyut, Somsak; Singseewo, Adisak; Suksringarm, Paitool

    2015-01-01

    This study aimed to investigate and compare knowledge, attitudes and participating behaviors of students who participated in a training session. A training manual based on the participatory process was used. The sample consisted of 30 grade 5 students and 30 grade 6 students using a voluntary sampling technique. Research instruments included 1) a…

  9. Implementation of Structured Inquiry Based Model Learning toward Students' Understanding of Geometry

    ERIC Educational Resources Information Center

    Salim, Kalbin; Tiawa, Dayang Hjh

    2015-01-01

    The purpose of this study is implementation of a structured inquiry learning model in instruction of geometry. The model used is a model with a quasi-experimental study amounted to two classes of samples selected from the population of the ten classes with cluster random sampling technique. Data collection tool consists of a test item…

  10. Assessing the precision of a time-sampling-based study among GPs: balancing sample size and measurement frequency.

    PubMed

    van Hassel, Daniël; van der Velden, Lud; de Bakker, Dinny; van der Hoek, Lucas; Batenburg, Ronald

    2017-12-04

    Our research is based on a technique for time sampling, an innovative method for measuring the working hours of Dutch general practitioners (GPs), which was deployed in an earlier study. In this study, 1051 GPs were questioned about their activities in real time by sending them one SMS text message every 3 h during 1 week. The required sample size for this study is important for health workforce planners to know if they want to apply this method to target groups who are hard to reach or if fewer resources are available. In this time-sampling method, however, standard power analyses is not sufficient for calculating the required sample size as this accounts only for sample fluctuation and not for the fluctuation of measurements taken from every participant. We investigated the impact of the number of participants and frequency of measurements per participant upon the confidence intervals (CIs) for the hours worked per week. Statistical analyses of the time-use data we obtained from GPs were performed. Ninety-five percent CIs were calculated, using equations and simulation techniques, for various different numbers of GPs included in the dataset and for various frequencies of measurements per participant. Our results showed that the one-tailed CI, including sample and measurement fluctuation, decreased from 21 until 3 h between one and 50 GPs. As a result of the formulas to calculate CIs, the increase of the precision continued and was lower with the same additional number of GPs. Likewise, the analyses showed how the number of participants required decreased if more measurements per participant were taken. For example, one measurement per 3-h time slot during the week requires 300 GPs to achieve a CI of 1 h, while one measurement per hour requires 100 GPs to obtain the same result. The sample size needed for time-use research based on a time-sampling technique depends on the design and aim of the study. In this paper, we showed how the precision of the measurement of hours worked each week by GPs strongly varied according to the number of GPs included and the frequency of measurements per GP during the week measured. The best balance between both dimensions will depend upon different circumstances, such as the target group and the budget available.

  11. Enhancement of low sampling frequency recordings for ECG biometric matching using interpolation.

    PubMed

    Sidek, Khairul Azami; Khalil, Ibrahim

    2013-01-01

    Electrocardiogram (ECG) based biometric matching suffers from high misclassification error with lower sampling frequency data. This situation may lead to an unreliable and vulnerable identity authentication process in high security applications. In this paper, quality enhancement techniques for ECG data with low sampling frequency has been proposed for person identification based on piecewise cubic Hermite interpolation (PCHIP) and piecewise cubic spline interpolation (SPLINE). A total of 70 ECG recordings from 4 different public ECG databases with 2 different sampling frequencies were applied for development and performance comparison purposes. An analytical method was used for feature extraction. The ECG recordings were segmented into two parts: the enrolment and recognition datasets. Three biometric matching methods, namely, Cross Correlation (CC), Percent Root-Mean-Square Deviation (PRD) and Wavelet Distance Measurement (WDM) were used for performance evaluation before and after applying interpolation techniques. Results of the experiments suggest that biometric matching with interpolated ECG data on average achieved higher matching percentage value of up to 4% for CC, 3% for PRD and 94% for WDM. These results are compared with the existing method when using ECG recordings with lower sampling frequency. Moreover, increasing the sample size from 56 to 70 subjects improves the results of the experiment by 4% for CC, 14.6% for PRD and 0.3% for WDM. Furthermore, higher classification accuracy of up to 99.1% for PCHIP and 99.2% for SPLINE with interpolated ECG data as compared of up to 97.2% without interpolation ECG data verifies the study claim that applying interpolation techniques enhances the quality of the ECG data. Crown Copyright © 2012. Published by Elsevier Ireland Ltd. All rights reserved.

  12. Separation techniques for the clean-up of radioactive mixed waste for ICP-AES/ICP-MS analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Swafford, A.M.; Keller, J.M.

    1993-03-17

    Two separation techniques were investigated for the clean-up of typical radioactive mixed waste samples requiring elemental analysis by Inductively Coupled Plasma-Atomic Emission Spectroscopy (ICP-AES) or Inductively Coupled Plasma-Mass Spectrometry (ICP-MS). These measurements frequently involve regulatory or compliance criteria which include the determination of elements on the EPA Target Analyte List (TAL). These samples usually consist of both an aqueous phase and a solid phase which is mostly an inorganic sludge. Frequently, samples taken from the waste tanks contain high levels of uranium and thorium which can cause spectral interferences in ICP-AES or ICP-MS analysis. The removal of these interferences ismore » necessary to determine the presence of the EPA TAL elements in the sample. Two clean-up methods were studied on simulated aqueous waste samples containing the EPA TAL elements. The first method studied was a classical procedure based upon liquid-liquid extraction using tri-n- octylphosphine oxide (TOPO) dissolved in cyclohexane. The second method investigated was based on more recently developed techniques using extraction chromatography; specifically the use of a commercially available Eichrom TRU[center dot]Spec[trademark] column. Literature on these two methods indicates the efficient removal of uranium and thorium from properly prepared samples and provides considerable qualitative information on the extraction behavior of many other elements. However, there is a lack of quantitative data on the extraction behavior of elements on the EPA Target Analyte List. Experimental studies on these two methods consisted of determining whether any of the analytes were extracted by these methods and the recoveries obtained. Both methods produced similar results; the EPA target analytes were only slightly or not extracted. Advantages and disadvantages of each method were evaluated and found to be comparable.« less

  13. Quantum-classical boundary for precision optical phase estimation

    NASA Astrophysics Data System (ADS)

    Birchall, Patrick M.; O'Brien, Jeremy L.; Matthews, Jonathan C. F.; Cable, Hugo

    2017-12-01

    Understanding the fundamental limits on the precision to which an optical phase can be estimated is of key interest for many investigative techniques utilized across science and technology. We study the estimation of a fixed optical phase shift due to a sample which has an associated optical loss, and compare phase estimation strategies using classical and nonclassical probe states. These comparisons are based on the attainable (quantum) Fisher information calculated per number of photons absorbed or scattered by the sample throughout the sensing process. We find that for a given number of incident photons upon the unknown phase, nonclassical techniques in principle provide less than a 20 % reduction in root-mean-square error (RMSE) in comparison with ideal classical techniques in multipass optical setups. Using classical techniques in a different optical setup that we analyze, which incorporates additional stages of interference during the sensing process, the achievable reduction in RMSE afforded by nonclassical techniques falls to only ≃4 % . We explain how these conclusions change when nonclassical techniques are compared to classical probe states in nonideal multipass optical setups, with additional photon losses due to the measurement apparatus.

  14. Refined genetic algorithm -- Economic dispatch example

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sheble, G.B.; Brittig, K.

    1995-02-01

    A genetic-based algorithm is used to solve an economic dispatch (ED) problem. The algorithm utilizes payoff information of perspective solutions to evaluate optimality. Thus, the constraints of classical LaGrangian techniques on unit curves are eliminated. Using an economic dispatch problem as a basis for comparison, several different techniques which enhance program efficiency and accuracy, such as mutation prediction, elitism, interval approximation and penalty factors, are explored. Two unique genetic algorithms are also compared. The results are verified for a sample problem using a classical technique.

  15. Evaluation of three techniques for classifying urban land cover patterns using LANDSAT MSS data. [New Orleans, Louisiana

    NASA Technical Reports Server (NTRS)

    Baumann, P. R. (Principal Investigator)

    1979-01-01

    Three computer quantitative techniques for determining urban land cover patterns are evaluated. The techniques examined deal with the selection of training samples by an automated process, the overlaying of two scenes from different seasons of the year, and the use of individual pixels as training points. Evaluation is based on the number and type of land cover classes generated and the marks obtained from an accuracy test. New Orleans, Louisiana and its environs form the study area.

  16. Evaluation of a native vegetation masking technique

    NASA Technical Reports Server (NTRS)

    Kinsler, M. C.

    1984-01-01

    A crop masking technique based on Ashburn's vegetative index (AVI) was used to evaluate native vegetation as an indicator of crop moisture condition. A mask of the range areas (native vegetation) was generated for each of thirteen Great Plains LANDSAT MSS sample segments. These masks were compared to the digitized ground truth and accuracies were computed. An analysis of the types of errors indicates a consistency in errors among the segments. The mask represents a simple quick-look technique for evaluating vegetative cover.

  17. Electrical field-induced extraction and separation techniques: promising trends in analytical chemistry--a review.

    PubMed

    Yamini, Yadollah; Seidi, Shahram; Rezazadeh, Maryam

    2014-03-03

    Sample preparation is an important issue in analytical chemistry, and is often a bottleneck in chemical analysis. So, the major incentive for the recent research has been to attain faster, simpler, less expensive, and more environmentally friendly sample preparation methods. The use of auxiliary energies, such as heat, ultrasound, and microwave, is one of the strategies that have been employed in sample preparation to reach the above purposes. Application of electrical driving force is the current state-of-the-art, which presents new possibilities for simplifying and shortening the sample preparation process as well as enhancing its selectivity. The electrical driving force has scarcely been utilized in comparison with other auxiliary energies. In this review, the different roles of electrical driving force (as a powerful auxiliary energy) in various extraction techniques, including liquid-, solid-, and membrane-based methods, have been taken into consideration. Also, the references have been made available, relevant to the developments in separation techniques and Lab-on-a-Chip (LOC) systems. All aspects of electrical driving force in extraction and separation methods are too specific to be treated in this contribution. However, the main aim of this review is to provide a brief knowledge about the different fields of analytical chemistry, with an emphasis on the latest efforts put into the electrically assisted membrane-based sample preparation systems. The advantages and disadvantages of these approaches as well as the new achievements in these areas have been discussed, which might be helpful for further progress in the future. Copyright © 2013 Elsevier B.V. All rights reserved.

  18. Application of the angle measure technique as image texture analysis method for the identification of uranium ore concentrate samples: New perspective in nuclear forensics.

    PubMed

    Fongaro, Lorenzo; Ho, Doris Mer Lin; Kvaal, Knut; Mayer, Klaus; Rondinella, Vincenzo V

    2016-05-15

    The identification of interdicted nuclear or radioactive materials requires the application of dedicated techniques. In this work, a new approach for characterizing powder of uranium ore concentrates (UOCs) is presented. It is based on image texture analysis and multivariate data modelling. 26 different UOCs samples were evaluated applying the Angle Measure Technique (AMT) algorithm to extract textural features on samples images acquired at 250× and 1000× magnification by Scanning Electron Microscope (SEM). At both magnifications, this method proved effective to classify the different types of UOC powder based on the surface characteristics that depend on particle size, homogeneity, and graininess and are related to the composition and processes used in the production facilities. Using the outcome data from the application of the AMT algorithm, the total explained variance was higher than 90% with Principal Component Analysis (PCA), while partial least square discriminant analysis (PLS-DA) applied only on the 14 black colour UOCs powder samples, allowed their classification only on the basis of their surface texture features (sensitivity>0.6; specificity>0.6). This preliminary study shows that this method was able to distinguish samples with similar composition, but obtained from different facilities. The mean angle spectral data obtained by the image texture analysis using the AMT algorithm can be considered as a specific fingerprint or signature of UOCs and could be used for nuclear forensic investigation. Copyright © 2016 The Authors. Published by Elsevier B.V. All rights reserved.

  19. Method for concentration and separation of biological organisms by ultrafiltration and dielectrophoresis

    DOEpatents

    Simmons, Blake A.; Hill, Vincent R.; Fintschenko, Yolanda; Cummings, Eric B.

    2012-09-04

    Disclosed is a method for monitoring sources of public water supply for a variety of pathogens by using a combination of ultrafiltration techniques together dielectrophoretic separation techniques. Because water-borne pathogens, whether present due to "natural" contamination or intentional introduction, would likely be present in drinking water at low concentrations when samples are collected for monitoring or outbreak investigations, an approach is needed to quickly and efficiently concentrate and separate particles such as viruses, bacteria, and parasites in large volumes of water (e.g., 100 L or more) while simultaneously reducing the sample volume to levels sufficient for detecting low concentrations of microbes (e.g., <10 mL). The technique is also designed to screen the separated microbes based on specific conductivity and size.

  20. Automation of static and dynamic non-dispersive liquid phase microextraction. Part 1: Approaches based on extractant drop-, plug-, film- and microflow-formation.

    PubMed

    Alexovič, Michal; Horstkotte, Burkhard; Solich, Petr; Sabo, Ján

    2016-02-04

    Simplicity, effectiveness, swiftness, and environmental friendliness - these are the typical requirements for the state of the art development of green analytical techniques. Liquid phase microextraction (LPME) stands for a family of elegant sample pretreatment and analyte preconcentration techniques preserving these principles in numerous applications. By using only fractions of solvent and sample compared to classical liquid-liquid extraction, the extraction kinetics, the preconcentration factor, and the cost efficiency can be increased. Moreover, significant improvements can be made by automation, which is still a hot topic in analytical chemistry. This review surveys comprehensively and in two parts the developments of automation of non-dispersive LPME methodologies performed in static and dynamic modes. Their advantages and limitations and the reported analytical performances are discussed and put into perspective with the corresponding manual procedures. The automation strategies, techniques, and their operation advantages as well as their potentials are further described and discussed. In this first part, an introduction to LPME and their static and dynamic operation modes as well as their automation methodologies is given. The LPME techniques are classified according to the different approaches of protection of the extraction solvent using either a tip-like (needle/tube/rod) support (drop-based approaches), a wall support (film-based approaches), or microfluidic devices. In the second part, the LPME techniques based on porous supports for the extraction solvent such as membranes and porous media are overviewed. An outlook on future demands and perspectives in this promising area of analytical chemistry is finally given. Copyright © 2015 Elsevier B.V. All rights reserved.

  1. Microbiological Sampling Methods and Sanitation of Edible Plants Grown on ISS

    NASA Technical Reports Server (NTRS)

    Parrish, Charles H. II; Khodadad, Christina L.; Garland, Nathaniel T.; Larson, Brian D.; Hummreick, Mary E.

    2013-01-01

    Pathogenic microbes on the surfaces of salad crops and growth chambers pose a threat to the health of crew on International Space Station. For astronauts to safely consume spacegrown vegetables produced in NASA's new vegetable production unit, VEGGIE, three technical challenges must be overcome: real-time sampling, microbiological analysis, and sanitation. Raphanus sativus cultivar Cherry Bomb II and Latuca sativa cultivar Outredgeous, two saled crops to be grown in VEGGIE, were inoculated with Salmonella enterica serovar Typhimurium (S. Typhimurium), a bacterium known to cause food-borne illness Tape- and swab-based sampling techniques were optimized for use in microgravity and assessed for effectiveness in recovery of bacteria from crop surfaces: Rapid pathogen detection and molecular analyses were performed via quantitative real-time polymerase chain reactiop using LightCycler® 480 and RAZOR® EX, a scaled-down instrument that is undergoing evaluation and testing for future flight hardware. These methods were compared with conventional, culture-based methods for the recovery of S. Typhimurium colonies. A sterile wipe saturated with a citric acid-based, food-grade sanitizer was applied to two different surface materials used in VEGGIE flight hardware that had been contaminated with the bacterium Pseudomonas aeruginosa,. another known human pathogen. To sanitize surfaces, wipes were saturated with either the sanitizer or sterile deionized water and applied to each surface. Colony forming units of P. aeruginosa grown on tryptic soy agar plates were enumerated from surface samples after sanitization treatments. Depending on the VEGGIE hardware material, 2- to 4.5-log10 reductions in colony-forming units were observed after sanitization. The difference in recovery of S. Typhimurium between tape- and swab- based sampling techniques was insignificant. RAZOR® EX rapidly detected S. Typhimurium present in both raw culture and extracted DNA samples.

  2. Validation and application of Acoustic Mapping Velocimetry

    NASA Astrophysics Data System (ADS)

    Baranya, Sandor; Muste, Marian

    2016-04-01

    The goal of this paper is to introduce a novel methodology to estimate bedload transport in rivers based on an improved bedform tracking procedure. The measurement technique combines components and processing protocols from two contemporary nonintrusive instruments: acoustic and image-based. The bedform mapping is conducted with acoustic surveys while the estimation of the velocity of the bedforms is obtained with processing techniques pertaining to image-based velocimetry. The technique is therefore called Acoustic Mapping Velocimetry (AMV). The implementation of this technique produces a whole-field velocity map associated with the multi-directional bedform movement. Based on the calculated two-dimensional bedform migration velocity field, the bedload transport estimation is done using the Exner equation. A proof-of-concept experiment was performed to validate the AMV based bedload estimation in a laboratory flume at IIHR-Hydroscience & Engineering (IIHR). The bedform migration was analysed at three different flow discharges. Repeated bed geometry mapping, using a multiple transducer array (MTA), provided acoustic maps, which were post-processed with a particle image velocimetry (PIV) method. Bedload transport rates were calculated along longitudinal sections using the streamwise components of the bedform velocity vectors and the measured bedform heights. The bulk transport rates were compared with the results from concurrent direct physical samplings and acceptable agreement was found. As a first field implementation of the AMV an attempt was made to estimate bedload transport for a section of the Ohio river in the United States, where bed geometry maps, resulted by repeated multibeam echo sounder (MBES) surveys, served as input data. Cross-sectional distributions of bedload transport rates from the AMV based method were compared with the ones obtained from another non-intrusive technique (due to the lack of direct samplings), ISSDOTv2, developed by the US Army Corps of Engineers. The good agreement between the results from the two different methods is encouraging and suggests further field tests in varying hydro-morphological situations.

  3. DART-MS: A New Analytical Technique for Forensic Paint Analysis.

    PubMed

    Marić, Mark; Marano, James; Cody, Robert B; Bridge, Candice

    2018-06-05

    Automotive paint evidence is one of the most significant forms of evidence obtained in automotive-related incidents. Therefore, the analysis of automotive paint evidence is imperative in forensic casework. Most analytical schemes for automotive paint characterization involve optical microscopy, followed by infrared spectroscopy and pyrolysis-gas chromatography mass spectrometry ( py-GCMS) if required. The main drawback with py-GCMS, aside from its destructive nature, is that this technique is relatively time intensive in comparison to other techniques. Direct analysis in real-time-time-of-flight mass spectrometry (DART-TOFMS) may provide an alternative to py-GCMS, as the rapidity of analysis and minimal sample preparation affords a significant advantage. In this study, automotive clear coats from four vehicles were characterized by DART-TOFMS and a standard py-GCMS protocol. Principal component analysis was utilized to interpret the resultant data and suggested the two techniques provided analogous sample discrimination. Moreover, in some instances DART-TOFMS was able to identify components not observed by py-GCMS and vice versa, which indicates that the two techniques may provide complementary information. Additionally, a thermal desorption/pyrolysis DART-TOFMS methodology was also evaluated to characterize the intact paint chips from the vehicles to ascertain if the linear temperature gradient provided additional discriminatory information. All the paint samples were able to be discriminated based on the distinctive thermal desorption plots afforded from this technique, which may also be utilized for sample discrimination. On the basis of the results, DART-TOFMS may provide an additional tool to the forensic paint examiner.

  4. Determinations of rare earth element abundance and U-Pb age of zircons using multispot laser ablation-inductively coupled plasma mass spectrometry.

    PubMed

    Yokoyama, Takaomi D; Suzuki, Toshihiro; Kon, Yoshiaki; Hirata, Takafumi

    2011-12-01

    We have developed a new calibration technique for multielement determination and U-Pb dating of zircon samples using laser ablation-inductively coupled plasma mass spectrometry (ICPMS) coupled with galvanometric optics. With the galvanometric optics, laser ablation of two or more sample materials could be achieved in very short time intervals (~10 ms). The resulting sample aerosols released from different ablation pits or different solid samples were mixed and homogenized within the sample cell and then transported into the ICP ion source. Multiple spot laser ablation enables spiking of analytes or internal standard elements directly into the solid samples, and therefore the standard addition calibration method can be applied for the determination of trace elements in solid samples. In this study, we have measured the rare earth element (REE) abundances of two zircon samples (Nancy 91500 and Prešovice) based on the standard addition technique, using a direct spiking of analytes through a multispot laser ablation of the glass standard material (NIST SRM612). The resulting REE abundance data show good agreement with previously reported values within analytical uncertainties achieved in this study (10% for most elements). Our experiments demonstrated that nonspectroscopic interferences on 14 REEs could be significantly reduced by the standard addition technique employed here. Another advantage of galvanometric devices is the accumulation of sample aerosol released from multiple spots. In this study we have measured the U-Pb age of a zircon sample (LMR) using an accumulation of sample aerosols released from 10 separate ablation pits of low diameters (~8 μm). The resulting (238)U-(206)Pb age data for the LMR zircons was 369 ± 64 Ma, which is in good agreement with previously reported age data (367.6 ± 1.5 Ma). (1) The data obtained here clearly demonstrate that the multiple spot laser ablation-ICPMS technique can become a powerful approach for elemental and isotopic ratio measurements in solid materials.

  5. Development and Application of Optical Coherence Elastography for Detection of Mechanical Property Changes Occurring in Early Osteoarthritis

    NASA Astrophysics Data System (ADS)

    Hirota, Koji

    We demonstrate a computationally-efficient method for optical coherence elastography (OCE) based on fringe washout method for a spectral-domain OCT (SD-OCT) system. By sending short pulses of mechanical perturbation with ultrasound or shock wave during the image acquisition of alternating depth profiles, we can extract cross-sectional mechanical assessment of tissue in real-time. This was achieved through a simple comparison of the intensity in adjacent depth profiles acquired during the states of perturbation and non-perturbation in order to quantify the degree of induced fringe washout. Although the results indicate that our OCE technique based on the fringe washout effect is sensitive enough to detect mechanical property changes in biological samples, there is some loss of sensitivity in comparison to previous techniques in order to achieve computationally efficiency and minimum modification in both hardware and software in the OCT system. The tissue phantom study was carried with various agar density samples to characterize our OCE technique. Young's modulus measurements were achieved with the atomic force microscopy (AFM) to correlate to our OCE assessment. Knee cartilage samples of monosodium iodoacetate (MIA) rat models were utilized to replicate cartilage damage of a human model. Our proposed OCE technique along with intensity and AFM measurements were applied to the MIA models to assess the damage. The results from both the phantom study and MIA model study demonstrated the strong capability to assess the changes in mechanical properties of the OCE technique. The correlation between the OCE measurements and the Young's modulus values demonstrated in the OCE data that the stiffer material had less magnitude of fringe washout effect. This result is attributed to the fringe washout effect caused by axial motion that the displacement of the scatterers in the stiffer samples in response to the external perturbation induces less fringe washout effect.

  6. Recent advancements in nanoelectrodes and nanopipettes used in combined scanning electrochemical microscopy techniques.

    PubMed

    Kranz, Christine

    2014-01-21

    In recent years, major developments in scanning electrochemical microscopy (SECM) have significantly broadened the application range of this electroanalytical technique from high-resolution electrochemical imaging via nanoscale probes to large scale mapping using arrays of microelectrodes. A major driving force in advancing the SECM methodology is based on developing more sophisticated probes beyond conventional micro-disc electrodes usually based on noble metals or carbon microwires. This critical review focuses on the design and development of advanced electrochemical probes particularly enabling combinations of SECM with other analytical measurement techniques to provide information beyond exclusively measuring electrochemical sample properties. Consequently, this critical review will focus on recent progress and new developments towards multifunctional imaging.

  7. New approaches to the analysis of complex samples using fluorescence lifetime techniques and organized media

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hertz, P.R.

    Fluorescence spectroscopy is a highly sensitive and selective tool for the analysis of complex systems. In order to investigate the efficacy of several steady state and dynamic techniques for the analysis of complex systems, this work focuses on two types of complex, multicomponent samples: petrolatums and coal liquids. It is shown in these studies dynamic, fluorescence lifetime-based measurements provide enhanced discrimination between complex petrolatum samples. Additionally, improved quantitative analysis of multicomponent systems is demonstrated via incorporation of organized media in coal liquid samples. This research provides the first systematic studies of (1) multifrequency phase-resolved fluorescence spectroscopy for dynamic fluorescence spectralmore » fingerprinting of complex samples, and (2) the incorporation of bile salt micellar media to improve accuracy and sensitivity for characterization of complex systems. In the petroleum studies, phase-resolved fluorescence spectroscopy is used to combine spectral and lifetime information through the measurement of phase-resolved fluorescence intensity. The intensity is collected as a function of excitation and emission wavelengths, angular modulation frequency, and detector phase angle. This multidimensional information enhances the ability to distinguish between complex samples with similar spectral characteristics. Examination of the eigenvalues and eigenvectors from factor analysis of phase-resolved and steady state excitation-emission matrices, using chemometric methods of data analysis, confirms that phase-resolved fluorescence techniques offer improved discrimination between complex samples as compared with conventional steady state methods.« less

  8. Measures of precision for dissimilarity-based multivariate analysis of ecological communities.

    PubMed

    Anderson, Marti J; Santana-Garcon, Julia

    2015-01-01

    Ecological studies require key decisions regarding the appropriate size and number of sampling units. No methods currently exist to measure precision for multivariate assemblage data when dissimilarity-based analyses are intended to follow. Here, we propose a pseudo multivariate dissimilarity-based standard error (MultSE) as a useful quantity for assessing sample-size adequacy in studies of ecological communities. Based on sums of squared dissimilarities, MultSE measures variability in the position of the centroid in the space of a chosen dissimilarity measure under repeated sampling for a given sample size. We describe a novel double resampling method to quantify uncertainty in MultSE values with increasing sample size. For more complex designs, values of MultSE can be calculated from the pseudo residual mean square of a permanova model, with the double resampling done within appropriate cells in the design. R code functions for implementing these techniques, along with ecological examples, are provided. © 2014 The Authors. Ecology Letters published by John Wiley & Sons Ltd and CNRS.

  9. Evaluation and comparison of FTA card and CTAB DNA extraction methods for non-agricultural taxa1

    PubMed Central

    Siegel, Chloe S.; Stevenson, Florence O.; Zimmer, Elizabeth A.

    2017-01-01

    Premise of the study: An efficient, effective DNA extraction method is necessary for comprehensive analysis of plant genomes. This study analyzed the quality of DNA obtained using paper FTA cards prepared directly in the field when compared to the more traditional cetyltrimethylammonium bromide (CTAB)–based extraction methods from silica-dried samples. Methods: DNA was extracted using FTA cards according to the manufacturer’s protocol. In parallel, CTAB-based extractions were done using the automated AutoGen DNA isolation system. DNA quality for both methods was determined for 15 non-agricultural species collected in situ, by gel separation, spectrophotometry, fluorometry, and successful amplification and sequencing of nuclear and chloroplast gene markers. Results: The FTA card extraction method yielded less concentrated, but also less fragmented samples than the CTAB-based technique. The card-extracted samples provided DNA that could be successfully amplified and sequenced. The FTA cards are also useful because the collected samples do not require refrigeration, extensive laboratory expertise, or as many hazardous chemicals as extractions using the CTAB-based technique. Discussion: The relative success of the FTA card method in our study suggested that this method could be a valuable tool for studies in plant population genetics and conservation biology that may involve screening of hundreds of individual plants. The FTA cards, like the silica gel samples, do not contain plant material capable of propagation, and therefore do not require permits from the U.S. Department of Agriculture (USDA) Animal and Plant Health Inspection Service (APHIS) for transportation. PMID:28224056

  10. Can cloud point-based enrichment, preservation, and detection methods help to bridge gaps in aquatic nanometrology?

    PubMed

    Duester, Lars; Fabricius, Anne-Lena; Jakobtorweihen, Sven; Philippe, Allan; Weigl, Florian; Wimmer, Andreas; Schuster, Michael; Nazar, Muhammad Faizan

    2016-11-01

    Coacervate-based techniques are intensively used in environmental analytical chemistry to enrich and extract different kinds of analytes. Most methods focus on the total content or the speciation of inorganic and organic substances. Size fractionation is less commonly addressed. Within coacervate-based techniques, cloud point extraction (CPE) is characterized by a phase separation of non-ionic surfactants dispersed in an aqueous solution when the respective cloud point temperature is exceeded. In this context, the feature article raises the following question: May CPE in future studies serve as a key tool (i) to enrich and extract nanoparticles (NPs) from complex environmental matrices prior to analyses and (ii) to preserve the colloidal status of unstable environmental samples? With respect to engineered NPs, a significant gap between environmental concentrations and size- and element-specific analytical capabilities is still visible. CPE may support efforts to overcome this "concentration gap" via the analyte enrichment. In addition, most environmental colloidal systems are known to be unstable, dynamic, and sensitive to changes of the environmental conditions during sampling and sample preparation. This delivers a so far unsolved "sample preparation dilemma" in the analytical process. The authors are of the opinion that CPE-based methods have the potential to preserve the colloidal status of these instable samples. Focusing on NPs, this feature article aims to support the discussion on the creation of a convention called the "CPE extractable fraction" by connecting current knowledge on CPE mechanisms and on available applications, via the uncertainties visible and modeling approaches available, with potential future benefits from CPE protocols.

  11. Molecular dynamics based enhanced sampling of collective variables with very large time steps.

    PubMed

    Chen, Pei-Yang; Tuckerman, Mark E

    2018-01-14

    Enhanced sampling techniques that target a set of collective variables and that use molecular dynamics as the driving engine have seen widespread application in the computational molecular sciences as a means to explore the free-energy landscapes of complex systems. The use of molecular dynamics as the fundamental driver of the sampling requires the introduction of a time step whose magnitude is limited by the fastest motions in a system. While standard multiple time-stepping methods allow larger time steps to be employed for the slower and computationally more expensive forces, the maximum achievable increase in time step is limited by resonance phenomena, which inextricably couple fast and slow motions. Recently, we introduced deterministic and stochastic resonance-free multiple time step algorithms for molecular dynamics that solve this resonance problem and allow ten- to twenty-fold gains in the large time step compared to standard multiple time step algorithms [P. Minary et al., Phys. Rev. Lett. 93, 150201 (2004); B. Leimkuhler et al., Mol. Phys. 111, 3579-3594 (2013)]. These methods are based on the imposition of isokinetic constraints that couple the physical system to Nosé-Hoover chains or Nosé-Hoover Langevin schemes. In this paper, we show how to adapt these methods for collective variable-based enhanced sampling techniques, specifically adiabatic free-energy dynamics/temperature-accelerated molecular dynamics, unified free-energy dynamics, and by extension, metadynamics, thus allowing simulations employing these methods to employ similarly very large time steps. The combination of resonance-free multiple time step integrators with free-energy-based enhanced sampling significantly improves the efficiency of conformational exploration.

  12. Molecular dynamics based enhanced sampling of collective variables with very large time steps

    NASA Astrophysics Data System (ADS)

    Chen, Pei-Yang; Tuckerman, Mark E.

    2018-01-01

    Enhanced sampling techniques that target a set of collective variables and that use molecular dynamics as the driving engine have seen widespread application in the computational molecular sciences as a means to explore the free-energy landscapes of complex systems. The use of molecular dynamics as the fundamental driver of the sampling requires the introduction of a time step whose magnitude is limited by the fastest motions in a system. While standard multiple time-stepping methods allow larger time steps to be employed for the slower and computationally more expensive forces, the maximum achievable increase in time step is limited by resonance phenomena, which inextricably couple fast and slow motions. Recently, we introduced deterministic and stochastic resonance-free multiple time step algorithms for molecular dynamics that solve this resonance problem and allow ten- to twenty-fold gains in the large time step compared to standard multiple time step algorithms [P. Minary et al., Phys. Rev. Lett. 93, 150201 (2004); B. Leimkuhler et al., Mol. Phys. 111, 3579-3594 (2013)]. These methods are based on the imposition of isokinetic constraints that couple the physical system to Nosé-Hoover chains or Nosé-Hoover Langevin schemes. In this paper, we show how to adapt these methods for collective variable-based enhanced sampling techniques, specifically adiabatic free-energy dynamics/temperature-accelerated molecular dynamics, unified free-energy dynamics, and by extension, metadynamics, thus allowing simulations employing these methods to employ similarly very large time steps. The combination of resonance-free multiple time step integrators with free-energy-based enhanced sampling significantly improves the efficiency of conformational exploration.

  13. Supervised target detection in hyperspectral images using one-class Fukunaga-Koontz Transform

    NASA Astrophysics Data System (ADS)

    Binol, Hamidullah; Bal, Abdullah

    2016-05-01

    A novel hyperspectral target detection technique based on Fukunaga-Koontz transform (FKT) is presented. FKT offers significant properties for feature selection and ordering. However, it can only be used to solve multi-pattern classification problems. Target detection may be considered as a two-class classification problem, i.e., target versus background clutter. Nevertheless, background clutter typically contains different types of materials. That's why; target detection techniques are different than classification methods by way of modeling clutter. To avoid the modeling of the background clutter, we have improved one-class FKT (OC-FKT) for target detection. The statistical properties of target training samples are used to define tunnel-like boundary of the target class. Non-target samples are then created synthetically as to be outside of the boundary. Thus, only limited target samples become adequate for training of FKT. The hyperspectral image experiments confirm that the proposed OC-FKT technique provides an effective means for target detection.

  14. [Developments in preparation and experimental method of solid phase microextraction fibers].

    PubMed

    Yi, Xu; Fu, Yujie

    2004-09-01

    Solid phase microextraction (SPME) is a simple and effective adsorption and desorption technique, which concentrates volatile or nonvolatile compounds from liquid samples or headspace of samples. SPME is compatible with analyte separation and detection by gas chromatography, high performance liquid chromatography, and other instrumental methods. It can provide many advantages, such as wide linear scale, low solvent and sample consumption, short analytical times, low detection limits, simple apparatus, and so on. The theory of SPME is introduced, which includes equilibrium theory and non-equilibrium theory. The novel development of fiber preparation methods and relative experimental techniques are discussed. In addition to commercial fiber preparation, different newly developed fabrication techniques, such as sol-gel, electronic deposition, carbon-base adsorption, high-temperature epoxy immobilization, are presented. Effects of extraction modes, selection of fiber coating, optimization of operating conditions, method sensitivity and precision, and systematical automation, are taken into considerations in the analytical process of SPME. A simple perspective of SPME is proposed at last.

  15. Optimization and Standardization of Fluorescent Cell Barcoding for Multiplexed Flow Cytometric Phenotyping

    PubMed Central

    Giudice, Valentina; Feng, Xingmin; Kajigaya, Sachiko; Young, Neal S.; Biancotto, Angélique

    2017-01-01

    Fluorescent cell barcoding (FCB) is a cell-based multiplexing technique for high-throughput flow cytometry. Barcoded samples can be stained and acquired collectively, minimizing staining variability and antibody consumption, and decreasing required sample volumes. Combined with functional measurements, FCB can be used for drug screening, signaling profiling, and cytokine detection, but technical issues are present. We optimized the FCB technique for routine utilization using DyLight 350, DyLight 800, Pacific Orange, and CBD500 for barcoding six, nine, or 36 human peripheral blood specimens. Working concentrations of FCB dyes ranging from 0 to 500 μg/ml were tested, and viability dye staining was optimized to increase robustness of data. A five-color staining with surface markers for Vβ usage analysis in CD4+ and CD8+ T cells was achieved in combination with nine sample barcoding. We provide improvements of the FCB technique that should be useful for multiplex drug screening and for lymphocyte characterization and perturbations in the diagnosis and during the course of disease. PMID:28692789

  16. Nonlinear Wave Mixing Technique for Nondestructive Assessment of Infrastructure Materials

    NASA Astrophysics Data System (ADS)

    Ju, Taeho

    To operate safely, structures and components need to be inspected or monitored either periodically or in real time for potential failure. For this purpose, ultrasonic nondestructive evaluation (NDE) techniques have been used extensively. Most of these ultrasonic NDE techniques utilize only the linear behavior of the ultrasound. These linear techniques are effective in detecting discontinuities in materials such as cracks, voids, interfaces, inclusions, etc. However, in many engineering materials, it is the accumulation of microdamage that leads to degradation and eventual failure of a component. Unfortunately, it is difficult for linear ultrasonic NDE techniques to characterize or quantify such damage. On the other hand, the acoustic nonlinearity parameter (ANLP) of a material is often positively correlated with such damage in a material. Thus, nonlinear ultrasonic NDE methods have been used in recently years to characterize cumulative damage such as fatigue in metallic materials, aging in polymeric materials, and degradation of cement-based materials due to chemical reactions. In this thesis, we focus on developing a suit of novel nonlinear ultrasonic NDE techniques based on the interactions of nonlinear ultrasonic waves, namely wave mixing. First, a noncollinear wave mixing technique is developed to detect localized damage in a homogeneous material by using a pair of noncollinear a longitudinal wave (L-wave) and a shear wave (S-wave). This pair of incident waves make it possible to conduct NDE from a single side of the component, a condition that is often encountered in practical applications. The proposed noncollinear wave mixing technique is verified experimentally by carrying out measurements on aluminum alloy (AA 6061) samples. Numerical simulations using the Finite Element Method (FEM) are also conducted to further demonstrate the potential of the proposed technique to detect localized damage in structural components. Second, the aforementioned nonlinear mixing technique is adapted to develop an NDE technique for characterizing thermal aging of adhesive joints. To this end, a nonlinear spring model is used to simulate the effect of the adhesive layer. Based on this nonlinear spring model, analytical expressions of the resonant wave generated by the adhesive layers is obtained through an asymptotic analysis when the adhesive layer thickness is much smaller than the pertinent wavelength. The solutions are expressed in terms of the properties of the adhesive layer. The nonlinear spring model shows a good agreement with the finite layer model solutions in the limit of a small thickness to wavelength ratio. Third, to demonstrate the effectiveness of this newly developed technique, measurements are conducted on adhesive joint samples made of two aluminum adherends bonded together by a polymer adhesive tape. The samples are aged in a thermal chamber to induce thermal ageing degradation in the adhesive layer. Using the developed wave-mixing technique in conjunction with the nonlinear spring model, we show that the thermal aging damage of the adhesive layer can be quantified from only one side of the sample. Finally, by mixing two L-waves, we develop a mixing technique to nondestructively evaluate the damage induced by alkali-silica reaction (ASR) in concrete. Experimental measurements are conducted on concrete prism samples that contain reactive aggregates and have been subjected to different ASR conditioning. This new technique takes into consideration of the significant attenuation caused by ASR-induced microcracks and scattering by the aggregates. The measurement results show that the ANLP has a much greater sensitivity to ASR damage than other parameters such as attenuation and wave speed. More remarkably, it is also found that the measured acoustic nonlinearity parameter is well-correlated with the reduction of the compressive strength induced by ASR damage. Thus, ANLP can be used to nondestructively track ASR damage in concrete.

  17. Effect of pigment concentration on fastness and color values of thermal and UV curable pigment printing

    NASA Astrophysics Data System (ADS)

    Baysal, Gulcin; Kalav, Berdan; Karagüzel Kayaoğlu, Burçak

    2017-10-01

    In the current study, it is aimed to determine the effect of pigment concentration on fastness and colour values of thermal and ultraviolet (UV) curable pigment printing on synthetic leather. For this purpose, thermal curable solvent-based and UV curable water-based formulations were prepared with different pigment concentrations (3, 5 and 7%) separately and applied by screen printing technique using a screen printing machine. Samples printed with solvent-based formulations were thermally cured and samples printed with water-based formulations were cured using a UV curing machine equipped with gallium and mercury (Ga/Hg) lamps at room temperature. The crock fastness values of samples printed with solvent-based formulations showed that increase in pigment concentration was not effective on both dry and wet crock fastness values. On the other hand, in samples printed with UV curable water-based formulations, dry crock fastness was improved and evaluated as very good for all pigment concentrations. However, increasing the pigment concentration affected the wet crock fastness values adversely and lower values were observed. As the energy level increased for each irradiation source, the fastness values were improved. In comparison with samples printed with solvent-based formulations, samples printed with UV curable water-based formulations yielded higher K/S values at all pigment concentrations. The results suggested that, higher K/S values can be obtained in samples printed with UV curable water-based formulations at a lower pigment concentration compared to samples printed with solvent-based formulations.

  18. Note: development of high speed confocal 3D profilometer.

    PubMed

    Ang, Kar Tien; Fang, Zhong Ping; Tay, Arthur

    2014-11-01

    A high-speed confocal 3D profilometer based on the chromatic confocal technology and spinning Nipkow disk technique has been developed and tested. It can measure a whole surface topography by taking only one image that requires less than 0.3 s. Surface height information is retrieved based on the ratios of red, green, and blue color information. A new vector projection technique has developed to enhance the vertical resolution of the measurement. The measurement accuracy of the prototype system has been verified via different test samples.

  19. Ultra-fast quantitative imaging using ptychographic iterative engine based digital micro-mirror device

    NASA Astrophysics Data System (ADS)

    Sun, Aihui; Tian, Xiaolin; Kong, Yan; Jiang, Zhilong; Liu, Fei; Xue, Liang; Wang, Shouyu; Liu, Cheng

    2018-01-01

    As a lensfree imaging technique, ptychographic iterative engine (PIE) method can provide both quantitative sample amplitude and phase distributions avoiding aberration. However, it requires field of view (FoV) scanning often relying on mechanical translation, which not only slows down measuring speed, but also introduces mechanical errors decreasing both resolution and accuracy in retrieved information. In order to achieve high-accurate quantitative imaging with fast speed, digital micromirror device (DMD) is adopted in PIE for large FoV scanning controlled by on/off state coding by DMD. Measurements were implemented using biological samples as well as USAF resolution target, proving high resolution in quantitative imaging using the proposed system. Considering its fast and accurate imaging capability, it is believed the DMD based PIE technique provides a potential solution for medical observation and measurements.

  20. A method to generate soft shadows using a layered depth image and warping.

    PubMed

    Im, Yeon-Ho; Han, Chang-Young; Kim, Lee-Sup

    2005-01-01

    We present an image-based method for propagating area light illumination through a Layered Depth Image (LDI) to generate soft shadows from opaque and nonrefractive transparent objects. In our approach, using the depth peeling technique, we render an LDI from a reference light sample on a planar light source. Light illumination of all pixels in an LDI is then determined for all the other sample points via warping, an image-based rendering technique, which approximates ray tracing in our method. We use an image-warping equation and McMillan's warp ordering algorithm to find the intersections between rays and polygons and to find the order of intersections. Experiments for opaque and nonrefractive transparent objects are presented. Results indicate our approach generates soft shadows fast and effectively. Advantages and disadvantages of the proposed method are also discussed.

  1. Laser etching of austenitic stainless steels for micro-structural evaluation

    NASA Astrophysics Data System (ADS)

    Baghra, Chetan; Kumar, Aniruddha; Sathe, D. B.; Bhatt, R. B.; Behere, P. G.; Afzal, Mohd

    2015-06-01

    Etching is a key step in metallography to reveal microstructure of polished specimen under an optical microscope. A conventional technique for producing micro-structural contrast is chemical etching. As an alternate, laser etching is investigated since it does not involve use of corrosive reagents and it can be carried out without any physical contact with sample. Laser induced etching technique will be beneficial especially in nuclear industry where materials, being radioactive in nature, are handled inside a glove box. In this paper, experimental results of pulsed Nd-YAG laser based etching of few austenitic stainless steels such as SS 304, SS 316 LN and SS alloy D9 which are chosen as structural material for fabrication of various components of upcoming Prototype Fast Breeder Reactor (PFBR) at Kalpakkam India were reported. Laser etching was done by irradiating samples using nanosecond pulsed Nd-YAG laser beam which was transported into glass paneled glove box using optics. Experiments were carried out to understand effect of laser beam parameters such as wavelength, fluence, pulse repetition rate and number of exposures required for etching of austenitic stainless steel samples. Laser etching of PFBR fuel tube and plug welded joint was also carried to evaluate base metal grain size, depth of fusion at welded joint and heat affected zone in the base metal. Experimental results demonstrated that pulsed Nd-YAG laser etching is a fast and effortless technique which can be effectively employed for non-contact remote etching of austenitic stainless steels for micro-structural evaluation.

  2. Automatic transfer function generation for volume rendering of high-resolution x-ray 3D digital mammography images

    NASA Astrophysics Data System (ADS)

    Alyassin, Abdal M.

    2002-05-01

    3D Digital mammography (3DDM) is a new technology that provides high resolution X-ray breast tomographic data. Like any other tomographic medical imaging modalities, viewing a stack of tomographic images may require time especially if the images are of large matrix size. In addition, it may cause difficulty to conceptually construct 3D breast structures. Therefore, there is a need to readily visualize the data in 3D. However, one of the issues that hinder the usage of volume rendering (VR) is finding an automatic way to generate transfer functions that efficiently map the important diagnostic information in the data. We have developed a method that randomly samples the volume. Based on the mean and the standard deviation of these samples, the technique determines the lower limit and upper limit of a piecewise linear ramp transfer function. We have volume rendered several 3DDM data using this technique and compared visually the outcome with the result from a conventional automatic technique. The transfer function generated through the proposed technique provided superior VR images over the conventional technique. Furthermore, the improvement in the reproducibility of the transfer function correlated with the number of samples taken from the volume at the expense of the processing time.

  3. Measurement of complex terahertz dielectric properties of polymers using an improved free-space technique

    NASA Astrophysics Data System (ADS)

    Chang, Tianying; Zhang, Xiansheng; Yang, Chuanfa; Sun, Zhonglin; Cui, Hong-Liang

    2017-04-01

    The complex dielectric properties of non-polar solid polymer materials were measured in the terahertz (THz) band by a free-space technique employing a frequency-extended vector network analyzer (VNA), and by THz time-domain spectroscopy (TDS). Mindful of THz wave’s unique characteristics, the free-space method for measurement of material dielectric properties in the microwave band was expanded and improved for application in the THz frequency region. To ascertain the soundness and utility of the proposed method, measurements of the complex dielectric properties of a variety of polymers were carried out, including polytetrafluoroethylene (PTFE, known also by the brand name Teflon), polypropylene (PP), polyethylene (PE), and glass fiber resin (Composite Stone). The free-space method relies on the determination of electromagnetic scattering parameters (S-parameters) of the sample, with the gated-reflect-line (GRL) calibration technique commonly employed using a VNA. Subsequently, based on the S-parameters, the dielectric constant and loss characteristic of the sample were calculated by using a Newtonian iterative algorithm. To verify the calculated results, THz TDS technique, which produced Fresnel parameters such as reflection and transmission coefficients, was also used to independently determine the dielectric properties of these polymer samples, with results satisfactorily corroborating those obtained by the free-space extended microwave technique.

  4. Comparison of streamflow and water-quality data collection techniques for the Saginaw River, Michigan

    USGS Publications Warehouse

    Hoard, C.J.; Holtschlag, D.J.; Duris, J.W.; James, D.A.; Obenauer, D.J.

    2012-01-01

    In 2009, the Michigan Department of Environmental Quality and the U.S. Geological Survey developed a plan to compare the effect of various streamgaging and water-quality collection techniques on streamflow and stream water-quality data for the Saginaw River, Michigan. The Saginaw River is the primary contributor of surface runoff to Saginaw Bay, Lake Huron, draining approximately 70 percent of the Saginaw Bay watershed. The U.S. Environmental Protection Agency has listed the Saginaw Bay system as an "Area of Concern" due to many factors, including excessive sediment and nutrient concentrations in the water. Current efforts to estimate loading of sediment and nutrients to Saginaw Bay utilize water-quality samples collected using a surface-grab technique and flow data that are uncertain during specific conditions. Comparisons of current flow and water-quality sampling techniques to alternative techniques were assessed between April 2009 and September 2009 at two locations in the Saginaw River. Streamflow estimated using acoustic Doppler current profiling technology was compared to a traditional stage-discharge technique. Complex conditions resulting from the influence of Saginaw Bay on the Saginaw River were able to be captured using the acoustic technology, while the traditional stage-discharge technique failed to quantify these effects. Water-quality samples were collected at two locations and on eight different dates, utilizing both surface-grab and depth-integrating multiple-vertical techniques. Sixteen paired samples were collected and analyzed for suspended sediment, turbidity, total phosphorus, total nitrogen, orthophosphate, nitrite, nitrate, and ammonia. Results indicate that concentrations of constituents associated with suspended material, such as suspended sediment, turbidity, and total phosphorus, are underestimated when samples are collected using the surface-grab technique. The median magnitude of the relative percent difference in concentration based on sampling technique was 37 percent for suspended sediment, 26 percent for turbidity, and 9.7 percent for total phosphorus samples collected at both. Acoustic techniques were also used to assist in the determination of the effectiveness of using acoustic-backscatter information for estimating the suspended-sediment concentration of the river water. Backscatter data was collected by use of an acoustic Doppler current profiler, and a Van Dorn manual sampler was simultaneously used to collect discrete water samples at 10 depths (3.5, 7.5, 11, 14, 15.5, 17.5, 19.5, 20.5, 22, and 24.5 ft below the water surface) along two vertical profiles near the center of the Saginaw River near Bay City. The Van Dorn samples were analyzed for suspended-sediment concentrations, and these data were then used to develop a relationship between acoustic-backscatter data. Acoustic-backscatter data was strongly correlated to sediment concentrations and, by using a linear regression, was able to explain 89 percent of the variability. Although this regression technique showed promise for using acoustic backscatter to estimate suspended-sediment concentration, attempts to compare suspended-sediment concentrations to the acoustic signal-to-noise ratio estimates, recorded at the fixed acoustic streamflow-gaging station near Bay City (04157061), resulted in a poor correlation.

  5. Computer-aided boundary delineation of agricultural lands

    NASA Technical Reports Server (NTRS)

    Cheng, Thomas D.; Angelici, Gary L.; Slye, Robert E.; Ma, Matt

    1989-01-01

    The National Agricultural Statistics Service of the United States Department of Agriculture (USDA) presently uses labor-intensive aerial photographic interpretation techniques to divide large geographical areas into manageable-sized units for estimating domestic crop and livestock production. Prototype software, the computer-aided stratification (CAS) system, was developed to automate the procedure, and currently runs on a Sun-based image processing system. With a background display of LANDSAT Thematic Mapper and United States Geological Survey Digital Line Graph data, the operator uses a cursor to delineate agricultural areas, called sampling units, which are assigned to strata of land-use and land-cover types. The resultant stratified sampling units are used as input into subsequent USDA sampling procedures. As a test, three counties in Missouri were chosen for application of the CAS procedures. Subsequent analysis indicates that CAS was five times faster in creating sampling units than the manual techniques were.

  6. A structural study of bone changes in knee osteoarthritis by synchrotron-based X-ray fluorescence and X-ray absorption spectroscopy techniques

    NASA Astrophysics Data System (ADS)

    Sindhupakorn, Bura; Thienpratharn, Suwittaya; Kidkhunthod, Pinit

    2017-10-01

    Osteoarthritis (OA) is characterized by degeneration of articular cartilage and thickening of subchondral bone. The present study investigated the changing of biochemical components of cartilage and bone compared between normal and OA people. Using Synchrotron-based X-ray fluorescence (SR-XRF) and X-ray absorption spectroscopy (XAS) techniquesincluding X-ray absorption near edge structure (XANES) and extended X-ray absorption fine structure (EXAFS) were employed for the bone changes in kneeosteoarthritisstudies. The bone samples were collected from various osteoarthritis patients with both male and female in the ages range between 20 and 74 years old. SR-XRF results excited at 4240 eV for Ca elements show a majority three main groups, based on their XRF intensities, 20-36 years, 40-60 years and over 70 years, respectively. By employing XAS techniques, XANES features can be used to clearly explain in term of electronic transitions occurring in bone samples which are affected from osteoarthritis symptoms. Moreover, a structural change around Ca ions in bone samples is obviously obtained by EXAFS results indicating an increase of Ca-amorphous phase when the ages increase.

  7. A hybrid sensing approach for pure and adulterated honey classification.

    PubMed

    Subari, Norazian; Mohamad Saleh, Junita; Md Shakaff, Ali Yeon; Zakaria, Ammar

    2012-10-17

    This paper presents a comparison between data from single modality and fusion methods to classify Tualang honey as pure or adulterated using Linear Discriminant Analysis (LDA) and Principal Component Analysis (PCA) statistical classification approaches. Ten different brands of certified pure Tualang honey were obtained throughout peninsular Malaysia and Sumatera, Indonesia. Various concentrations of two types of sugar solution (beet and cane sugar) were used in this investigation to create honey samples of 20%, 40%, 60% and 80% adulteration concentrations. Honey data extracted from an electronic nose (e-nose) and Fourier Transform Infrared Spectroscopy (FTIR) were gathered, analyzed and compared based on fusion methods. Visual observation of classification plots revealed that the PCA approach able to distinct pure and adulterated honey samples better than the LDA technique. Overall, the validated classification results based on FTIR data (88.0%) gave higher classification accuracy than e-nose data (76.5%) using the LDA technique. Honey classification based on normalized low-level and intermediate-level FTIR and e-nose fusion data scored classification accuracies of 92.2% and 88.7%, respectively using the Stepwise LDA method. The results suggested that pure and adulterated honey samples were better classified using FTIR and e-nose fusion data than single modality data.

  8. Strategies for In situ and Sample Return Analyses

    NASA Astrophysics Data System (ADS)

    Papanastassiou, D. A.

    2006-12-01

    There is general agreement that planetary exploration proceeds from orbital reconnaissance of a planet, to surface and near-surface in situ exploration, to sample return missions, which bring back samples for investigations in terrestrial laboratories, using the panoply of state-of-the-art analytical techniques. The applicable techniques may depend on the nature of the returned material and complementary and multi- disciplinary techniques can be used to best advantage. High precision techniques also serve to provide the "ground truth" and calibrate past and future orbital and in situ measurements on a planet. It is also recognized that returned samples may continue to be analyzed by novel techniques as the techniques become developed, in part to address specific characteristics of returned samples. There are geophysical measurements such as those of the moment of inertia of a planet, seismic activity, and surface morphology that depend on orbital and in-situ science. Other characteristics, such as isotopic ages and isotopic compositions (e.g., initial Sr and Nd) as indicators of planetary mantle or crust evolution and sample provenance require returned samples. In situ analyses may be useful for preliminary characterization and for optimization of sample selection for sample return. In situ analyses by Surveyor on the Moon helped identify the major element chemistry of lunar samples and the need for high precision mass spectrometry (e. g., for Rb-Sr ages, based on extremely low alkali contents). The discussion of in-situ investigations vs. investigations on returned samples must be directly related to available instrumentation and to instrumentation that can be developed in the foreseeable future. The discussion of choices is not a philosophical but instead a very practical issue: what precision is required for key investigations and what is the instrumentation that meets or exceeds the required precision. This must be applied to potential in situ instruments and to laboratory instruments. Age determinations and use of isotopes for deciphering planetary evolution are viewed as off-limits for in-situ determinations, as they require: a) typically high precision mass spectrometry (at 0.01% and below); b) the determination of parent-daughter element ratios at least at the percent level; c) the measurement of coexisting minerals (for internal isochron determinations); d) low contamination (e. g., for U-Pb and Pb-Pb); and e) removal of adhering phases and contaminants, not related to the samples to be analyzed. Total K-Ar age determinations are subject to fewer requirements and may be feasible, in situ, but in the absence of neutron activation, as required for 39Ar-40Ar, the expected precision is at the level of ~20%, with trapped Ar in the samples introducing further uncertainty. Precision of 20% for K-Ar may suffice to address some key cratering rate uncertainties on Mars, especially as applicable to the Middle Amazonian(1). For in situ, the key issues, which must be addressed for all measurements are: what precision is required and are there instruments available, at the required precision levels. These issues must be addressed many years before a mission gets defined. Low precision instruments on several in situ missions that do not address key scientific questions may in fact be more expensive, in their sum, than a sample return mission. In summary, all missions should undergo similar intense scrutiny with regard to desired science and feasibility, based on available instrumentation (with demonstrated and known capabilities) and cost. 1. P. T. Doran et al. (2004) Earth Sci. Rev. 67, 313-337.

  9. 100 GHz pulse waveform measurement based on electro-optic sampling

    NASA Astrophysics Data System (ADS)

    Feng, Zhigang; Zhao, Kejia; Yang, Zhijun; Miao, Jingyuan; Chen, He

    2018-05-01

    We present an ultrafast pulse waveform measurement system based on an electro-optic sampling technique at 1560 nm and prepare LiTaO3-based electro-optic modulators with a coplanar waveguide structure. The transmission and reflection characteristics of electrical pulses on a coplanar waveguide terminated with an open circuit and a resistor are investigated by analyzing the corresponding time-domain pulse waveforms. We measure the output electrical pulse waveform of a 100 GHz photodiode and the obtained rise times of the impulse and step responses are 2.5 and 3.4 ps, respectively.

  10. Adaptive swarm cluster-based dynamic multi-objective synthetic minority oversampling technique algorithm for tackling binary imbalanced datasets in biomedical data classification.

    PubMed

    Li, Jinyan; Fong, Simon; Sung, Yunsick; Cho, Kyungeun; Wong, Raymond; Wong, Kelvin K L

    2016-01-01

    An imbalanced dataset is defined as a training dataset that has imbalanced proportions of data in both interesting and uninteresting classes. Often in biomedical applications, samples from the stimulating class are rare in a population, such as medical anomalies, positive clinical tests, and particular diseases. Although the target samples in the primitive dataset are small in number, the induction of a classification model over such training data leads to poor prediction performance due to insufficient training from the minority class. In this paper, we use a novel class-balancing method named adaptive swarm cluster-based dynamic multi-objective synthetic minority oversampling technique (ASCB_DmSMOTE) to solve this imbalanced dataset problem, which is common in biomedical applications. The proposed method combines under-sampling and over-sampling into a swarm optimisation algorithm. It adaptively selects suitable parameters for the rebalancing algorithm to find the best solution. Compared with the other versions of the SMOTE algorithm, significant improvements, which include higher accuracy and credibility, are observed with ASCB_DmSMOTE. Our proposed method tactfully combines two rebalancing techniques together. It reasonably re-allocates the majority class in the details and dynamically optimises the two parameters of SMOTE to synthesise a reasonable scale of minority class for each clustered sub-imbalanced dataset. The proposed methods ultimately overcome other conventional methods and attains higher credibility with even greater accuracy of the classification model.

  11. Comparative testing of nondestructive examination techniques for concrete structures

    NASA Astrophysics Data System (ADS)

    Clayton, Dwight A.; Smith, Cyrus M.

    2014-03-01

    A multitude of concrete-based structures are typically part of a light water reactor (LWR) plant to provide foundation, support, shielding, and containment functions. Concrete has been used in the construction of nuclear power plants (NPPs) because of three primary properties, its inexpensiveness, its structural strength, and its ability to shield radiation. Examples of concrete structures important to the safety of LWR plants include containment building, spent fuel pool, and cooling towers. Comparative testing of the various NDE concrete measurement techniques requires concrete samples with known material properties, voids, internal microstructure flaws, and reinforcement locations. These samples can be artificially created under laboratory conditions where the various properties can be controlled. Other than NPPs, there are not many applications where critical concrete structures are as thick and reinforced. Therefore, there are not many industries other than the nuclear power plant or power plant industry that are interested in performing NDE on thick and reinforced concrete structures. This leads to the lack of readily available samples of thick and heavily reinforced concrete for performing NDE evaluations, research, and training. The industry that typically performs the most NDE on concrete structures is the bridge and roadway industry. While bridge and roadway structures are thinner and less reinforced, they have a good base of NDE research to support their field NDE programs to detect, identify, and repair concrete failures. This paper will summarize the initial comparative testing of two concrete samples with an emphasis on how these techniques could perform on NPP concrete structures.

  12. FDTD based model of ISOCT imaging for validation of nanoscale sensitivity (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Eid, Aya; Zhang, Di; Yi, Ji; Backman, Vadim

    2017-02-01

    Many of the earliest structural changes associated with neoplasia occur on the micro and nanometer scale, and thus appear histologically normal. Our group has established Inverse Spectroscopic OCT (ISOCT), a spectral based technique to extract nanoscale sensitive metrics derived from the OCT signal. Thus, there is a need to model light transport through relatively large volumes (< 50 um^3) of media with nanoscale level resolution. Finite Difference Time Domain (FDTD) is an iterative approach which directly solves Maxwell's equations to robustly estimate the electric and magnetic fields propagating through a sample. The sample's refractive index for every spatial voxel and wavelength are specified upon a grid with voxel sizes on the order of λ/20, making it an ideal modelling technique for nanoscale structure analysis. Here, we utilize the FDTD technique to validate the nanoscale sensing ability of ISOCT. The use of FDTD for OCT modelling requires three components: calculating the source beam as it propagates through the optical system, computing the sample's scattered field using FDTD, and finally propagating the scattered field back through the optical system. The principles of Fourier optics are employed to focus this interference field through a 4f optical system and onto the detector. Three-dimensional numerical samples are generated from a given refractive index correlation function with known parameters, and subsequent OCT images and mass density correlation function metrics are computed. We show that while the resolvability of the OCT image remains diffraction limited, spectral analysis allows nanoscale sensitive metrics to be extracted.

  13. Inference for multivariate regression model based on multiply imputed synthetic data generated via posterior predictive sampling

    NASA Astrophysics Data System (ADS)

    Moura, Ricardo; Sinha, Bimal; Coelho, Carlos A.

    2017-06-01

    The recent popularity of the use of synthetic data as a Statistical Disclosure Control technique has enabled the development of several methods of generating and analyzing such data, but almost always relying in asymptotic distributions and in consequence being not adequate for small sample datasets. Thus, a likelihood-based exact inference procedure is derived for the matrix of regression coefficients of the multivariate regression model, for multiply imputed synthetic data generated via Posterior Predictive Sampling. Since it is based in exact distributions this procedure may even be used in small sample datasets. Simulation studies compare the results obtained from the proposed exact inferential procedure with the results obtained from an adaptation of Reiters combination rule to multiply imputed synthetic datasets and an application to the 2000 Current Population Survey is discussed.

  14. A performance model for GPUs with caches

    DOE PAGES

    Dao, Thanh Tuan; Kim, Jungwon; Seo, Sangmin; ...

    2014-06-24

    To exploit the abundant computational power of the world's fastest supercomputers, an even workload distribution to the typically heterogeneous compute devices is necessary. While relatively accurate performance models exist for conventional CPUs, accurate performance estimation models for modern GPUs do not exist. This paper presents two accurate models for modern GPUs: a sampling-based linear model, and a model based on machine-learning (ML) techniques which improves the accuracy of the linear model and is applicable to modern GPUs with and without caches. We first construct the sampling-based linear model to predict the runtime of an arbitrary OpenCL kernel. Based on anmore » analysis of NVIDIA GPUs' scheduling policies we determine the earliest sampling points that allow an accurate estimation. The linear model cannot capture well the significant effects that memory coalescing or caching as implemented in modern GPUs have on performance. We therefore propose a model based on ML techniques that takes several compiler-generated statistics about the kernel as well as the GPU's hardware performance counters as additional inputs to obtain a more accurate runtime performance estimation for modern GPUs. We demonstrate the effectiveness and broad applicability of the model by applying it to three different NVIDIA GPU architectures and one AMD GPU architecture. On an extensive set of OpenCL benchmarks, on average, the proposed model estimates the runtime performance with less than 7 percent error for a second-generation GTX 280 with no on-chip caches and less than 5 percent for the Fermi-based GTX 580 with hardware caches. On the Kepler-based GTX 680, the linear model has an error of less than 10 percent. On an AMD GPU architecture, Radeon HD 6970, the model estimates with 8 percent of error rates. As a result, the proposed technique outperforms existing models by a factor of 5 to 6 in terms of accuracy.« less

  15. A study of tablet dissolution by magnetic resonance electric current density imaging.

    PubMed

    Mikac, Ursa; Demsar, Alojz; Demsar, Franci; Sersa, Igor

    2007-03-01

    The electric current density imaging technique (CDI) was used to monitor the dissolution of ion releasing tablets (made of various carboxylic acids and of sodium chloride) by following conductivity changes in an agar-agar gel surrounding the tablet. Conductivity changes in the sample were used to calculate spatial and temporal changes of ionic concentrations in the sample. The experimental data for ion migration were compared to a mathematical model based on a solution of the diffusion equation with moving boundary conditions for the tablet geometry. Diffusion constants for different acids were determined by fitting the model to the experimental data. The experiments with dissolving tablets were used to demonstrate the potential of the CDI technique for measurement of ion concentration in the vicinity of ion releasing samples.

  16. A simple thermometric technique for reaction-rate determination of inorganic species, based on the iodide-catalysed cerium(IV)-arsenic(III) reaction.

    PubMed

    Grases, F; Forteza, R; March, J G; Cerda, V

    1985-02-01

    A very simple reaction-rate thermometric technique is used for determination of iodide (5-20 ng ml ), based on its catalytic action on the cerium(IV)-arsenic(III) reaction, and for determination of mercury(II) (1.5-10 ng ml ) and silver(I) (2-10 ng ml ), based on their inhibitory effect on this reaction. The reaction is followed by measuring the rate of temperature increase. The method suffers from very few interferences and is applied to determination of iodide in biological and inorganic samples, and Hg(II) and Ag(I) in pharmaceutical products.

  17. Impact of sampling techniques on measured stormwater quality data for small streams

    USGS Publications Warehouse

    Harmel, R.D.; Slade, R.M.; Haney, R.L.

    2010-01-01

    Science-based sampling methodologies are needed to enhance water quality characterization for setting appropriate water quality standards, developing Total Maximum Daily Loads, and managing nonpoint source pollution. Storm event sampling, which is vital for adequate assessment of water quality in small (wadeable) streams, is typically conducted by manual grab or integrated sampling or with an automated sampler. Although it is typically assumed that samples from a single point adequately represent mean cross-sectional concentrations, especially for dissolved constituents, this assumption of well-mixed conditions has received limited evaluation. Similarly, the impact of temporal (within-storm) concentration variability is rarely considered. Therefore, this study evaluated differences in stormwater quality measured in small streams with several common sampling techniques, which in essence evaluated within-channel and within-storm concentration variability. Constituent concentrations from manual grab samples and from integrated samples were compared for 31 events, then concentrations were also compared for seven events with automated sample collection. Comparison of sampling techniques indicated varying degrees of concentration variability within channel cross sections for both dissolved and particulate constituents, which is contrary to common assumptions of substantial variability in particulate concentrations and of minimal variability in dissolved concentrations. Results also indicated the potential for substantial within-storm (temporal) concentration variability for both dissolved and particulate constituents. Thus, failing to account for potential cross-sectional and temporal concentration variability in stormwater monitoring projects can introduce additional uncertainty in measured water quality data. Copyright ?? 2010 by the American Society of Agronomy, Crop Science Society of America, and Soil Science Society of America. All rights reserved.

  18. Improving Ramsey spectroscopy in the extreme-ultraviolet region with a random-sampling approach

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Eramo, R.; Bellini, M.; European Laboratory for Non-linear Spectroscopy

    2011-04-15

    Ramsey-like techniques, based on the coherent excitation of a sample by delayed and phase-correlated pulses, are promising tools for high-precision spectroscopic tests of QED in the extreme-ultraviolet (xuv) spectral region, but currently suffer experimental limitations related to long acquisition times and critical stability issues. Here we propose a random subsampling approach to Ramsey spectroscopy that, by allowing experimentalists to reach a given spectral resolution goal in a fraction of the usual acquisition time, leads to substantial improvements in high-resolution spectroscopy and may open the way to a widespread application of Ramsey-like techniques to precision measurements in the xuv spectral region.

  19. Size measuring techniques as tool to monitor pea proteins intramolecular crosslinking by transglutaminase treatment.

    PubMed

    Djoullah, Attaf; Krechiche, Ghali; Husson, Florence; Saurel, Rémi

    2016-01-01

    In this work, techniques for monitoring the intramolecular transglutaminase cross-links of pea proteins, based on protein size determination, were developed. Sodium dodecyl sulfate-polyacrylamide gel electrophoresis profiles of transglutaminase-treated low concentration (0.01% w/w) pea albumin samples, compared to the untreated one (control), showed a higher electrophoretic migration of the major albumin fraction band (26 kDa), reflecting a decrease in protein size. This protein size decrease was confirmed, after DEAE column purification, by dynamic light scattering (DLS) where the hydrodynamic radius of treated samples appears to be reduced compared to the control one. Copyright © 2015 Elsevier Ltd. All rights reserved.

  20. Rapid wide-field Mueller matrix polarimetry imaging based on four photoelastic modulators with no moving parts.

    PubMed

    Alali, Sanaz; Gribble, Adam; Vitkin, I Alex

    2016-03-01

    A new polarimetry method is demonstrated to image the entire Mueller matrix of a turbid sample using four photoelastic modulators (PEMs) and a charge coupled device (CCD) camera, with no moving parts. Accurate wide-field imaging is enabled with a field-programmable gate array (FPGA) optical gating technique and an evolutionary algorithm (EA) that optimizes imaging times. This technique accurately and rapidly measured the Mueller matrices of air, polarization elements, and turbid phantoms. The system should prove advantageous for Mueller matrix analysis of turbid samples (e.g., biological tissues) over large fields of view, in less than a second.

  1. Remote monitoring of environmental particulate pollution - A problem in inversion of first-kind integral equations

    NASA Technical Reports Server (NTRS)

    Fymat, A. L.

    1975-01-01

    The determination of the microstructure, chemical nature, and dynamical evolution of scattering particulates in the atmosphere is considered. A description is given of indirect sampling techniques which can circumvent most of the difficulties associated with direct sampling techniques, taking into account methods based on scattering, extinction, and diffraction of an incident light beam. Approaches for reconstructing the particulate size distribution from the direct and the scattered radiation are discussed. A new method is proposed for determining the chemical composition of the particulates and attention is given to the relevance of methods of solution involving first kind Fredholm integral equations.

  2. Compressive self-interference Fresnel digital holography with faithful reconstruction

    NASA Astrophysics Data System (ADS)

    Wan, Yuhong; Man, Tianlong; Han, Ying; Zhou, Hongqiang; Wang, Dayong

    2017-05-01

    We developed compressive self-interference digital holographic approach that allows retrieving three-dimensional information of the spatially incoherent objects from single-shot captured hologram. The Fresnel incoherent correlation holography is combined with parallel phase-shifting technique to instantaneously obtain spatial-multiplexed phase-shifting holograms. The recording scheme is regarded as compressive forward sensing model, thus the compressive-sensing-based reconstruction algorithm is implemented to reconstruct the original object from the under sampled demultiplexed sub-holograms. The concept was verified by simulations and experiments with simulating use of the polarizer array. The proposed technique has great potential to be applied in 3D tracking of spatially incoherent samples.

  3. Ultra fine grained Ti prepared by severe plastic deformation

    NASA Astrophysics Data System (ADS)

    Lukáč, F.; Čížek, J.; Knapp, J.; Procházka, I.; Zháňal, P.; Islamgaliev, R. K.

    2016-01-01

    The positron annihilation spectroscopy was employed for characterisation of defects in pure Ti with ultra fine grained (UFG) structure. UFG Ti samples were prepared by two techniques based on severe plastic deformation (SPD): (i) high pressure torsion (HPT) and (ii) equal channel angular pressing (ECAP). Although HPT is the most efficient technique for grain refinement, the size of HPT-deformed specimens is limited. On the other hand, ECAP is less efficient in grain refinement but enables to produce larger samples more suitable for industrial applications. Characterisation of defects by positron annihilation spectroscopy was accompanied by hardness testing in order to monitor the development of mechanical properties of UFG Ti.

  4. Network Model-Assisted Inference from Respondent-Driven Sampling Data

    PubMed Central

    Gile, Krista J.; Handcock, Mark S.

    2015-01-01

    Summary Respondent-Driven Sampling is a widely-used method for sampling hard-to-reach human populations by link-tracing over their social networks. Inference from such data requires specialized techniques because the sampling process is both partially beyond the control of the researcher, and partially implicitly defined. Therefore, it is not generally possible to directly compute the sampling weights for traditional design-based inference, and likelihood inference requires modeling the complex sampling process. As an alternative, we introduce a model-assisted approach, resulting in a design-based estimator leveraging a working network model. We derive a new class of estimators for population means and a corresponding bootstrap standard error estimator. We demonstrate improved performance compared to existing estimators, including adjustment for an initial convenience sample. We also apply the method and an extension to the estimation of HIV prevalence in a high-risk population. PMID:26640328

  5. Network Model-Assisted Inference from Respondent-Driven Sampling Data.

    PubMed

    Gile, Krista J; Handcock, Mark S

    2015-06-01

    Respondent-Driven Sampling is a widely-used method for sampling hard-to-reach human populations by link-tracing over their social networks. Inference from such data requires specialized techniques because the sampling process is both partially beyond the control of the researcher, and partially implicitly defined. Therefore, it is not generally possible to directly compute the sampling weights for traditional design-based inference, and likelihood inference requires modeling the complex sampling process. As an alternative, we introduce a model-assisted approach, resulting in a design-based estimator leveraging a working network model. We derive a new class of estimators for population means and a corresponding bootstrap standard error estimator. We demonstrate improved performance compared to existing estimators, including adjustment for an initial convenience sample. We also apply the method and an extension to the estimation of HIV prevalence in a high-risk population.

  6. Automated Broad-Range Molecular Detection of Bacteria in Clinical Samples

    PubMed Central

    Hoogewerf, Martine; Vandenbroucke-Grauls, Christina M. J. E.; Savelkoul, Paul H. M.

    2016-01-01

    Molecular detection methods, such as quantitative PCR (qPCR), have found their way into clinical microbiology laboratories for the detection of an array of pathogens. Most routinely used methods, however, are directed at specific species. Thus, anything that is not explicitly searched for will be missed. This greatly limits the flexibility and universal application of these techniques. We investigated the application of a rapid universal bacterial molecular identification method, IS-pro, to routine patient samples received in a clinical microbiology laboratory. IS-pro is a eubacterial technique based on the detection and categorization of 16S-23S rRNA gene interspace regions with lengths that are specific for each microbial species. As this is an open technique, clinicians do not need to decide in advance what to look for. We compared routine culture to IS-pro using 66 samples sent in for routine bacterial diagnostic testing. The samples were obtained from patients with infections in normally sterile sites (without a resident microbiota). The results were identical in 20 (30%) samples, IS-pro detected more bacterial species than culture in 31 (47%) samples, and five of the 10 culture-negative samples were positive with IS-pro. The case histories of the five patients from whom these culture-negative/IS-pro-positive samples were obtained suggest that the IS-pro findings are highly clinically relevant. Our findings indicate that an open molecular approach, such as IS-pro, may have a high added value for clinical practice. PMID:26763956

  7. Development of a Multiplexed Liquid Chromatography Multiple-Reaction-Monitoring Mass Spectrometry (LC-MRM/MS) Method for Evaluation of Salivary Proteins as Oral Cancer Biomarkers*

    PubMed Central

    Chen, Hsiao-Wei; Wu, Chun-Feng; Chu, Lichieh Julie; Chiang, Wei-Fang; Wu, Chih-Ching; Yu, Jau-Song; Tsai, Cheng-Han; Liang, Kung-Hao; Chang, Yu-Sun; Wu, Maureen; Ou Yang, Wei-Ting

    2017-01-01

    Multiple (selected) reaction monitoring (MRM/SRM) of peptides is a growing technology for target protein quantification because it is more robust, precise, accurate, high-throughput, and multiplex-capable than antibody-based techniques. The technique has been applied clinically to the large-scale quantification of multiple target proteins in different types of fluids. However, previous MRM-based studies have placed less focus on sample-preparation workflow and analytical performance in the precise quantification of proteins in saliva, a noninvasively sampled body fluid. In this study, we evaluated the analytical performance of a simple and robust multiple reaction monitoring (MRM)-based targeted proteomics approach incorporating liquid chromatography with mass spectrometry detection (LC-MRM/MS). This platform was used to quantitatively assess the biomarker potential of a group of 56 salivary proteins that have previously been associated with human cancers. To further enhance the development of this technology for assay of salivary samples, we optimized the workflow for salivary protein digestion and evaluated quantification performance, robustness and technical limitations in analyzing clinical samples. Using a clinically well-characterized cohort of two independent clinical sample sets (total n = 119), we quantitatively characterized these protein biomarker candidates in saliva specimens from controls and oral squamous cell carcinoma (OSCC) patients. The results clearly showed a significant elevation of most targeted proteins in saliva samples from OSCC patients compared with controls. Overall, this platform was capable of assaying the most highly multiplexed panel of salivary protein biomarkers, highlighting the clinical utility of MRM in oral cancer biomarker research. PMID:28235782

  8. A hydrophobic ionic liquid compartmentalized sampling/labeling and its separation techniques in polydimethylsiloxane microchip capillary electrophoresis.

    PubMed

    Quan, Hong Hua; Li, Ming; Huang, Yan; Hahn, Jong Hoon

    2017-01-01

    This paper demonstrates a novel compartmentalized sampling/labeling method and its separation techniques using a hydrophobic ionic liquid (IL)-1-butyl-3-methylimidazolium bis(trifluoromethylsulfonyl)-imidate (BmimNTf 2 )-as the immiscible phase, which is capable of minimizing signal losses during microchip capillary electrophoresis (MCE). The MCE device consists of a silica tube connected to a straight polydimethylsiloxane (PDMS) separation channel. Poly(diallyldimethylammonium chloride) (PDDAC) was coated on the inner surface of channel to ease the introduction of IL plugs and enhance the IL wetting on the PDMS surface for sample releasing. Electroosmotic flow (EOF)-based sample compartmentalization was carried out through a sequenced injection into sampling tubes with the following order: leading IL plug/sample segment/terminal IL plug. The movement of the sample segment was easily controlled by applying an electrical voltage across both ends of the chip without a sample volume change. This approach effectively prevented analyte diffusion before injection into MCE channels. When the sample segment was manipulated to the PDDAC-modified PDMS channel, the sample plug then was released from isolation under EOF while IL plugs adsorbed onto channel surfaces owing to strong adhesion. A mixture of flavin adenine nucleotides (FAD) and flavin mononucleotides (FMN) was successfully separated on a 2.5 cm long separation channel, for which the theoretical numbers of plates were 15 000 and 17 000, respectively. The obtained peak intensity was increased 6.3-fold over the corresponding value from conventional electrokinetic injection with the same sampling time. Furthermore, based on the compartmented sample segment serving as an interim reactor, an on-chip fluorescence labeling is demonstrated. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  9. Differentiation of Candida albicans, Candida glabrata, and Candida krusei by FT-IR and chemometrics by CHROMagar™ Candida.

    PubMed

    Wohlmeister, Denise; Vianna, Débora Renz Barreto; Helfer, Virginia Etges; Calil, Luciane Noal; Buffon, Andréia; Fuentefria, Alexandre Meneghello; Corbellini, Valeriano Antonio; Pilger, Diogo André

    2017-10-01

    Pathogenic Candida species are detected in clinical infections. CHROMagar™ is a phenotypical method used to identify Candida species, although it has limitations, which indicates the need for more sensitive and specific techniques. Infrared Spectroscopy (FT-IR) is an analytical vibrational technique used to identify patterns of metabolic fingerprint of biological matrixes, particularly whole microbial cell systems as Candida sp. in association of classificatory chemometrics algorithms. On the other hand, Soft Independent Modeling by Class Analogy (SIMCA) is one of the typical algorithms still little employed in microbiological classification. This study demonstrates the applicability of the FT-IR-technique by specular reflectance associated with SIMCA to discriminate Candida species isolated from vaginal discharges and grown on CHROMagar™. The differences in spectra of C. albicans, C. glabrata and C. krusei were suitable for use in the discrimination of these species, which was observed by PCA. Then, a SIMCA model was constructed with standard samples of three species and using the spectral region of 1792-1561cm -1 . All samples (n=48) were properly classified based on the chromogenic method using CHROMagar™ Candida. In total, 93.4% (n=45) of the samples were correctly and unambiguously classified (Class I). Two samples of C. albicans were classified correctly, though these could have been C. glabrata (Class II). Also, one C. glabrata sample could have been classified as C. krusei (Class II). Concerning these three samples, one triplicate of each was included in Class II and two in Class I. Therefore, FT-IR associated with SIMCA can be used to identify samples of C. albicans, C. glabrata, and C. krusei grown in CHROMagar™ Candida aiming to improve clinical applications of this technique. Copyright © 2017 Elsevier B.V. All rights reserved.

  10. Detection of Hepatitis A Virus by the Nucleic Acid Sequence-Based Amplification Technique and Comparison with Reverse Transcription-PCR

    PubMed Central

    Jean, Julie; Blais, Burton; Darveau, André; Fliss, Ismaïl

    2001-01-01

    A nucleic acid sequence-based amplification (NASBA) technique for the detection of hepatitis A virus (HAV) in foods was developed and compared to the traditional reverse transcription (RT)-PCR technique. Oligonucleotide primers targeting the VP1 and VP2 genes encoding the major HAV capsid proteins were used for the amplification of viral RNA in an isothermal process resulting in the accumulation of RNA amplicons. Amplicons were detected by hybridization with a digoxigenin-labeled oligonucleotide probe in a dot blot assay format. Using the NASBA, as little as 0.4 ng of target RNA/ml was detected per comparison to 4 ng/ml for RT-PCR. When crude HAV viral lysate was used, a detection limit of 2 PFU (4 × 102 PFU/ml) was obtained with NASBA, compared to 50 PFU (1 × 104 PFU/ml) obtained with RT-PCR. No interference was encountered in the amplification of HAV RNA in the presence of excess nontarget RNA or DNA. The NASBA system successfully detected HAV recovered from experimentally inoculated samples of waste water, lettuce, and blueberries. Compared to RT-PCR and other amplification techniques, the NASBA system offers several advantages in terms of sensitivity, rapidity, and simplicity. This technique should be readily adaptable for detection of other RNA viruses in both foods and clinical samples. PMID:11722911

  11. Detection of hepatitis A virus by the nucleic acid sequence-based amplification technique and comparison with reverse transcription-PCR.

    PubMed

    Jean, J; Blais, B; Darveau, A; Fliss, I

    2001-12-01

    A nucleic acid sequence-based amplification (NASBA) technique for the detection of hepatitis A virus (HAV) in foods was developed and compared to the traditional reverse transcription (RT)-PCR technique. Oligonucleotide primers targeting the VP1 and VP2 genes encoding the major HAV capsid proteins were used for the amplification of viral RNA in an isothermal process resulting in the accumulation of RNA amplicons. Amplicons were detected by hybridization with a digoxigenin-labeled oligonucleotide probe in a dot blot assay format. Using the NASBA, as little as 0.4 ng of target RNA/ml was detected per comparison to 4 ng/ml for RT-PCR. When crude HAV viral lysate was used, a detection limit of 2 PFU (4 x 10(2) PFU/ml) was obtained with NASBA, compared to 50 PFU (1 x 10(4) PFU/ml) obtained with RT-PCR. No interference was encountered in the amplification of HAV RNA in the presence of excess nontarget RNA or DNA. The NASBA system successfully detected HAV recovered from experimentally inoculated samples of waste water, lettuce, and blueberries. Compared to RT-PCR and other amplification techniques, the NASBA system offers several advantages in terms of sensitivity, rapidity, and simplicity. This technique should be readily adaptable for detection of other RNA viruses in both foods and clinical samples.

  12. A multiplex real-time PCR assay for identification of Pneumocystis jirovecii, Histoplasma capsulatum, and Cryptococcus neoformans/Cryptococcus gattii in samples from AIDS patients with opportunistic pneumonia.

    PubMed

    Gago, Sara; Esteban, Cristina; Valero, Clara; Zaragoza, Oscar; Puig de la Bellacasa, Jorge; Buitrago, María José

    2014-04-01

    A molecular diagnostic technique based on real-time PCR was developed for the simultaneous detection of three of the most frequent causative agents of fungal opportunistic pneumonia in AIDS patients: Pneumocystis jirovecii, Histoplasma capsulatum, and Cryptococcus neoformans/Cryptococcus gattii. This technique was tested in cultured strains and in clinical samples from HIV-positive patients. The methodology used involved species-specific molecular beacon probes targeted to the internal transcribed spacer regions of the rDNA. An internal control was also included in each assay. The multiplex real-time PCR assay was tested in 24 clinical strains and 43 clinical samples from AIDS patients with proven fungal infection. The technique developed showed high reproducibility (r(2) of >0.98) and specificity (100%). For H. capsulatum and Cryptococcus spp., the detection limits of the method were 20 and 2 fg of genomic DNA/20 μl reaction mixture, respectively, while for P. jirovecii the detection limit was 2.92 log10 copies/20 μl reaction mixture. The sensitivity in vitro was 100% for clinical strains and 90.7% for clinical samples. The assay was positive for 92.5% of the patients. For one of the patients with proven histoplasmosis, P. jirovecii was also detected in a bronchoalveolar lavage sample. No PCR inhibition was detected. This multiplex real-time PCR technique is fast, sensitive, and specific and may have clinical applications.

  13. Recruitment Techniques and Strategies in a Community-Based Colorectal Cancer Screening Study of Men and Women of African Ancestry.

    PubMed

    Davis, Stacy N; Govindaraju, Swapamthi; Jackson, Brittany; Williams, Kimberly R; Christy, Shannon M; Vadaparampil, Susan T; Quinn, Gwendolyn P; Shibata, David; Roetzheim, Richard; Meade, Cathy D; Gwede, Clement K

    Recruiting ethnically diverse Black participants to an innovative, community-based research study to reduce colorectal cancer screening disparities requires multipronged recruitment techniques. This article describes active, passive, and snowball recruitment techniques, and challenges and lessons learned in recruiting a diverse sample of Black participants. For each of the three recruitment techniques, data were collected on strategies, enrollment efficiency (participants enrolled/participants evaluated), and reasons for ineligibility. Five hundred sixty individuals were evaluated, and 330 individuals were enrolled. Active recruitment yielded the highest number of enrolled participants, followed by passive and snowball. Snowball recruitment was the most efficient technique, with enrollment efficiency of 72.4%, followed by passive (58.1%) and active (55.7%) techniques. There were significant differences in gender, education, country of origin, health insurance, and having a regular physician by recruitment technique (p < .05). Multipronged recruitment techniques should be employed to increase reach, diversity, and study participation rates among Blacks. Although each recruitment technique had a variable enrollment efficiency, the use of multipronged recruitment techniques can lead to successful enrollment of diverse Blacks into cancer prevention and control interventions.

  14. Development of a simple and low-cost enzymatic methodology for quantitative analysis of carbamates in meat samples of forensic interest.

    PubMed

    Sabino, Bruno Duarte; Torraca, Tathiana Guilliod; Moura, Claudia Melo; Rozenbaum, Hannah Felicia; de Castro Faria, Mauro Velho

    2010-05-01

    Foods contaminated with a granulated material similar to Temik (a commercial pesticide formulation containing the carbamate insecticide aldicarb) are often involved in accidental ingestion, suicides, and homicides in Brazil. We developed a simple technique to detect aldicarb. This technique is based on the inhibition of a stable preparation of the enzyme acetylcholinesterase, and it is specially adapted for forensic purposes. It comprises an initial extraction step with the solvent methylene chloride followed by a colorimetric acetylcholinesterase assay. We propose that results of testing contaminated forensic samples be expressed in aldicarb equivalents because, even though all other carbamates are also potent enzyme inhibitors, aldicarb is the contaminant most frequently found in forensic samples. This method is rapid (several samples can be run in a period of 2 h) and low cost. This method also proved to be precise and accurate, detecting concentrations as low as 40 microg/kg of aldicarb in meat samples.

  15. High speed wavefront sensorless aberration correction in digital micromirror based confocal microscopy.

    PubMed

    Pozzi, P; Wilding, D; Soloviev, O; Verstraete, H; Bliek, L; Vdovin, G; Verhaegen, M

    2017-01-23

    The quality of fluorescence microscopy images is often impaired by the presence of sample induced optical aberrations. Adaptive optical elements such as deformable mirrors or spatial light modulators can be used to correct aberrations. However, previously reported techniques either require special sample preparation, or time consuming optimization procedures for the correction of static aberrations. This paper reports a technique for optical sectioning fluorescence microscopy capable of correcting dynamic aberrations in any fluorescent sample during the acquisition. This is achieved by implementing adaptive optics in a non conventional confocal microscopy setup, with multiple programmable confocal apertures, in which out of focus light can be separately detected, and used to optimize the correction performance with a sampling frequency an order of magnitude faster than the imaging rate of the system. The paper reports results comparing the correction performances to traditional image optimization algorithms, and demonstrates how the system can compensate for dynamic changes in the aberrations, such as those introduced during a focal stack acquisition though a thick sample.

  16. 3D Image Analysis of Geomaterials using Confocal Microscopy

    NASA Astrophysics Data System (ADS)

    Mulukutla, G.; Proussevitch, A.; Sahagian, D.

    2009-05-01

    Confocal microscopy is one of the most significant advances in optical microscopy of the last century. It is widely used in biological sciences but its application to geomaterials lingers due to a number of technical problems. Potentially the technique can perform non-invasive testing on a laser illuminated sample that fluoresces using a unique optical sectioning capability that rejects out-of-focus light reaching the confocal aperture. Fluorescence in geomaterials is commonly induced using epoxy doped with a fluorochrome that is impregnated into the sample to enable discrimination of various features such as void space or material boundaries. However, for many geomaterials, this method cannot be used because they do not naturally fluoresce and because epoxy cannot be impregnated into inaccessible parts of the sample due to lack of permeability. As a result, the confocal images of most geomaterials that have not been pre-processed with extensive sample preparation techniques are of poor quality and lack the necessary image and edge contrast necessary to apply any commonly used segmentation techniques to conduct any quantitative study of its features such as vesicularity, internal structure, etc. In our present work, we are developing a methodology to conduct a quantitative 3D analysis of images of geomaterials collected using a confocal microscope with minimal amount of prior sample preparation and no addition of fluorescence. Two sample geomaterials, a volcanic melt sample and a crystal chip containing fluid inclusions are used to assess the feasibility of the method. A step-by-step process of image analysis includes application of image filtration to enhance the edges or material interfaces and is based on two segmentation techniques: geodesic active contours and region competition. Both techniques have been applied extensively to the analysis of medical MRI images to segment anatomical structures. Preliminary analysis suggests that there is distortion in the shapes of the segmented vesicles, vapor bubbles, and void spaces due to the optical measurements, so corrective actions are being explored. This will establish a practical and reliable framework for an adaptive 3D image processing technique for the analysis of geomaterials using confocal microscopy.

  17. Thermoelectric Properties of Pulsed Electric Current Sintered Samples of AgPb m SbSe17 ( m = 16 or 17)

    NASA Astrophysics Data System (ADS)

    Wu, Chun-I.; Todorov, Ilyia; Kanatzidis, Mercouri G.; Timm, Edward; Case, Eldon D.; Schock, Harold; Hogan, Timothy P.

    2012-06-01

    Lead chalcogenide materials have drawn attention in recent years because of their outstanding thermoelectric properties. Bulk n-type materials of AgPb m SbTe2+ m have been reported to exhibit high figure of merit, ZT, as high as 1.7 at 700 K. Recent reports have shown p-type lead selenide-based compounds with comparable ZT. The analogous material AgPb m SbSe17 shares a similar cubic rock-salt structure with PbTe-based compounds; however, it exhibits a higher melting point, and selenium is more abundant than tellurium. Using solid solution chemistry, we have fabricated cast AgPb15SbSe17 samples that show a peak power factor of approximately 17 μW/cm K2 at 450 K. Increasing the strength of such materials is commonly achieved through powder processing, which also helps to homogenize the source materials. Pulsed electric current sintering (PECS) is a hot-pressing technique that utilizes electric current through the die and sample for direct Joule heating during pressing. The mechanisms present during PECS processing have captured significant research interest and have led to some notable improvements in sample properties compared with other densification techniques. We report the thermoelectric properties of PECS samples of AgPb m SbSe17 along with sample fabrication and processing details.

  18. Automated liver sampling using a gradient dual-echo Dixon-based technique.

    PubMed

    Bashir, Mustafa R; Dale, Brian M; Merkle, Elmar M; Boll, Daniel T

    2012-05-01

    Magnetic resonance spectroscopy of the liver requires input from a physicist or physician at the time of acquisition to insure proper voxel selection, while in multiecho chemical shift imaging, numerous regions of interest must be manually selected in order to ensure analysis of a representative portion of the liver parenchyma. A fully automated technique could improve workflow by selecting representative portions of the liver prior to human analysis. Complete volumes from three-dimensional gradient dual-echo acquisitions with two-point Dixon reconstruction acquired at 1.5 and 3 T were analyzed in 100 subjects, using an automated liver sampling algorithm, based on ratio pairs calculated from signal intensity image data as fat-only/water-only and log(in-phase/opposed-phase) on a voxel-by-voxel basis. Using different gridding variations of the algorithm, the average correct liver volume samples ranged from 527 to 733 mL. The average percentage of sample located within the liver ranged from 95.4 to 97.1%, whereas the average incorrect volume selected was 16.5-35.4 mL (2.9-4.6%). Average run time was 19.7-79.0 s. The algorithm consistently selected large samples of the hepatic parenchyma with small amounts of erroneous extrahepatic sampling, and run times were feasible for execution on an MRI system console during exam acquisition. Copyright © 2011 Wiley Periodicals, Inc.

  19. Detection of stiff nanoparticles within cellular structures by contact resonance atomic force microscopy subsurface nanomechanical imaging.

    PubMed

    Reggente, Melania; Passeri, Daniele; Angeloni, Livia; Scaramuzzo, Francesca Anna; Barteri, Mario; De Angelis, Francesca; Persiconi, Irene; De Stefano, Maria Egle; Rossi, Marco

    2017-05-04

    Detecting stiff nanoparticles buried in soft biological matrices by atomic force microscopy (AFM) based techniques represents a new frontier in the field of scanning probe microscopies, originally developed as surface characterization methods. Here we report the detection of stiff (magnetic) nanoparticles (NPs) internalized in cells by using contact resonance AFM (CR-AFM) employed as a potentially non-destructive subsurface characterization tool. Magnetite (Fe 3 O 4 ) NPs were internalized in microglial cells from cerebral cortices of mouse embryos of 18 days by phagocytosis. Nanomechanical imaging of cells was performed by detecting the contact resonance frequencies (CRFs) of an AFM cantilever held in contact with the sample. Agglomerates of NPs internalized in cells were visualized on the basis of the local increase in the contact stiffness with respect to the surrounding biological matrix. A second AFM-based technique for nanomechanical imaging, i.e., HarmoniX™, as well as magnetic force microscopy and light microscopy were used to confirm the CR-AFM results. Thus, CR-AFM was demonstrated as a promising technique for subsurface imaging of nanomaterials in biological samples.

  20. A highly selective dispersive liquid-liquid microextraction approach based on the unique fluorous affinity for the extraction and detection of per- and polyfluoroalkyl substances coupled with high performance liquid chromatography tandem-mass spectrometry.

    PubMed

    Wang, Juan; Shi, Yali; Cai, Yaqi

    2018-04-06

    In the present study, a highly selective fluorous affinity-based dispersive liquid-liquid microextraction (DLLME) technique was developed for the extraction and analysis of per- and polyfluoroalkyl substances (PFASs) followed by high performance liquid chromatography tandem-mass spectrometry. Perfluoro-tert-butanol with multiple C-F bonds was chosen as the extraction solvent, which was injected into the aqueous samples with a dispersive solvent (acetonitrile) in a 120:800 (μL, v/v) mixture for PFASs enrichment. The fluorous affinity-based extraction mechanism was confirmed by the significantly higher extraction recoveries for PFASs containing multiple fluorine atoms than those for compounds with fewer or no fluorine atoms. The extraction recoveries of medium and long-chain PFASs (CF 2  > 5) exceeded 70%, except perfluoroheptanoic acid, while those of short-chain PFASs were lower than 50%, implying that the proposed DLLME may not be suitable for their extraction due to weak fluorous affinity. This highly fluoroselective DLLME technique can greatly decrease the matrix effect that occurs in mass spectrometry detection when applied to the analysis of urine samples. Under the optimum conditions, the relative recoveries of PFASs with CF 2  > 5 ranged from 80.6-121.4% for tap water, river water and urine samples spiked with concentrations of 10, 50 and 100 ng/L. The method limits of quantification for PFASs in water and urine samples were in the range of 0.6-8.7 ng/L. Furthermore, comparable concentrations of PFASs were obtained via DLLME and solid-phase extraction, confirming that the developed DLLME technique is a promising method for the extraction of PFASs in real samples. Copyright © 2018 Elsevier B.V. All rights reserved.

  1. Fast, metadynamics-based method for prediction of the stereochemistry-dependent relative free energies of ligand-receptor interactions.

    PubMed

    Plazinska, Anita; Plazinski, Wojciech; Jozwiak, Krzysztof

    2014-04-30

    The computational approach applicable for the molecular dynamics (MD)-based techniques is proposed to predict the ligand-protein binding affinities dependent on the ligand stereochemistry. All possible stereoconfigurations are expressed in terms of one set of force-field parameters [stereoconfiguration-independent potential (SIP)], which allows for calculating all relative free energies by only single simulation. SIP can be used for studying diverse, stereoconfiguration-dependent phenomena by means of various computational techniques of enhanced sampling. The method has been successfully tested on the β2-adrenergic receptor (β2-AR) binding the four fenoterol stereoisomers by both metadynamics simulations and replica-exchange MD. Both the methods gave very similar results, fully confirming the presence of stereoselective effects in the fenoterol-β2-AR interactions. However, the metadynamics-based approach offered much better efficiency of sampling which allows for significant reduction of the unphysical region in SIP. Copyright © 2014 Wiley Periodicals, Inc.

  2. Experimental study on GMM-based speaker recognition

    NASA Astrophysics Data System (ADS)

    Ye, Wenxing; Wu, Dapeng; Nucci, Antonio

    2010-04-01

    Speaker recognition plays a very important role in the field of biometric security. In order to improve the recognition performance, many pattern recognition techniques have be explored in the literature. Among these techniques, the Gaussian Mixture Model (GMM) is proved to be an effective statistic model for speaker recognition and is used in most state-of-the-art speaker recognition systems. The GMM is used to represent the 'voice print' of a speaker through modeling the spectral characteristic of speech signals of the speaker. In this paper, we implement a speaker recognition system, which consists of preprocessing, Mel-Frequency Cepstrum Coefficients (MFCCs) based feature extraction, and GMM based classification. We test our system with TIDIGITS data set (325 speakers) and our own recordings of more than 200 speakers; our system achieves 100% correct recognition rate. Moreover, we also test our system under the scenario that training samples are from one language but test samples are from a different language; our system also achieves 100% correct recognition rate, which indicates that our system is language independent.

  3. Grazing-incidence coherent x-ray imaging in true reflection geometry

    NASA Astrophysics Data System (ADS)

    Sun, Tao; Jiang, Zhang; Strzalka, Joseph; Wang, Jin

    2012-02-01

    The development of the 3^rd and 4^th generation synchrotrons has stimulated extensive research activities in x-ray imaging techniques. Among all, coherent diffractive imaging (CDI) shows great promise, as its resolution is only limited by the wavelength of the source. Most of the CDI work reported thus far used transmission geometry, which however is not suitable for samples on opaque substrates or in which only the surfaces are the regions of interest. Even though two groups have performed CDI experiments (using laser or x-ray) in reflection geometry and succeeded in reconstructing the planar image of the surface, the theoretical underpinnings and analysis approaches of their techniques are essentially identical to transmission CDI. Most importantly, they couldn't obtain the structural information along sample thickness direction. Here, we introduce a reflection CDI technique that works at grazing-incidence geometry. By visualizing Au nanostructures fabricated on Si substrate, we demonstrate that this innovative imaging technique is capable of obtaining both 2D and 3D information of surfaces or buried structures in the samples. In the meanwhile, we will also explain the grazing-incidence-scattering based-algorithm developed for 3D phase retrieval.

  4. Using a novel micro-sampling technique to monitor the effects of methylmercury on the eggs of wild birds

    USGS Publications Warehouse

    Klimstra, J.D.; Stebbins, K.R.; Heinz, G.H.

    2007-01-01

    Methylmercury is the predominant chemical form of mercury reported in the eggs of wild birds. The embryo is the life stage at which birds are most sensitive to methylmercury. Protective guidelines have been based largely on captive-breeding studies done with chickens (Gallus domesticus), mallards (Anas platyrhynchos), and ring-necked pheasants (Phasianus colchicus). Typically these studies are cost and time prohibitive. In the past, researchers have used either egg injections or the ?sample egg? technique to determine contaminant effects on bird eggs. Both techniques have their limitations. As an alternative to the above methods and because most of the methylmercury is found in the albumen we have developed a novel, less invasive technique, to micro-sample the albumen of eggs in the field. An albumen sample would be analyzed and then compared to the hatching success of that egg. Using the micro-sampling procedure, the egg is oriented with the blunt end up and the pointed end down. A vent hole is drilled at the top to relieve pressure. Approximately one third up from the bottom, a withdrawal site is drilled just until the inner shell membrane is exposed. A syringe with a 21 or 18 gauge needle is gently inserted just into the egg and approximately 200?300?l of albumen is removed. Almost concurrently this site and then the vent are sealed. Thus far we have experimented with both chicken and mallard eggs in the laboratory. We sampled chicken eggs at days 0 and 3 of incubation with a hatching success of 76% and 70%, respectively. Neither group was significantly different from control eggs (P=0.52, 0.54). Field studies are in progress using this technique in which birds are allowed to incubate their own eggs. We envision micro-sampling to be a tool that researchers and managers could use in the field to determine the effects of mercury or other contaminants in bird populations. Micro-sampling would reduce the impact on the sampled population and could be used to monitor sensitive species without impacting reproduction and recruitment.

  5. Navigating complex sample analysis using national survey data.

    PubMed

    Saylor, Jennifer; Friedmann, Erika; Lee, Hyeon Joo

    2012-01-01

    The National Center for Health Statistics conducts the National Health and Nutrition Examination Survey and other national surveys with probability-based complex sample designs. Goals of national surveys are to provide valid data for the population of the United States. Analyses of data from population surveys present unique challenges in the research process but are valuable avenues to study the health of the United States population. The aim of this study was to demonstrate the importance of using complex data analysis techniques for data obtained with complex multistage sampling design and provide an example of analysis using the SPSS Complex Samples procedure. Illustration of challenges and solutions specific to secondary data analysis of national databases are described using the National Health and Nutrition Examination Survey as the exemplar. Oversampling of small or sensitive groups provides necessary estimates of variability within small groups. Use of weights without complex samples accurately estimates population means and frequency from the sample after accounting for over- or undersampling of specific groups. Weighting alone leads to inappropriate population estimates of variability, because they are computed as if the measures were from the entire population rather than a sample in the data set. The SPSS Complex Samples procedure allows inclusion of all sampling design elements, stratification, clusters, and weights. Use of national data sets allows use of extensive, expensive, and well-documented survey data for exploratory questions but limits analysis to those variables included in the data set. The large sample permits examination of multiple predictors and interactive relationships. Merging data files, availability of data in several waves of surveys, and complex sampling are techniques used to provide a representative sample but present unique challenges. In sophisticated data analysis techniques, use of these data is optimized.

  6. Spectrum simulation in DTSA-II.

    PubMed

    Ritchie, Nicholas W M

    2009-10-01

    Spectrum simulation is a useful practical and pedagogical tool. Particularly with complex samples or trace constituents, a simulation can help to understand the limits of the technique and the instrument parameters for the optimal measurement. DTSA-II, software for electron probe microanalysis, provides both easy to use and flexible tools for simulating common and less common sample geometries and materials. Analytical models based on (rhoz) curves provide quick simulations of simple samples. Monte Carlo models based on electron and X-ray transport provide more sophisticated models of arbitrarily complex samples. DTSA-II provides a broad range of simulation tools in a framework with many different interchangeable physical models. In addition, DTSA-II provides tools for visualizing, comparing, manipulating, and quantifying simulated and measured spectra.

  7. Raman sorting and identification of single living micro-organisms with optical tweezers

    NASA Astrophysics Data System (ADS)

    Xie, Changan; Chen, De; Li, Yong-Qing

    2005-07-01

    We report on a novel technique for sorting and identification of single biological cells and food-borne bacteria based on laser tweezers and Raman spectroscopy (LTRS). With this technique, biological cells of different physiological states in a sample chamber were identified by their Raman spectral signatures and then they were selectively manipulated into a clean collection chamber with optical tweezers through a microchannel. As an example, we sorted the live and dead yeast cells into the collection chamber and validated this with a standard staining technique. We also demonstrated that bacteria existing in spoiled foods could be discriminated from a variety of food particles based on their characteristic Raman spectra and then isolated with laser manipulation. This label-free LTRS sorting technique may find broad applications in microbiology and rapid examination of food-borne diseases.

  8. Comparative study of nail sampling techniques in onychomycosis.

    PubMed

    Shemer, Avner; Davidovici, Batya; Grunwald, Marcelo H; Trau, Henri; Amichai, Boaz

    2009-07-01

    Onychomycosis is a common problem. Obtaining accurate laboratory test results before treatment is important in clinical practice. The purpose of this study was to compare results of curettage and drilling techniques of nail sampling in the diagnosis of onychomycosis, and to establish the best technique and location of sampling. We evaluated 60 patients suffering from distal and lateral subungual onychomycosis and lateral subungual onychomycosis using curettage and vertical and horizontal drilling sampling techniques from three different sites of the infected nail. KOH examination and fungal culture were used for detection and identification of fungal infection. At each sample site, the horizontal drilling technique has a better culture sensitivity than curettage. Trichophyton rubrum was by far the most common pathogen detected by both techniques from all sampling sites. The drilling technique was found to be statistically better than curettage at each site of sampling, furthermore vertical drilling from the proximal part of the affected nail was found to be the best procedure for nail sampling. With each technique we found that the culture sensitivity improved as the location of the sample was more proximal. More types of pathogens were detected in samples taken by both methods from proximal parts of the affected nails.

  9. Analysis of the differentially expressed low molecular weight peptides in human serum via an N-terminal isotope labeling technique combining nano-liquid chromatography/matrix-assisted laser desorption/ionization mass spectrometry.

    PubMed

    Leng, Jiapeng; Zhu, Dong; Wu, Duojiao; Zhu, Tongyu; Zhao, Ningwei; Guo, Yinlong

    2012-11-15

    Peptidomics analysis of human serum is challenging due to the low abundance of serum peptides and interference from the complex matrix. This study analyzed the differentially expressed (DE) low molecular weight peptides in human serum integrating a DMPITC-based N-terminal isotope labeling technique with nano-liquid chromatography and matrix-assisted laser desorption/ionization mass spectrometry (nano-LC/MALDI-MS). The workflow introduced a [d(6)]-4,6-dimethoxypyrimidine-2-isothiocyanate (DMPITC)-labeled mixture of aliquots from test samples as the internal standard. The spiked [d(0)]-DMPITC-labeled samples were separated by nano-LC then spotted on the MALDI target. Both quantitative and qualitative studies for serum peptides were achieved based on the isotope-labeled peaks. The DMPITC labeling technique combined with nano-LC/MALDI-MS not only minimized the errors in peptide quantitation, but also allowed convenient recognition of the labeled peptides due to the 6 Da mass difference. The data showed that the entire research procedure as well as the subsequent data analysis method were effective, reproducible, and sensitive for the analysis of DE serum peptides. This study successfully established a research model for DE serum peptides using DMPITC-based N-terminal isotope labeling and nano-LC/MALDI-MS. Application of the DMPITC-based N-terminal labeling technique is expected to provide a promising tool for the investigation of peptides in vivo, especially for the analysis of DE peptides under different biological conditions. Copyright © 2012 John Wiley & Sons, Ltd.

  10. Split Bregman multicoil accelerated reconstruction technique: A new framework for rapid reconstruction of cardiac perfusion MRI

    PubMed Central

    Kamesh Iyer, Srikant; Tasdizen, Tolga; Likhite, Devavrat; DiBella, Edward

    2016-01-01

    Purpose: Rapid reconstruction of undersampled multicoil MRI data with iterative constrained reconstruction method is a challenge. The authors sought to develop a new substitution based variable splitting algorithm for faster reconstruction of multicoil cardiac perfusion MRI data. Methods: The new method, split Bregman multicoil accelerated reconstruction technique (SMART), uses a combination of split Bregman based variable splitting and iterative reweighting techniques to achieve fast convergence. Total variation constraints are used along the spatial and temporal dimensions. The method is tested on nine ECG-gated dog perfusion datasets, acquired with a 30-ray golden ratio radial sampling pattern and ten ungated human perfusion datasets, acquired with a 24-ray golden ratio radial sampling pattern. Image quality and reconstruction speed are evaluated and compared to a gradient descent (GD) implementation and to multicoil k-t SLR, a reconstruction technique that uses a combination of sparsity and low rank constraints. Results: Comparisons based on blur metric and visual inspection showed that SMART images had lower blur and better texture as compared to the GD implementation. On average, the GD based images had an ∼18% higher blur metric as compared to SMART images. Reconstruction of dynamic contrast enhanced (DCE) cardiac perfusion images using the SMART method was ∼6 times faster than standard gradient descent methods. k-t SLR and SMART produced images with comparable image quality, though SMART was ∼6.8 times faster than k-t SLR. Conclusions: The SMART method is a promising approach to reconstruct good quality multicoil images from undersampled DCE cardiac perfusion data rapidly. PMID:27036592

  11. Convenient mounting method for electrical measurements of thin samples

    NASA Technical Reports Server (NTRS)

    Matus, L. G.; Summers, R. L.

    1986-01-01

    A method for mounting thin samples for electrical measurements is described. The technique is based on a vacuum chuck concept in which the vacuum chuck simultaneously holds the sample and established electrical contact. The mounting plate is composed of a glass-ceramic insulating material and the surfaces of the plate and vacuum chuck are polished. The operation of the vacuum chuck is examined. The contacts on the sample and mounting plate, which are sputter-deposited through metal masks, are analyzed. The mounting method was utilized for van der Pauw measurements.

  12. Hard X-ray full field microscopy and magnifying microtomography using compound refractive lenses

    NASA Astrophysics Data System (ADS)

    Schroer, Christian G.; Günzler, Til Florian; Benner, Boris; Kuhlmann, Marion; Tümmler, Johannes; Lengeler, Bruno; Rau, Christoph; Weitkamp, Timm; Snigirev, Anatoly; Snigireva, Irina

    2001-07-01

    For hard X-rays, parabolic compound refractive lenses (PCRLs) are genuine imaging devices like glass lenses for visible light. Based on these new lenses, a hard X-ray full field microscope has been constructed that is ideally suited to image the interior of opaque samples with a minimum of sample preparation. As a result of a large depth of field, CRL micrographs are sharp projection images of most samples. To obtain 3D information about a sample, tomographic techniques are combined with magnified imaging.

  13. Absolute method of measuring magnetic susceptibility

    USGS Publications Warehouse

    Thorpe, A.; Senftle, F.E.

    1959-01-01

    An absolute method of standardization and measurement of the magnetic susceptibility of small samples is presented which can be applied to most techniques based on the Faraday method. The fact that the susceptibility is a function of the area under the curve of sample displacement versus distance of the magnet from the sample, offers a simple method of measuring the susceptibility without recourse to a standard sample. Typical results on a few substances are compared with reported values, and an error of less than 2% can be achieved. ?? 1959 The American Institute of Physics.

  14. An improved survivability prognosis of breast cancer by using sampling and feature selection technique to solve imbalanced patient classification data.

    PubMed

    Wang, Kung-Jeng; Makond, Bunjira; Wang, Kung-Min

    2013-11-09

    Breast cancer is one of the most critical cancers and is a major cause of cancer death among women. It is essential to know the survivability of the patients in order to ease the decision making process regarding medical treatment and financial preparation. Recently, the breast cancer data sets have been imbalanced (i.e., the number of survival patients outnumbers the number of non-survival patients) whereas the standard classifiers are not applicable for the imbalanced data sets. The methods to improve survivability prognosis of breast cancer need for study. Two well-known five-year prognosis models/classifiers [i.e., logistic regression (LR) and decision tree (DT)] are constructed by combining synthetic minority over-sampling technique (SMOTE), cost-sensitive classifier technique (CSC), under-sampling, bagging, and boosting. The feature selection method is used to select relevant variables, while the pruning technique is applied to obtain low information-burden models. These methods are applied on data obtained from the Surveillance, Epidemiology, and End Results database. The improvements of survivability prognosis of breast cancer are investigated based on the experimental results. Experimental results confirm that the DT and LR models combined with SMOTE, CSC, and under-sampling generate higher predictive performance consecutively than the original ones. Most of the time, DT and LR models combined with SMOTE and CSC use less informative burden/features when a feature selection method and a pruning technique are applied. LR is found to have better statistical power than DT in predicting five-year survivability. CSC is superior to SMOTE, under-sampling, bagging, and boosting to improve the prognostic performance of DT and LR.

  15. An improved survivability prognosis of breast cancer by using sampling and feature selection technique to solve imbalanced patient classification data

    PubMed Central

    2013-01-01

    Background Breast cancer is one of the most critical cancers and is a major cause of cancer death among women. It is essential to know the survivability of the patients in order to ease the decision making process regarding medical treatment and financial preparation. Recently, the breast cancer data sets have been imbalanced (i.e., the number of survival patients outnumbers the number of non-survival patients) whereas the standard classifiers are not applicable for the imbalanced data sets. The methods to improve survivability prognosis of breast cancer need for study. Methods Two well-known five-year prognosis models/classifiers [i.e., logistic regression (LR) and decision tree (DT)] are constructed by combining synthetic minority over-sampling technique (SMOTE) ,cost-sensitive classifier technique (CSC), under-sampling, bagging, and boosting. The feature selection method is used to select relevant variables, while the pruning technique is applied to obtain low information-burden models. These methods are applied on data obtained from the Surveillance, Epidemiology, and End Results database. The improvements of survivability prognosis of breast cancer are investigated based on the experimental results. Results Experimental results confirm that the DT and LR models combined with SMOTE, CSC, and under-sampling generate higher predictive performance consecutively than the original ones. Most of the time, DT and LR models combined with SMOTE and CSC use less informative burden/features when a feature selection method and a pruning technique are applied. Conclusions LR is found to have better statistical power than DT in predicting five-year survivability. CSC is superior to SMOTE, under-sampling, bagging, and boosting to improve the prognostic performance of DT and LR. PMID:24207108

  16. Determination of renewable energy yield from mixed waste material from the use of novel image analysis methods.

    PubMed

    Wagland, S T; Dudley, R; Naftaly, M; Longhurst, P J

    2013-11-01

    Two novel techniques are presented in this study which together aim to provide a system able to determine the renewable energy potential of mixed waste materials. An image analysis tool was applied to two waste samples prepared using known quantities of source-segregated recyclable materials. The technique was used to determine the composition of the wastes, where through the use of waste component properties the biogenic content of the samples was calculated. The percentage renewable energy determined by image analysis for each sample was accurate to within 5% of the actual values calculated. Microwave-based multiple-point imaging (AutoHarvest) was used to demonstrate the ability of such a technique to determine the moisture content of mixed samples. This proof-of-concept experiment was shown to produce moisture measurement accurate to within 10%. Overall, the image analysis tool was able to determine the renewable energy potential of the mixed samples, and the AutoHarvest should enable the net calorific value calculations through the provision of moisture content measurements. The proposed system is suitable for combustion facilities, and enables the operator to understand the renewable energy potential of the waste prior to combustion. Copyright © 2013 Elsevier Ltd. All rights reserved.

  17. A Novel Technique for the Connection of Ceramic and Titanium Implant Components Using Glass Solder Bonding

    PubMed Central

    Mick, Enrico; Tinschert, Joachim; Mitrovic, Aurica; Bader, Rainer

    2015-01-01

    Both titanium and ceramic materials provide specific advantages in dental implant technology. However, some problems, like hypersensitivity reactions, corrosion and mechanical failure, have been reported. Therefore, the combining of both materials to take advantage of their pros, while eliminating their respective cons, would be desirable. Hence, we introduced a new technique to bond titanium and ceramic materials by means of a silica-based glass ceramic solder. Cylindrical compound samples (Ø10 mm × 56 mm) made of alumina toughened zirconia (ATZ), as well as titanium grade 5, were bonded by glass solder on their end faces. As a control, a two-component adhesive glue was utilized. The samples were investigated without further treatment, after 30 and 90 days of storage in distilled water at room temperature, and after aging. All samples were subjected to quasi-static four-point-bending tests. We found that the glass solder bonding provided significantly higher bending strength than adhesive glue bonding. In contrast to the glued samples, the bending strength of the soldered samples remained unaltered by the storage and aging treatments. Scanning electron microscopy (SEM) and energy-dispersive X-ray (EDX) analyses confirmed the presence of a stable solder-ceramic interface. Therefore, the glass solder technique represents a promising method for optimizing dental and orthopedic implant bondings. PMID:28793440

  18. EVALUATE THE UTILITY OF ENTEROCOCCI AS INDICATORS OF THE SOURCES OF FECAL CONTAMINATION IN IMPAIRED SUBWATERSHEDS THROUGH DNA-BASED MOLECULAR TECHNIQUES

    EPA Science Inventory

    Microbial source tracking (MST) is based on the assumption that specific strains of bacteria are associated with specific host species. MST methods are attractive because their application on environmental samples could help define the nature of water quality problems in impaire...

  19. The presence-absence coliform test for monitoring drinking water quality.

    PubMed Central

    Rice, E W; Geldreich, E E; Read, E J

    1989-01-01

    The concern for improved monitoring of the sanitary quality of drinking water has prompted interest in alternative methods for the detection of total coliform bacteria. A simplified qualitative presence-absence test has been proposed as an alternate procedure for detecting coliform bacteria in potable water. In this paper data from four comparative studies were analyzed to compare the recovery of total coliform bacteria from drinking water using the presence-absence test, the multiple fermentation tube procedure, and the membrane filter technique. The four studies were of water samples taken from four different geographic areas of the United States: Hawaii, New England (Vermont and New Hampshire), Oregon, and Pennsylvania. Analysis of the results of these studies were compared, based upon the number of positive samples detected by each method. Combined recoveries showed the presence-absence test detected significantly higher numbers of samples with coliforms than either the fermentation tube or membrane filter methods, P less than 0.01. The fermentation tube procedure detected significantly more positive samples than the membrane filter technique, P less than 0.01. Based upon the analysis of the combined data base, it is clear that the presence-absence test is as sensitive as the current coliform methods for the examination of potable water. The presence-absence test offers a viable alternative to water utility companies that elect to use the frequency-of-occurrence approach for compliance monitoring. PMID:2493663

  20. Image analysis for quantification of bacterial rock weathering.

    PubMed

    Puente, M Esther; Rodriguez-Jaramillo, M Carmen; Li, Ching Y; Bashan, Yoav

    2006-02-01

    A fast, quantitative image analysis technique was developed to assess potential rock weathering by bacteria. The technique is based on reduction in the surface area of rock particles and counting the relative increase in the number of small particles in ground rock slurries. This was done by recording changes in ground rock samples with an electronic image analyzing process. The slurries were previously amended with three carbon sources, ground to a uniform particle size and incubated with rock weathering bacteria for 28 days. The technique was developed and tested, using two rock-weathering bacteria Pseudomonas putida R-20 and Azospirillum brasilense Cd on marble, granite, apatite, quartz, limestone, and volcanic rock as substrates. The image analyzer processed large number of particles (10(7)-10(8) per sample), so that the weathering capacity of bacteria can be detected.

  1. Kinetic quantitation of cerebral PET-FDG studies without concurrent blood sampling: statistical recovery of the arterial input function.

    PubMed

    O'Sullivan, F; Kirrane, J; Muzi, M; O'Sullivan, J N; Spence, A M; Mankoff, D A; Krohn, K A

    2010-03-01

    Kinetic quantitation of dynamic positron emission tomography (PET) studies via compartmental modeling usually requires the time-course of the radio-tracer concentration in the arterial blood as an arterial input function (AIF). For human and animal imaging applications, significant practical difficulties are associated with direct arterial sampling and as a result there is substantial interest in alternative methods that require no blood sampling at the time of the study. A fixed population template input function derived from prior experience with directly sampled arterial curves is one possibility. Image-based extraction, including requisite adjustment for spillover and recovery, is another approach. The present work considers a hybrid statistical approach based on a penalty formulation in which the information derived from a priori studies is combined in a Bayesian manner with information contained in the sampled image data in order to obtain an input function estimate. The absolute scaling of the input is achieved by an empirical calibration equation involving the injected dose together with the subject's weight, height and gender. The technique is illustrated in the context of (18)F -Fluorodeoxyglucose (FDG) PET studies in humans. A collection of 79 arterially sampled FDG blood curves are used as a basis for a priori characterization of input function variability, including scaling characteristics. Data from a series of 12 dynamic cerebral FDG PET studies in normal subjects are used to evaluate the performance of the penalty-based AIF estimation technique. The focus of evaluations is on quantitation of FDG kinetics over a set of 10 regional brain structures. As well as the new method, a fixed population template AIF and a direct AIF estimate based on segmentation are also considered. Kinetics analyses resulting from these three AIFs are compared with those resulting from radially sampled AIFs. The proposed penalty-based AIF extraction method is found to achieve significant improvements over the fixed template and the segmentation methods. As well as achieving acceptable kinetic parameter accuracy, the quality of fit of the region of interest (ROI) time-course data based on the extracted AIF, matches results based on arterially sampled AIFs. In comparison, significant deviation in the estimation of FDG flux and degradation in ROI data fit are found with the template and segmentation methods. The proposed AIF extraction method is recommended for practical use.

  2. Glow discharge sources for atomic and molecular analyses

    NASA Astrophysics Data System (ADS)

    Storey, Andrew Patrick

    Two types of glow discharges were used and characterized for chemical analysis. The flowing atmospheric pressure afterglow (FAPA) source, based on a helium glow discharge (GD), was utilized to analyze samples with molecular mass spectrometry. A second GD, operated at reduced pressure in argon, was employed to map the elemental composition of a solid surface with novel optical detection systems, enabling new applications and perspectives for GD emission spectrometry. Like many plasma-based ambient desorption-ionization sources being used around the world, the FAPA requires a supply of helium to operate effectively. With increased pressures on global helium supply and pricing, the use of an interrupted stream of helium for analysis was explored for vapor and solid samples. In addition to the mass spectra generated by the FAPA source, schlieren imaging and infrared thermography were employed to map the behavior of the source and its surroundings under the altered conditions. Additionally, a new annular microplasma variation of the FAPA source was developed and characterized. A spectroscopic imaging system that utilized an adjustable-tilt interference filter was used to map the elemental composition of a sample surface by glow discharge emission spectroscopy. This apparatus was compared to other GD imaging techniques for mapping elemental surface composition. The wide bandpass filter resulted in significant spectral interferences that could be partially overcome with chemometric data processing. Because time-resolved GD emission spectroscopy can provide fine depth-profiling measurements, a natural extension of GD imaging would be its application to three-dimensional characterization of samples. However, the simultaneous cathodic sputtering that occur across the sample results in a sampling process that is not completely predictable. These issues are frequently encountered when laterally varied samples are explored with glow discharge imaging techniques. These insights are described with respect to their consequences for both imaging and conventional GD spectroscopic techniques.

  3. Determination of aluminium in groundwater samples by GF-AAS, ICP-AES, ICP-MS and modelling of inorganic aluminium complexes.

    PubMed

    Frankowski, Marcin; Zioła-Frankowska, Anetta; Kurzyca, Iwona; Novotný, Karel; Vaculovič, Tomas; Kanický, Viktor; Siepak, Marcin; Siepak, Jerzy

    2011-11-01

    The paper presents the results of aluminium determinations in ground water samples of the Miocene aquifer from the area of the city of Poznań (Poland). The determined aluminium content amounted from <0.0001 to 752.7 μg L(-1). The aluminium determinations were performed using three analytical techniques: graphite furnace atomic absorption spectrometry (GF-AAS), inductively coupled plasma atomic emission spectrometry (ICP-AES) and inductively coupled plasma mass spectrometry (ICP-MS). The results of aluminium determinations in groundwater samples for particular analytical techniques were compared. The results were used to identify the ascent of ground water from the Mesozoic aquifer to the Miocene aquifer in the area of the fault graben. Using the Mineql+ program, the modelling of the occurrence of aluminium and the following aluminium complexes: hydroxy, with fluorides and sulphates was performed. The paper presents the results of aluminium determinations in ground water using different analytical techniques as well as the chemical modelling in the Mineql+ program, which was performed for the first time and which enabled the identification of aluminium complexes in the investigated samples. The study confirms the occurrence of aluminium hydroxy complexes and aluminium fluoride complexes in the analysed groundwater samples. Despite the dominance of sulphates and organic matter in the sample, major participation of the complexes with these ligands was not stated based on the modelling.

  4. Laser induced breakdown spectroscopy (LIBS) as a rapid tool for material analysis

    NASA Astrophysics Data System (ADS)

    Hussain, T.; Gondal, M. A.

    2013-06-01

    Laser induced breakdown spectroscopy (LIBS) is a novel technique for elemental analysis based on laser-generated plasma. In this technique, laser pulses are applied for ablation of the sample, resulting in the vaporization and ionization of sample in hot plasma which is finally analyzed by the spectrometer. The elements are identified by their unique spectral signatures. LIBS system was developed for elemental analysis of solid and liquid samples. The developed system was applied for qualitative as well as quantitative measurement of elemental concentration present in iron slag and open pit ore samples. The plasma was generated by focusing a pulsed Nd:YAG laser at 1064 nm on test samples to study the capabilities of LIBS as a rapid tool for material analysis. The concentrations of various elements of environmental significance such as cadmium, calcium, magnesium, chromium, manganese, titanium, barium, phosphorus, copper, iron, zinc etc., in these samples were determined. Optimal experimental conditions were evaluated for improving the sensitivity of developed LIBS system through parametric dependence study. The laser-induced breakdown spectroscopy (LIBS) results were compared with the results obtained using standard analytical technique such as inductively couple plasma emission spectroscopy (ICP). Limit of detection (LOD) of our LIBS system were also estimated for the above mentioned elements. This study demonstrates that LIBS could be highly appropriate for rapid online analysis of iron slag and open pit waste.

  5. [MicroRNA Target Prediction Based on Support Vector Machine Ensemble Classification Algorithm of Under-sampling Technique].

    PubMed

    Chen, Zhiru; Hong, Wenxue

    2016-02-01

    Considering the low accuracy of prediction in the positive samples and poor overall classification effects caused by unbalanced sample data of MicroRNA (miRNA) target, we proposes a support vector machine (SVM)-integration of under-sampling and weight (IUSM) algorithm in this paper, an under-sampling based on the ensemble learning algorithm. The algorithm adopts SVM as learning algorithm and AdaBoost as integration framework, and embeds clustering-based under-sampling into the iterative process, aiming at reducing the degree of unbalanced distribution of positive and negative samples. Meanwhile, in the process of adaptive weight adjustment of the samples, the SVM-IUSM algorithm eliminates the abnormal ones in negative samples with robust sample weights smoothing mechanism so as to avoid over-learning. Finally, the prediction of miRNA target integrated classifier is achieved with the combination of multiple weak classifiers through the voting mechanism. The experiment revealed that the SVM-IUSW, compared with other algorithms on unbalanced dataset collection, could not only improve the accuracy of positive targets and the overall effect of classification, but also enhance the generalization ability of miRNA target classifier.

  6. Nickel speciation in several serpentine (ultramafic) topsoils via bulk synchrotron-based techniques

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Siebecker, Matthew G.; Chaney, Rufus L.; Sparks, Donald L.

    2017-07-01

    Serpentine soils have elevated concentrations of trace metals including nickel, cobalt, and chromium compared to non-serpentine soils. Identifying the nickel bearing minerals allows for prediction of potential mobility of nickel. Synchrotron-based techniques can identify the solid-phase chemical forms of nickel with minimal sample treatment. Element concentrations are known to vary among soil particle sizes in serpentine soils. Sonication is a useful method to physically disperse sand, silt and clay particles in soils. Synchrotron-based techniques and sonication were employed to identify nickel species in discrete particle size fractions in several serpentine (ultramafic) topsoils to better understand solid-phase nickel geochemistry. Nickel commonlymore » resided in primary serpentine parent material such as layered-phyllosilicate and chain-inosilicate minerals and was associated with iron oxides. In the clay fractions, nickel was associated with iron oxides and primary serpentine minerals, such as lizardite. Linear combination fitting (LCF) was used to characterize nickel species. Total metal concentration did not correlate with nickel speciation and is not an indicator of the major nickel species in the soil. Differences in soil texture were related to different nickel speciation for several particle size fractionated samples. A discussion on LCF illustrates the importance of choosing standards based not only on statistical methods such as Target Transformation but also on sample mineralogy and particle size. Results from the F-test (Hamilton test), which is an underutilized tool in the literature for LCF in soils, highlight its usefulness to determine the appropriate number of standards to for LCF. EXAFS shell fitting illustrates that destructive interference commonly found for light and heavy elements in layered double hydroxides and in phyllosilicates also can occur in inosilicate minerals, causing similar structural features and leading to false positive results in LCF.« less

  7. High-resolution correlation

    NASA Astrophysics Data System (ADS)

    Nelson, D. J.

    2007-09-01

    In the basic correlation process a sequence of time-lag-indexed correlation coefficients are computed as the inner or dot product of segments of two signals. The time-lag(s) for which the magnitude of the correlation coefficient sequence is maximized is the estimated relative time delay of the two signals. For discrete sampled signals, the delay estimated in this manner is quantized with the same relative accuracy as the clock used in sampling the signals. In addition, the correlation coefficients are real if the input signals are real. There have been many methods proposed to estimate signal delay to more accuracy than the sample interval of the digitizer clock, with some success. These methods include interpolation of the correlation coefficients, estimation of the signal delay from the group delay function, and beam forming techniques, such as the MUSIC algorithm. For spectral estimation, techniques based on phase differentiation have been popular, but these techniques have apparently not been applied to the correlation problem . We propose a phase based delay estimation method (PBDEM) based on the phase of the correlation function that provides a significant improvement of the accuracy of time delay estimation. In the process, the standard correlation function is first calculated. A time lag error function is then calculated from the correlation phase and is used to interpolate the correlation function. The signal delay is shown to be accurately estimated as the zero crossing of the correlation phase near the index of the peak correlation magnitude. This process is nearly as fast as the conventional correlation function on which it is based. For real valued signals, a simple modification is provided, which results in the same correlation accuracy as is obtained for complex valued signals.

  8. Multiplexed Paper Analytical Device for Quantification of Metals using Distance-Based Detection

    PubMed Central

    Cate, David M.; Noblitt, Scott D.; Volckens, John; Henry, Charles S.

    2015-01-01

    Exposure to metal-containing aerosols has been linked with adverse health outcomes for almost every organ in the human body. Commercially available techniques for quantifying particulate metals are time-intensive, laborious, and expensive; often sample analysis exceeds $100. We report a simple technique, based upon a distance-based detection motif, for quantifying metal concentrations of Ni, Cu, and Fe in airborne particulate matter using microfluidic paper-based analytical devices. Paper substrates are used to create sensors that are self-contained, self-timing, and require only a drop of sample for operation. Unlike other colorimetric approaches in paper microfluidics that rely on optical instrumentation for analysis, with distance-based detection, analyte is quantified visually based on the distance of a colorimetric reaction, similar to reading temperature on a thermometer. To demonstrate the effectiveness of this approach, Ni, Cu, and Fe were measured individually in single-channel devices; detection limits as low as 0.1, 0.1, and 0.05 µg were reported for Ni, Cu, and Fe. Multiplexed analysis of all three metals was achieved with detection limits of 1, 5, and 1 µg for Ni, Cu, and Fe. We also extended the dynamic range for multi-analyte detection by printing concentration gradients of colorimetric reagents using an off the shelf inkjet printer. Analyte selectivity was demonstrated for common interferences. To demonstrate utility of the method, Ni, Cu, and Fe were measured from samples of certified welding fume; levels measured with paper sensors matched known values determined gravimetrically. PMID:26009988

  9. Quantitation of dissolved gas content in emulsions and in blood using mass spectrometric detection

    PubMed Central

    Grimley, Everett; Turner, Nicole; Newell, Clayton; Simpkins, Cuthbert; Rodriguez, Juan

    2011-01-01

    Quantitation of dissolved gases in blood or in other biological media is essential for understanding the dynamics of metabolic processes. Current detection techniques, while enabling rapid and convenient assessment of dissolved gases, provide only direct information on the partial pressure of gases dissolved in the aqueous fraction of the fluid. The more relevant quantity known as gas content, which refers to the total amount of the gas in all fractions of the sample, can be inferred from those partial pressures, but only indirectly through mathematical modeling. Here we describe a simple mass spectrometric technique for rapid and direct quantitation of gas content for a wide range of gases. The technique is based on a mass spectrometer detector that continuously monitors gases that are rapidly extracted from samples injected into a purge vessel. The accuracy and sample processing speed of the system is demonstrated with experiments that reproduce within minutes literature values for the solubility of various gases in water. The capability of the technique is further demonstrated through accurate determination of O2 content in a lipid emulsion and in whole blood, using as little as 20 μL of sample. The approach to gas content quantitation described here should greatly expand the range of animals and conditions that may be used in studies of metabolic gas exchange, and facilitate the development of artificial oxygen carriers and resuscitation fluids. PMID:21497566

  10. Machine compliance in compression tests

    NASA Astrophysics Data System (ADS)

    Sousa, Pedro; Ivens, Jan; Lomov, Stepan V.

    2018-05-01

    The compression behavior of a material cannot be accurately determined if the machine compliance is not accounted prior to the measurements. This work discusses the machine compliance during a compressibility test with fiberglass fabrics. The thickness variation was measured during loading and unloading cycles with a relaxation stage of 30 minutes between them. The measurements were performed using an indirect technique based on the comparison between the displacement at a free compression cycle and the displacement with a sample. Relating to the free test, it has been noticed the nonexistence of machine relaxation during relaxation stage. Considering relaxation or not, the characteristic curves for a free compression cycle can be overlapped precisely in the majority of the points. For the compression test with sample, it was noticed a non-physical decrease of about 30 µm during the relaxation stage, what can be explained by the greater fabric relaxation in relation to the machine relaxation. Beyond the technique normally used, another technique was used which allows a constant thickness during relaxation. Within this second method, machine displacement with sample is simply subtracted to the machine displacement without sample being imposed as constant. If imposed as a constant it will remain constant during relaxation stage and it will suddenly decrease after relaxation. If constantly calculated it will decrease gradually during relaxation stage. Independently of the technique used the final result will remain unchanged. The uncertainty introduced by this imprecision is about ±15 µm.

  11. Improving membrane based multiplex immunoassays for semi-quantitative detection of multiple cytokines in a single sample

    PubMed Central

    2014-01-01

    Background Inflammatory mediators can serve as biomarkers for the monitoring of the disease progression or prognosis in many conditions. In the present study we introduce an adaptation of a membrane-based technique in which the level of up to 40 cytokines and chemokines can be determined in both human and rodent blood in a semi-quantitative way. The planar assay was modified using the LI-COR (R) detection system (fluorescence based) rather than chemiluminescence and semi-quantitative outcomes were achieved by normalizing the outcomes using the automated exposure settings of the Odyssey readout device. The results were compared to the gold standard assay, namely ELISA. Results The improved planar assay allowed the detection of a considerably higher number of analytes (n = 30 and n = 5 for fluorescent and chemiluminescent detection, respectively). The improved planar method showed high sensitivity up to 17 pg/ml and a linear correlation of the normalized fluorescence intensity with the results from the ELISA (r = 0.91). Conclusions The results show that the membrane-based technique is a semi-quantitative assay that correlates satisfactorily to the gold standard when enhanced by the use of fluorescence and subsequent semi-quantitative analysis. This promising technique can be used to investigate inflammatory profiles in multiple conditions, particularly in studies with constraints in sample sizes and/or budget. PMID:25022797

  12. An aircraft measurement technique for formaldehyde and soluble carbonyl compounds

    NASA Astrophysics Data System (ADS)

    Lee, Yin-Nan; Zhou, Xianliang; Leaitch, W. Richard; Banic, Catharine M.

    1996-12-01

    An aircraft technique was developed for measuring ambient concentrations of formaldehyde and a number of soluble carbonyl compounds, including glycolaldehyde, glyoxal, methylglyoxal, glyoxylic acid, and pyruvic acid. Sampling was achieved by liquid scrubbing using a glass coil scrubber in conjunction with an autosampler which collected 5-min integrated liquid samples in septum-sealed vials. Analysis was performed on the ground after flight using high-performance liquid chromatography following derivatization of the carbonyl analytes with 2,4-dinitrophenylhydrazine; the limit of detection was 0.01 to 0.02 parts per billion by volume (ppbv) in the gas phase. Although lacking a real-time capability, this technique offers the advantage of simultaneously measuring six carbonyl compounds, savings in space and power on the aircraft, and a dependable ground-based analysis. This technique was deployed on the Canadian National Research Council DHC-6 Twin Otter during the 1993 summer intensive of the North Atlantic Regional Experiment. The data obtained on August 28, 1993, during a pollutant transport episode are presented as an example of the performance and capability of this technique.

  13. Calculation of three-dimensional, inviscid, supersonic, steady flows

    NASA Technical Reports Server (NTRS)

    Moretti, G.

    1981-01-01

    A detailed description of a computational program for the evaluation of three dimensional supersonic, inviscid, steady flow past airplanes is presented. Emphasis was put on how a powerful, automatic mapping technique is coupled to the fluid mechanical analysis. Each of the three constituents of the analysis (body geometry, mapping technique, and gas dynamical effects) was carefully coded and described. Results of computations based on sample geometrics and discussions are also presented.

  14. Ozone data and mission sampling analysis

    NASA Technical Reports Server (NTRS)

    Robbins, J. L.

    1980-01-01

    A methodology was developed to analyze discrete data obtained from the global distribution of ozone. Statistical analysis techniques were applied to describe the distribution of data variance in terms of empirical orthogonal functions and components of spherical harmonic models. The effects of uneven data distribution and missing data were considered. Data fill based on the autocorrelation structure of the data is described. Computer coding of the analysis techniques is included.

  15. Comparative measurements of stratospheric particulate content by aircraft and ground-based lidar. [aerosol sampling and scattering data analysis

    NASA Technical Reports Server (NTRS)

    Viezee, W.; Russell, P. B.; Hake, R. D., Jr.

    1974-01-01

    The matching method of lidar data analysis is explained, and the results from two flights studying the stratospheric aerosol using lidar techniques are summarized and interpreted. Support is lent to the matching method of lidar data analysis by the results, but it is not yet apparent that the analysis technique leads to acceptable results on all nights in all seasons.

  16. Non-destructive evaluation of teeth restored with different composite resins using synchrotron based micro-imaging.

    PubMed

    Fatima, A; Kulkarni, V K; Banda, N R; Agrawal, A K; Singh, B; Sarkar, P S; Tripathi, S; Shripathi, T; Kashyap, Y; Sinha, A

    2016-01-01

    Application of high resolution synchrotron micro-imaging in microdefects studies of restored dental samples. The purpose of this study was to identify and compare the defects in restorations done by two different resin systems on teeth samples using synchrotron based micro-imaging techniques namely Phase Contrast Imaging (PCI) and micro-computed tomography (MCT). With this aim acquired image quality was also compared with routinely used RVG (Radiovisiograph). Crowns of human teeth samples were fractured mechanically involving only enamel and dentin, without exposure of pulp chamber and were divided into two groups depending on the restorative composite materials used. Group A samples were restored using a submicron Hybrid composite material and Group B samples were restored using a Nano-Hybrid restorative composite material. Synchrotron based PCI and MCT was performed with the aim of visualization of tooth structure, composite resin and their interface. The quantitative and qualitative comparison of phase contrast and absorption contrast images along with MCT on the restored teeth samples shows comparatively large number of voids in Group A samples. Quality assessment of dental restorations using synchrotron based micro-imaging suggests Nano-Hybrid resin restorations (Group B) are better than Group A.

  17. A quantitative and non-contact technique to characterise microstructural variations of skin tissues during photo-damaging process based on Mueller matrix polarimetry.

    PubMed

    Dong, Yang; He, Honghui; Sheng, Wei; Wu, Jian; Ma, Hui

    2017-10-31

    Skin tissue consists of collagen and elastic fibres, which are highly susceptible to damage when exposed to ultraviolet radiation (UVR), leading to skin aging and cancer. However, a lack of non-invasive detection methods makes determining the degree of UVR damage to skin in real time difficult. As one of the fundamental features of light, polarization can be used to develop imaging techniques capable of providing structural information about tissues. In particular, Mueller matrix polarimetry is suitable for detecting changes in collagen and elastic fibres. Here, we demonstrate a novel, quantitative, non-contact and in situ technique based on Mueller matrix polarimetry for monitoring the microstructural changes of skin tissues during UVR-induced photo-damaging. We measured the Mueller matrices of nude mouse skin samples, then analysed the transformed parameters to characterise microstructural changes during the skin photo-damaging and self-repairing processes. Comparisons between samples with and without the application of a sunscreen showed that the Mueller matrix-derived parameters are potential indicators for fibrous microstructure in skin tissues. Histological examination and Monte Carlo simulations confirmed the relationship between the Mueller matrix parameters and changes to fibrous structures. This technique paves the way for non-contact evaluation of skin structure in cosmetics and dermatological health.

  18. Successful Treatment of Postpeak Stage Patients with Class II Division 1 Malocclusion Using Non-extraction and Multiloop Edgewise Archwire Therapy: A Report on 16 Cases

    PubMed Central

    Liu, Jun; Zou, Ling; Zhao, Zhi-he; Welburn, Neala; Yang, Pu; Tang, Tian; Li, Yu

    2009-01-01

    Aim To determine cephalometrically the mechanism of the treatment effects of non-extraction and multiloop edgewise archwire (MEAW) technique on postpeak Class II Division 1 patients. Methodology In this retrospective study, 16 postpeak Class II Division 1 patients successfully corrected using a non-extraction and MEAW technique were cephalometrically evaluated and compared with 16 matched control subjects treated using an extraction technique. Using CorelDRAW® software, standardized digital cephalograms pre- and post-active treatments were traced and a reference grid was set up. The superimpositions were based on the cranial base, the mandibular and the maxilla regions,and skeletal and dental changes were measured. Changes following treatment were evaluated using the paired-sample t-test. Student's t-test for unpaired samples was used to assess the differences in changes between the MEAW and the extraction control groups. Results The correction of the molar relationships comprised 54% skeletal change (mainly the advancement of the mandible) and 46% dental change. Correction of the anterior teeth relationships comprised 30% skeletal change and 70% dental change. Conclusion The MEAW technique can produce the desired vertical and sagittal movement of the tooth segment and then effectively stimulate mandibular advancement by utilizing the residual growth potential of the condyle. PMID:20690424

  19. Damage Detection in Rotorcraft Composite Structures Using Thermography and Laser-Based Ultrasound

    NASA Technical Reports Server (NTRS)

    Anastasi, Robert F.; Zalameda, Joseph N.; Madaras, Eric I.

    2004-01-01

    New rotorcraft structural composite designs incorporate lower structural weight, reduced manufacturing complexity, and improved threat protection. These new structural concepts require nondestructive evaluation inspection technologies that can potentially be field-portable and able to inspect complex geometries for damage or structural defects. Two candidate technologies were considered: Thermography and Laser-Based Ultrasound (Laser UT). Thermography and Laser UT have the advantage of being non-contact inspection methods, with Thermography being a full-field imaging method and Laser UT a point scanning technique. These techniques were used to inspect composite samples that contained both embedded flaws and impact damage of various size and shape. Results showed that the inspection techniques were able to detect both embedded and impact damage with varying degrees of success.

  20. Co-detection: ultra-reliable nanoparticle-based electrical detection of biomolecules in the presence of large background interference.

    PubMed

    Liu, Yang; Gu, Ming; Alocilja, Evangelyn C; Chakrabartty, Shantanu

    2010-11-15

    An ultra-reliable technique for detecting trace quantities of biomolecules is reported. The technique called "co-detection" exploits the non-linear redundancy amongst synthetically patterned biomolecular logic circuits for deciphering the presence or absence of target biomolecules in a sample. In this paper, we verify the "co-detection" principle on gold-nanoparticle-based conductimetric soft-logic circuits which use a silver-enhancement technique for signal amplification. Using co-detection, we have been able to demonstrate a great improvement in the reliability of detecting mouse IgG at concentration levels that are 10(5) lower than the concentration of rabbit IgG which serves as background interference. Copyright © 2010 Elsevier B.V. All rights reserved.

Top