Yan, Xiaowen; Yang, Limin; Wang, Qiuquan
2013-07-01
Much progress has been made in identification of the proteins in proteomes, and quantification of these proteins has attracted much interest. In addition to popular tandem mass spectrometric methods based on soft ionization, inductively coupled plasma mass spectrometry (ICPMS), a typical example of mass spectrometry based on hard ionization, usually used for analysis of elements, has unique advantages in absolute quantification of proteins by determination of an element with a definite stoichiometry in a protein or attached to the protein. In this Trends article, we briefly describe state-of-the-art ICPMS-based methods for quantification of proteins, emphasizing protein-labeling and element-tagging strategies developed on the basis of chemically selective reactions and/or biospecific interactions. Recent progress from protein to cell quantification by use of ICPMS is also discussed, and the possibilities and challenges of ICPMS-based protein quantification for universal, selective, or targeted quantification of proteins and cells in a biological sample are also discussed critically. We believe ICPMS-based protein quantification will become ever more important in targeted quantitative proteomics and bioanalysis in the near future.
Standardless quantification by parameter optimization in electron probe microanalysis
NASA Astrophysics Data System (ADS)
Limandri, Silvina P.; Bonetto, Rita D.; Josa, Víctor Galván; Carreras, Alejo C.; Trincavelli, Jorge C.
2012-11-01
A method for standardless quantification by parameter optimization in electron probe microanalysis is presented. The method consists in minimizing the quadratic differences between an experimental spectrum and an analytical function proposed to describe it, by optimizing the parameters involved in the analytical prediction. This algorithm, implemented in the software POEMA (Parameter Optimization in Electron Probe Microanalysis), allows the determination of the elemental concentrations, along with their uncertainties. The method was tested in a set of 159 elemental constituents corresponding to 36 spectra of standards (mostly minerals) that include trace elements. The results were compared with those obtained with the commercial software GENESIS Spectrum® for standardless quantification. The quantifications performed with the method proposed here are better in the 74% of the cases studied. In addition, the performance of the method proposed is compared with the first principles standardless analysis procedure DTSA for a different data set, which excludes trace elements. The relative deviations with respect to the nominal concentrations are lower than 0.04, 0.08 and 0.35 for the 66% of the cases for POEMA, GENESIS and DTSA, respectively.
NASA Astrophysics Data System (ADS)
Lee, Hyun-Seok; Heun Kim, Sook; Jeong, Ji-Seon; Lee, Yong-Moon; Yim, Yong-Hyeon
2015-10-01
An element-based reductive approach provides an effective means of realizing International System of Units (SI) traceability for high-purity biological standards. Here, we develop an absolute protein quantification method using double isotope dilution (ID) inductively coupled plasma mass spectrometry (ICP-MS) combined with microwave-assisted acid digestion for the first time. We validated the method and applied it to certify the candidate protein certified reference material (CRM) of human growth hormone (hGH). The concentration of hGH was determined by analysing the total amount of sulfur in hGH. Next, the size-exclusion chromatography method was used with ICP-MS to characterize and quantify sulfur-containing impurities. By subtracting the contribution of sulfur-containing impurities from the total sulfur content in the hGH CRM, we obtained a SI-traceable certification value. The quantification result obtained with the present method based on sulfur analysis was in excellent agreement with the result determined via a well-established protein quantification method based on amino acid analysis using conventional acid hydrolysis combined with an ID liquid chromatography-tandem mass spectrometry. The element-based protein quantification method developed here can be generally used for SI-traceable absolute quantification of proteins, especially pure-protein standards.
Krachler, M; Irgolic, K J
1999-11-01
The advantages accruing to biochemical and clinical investigations from a method that allows the simultaneous quantification (RSD < or = 10%) of many elements in blood, plasma, and serum at concentrations equal to one-hundredth of the lower limits of the normal ranges are undeniable. The suitability of inductively coupled argon plasma low-resolution quadrupole mass spectrometry (ICP-MS), a simultaneous method with low detection limits, is evaluated for the quantification of inorganic constituents in whole blood, plasma, and serum with consideration of the dilution associated with the mineralization of the samples, of isobaric and polyatomic interferences and of normal ranges. Of the 3 bulk elements, the 3 major electrolytes, the 15 essential elements, the 8 toxic elements, the 4 therapeutic elements, and the 14 elements of potential interest (total of 47 elements) only 7 elements (Ca, Cu, K, Mg, Rb, Sr, Zn) can be simultaneously quantified under these rigorous conditions in serum and only 8 elements (additional element Pb) in whole blood. Quantification of elements in the Seronorm Standards "Whole Blood" and "Serum" showed, that this list of simultaneously determinable elements in these matrices is reasonable. Although this list is disappointingly short, the number of elements determinable simultaneously by ICP-MS is still larger than that by ICP-AES or GFAAS. Improved detectors, more efficient nebulizers, avoidance of interferences, better instrument design, and high-resolution mass spectrometers promise to increase the number of elements that can be determined simultaneously.
Simple, Fast, and Sensitive Method for Quantification of Tellurite in Culture Media▿
Molina, Roberto C.; Burra, Radhika; Pérez-Donoso, José M.; Elías, Alex O.; Muñoz, Claudia; Montes, Rebecca A.; Chasteen, Thomas G.; Vásquez, Claudio C.
2010-01-01
A fast, simple, and reliable chemical method for tellurite quantification is described. The procedure is based on the NaBH4-mediated reduction of TeO32− followed by the spectrophotometric determination of elemental tellurium in solution. The method is highly reproducible, is stable at different pH values, and exhibits linearity over a broad range of tellurite concentrations. PMID:20525868
Protein Quantification by Elemental Mass Spectrometry: An Experiment for Graduate Students
ERIC Educational Resources Information Center
Schwarz, Gunnar; Ickert, Stefanie; Wegner, Nina; Nehring, Andreas; Beck, Sebastian; Tiemann, Ruediger; Linscheid, Michael W.
2014-01-01
A multiday laboratory experiment was designed to integrate inductively coupled plasma-mass spectrometry (ICP-MS) in the context of protein quantification into an advanced practical course in analytical and environmental chemistry. Graduate students were familiar with the analytical methods employed, whereas the combination of bioanalytical assays…
2016-02-01
SPECTROMETRY: QUANTIFICATION OF FREE GB FROM VARIOUS FOOD MATRICES ECBC-TR-1351 Sue Y. Bae Mark D. Winemiller RESEARCH AND TECHNOLOGY DIRECTORATE...Flight Mass Spectrometry: Quantification of Free GB from Various Food Matrices 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER...methylphosphonofluoridate (sarin, GB) in various food matrices. The development of a solid-phase extraction method using a normal-phase silica gel column for
Quantification of Methylated Selenium, Sulfur, and Arsenic in the Environment
Vriens, Bas; Ammann, Adrian A.; Hagendorfer, Harald; Lenz, Markus; Berg, Michael; Winkel, Lenny H. E.
2014-01-01
Biomethylation and volatilization of trace elements may contribute to their redistribution in the environment. However, quantification of volatile, methylated species in the environment is complicated by a lack of straightforward and field-deployable air sampling methods that preserve element speciation. This paper presents a robust and versatile gas trapping method for the simultaneous preconcentration of volatile selenium (Se), sulfur (S), and arsenic (As) species. Using HPLC-HR-ICP-MS and ESI-MS/MS analyses, we demonstrate that volatile Se and S species efficiently transform into specific non-volatile compounds during trapping, which enables the deduction of the original gaseous speciation. With minor adaptations, the presented HPLC-HR-ICP-MS method also allows for the quantification of 13 non-volatile methylated species and oxyanions of Se, S, and As in natural waters. Application of these methods in a peatland indicated that, at the selected sites, fluxes varied between 190–210 ng Se·m−2·d−1, 90–270 ng As·m−2·d−1, and 4–14 µg S·m−2·d−1, and contained at least 70% methylated Se and S species. In the surface water, methylated species were particularly abundant for As (>50% of total As). Our results indicate that methylation plays a significant role in the biogeochemical cycles of these elements. PMID:25047128
A Fatigue Crack Size Evaluation Method Based on Lamb Wave Simulation and Limited Experimental Data
He, Jingjing; Ran, Yunmeng; Liu, Bin; Yang, Jinsong; Guan, Xuefei
2017-01-01
This paper presents a systematic and general method for Lamb wave-based crack size quantification using finite element simulations and Bayesian updating. The method consists of construction of a baseline quantification model using finite element simulation data and Bayesian updating with limited Lamb wave data from target structure. The baseline model correlates two proposed damage sensitive features, namely the normalized amplitude and phase change, with the crack length through a response surface model. The two damage sensitive features are extracted from the first received S0 mode wave package. The model parameters of the baseline model are estimated using finite element simulation data. To account for uncertainties from numerical modeling, geometry, material and manufacturing between the baseline model and the target model, Bayesian method is employed to update the baseline model with a few measurements acquired from the actual target structure. A rigorous validation is made using in-situ fatigue testing and Lamb wave data from coupon specimens and realistic lap-joint components. The effectiveness and accuracy of the proposed method is demonstrated under different loading and damage conditions. PMID:28902148
Zhang, Chi; Fang, Xin; Qiu, Haopu; Li, Ning
2015-01-01
Real-time PCR amplification of mitochondria gene could not be used for DNA quantification, and that of single copy DNA did not allow an ideal sensitivity. Moreover, cross-reactions among similar species were commonly observed in the published methods amplifying repetitive sequence, which hindered their further application. The purpose of this study was to establish a short interspersed nuclear element (SINE)-based real-time PCR approach having high specificity for species detection that could be used in DNA quantification. After massive screening of candidate Sus scrofa SINEs, one optimal combination of primers and probe was selected, which had no cross-reaction with other common meat species. LOD of the method was 44 fg DNA/reaction. Further, quantification tests showed this approach was practical in DNA estimation without tissue variance. Thus, this study provided a new tool for qualitative detection of porcine component, which could be promising in the QC of meat products.
Stable isotope labelling methods in mass spectrometry-based quantitative proteomics.
Chahrour, Osama; Cobice, Diego; Malone, John
2015-09-10
Mass-spectrometry based proteomics has evolved as a promising technology over the last decade and is undergoing a dramatic development in a number of different areas, such as; mass spectrometric instrumentation, peptide identification algorithms and bioinformatic computational data analysis. The improved methodology allows quantitative measurement of relative or absolute protein amounts, which is essential for gaining insights into their functions and dynamics in biological systems. Several different strategies involving stable isotopes label (ICAT, ICPL, IDBEST, iTRAQ, TMT, IPTL, SILAC), label-free statistical assessment approaches (MRM, SWATH) and absolute quantification methods (AQUA) are possible, each having specific strengths and weaknesses. Inductively coupled plasma mass spectrometry (ICP-MS), which is still widely recognised as elemental detector, has recently emerged as a complementary technique to the previous methods. The new application area for ICP-MS is targeting the fast growing field of proteomics related research, allowing absolute protein quantification using suitable elemental based tags. This document describes the different stable isotope labelling methods which incorporate metabolic labelling in live cells, ICP-MS based detection and post-harvest chemical label tagging for protein quantification, in addition to summarising their pros and cons. Copyright © 2015 Elsevier B.V. All rights reserved.
PCR technology for screening and quantification of genetically modified organisms (GMOs).
Holst-Jensen, Arne; Rønning, Sissel B; Løvseth, Astrid; Berdal, Knut G
2003-04-01
Although PCR technology has obvious limitations, the potentially high degree of sensitivity and specificity explains why it has been the first choice of most analytical laboratories interested in detection of genetically modified (GM) organisms (GMOs) and derived materials. Because the products that laboratories receive for analysis are often processed and refined, the quality and quantity of target analyte (e.g. protein or DNA) frequently challenges the sensitivity of any detection method. Among the currently available methods, PCR methods are generally accepted as the most sensitive and reliable methods for detection of GM-derived material in routine applications. The choice of target sequence motif is the single most important factor controlling the specificity of the PCR method. The target sequence is normally a part of the modified gene construct, for example a promoter, a terminator, a gene, or a junction between two of these elements. However, the elements may originate from wildtype organisms, they may be present in more than one GMO, and their copy number may also vary from one GMO to another. They may even be combined in a similar way in more than one GMO. Thus, the choice of method should fit the purpose. Recent developments include event-specific methods, particularly useful for identification and quantification of GM content. Thresholds for labelling are now in place in many countries including those in the European Union. The success of the labelling schemes is dependent upon the efficiency with which GM-derived material can be detected. We will present an overview of currently available PCR methods for screening and quantification of GM-derived DNA, and discuss their applicability and limitations. In addition, we will discuss some of the major challenges related to determination of the limits of detection (LOD) and quantification (LOQ), and to validation of methods.
NASA Astrophysics Data System (ADS)
Wellenreuther, G.; Fittschen, U. E. A.; Achard, M. E. S.; Faust, A.; Kreplin, X.; Meyer-Klaucke, W.
2008-12-01
Total reflection X-ray fluorescence (TXRF) is a very promising method for the direct, quick and reliable multi-elemental quantification of trace elements in protein samples. With the introduction of an internal standard consisting of two reference elements, scandium and gallium, a wide range of proteins can be analyzed, regardless of their salt content, buffer composition, additives and amino acid composition. This strategy also enables quantification of matrix effects. Two potential issues associated with drying have been considered in this study: (1) Formation of heterogeneous residues of varying thickness and/or density; and (2) separation of the internal standard and protein during drying (which has to be prevented to allow accurate quantification). These issues were investigated by microbeam X-ray fluorescence (μXRF) with special emphasis on (I) the influence of sample support and (II) the protein / buffer system used. In the first part, a model protein was studied on well established sample supports used in TXRF, PIXE and XRF (Mylar, siliconized quartz, Plexiglas and silicon). In the second part we imaged proteins of different molecular weight, oligomerization state, bound metals and solubility. A partial separation of protein and internal standard was only observed with untreated silicon, suggesting it may not be an adequate support material. Siliconized quartz proved to be the least prone to heterogeneous drying of the sample and yielded the most reliable results.
NASA Astrophysics Data System (ADS)
Schalm, O.; Janssens, K.
2003-04-01
Quantitative analysis by means of electron probe X-ray microanalysis (EPXMA) of low Z materials such as silicate glasses can be hampered by the fact that ice or other contaminants build up on the Si(Li) detector beryllium window or (in the case of a windowless detector) on the Si(Li) crystal itself. These layers act as an additional absorber in front of the detector crystal, decreasing the detection efficiency at low energies (<5 keV). Since the layer thickness gradually changes with time, also the detector efficiency in the low energy region is not constant. Using the normal ZAF approach to quantification of EPXMA data is cumbersome in these conditions, because spectra from reference materials and from unknown samples must be acquired within a fairly short period of time in order to avoid the effect of the change in efficiency. To avoid this problem, an alternative approach to quantification of EPXMA data is proposed, following a philosophy often employed in quantitative analysis of X-ray fluorescence (XRF) and proton-induced X-ray emission (PIXE) data. This approach is based on the (experimental) determination of thin-film element yields, rather than starting from infinitely thick and single element calibration standards. These thin-film sensitivity coefficients can also be interpolated to allow quantification of elements for which no suitable standards are available. The change in detector efficiency can be monitored by collecting an X-ray spectrum of one multi-element glass standard. This information is used to adapt the previously determined thin-film sensitivity coefficients to the actual detector efficiency conditions valid on the day that the experiments were carried out. The main advantage of this method is that spectra collected from the standards and from the unknown samples should not be acquired within a short period of time. This new approach is evaluated for glass and metal matrices and is compared with a standard ZAF method.
NASA Astrophysics Data System (ADS)
Giovanis, D. G.; Shields, M. D.
2018-07-01
This paper addresses uncertainty quantification (UQ) for problems where scalar (or low-dimensional vector) response quantities are insufficient and, instead, full-field (very high-dimensional) responses are of interest. To do so, an adaptive stochastic simulation-based methodology is introduced that refines the probability space based on Grassmann manifold variations. The proposed method has a multi-element character discretizing the probability space into simplex elements using a Delaunay triangulation. For every simplex, the high-dimensional solutions corresponding to its vertices (sample points) are projected onto the Grassmann manifold. The pairwise distances between these points are calculated using appropriately defined metrics and the elements with large total distance are sub-sampled and refined. As a result, regions of the probability space that produce significant changes in the full-field solution are accurately resolved. An added benefit is that an approximation of the solution within each element can be obtained by interpolation on the Grassmann manifold. The method is applied to study the probability of shear band formation in a bulk metallic glass using the shear transformation zone theory.
Iterative fitting method for the evaluation and quantification of PAES spectra
NASA Astrophysics Data System (ADS)
Zimnik, Samantha; Hackenberg, Mathias; Hugenschmidt, Christoph
2017-01-01
The elemental composition of surfaces is of great importance for the understanding of many surface processes such as catalysis. For a reliable analysis and a comparison of results, the quantification of the measured data is indispensable. Positron annihilation induced Auger Electron Spectroscopy (PAES) is a spectroscopic technique that measures the elemental composition with outstanding surface sensitivity, but up to now, no standardized evaluation procedure for PAES spectra is available. In this paper we present a new approach for the evaluation of PAES spectra of compounds, using the spectra obtained for the pure elements as reference. The measured spectrum is then fitted by a linear combination of the reference spectra by varying their intensities. The comparison of the results of the fitting routine with a calculation of the full parameter range shows an excellent agreement. We present the results of the new analysis method to evaluate the PAES spectra of sub-monolayers of Ni on a Pd substrate.
Kemeny, Steven Frank; Clyne, Alisa Morss
2011-04-01
Fiber alignment plays a critical role in the structure and function of cells and tissues. While fiber alignment quantification is important to experimental analysis and several different methods for quantifying fiber alignment exist, many studies focus on qualitative rather than quantitative analysis perhaps due to the complexity of current fiber alignment methods. Speed and sensitivity were compared in edge detection and fast Fourier transform (FFT) for measuring actin fiber alignment in cells exposed to shear stress. While edge detection using matrix multiplication was consistently more sensitive than FFT, image processing time was significantly longer. However, when MATLAB functions were used to implement edge detection, MATLAB's efficient element-by-element calculations and fast filtering techniques reduced computation cost 100 times compared to the matrix multiplication edge detection method. The new computation time was comparable to the FFT method, and MATLAB edge detection produced well-distributed fiber angle distributions that statistically distinguished aligned and unaligned fibers in half as many sample images. When the FFT sensitivity was improved by dividing images into smaller subsections, processing time grew larger than the time required for MATLAB edge detection. Implementation of edge detection in MATLAB is simpler, faster, and more sensitive than FFT for fiber alignment quantification.
Reduction of Solvent Effect in Reverse Phase Gradient Elution LC-ICP-MS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sullivan, Patrick Allen
2005-12-17
Quantification in liquid chromatography (LC) is becoming very important as more researchers are using LC, not as an analytical tool itself, but as a sample introduction system for other analytical instruments. The ability of LC instrumentation to quickly separate a wide variety of compounds makes it ideal for analysis of complex mixtures. For elemental speciation, LC is joined with inductively coupled plasma mass spectrometry (ICP-MS) to separate and detect metal-containing, organic compounds in complex mixtures, such as biological samples. Often, the solvent gradients required to perform complex separations will cause matrix effects within the plasma. This limits the sensitivity ofmore » the ICP-MS and the quantification methods available for use in such analyses. Traditionally, isotope dilution has been the method of choice for LC-ICP-MS quantification. The use of naturally abundant isotopes of a single element in quantification corrects for most of the effects that LC solvent gradients produce within the plasma. However, not all elements of interest in speciation studies have multiple naturally occurring isotopes; and polyatomic interferences for a given isotope can develop within the plasma, depending on the solvent matrix. This is the case for reverse phase LC separations, where increasing amounts of organic solvent are required. For such separations, an alternative to isotope dilution for quantification would be is needed. To this end, a new method was developed using the Apex-Q desolvation system (ESI, Omaha, NE) to couple LC instrumentation with an ICP-MS device. The desolvation power of the system allowed greater concentrations of methanol to be introduced to the plasma prior to destabilization than with direct methanol injection into the plasma. Studies were performed, using simulated and actual linear methanol gradients, to find analyte-internal standard (AIS) pairs whose ratio remains consistent (deviations {+-} 10%) over methanol concentration ranges of 5%-35% (simulated) and 8%-32% (actual). Quadrupole (low resolution) and sector field (high resolution) ICP-MS instrumentation were utilized in these studies. Once an AIS pair is determined, quantification studies can be performed. First, an analysis is performed by adding both elements of the AIS pair post-column while performing the gradient elution without sample injection. A comparison of the ratio of the measured intensities to the atomic ratio of the two standards is used to determine a correction factor that can be used to account for the matrix effects caused by the mobile phase. Then, organic and/or biological molecules containing one of the two elements in the AIS pair are injected into the LC column. A gradient method is used to vary the methanol-water mixture in the mobile phase and to separate out the compounds in a given sample. A standard solution of the second ion in the AIS pair is added continuously post-column. By comparing the ratio of the measured intensities to the atomic ratio of the eluting compound and internal standard, the concentration of the injected compound can be determined.« less
Austin, Christine; Gennings, Chris; Tammimies, Kristiina; Bölte, Sven; Arora, Manish
2017-01-01
Environmental exposures to essential and toxic elements may alter health trajectories, depending on the timing, intensity, and mixture of exposures. In epidemiologic studies, these factors are typically analyzed as a function of elemental concentrations in biological matrices measured at one or more points in time. Such an approach, however, fails to account for the temporal cyclicity in the metabolism of environmental chemicals, which if perturbed may lead to adverse health outcomes. Here, we conceptualize and apply a non-linear method–recurrence quantification analysis (RQA)–to quantify cyclical components of prenatal and early postnatal exposure profiles for elements essential to normal development, including Zn, Mn, Mg, and Ca, and elements associated with deleterious health effects or narrow tolerance ranges, including Pb, As, and Cr. We found robust evidence of cyclical patterns in the metabolic profiles of nutrient elements, which we validated against randomized twin-surrogate time-series, and further found that nutrient dynamical properties differ from those of Cr, As, and Pb. Furthermore, we extended this approach to provide a novel method of quantifying dynamic interactions between two environmental exposures. To achieve this, we used cross-recurrence quantification analysis (CRQA), and found that elemental nutrient-nutrient interactions differed from those involving toxicants. These rhythmic regulatory interactions, which we characterize in two geographically distinct cohorts, have not previously been uncovered using traditional regression-based approaches, and may provide a critical unit of analysis for environmental and dietary exposures in epidemiological studies. PMID:29112980
Moraleja, I; Mena, M L; Lázaro, A; Neumann, B; Tejedor, A; Jakubowski, N; Gómez-Gómez, M M; Esteban-Fernández, D
2018-02-01
Laser ablation inductively coupled plasma mass spectrometry (LA-ICP-MS) has been revealed as a convenient technique for trace elemental imaging in tissue sections, providing elemental 2D distribution at a quantitative level. For quantification purposes, in the last years several approaches have been proposed in the literature such as the use of CRMs or matrix matched standards. The use of Isotope Dilution (ID) for quantification by LA-ICP-MS has been also described, being mainly useful for bulk analysis but not feasible for spatial measurements so far. In this work, a quantification method based on ID analysis was developed by printing isotope-enriched inks onto kidney slices from rats treated with antitumoral Pt-based drugs using a commercial ink-jet device, in order to perform an elemental quantification in different areas from bio-images. For the ID experiments 194 Pt enriched platinum was used. The methodology was validated by deposition of natural Pt standard droplets with a known amount of Pt onto the surface of a control tissue, where could be quantified even 50pg of Pt, with recoveries higher than 90%. The amount of Pt present in the whole kidney slices was quantified for cisplatin, carboplatin and oxaliplatin-treated rats. The results obtained were in accordance with those previously reported. The amount of Pt distributed between the medullar and cortical areas was also quantified, observing different behavior for the three drugs. Copyright © 2017 Elsevier B.V. All rights reserved.
Dubascoux, Stéphane; Andrey, Daniel; Vigo, Mario; Kastenmayer, Peter; Poitevin, Eric
2018-09-01
Nutritional information about human milk is essential as early human growth and development have been closely linked to the status and requirements of several macro- and micro-elements. However, methods addressing whole mineral profiling in human milk have been scarce due in part to their technical complexities to accurately and simultaneously measure the concentration of micro- and macro-trace elements in low volume of human milk. In the present study, a single laboratory validation has been performed using a "dilute and shoot" approach for the quantification of sodium (Na), magnesium (Mg), phosphorus (P), potassium (K), calcium (Ca), manganese (Mn), iron (Fe), copper (Cu), zinc (Zn), selenium (Se), molybdenum (Mo) and iodine (I), in both human milk and milk preparations. Performances in terms of limits of detection and quantification, of repeatability, reproducibility and trueness have been assessed and verified using various reference or certified materials. For certified human milk sample (NIST 1953), recoveries obtained for reference or spiked values are ranged from 93% to 108% (except for Mn at 151%). This robust method using new technology ICP-MS/MS without high pressure digestion is adapted to both routinely and rapidly analyze human milk micro-sample (i.e. less than 250 μL) in the frame of clinical trials but also to be extended to the mineral profiling of milk preparations like infant formula and adult nutritionals. Copyright © 2018 Elsevier GmbH. All rights reserved.
de Oliveira, Fernanda Ataide; de Abreu, Adriana Trópia; de Oliveira Nascimento, Nathália; Froes-Silva, Roberta Eliane Santos; Antonini, Yasmine; Nalini, Hermínio Arias; de Lena, Jorge Carvalho
2017-01-01
Bees are considered the main pollinators in natural and agricultural environments. Chemical elements from honey and pollen have been used for monitoring the environment, the health of bees and the quality of their products. Nevertheless, there are not many studies on honey and pollen of native Brazilian bees. The goal of this work was to determine important chemical elements (Sc, Y, La, Ce, Pr, Nd, Sm, Eu, Gd, Dy, Ho, Er, Tm, Lu and Yb) along with As, Bi, Cd, Pb, Se and In, in honey and pollen of native Brazilian bees, assessing analytical interferences from the matrix. A proposed analytical method was developed for these elements by quadrupole ICP-MS. Matrix effect was verified in honey matrix in the quantification of As, Bi and Dy; and in pollen matrix for Bi, Cd, Ce, Gd, La, Pb and Sc. The quality of the method was considered satisfactory taking into consideration the recovery rate of each element in the spiked solutions: honey matrix (91.6-103.9%) and pollen matrix (94.1-115.6%). The quantification limits of the method ranged between 0.00041 and 10.3μgL -1 for honey and 0.00041-0.095μgL -1 for pollen. The results demonstrate that the method is accurate, precise and suitable. Copyright © 2016 Elsevier B.V. All rights reserved.
Laser-induced plasma characterization through self-absorption quantification
NASA Astrophysics Data System (ADS)
Hou, JiaJia; Zhang, Lei; Zhao, Yang; Yan, Xingyu; Ma, Weiguang; Dong, Lei; Yin, Wangbao; Xiao, Liantuan; Jia, Suotang
2018-07-01
A self-absorption quantification method is proposed to quantify the self-absorption degree of spectral lines, in which plasma characteristics including electron temperature, elemental concentration ratio, and absolute species number density can be deduced directly. Since there is no spectral intensity involved in the calculation, the analysis results are independent of the self-absorption effects and the additional spectral efficiency calibration is not required. In order to evaluate the practicality, the limitation for application and the precision of this method are also discussed. Experimental results of aluminum-lithium alloy prove that the proposed method is qualified to realize semi-quantitative measurements and fast plasma characteristics diagnostics.
New approach for the quantification of processed animal proteins in feed using light microscopy.
Veys, P; Baeten, V
2010-07-01
A revision of European Union's total feed ban on animal proteins in feed will need robust quantification methods, especially for control analyses, if tolerance levels are to be introduced, as for fishmeal in ruminant feed. In 2006, a study conducted by the Community Reference Laboratory for Animal Proteins in feedstuffs (CRL-AP) demonstrated the deficiency of the official quantification method based on light microscopy. The study concluded that the method had to be revised. This paper puts forward an improved quantification method based on three elements: (1) the preparation of permanent slides with an optical adhesive preserving all morphological markers of bones necessary for accurate identification and precision counting; (2) the use of a counting grid eyepiece reticle; and (3) new definitions for correction factors for the estimated portions of animal particles in the sediment. This revised quantification method was tested on feeds adulterated at different levels with bovine meat and bone meal (MBM) and fishmeal, and it proved to be effortless to apply. The results obtained were very close to the expected values of contamination levels for both types of adulteration (MBM or fishmeal). Calculated values were not only replicable, but also reproducible. The advantages of the new approach, including the benefits of the optical adhesive used for permanent slide mounting and the experimental conditions that need to be met to implement the new method correctly, are discussed.
Sotelo, Julio; Urbina, Jesús; Valverde, Israel; Mura, Joaquín; Tejos, Cristián; Irarrazaval, Pablo; Andia, Marcelo E; Hurtado, Daniel E; Uribe, Sergio
2018-01-01
We propose a 3D finite-element method for the quantification of vorticity and helicity density from 3D cine phase-contrast (PC) MRI. By using a 3D finite-element method, we seamlessly estimate velocity gradients in 3D. The robustness and convergence were analyzed using a combined Poiseuille and Lamb-Ossen equation. A computational fluid dynamics simulation was used to compared our method with others available in the literature. Additionally, we computed 3D maps for different 3D cine PC-MRI data sets: phantom without and with coarctation (18 healthy volunteers and 3 patients). We found a good agreement between our method and both the analytical solution of the combined Poiseuille and Lamb-Ossen. The computational fluid dynamics results showed that our method outperforms current approaches to estimate vorticity and helicity values. In the in silico model, we observed that for a tetrahedral element of 2 mm of characteristic length, we underestimated the vorticity in less than 5% with respect to the analytical solution. In patients, we found higher values of helicity density in comparison to healthy volunteers, associated with vortices in the lumen of the vessels. We proposed a novel method that provides entire 3D vorticity and helicity density maps, avoiding the used of reformatted 2D planes from 3D cine PC-MRI. Magn Reson Med 79:541-553, 2018. © 2017 International Society for Magnetic Resonance in Medicine. © 2017 International Society for Magnetic Resonance in Medicine.
Measuring Mass-Based Hygroscopicity of Atmospheric Particles through in situ Imaging
DOE Office of Scientific and Technical Information (OSTI.GOV)
Piens, Dominique` Y.; Kelly, Stephen T.; Harder, Tristan
Quantifying how atmospheric particles interact with water vapor is critical for understanding the effects of aerosols on climate. We present a novel method to measure the mass-based hygroscopicity of particles while characterizing their elemental and carbon functional group compositions. Since mass-based hygroscopicity is insensitive to particle geometry, it is advantageous for probing the hygroscopic behavior of atmospheric particles, which can have irregular morphologies. Combining scanning electron microscopy with energy dispersive X-ray analysis (SEM/EDX), scanning transmission X-ray microscopy (STXM) analysis, and in situ STXM humidification experiments, this method was validated using laboratory-generated, atmospherically relevant particles. Then, the hygroscopicity and elemental compositionmore » of 15 complex atmospheric particles were analyzed by leveraging quantification of C, N, and O from STXM, and complementary elemental quantification from SEM/EDX. We found three types of hygroscopic responses, and correlated high hygroscopicity with Na and Cl content. The mixing state determined for 158 particles broadly agreed with those of the humidified particles, indicating the potential to infer the atmospheric hygroscopic behavior from a selected subset of particles. These methods offer unique quantitative capabilities to characterize and correlate the hygroscopicity and chemistry of individual submicron atmospheric particles.« less
Measuring mass-based hygroscopicity of atmospheric particles through in situ imaging
DOE Office of Scientific and Technical Information (OSTI.GOV)
Piens, Dominique S.; Kelly, Stephen T.; Harder, Tristan H.
Quantifying how atmospheric particles interact with water vapor is critical for understanding the effects of aerosols on climate. We present a novel method to measure the mass-based hygroscopicity of particles while characterizing their elemental and carbon functional group compositions. Since mass-based hygroscopicity is insensitive to particle geometry, it is advantageous for probing the hygroscopic behavior of atmospheric particles, which can have irregular morphologies. Combining scanning electron microscopy with energy dispersive X-ray analysis (SEM/EDX), scanning transmission X-ray microscopy (STXM) analysis, and in situ STXM humidification experiments, this method was validated using laboratory-generated, atmospherically relevant particles. Then, the hygroscopicity and elemental compositionmore » of 15 complex atmospheric particles were analyzed by leveraging quantification of C, N, and O from STXM, and complementary elemental quantification from SEM/EDX. We found three types of hygroscopic responses, and correlated high hygroscopicity with Na and Cl content. The mixing state of 158 other particles from the sample broadly agreed with those of the humidified particles, indicating the potential to infer atmospheric hygroscopic behavior from a selected subset of particles. As a result, these methods offer unique quantitative capabilities to characterize and correlate the hygroscopicity and chemistry of individual submicrometer atmospheric particles.« less
Measuring mass-based hygroscopicity of atmospheric particles through in situ imaging
Piens, Dominique S.; Kelly, Stephen T.; Harder, Tristan H.; ...
2016-04-18
Quantifying how atmospheric particles interact with water vapor is critical for understanding the effects of aerosols on climate. We present a novel method to measure the mass-based hygroscopicity of particles while characterizing their elemental and carbon functional group compositions. Since mass-based hygroscopicity is insensitive to particle geometry, it is advantageous for probing the hygroscopic behavior of atmospheric particles, which can have irregular morphologies. Combining scanning electron microscopy with energy dispersive X-ray analysis (SEM/EDX), scanning transmission X-ray microscopy (STXM) analysis, and in situ STXM humidification experiments, this method was validated using laboratory-generated, atmospherically relevant particles. Then, the hygroscopicity and elemental compositionmore » of 15 complex atmospheric particles were analyzed by leveraging quantification of C, N, and O from STXM, and complementary elemental quantification from SEM/EDX. We found three types of hygroscopic responses, and correlated high hygroscopicity with Na and Cl content. The mixing state of 158 other particles from the sample broadly agreed with those of the humidified particles, indicating the potential to infer atmospheric hygroscopic behavior from a selected subset of particles. As a result, these methods offer unique quantitative capabilities to characterize and correlate the hygroscopicity and chemistry of individual submicrometer atmospheric particles.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Proudnikov, D.; Kirillov, E.; Chumakov, K.
2000-01-01
This paper describes use of a new technology of hybridization with a micro-array of immobilized oligonucleotides for detection and quantification of neurovirulent mutants in Oral Poliovirus Vaccine (OPV). We used a micro-array consisting of three-dimensional gel-elements containing all possible hexamers (total of 4096 probes). Hybridization of fluorescently labelled viral cDNA samples with such microchips resulted in a pattern of spots that was registered and quantified by a computer-linked CCD camera, so that the sequence of the original cDNA could be deduced. The method could reliably identify single point mutations, since each of them affected fluorescence intensity of 12 micro-array elements.more » Micro-array hybridization of DNA mixtures with varying contents of point mutants demonstrated that the method can detect as little as 10% of revertants in a population of vaccine virus. This new technology should be useful for quality control of live viral vaccines, as well as for other applications requiring identification and quantification of point mutations.« less
Ortega, Richard; Devès, Guillaume; Carmona, Asunción
2009-01-01
The direct detection of biologically relevant metals in single cells and of their speciation is a challenging task that requires sophisticated analytical developments. The aim of this article is to present the recent achievements in the field of cellular chemical element imaging, and direct speciation analysis, using proton and synchrotron radiation X-ray micro- and nano-analysis. The recent improvements in focusing optics for MeV-accelerated particles and keV X-rays allow application to chemical element analysis in subcellular compartments. The imaging and quantification of trace elements in single cells can be obtained using particle-induced X-ray emission (PIXE). The combination of PIXE with backscattering spectrometry and scanning transmission ion microscopy provides a high accuracy in elemental quantification of cellular organelles. On the other hand, synchrotron radiation X-ray fluorescence provides chemical element imaging with less than 100 nm spatial resolution. Moreover, synchrotron radiation offers the unique capability of spatially resolved chemical speciation using micro-X-ray absorption spectroscopy. The potential of these methods in biomedical investigations will be illustrated with examples of application in the fields of cellular toxicology, and pharmacology, bio-metals and metal-based nano-particles. PMID:19605403
Hong, Huachang; Cai, Xiang; Shen, Liguo; Li, Renjie; Lin, Hongjun
2017-10-01
Quantification of interfacial interactions between two rough surfaces represents one of the most pressing requirements for membrane fouling prediction and control in membrane bioreactors (MBRs). This study firstly constructed regularly rough membrane and particle surfaces by using rigorous mathematical equations. Thereafter, a new method involving surface element integration (SEI) method, differential geometry and composite Simpson's rule was proposed to quantify the interfacial interactions between the two constructed rough surfaces. This new method were then applied to investigate interfacial interactions in a MBR with the data of surface properties of membrane and foulants experimentally measured. The feasibility of the new method was verified. It was found that asperity amplitude and period of the membrane surface exerted profound effects on the total interaction. The new method had broad potential application fields especially including guiding membrane surface design for membrane fouling mitigation. Copyright © 2017 Elsevier Ltd. All rights reserved.
Inductively Coupled Plasma Mass Spectrometry (ICP-MS) Applications in Quantitative Proteomics.
Chahrour, Osama; Malone, John
2017-01-01
Recent advances in inductively coupled plasma mass spectrometry (ICP-MS) hyphenated to different separation techniques have promoted it as a valuable tool in protein/peptide quantification. These emerging ICP-MS applications allow absolute quantification by measuring specific elemental responses. One approach quantifies elements already present in the structure of the target peptide (e.g. phosphorus and sulphur) as natural tags. Quantification of these natural tags allows the elucidation of the degree of protein phosphorylation in addition to absolute protein quantification. A separate approach is based on utilising bi-functional labelling substances (those containing ICP-MS detectable elements), that form a covalent chemical bond with the protein thus creating analogs which are detectable by ICP-MS. Based on the previously established stoichiometries of the labelling reagents, quantification can be achieved. This technique is very useful for the design of precise multiplexed quantitation schemes to address the challenges of biomarker screening and discovery. This review discusses the capabilities and different strategies to implement ICP-MS in the field of quantitative proteomics. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.
Perrin, Stephane; Baranski, Maciej; Froehly, Luc; Albero, Jorge; Passilly, Nicolas; Gorecki, Christophe
2015-11-01
We report a simple method, based on intensity measurements, for the characterization of the wavefront and aberrations produced by micro-optical focusing elements. This method employs the setup presented earlier in [Opt. Express 22, 13202 (2014)] for measurements of the 3D point spread function, on which a basic phase-retrieval algorithm is applied. This combination allows for retrieval of the wavefront generated by the micro-optical element and, in addition, quantification of the optical aberrations through the wavefront decomposition with Zernike polynomials. The optical setup requires only an in-motion imaging system. The technique, adapted for the optimization of micro-optical component fabrication, is demonstrated by characterizing a planoconvex microlens.
Quantification of multiple elements in dried blood spot samples.
Pedersen, Lise; Andersen-Ranberg, Karen; Hollergaard, Mads; Nybo, Mads
2017-08-01
Dried blood spots (DBS) is a unique matrix that offers advantages compared to conventional blood collection making it increasingly popular in large population studies. We here describe development and validation of a method to determine multiple elements in DBS. Elements were extracted from punches and analyzed using inductively coupled plasma-mass spectrometry (ICP-MS). The method was evaluated with quality controls with defined element concentration and blood spiked with elements to assess accuracy and imprecision. DBS element concentrations were compared with concentrations in venous blood. Samples with different hematocrit were spotted onto filter paper to assess hematocrit effect. The established method was precise and accurate for measurement of most elements in DBS. There was a significant but relatively weak correlation between measurement of the elements Mg, K, Fe, Cu, Zn, As and Se in DBS and venous whole blood. Hematocrit influenced the DBS element measurement, especially for K, Fe and Zn. Trace elements can be measured with high accuracy and low imprecision in DBS, but contribution of signal from the filter paper influences measurement of some elements present at low concentrations. Simultaneous measurement of K and Fe in DBS extracts may be used to estimate sample hematocrit. Copyright © 2017 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.
Quantification of Forecasting and Change-Point Detection Methods for Predictive Maintenance
2015-08-19
industries to manage the service life of equipment, and also to detect precursors to the failure of components found in nuclear power plants, wind turbines ...detection methods for predictive maintenance 5a. CONTRACT NUMBER FA2386-14-1-4096 5b. GRANT NUMBER Grant 14IOA015 AOARD-144096 5c. PROGRAM ELEMENT...sensitive to changes related to abnormality. 15. SUBJECT TERMS predictive maintenance , predictive maintenance , forecasting 16
A multilevel finite element method for Fredholm integral eigenvalue problems
NASA Astrophysics Data System (ADS)
Xie, Hehu; Zhou, Tao
2015-12-01
In this work, we proposed a multigrid finite element (MFE) method for solving the Fredholm integral eigenvalue problems. The main motivation for such studies is to compute the Karhunen-Loève expansions of random fields, which play an important role in the applications of uncertainty quantification. In our MFE framework, solving the eigenvalue problem is converted to doing a series of integral iterations and eigenvalue solving in the coarsest mesh. Then, any existing efficient integration scheme can be used for the associated integration process. The error estimates are provided, and the computational complexity is analyzed. It is noticed that the total computational work of our method is comparable with a single integration step in the finest mesh. Several numerical experiments are presented to validate the efficiency of the proposed numerical method.
NASA Astrophysics Data System (ADS)
Kump, P.; Vogel-Mikuš, K.
2018-05-01
Two fundamental-parameter (FP) based models for quantification of 2D elemental distribution maps of intermediate-thick biological samples by synchrotron low energy μ-X-ray fluorescence spectrometry (SR-μ-XRF) are presented and applied to the elemental analysis in experiments with monochromatic focused photon beam excitation at two low energy X-ray fluorescence beamlines—TwinMic, Elettra Sincrotrone Trieste, Italy, and ID21, ESRF, Grenoble, France. The models assume intermediate-thick biological samples composed of measured elements, the sources of the measurable spectral lines, and by the residual matrix, which affects the measured intensities through absorption. In the first model a fixed residual matrix of the sample is assumed, while in the second model the residual matrix is obtained by the iteration refinement of elemental concentrations and an adjusted residual matrix. The absorption of the incident focused beam in the biological sample at each scanned pixel position, determined from the output of a photodiode or a CCD camera, is applied as a control in the iteration procedure of quantification.
Kretschy, Daniela; Koellensperger, Gunda; Hann, Stephan
2012-01-01
This article reviews novel quantification concepts where elemental labelling is combined with flow injection inductively coupled plasma mass spectrometry (FI-ICP-MS) or liquid chromatography inductively coupled plasma mass spectrometry (LC–ICP-MS), and employed for quantification of biomolecules such as proteins, peptides and related molecules in challenging sample matrices. In the first sections an overview on general aspects of biomolecule quantification, as well as of labelling will be presented emphasizing the potential, which lies in such methodological approaches. In this context, ICP-MS as detector provides high sensitivity, selectivity and robustness in biological samples and offers the capability for multiplexing and isotope dilution mass spectrometry (IDMS). Fundamental methodology of elemental labelling will be highlighted and analytical, as well as biomedical applications will be presented. A special focus will lie on established applications underlining benefits and bottlenecks of such approaches for the implementation in real life analysis. Key research made in this field will be summarized and a perspective for future developments including sophisticated and innovative applications will given. PMID:23062431
Quantification of lithium at ppm level in geological samples using nuclear reaction analysis.
De La Rosa, Nathaly; Kristiansson, Per; Nilsson, E J Charlotta; Ros, Linus; Pallon, Jan; Skogby, Henrik
2018-01-01
Proton-induced reaction (p,α) is one type of nuclear reaction analysis (NRA) suitable especially for light element quantification. In the case of lithium quantification presented in this work, accelerated protons with an energy about of 850 keV were used to induce the 7 Li(p,α) 4 He reaction in standard reference and geological samples such as tourmaline and other Li-minerals. It is shown that this technique for lithium quantification allowed for measurement of concentrations down below one ppm. The possibility to relate the lithium content with the boron content in a single analysis was also demonstrated using tourmaline samples, both in absolute concentration and in lateral distribution. In addition, Particle induced X-ray emission (PIXE) was utilized as a complementary IBA technique for simultaneous mapping of elements heavier than sodium.
NASA Astrophysics Data System (ADS)
Wright, K. E.; Popa, K.; Pöml, P.
2018-01-01
Transmutation nuclear fuels contain weight percentage quantities of actinide elements, including Pu, Am and Np. Because of the complex spectra presented by actinide elements using electron probe microanalysis (EPMA), it is necessary to have relatively pure actinide element standards to facilitate overlap correction and accurate quantitation. Synthesis of actinide oxide standards is complicated by their multiple oxidation states, which can result in inhomogeneous standards or standards that are not stable at atmospheric conditions. Synthesis of PuP4 results in a specimen that exhibits stable oxidation-reduction chemistry and is sufficiently homogenous to serve as an EPMA standard. This approach shows promise as a method for producing viable actinide standards for microanalysis.
Optimized approaches for quantification of drug transporters in tissues and cells by MRM proteomics.
Prasad, Bhagwat; Unadkat, Jashvant D
2014-07-01
Drug transporter expression in tissues (in vivo) usually differs from that in cell lines used to measure transporter activity (in vitro). Therefore, quantification of transporter expression in tissues and cell lines is important to develop scaling factor for in vitro to in vivo extrapolation (IVIVE) of transporter-mediated drug disposition. Since traditional immunoquantification methods are semiquantitative, targeted proteomics is now emerging as a superior method to quantify proteins, including membrane transporters. This superiority is derived from the selectivity, precision, accuracy, and speed of analysis by liquid chromatography tandem mass spectrometry (LC-MS/MS) in multiple reaction monitoring (MRM) mode. Moreover, LC-MS/MS proteomics has broader applicability because it does not require selective antibodies for individual proteins. There are a number of recent research and review papers that discuss the use of LC-MS/MS for transporter quantification. Here, we have compiled from the literature various elements of MRM proteomics to provide a comprehensive systematic strategy to quantify drug transporters. This review emphasizes practical aspects and challenges in surrogate peptide selection, peptide qualification, peptide synthesis and characterization, membrane protein isolation, protein digestion, sample preparation, LC-MS/MS parameter optimization, method validation, and sample analysis. In particular, bioinformatic tools used in method development and sample analysis are discussed in detail. Various pre-analytical and analytical sources of variability that should be considered during transporter quantification are highlighted. All these steps are illustrated using P-glycoprotein (P-gp) as a case example. Greater use of quantitative transporter proteomics will lead to a better understanding of the role of drug transporters in drug disposition.
NASA Astrophysics Data System (ADS)
Furger, Markus; Cruz Minguillón, María; Yadav, Varun; Slowik, Jay G.; Hüglin, Christoph; Fröhlich, Roman; Petterson, Krag; Baltensperger, Urs; Prévôt, André S. H.
2017-06-01
The Xact 625 Ambient Metals Monitor was tested during a 3-week field campaign at the rural, traffic-influenced site Härkingen in Switzerland during the summer of 2015. The field campaign encompassed the Swiss National Day fireworks event, providing increased concentrations and unique chemical signatures compared to non-fireworks (or background) periods. The objective was to evaluate the data quality by intercomparison with other independent measurements and test its applicability for aerosol source quantification. The Xact was configured to measure 24 elements in PM10 with 1 h time resolution. Data quality was evaluated for 10 24 h averages of Xact data by intercomparison with 24 h PM10 filter data analysed with ICP-OES for major elements, ICP-MS for trace elements, and gold amalgamation atomic absorption spectrometry for Hg. Ten elements (S, K, Ca, Ti, Mn, Fe, Cu, Zn, Ba, Pb) showed excellent correlation between the compared methods, with r2 values ≥ 0.95. However, the slopes of the regressions between Xact 625 and ICP data varied from 0.97 to 1.8 (average 1.28) and thus indicated generally higher Xact elemental concentrations than ICP for these elements. Possible reasons for these differences are discussed, but further investigations are needed. For the remaining elements no conclusions could be drawn about their quantification for various reasons, mainly detection limit issues. An indirect intercomparison of hourly values was performed for the fireworks peak, which brought good agreement of total masses when the Xact data were corrected with the regressions from the 24 h value intercomparison. The results demonstrate that multi-metal characterization at high-time-resolution capability of Xact is a valuable and practical tool for ambient monitoring.
[Determination of multi-element contents in gypsum by ICP-AES].
Guo, Zhong-bao; Bai, Yong-zhi; Cui, Jin-hua; Mei, Yi-fei; Ma, Zhen-zhu
2014-08-01
The content of multi-element in gypsum was determined by ICP-AES. The sample was pretreated by acid-soluble method or alkali-fusion method. Acid-soluble method is suitable for the determination of CaO, SOs, Al2O3, Fe2O3, MgO, K2O, Na2O, TiO2, P2O5, MnO, SrO and BaO. Alkali-fusion method is suitable for the determination of CaO, SO3, SiO2, Al2O3, Fe2O3, MgO, TiO2, P2O5, MnO, SrO, BaO and B2O3. Different series standard solutions were prepared considering the properties and content of elements and solution matrix. The limit of detection and quantification were confirmed for each element under their best analysis spectral lines. The recoveries of the two pretreatment methods were from 93% to 110%, besides that for TiO2 was 81%-87% as pretreated by acid-soluble method. All RSDs (n=6) of tests were from 0.70%-3.42%. The accuracies of CaO and SO3 with ICP-AES method were less than the chemical analysis method. The determination of CaO and SO3 with ICP-AES method is only suitable for the case of low accuracy requirement. The results showed that the method can be used for the determination of multi-element contents in gypsum, with simple operation, fast analysis and reliable results. Total elements can be analysed with both acid-soluble method and alkali-fusion method.
Krystek, Petra; Bäuerlein, Patrick S; Kooij, Pascal J F
2015-03-15
For pharmaceutical applications, the use of inorganic engineered nanoparticles is of growing interest while silver (Ag) and gold (Au) are the most relevant elements. A few methods were developed recently but the validation and the application testing were quite limited. Therefore, a routinely suitable multi element method for the identification of nanoparticles of different sizes below 100 nm and elemental composition by applying asymmetric flow field flow fraction (AF4) - inductively coupled plasma mass spectrometry (ICPMS) is developed. A complete validation model of the quantification of releasable pharmaceutical relevant inorganic nanoparticles based on Ag and Au is presented for the most relevant aqueous matrices of tap water and domestic waste water. The samples are originated from locations in the Netherlands and it is of great interest to study the unwanted presence of Ag and Au as nanoparticle residues due to possible health and environmental risks. During method development, instability effects are observed for 60 nm and 70 nm Ag ENPs with different capping agents. These effects are studied more closely in relation to matrix effects. Besides the methodological aspects, the obtained analytical results and relevant performance characteristics (e.g. measuring range, limit of detection, repeatability, reproducibility, trueness, and expanded uncertainty of measurement) are determined and discussed. For the chosen aqueous matrices, the results of the performance characteristics are significantly better for Au ENPs in comparison to Ag ENPs; e.g. repeatability and reproducibility are below 10% for all Au ENPs respectively maximal 27% repeatability for larger Ag ENPs. The method is a promising tool for the simultaneous determination of releasable pharmaceutical relevant inorganic nanoparticles. Copyright © 2014 Elsevier B.V. All rights reserved.
Malucelli, Emil; Procopio, Alessandra; Fratini, Michela; Gianoncelli, Alessandra; Notargiacomo, Andrea; Merolle, Lucia; Sargenti, Azzurra; Castiglioni, Sara; Cappadone, Concettina; Farruggia, Giovanna; Lombardo, Marco; Lagomarsino, Stefano; Maier, Jeanette A; Iotti, Stefano
2018-01-01
The quantification of elemental concentration in cells is usually performed by analytical assays on large populations missing peculiar but important rare cells. The present article aims at comparing the elemental quantification in single cells and cell population in three different cell types using a new approach for single cells elemental analysis performed at sub-micrometer scale combining X-ray fluorescence microscopy and atomic force microscopy. The attention is focused on the light element Mg, exploiting the opportunity to compare the single cell quantification to the cell population analysis carried out by a highly Mg-selective fluorescent chemosensor. The results show that the single cell analysis reveals the same Mg differences found in large population of the different cell strains studied. However, in one of the cell strains, single cell analysis reveals two cells with an exceptionally high intracellular Mg content compared with the other cells of the same strain. The single cell analysis allows mapping Mg and other light elements in whole cells at sub-micrometer scale. A detailed intensity correlation analysis on the two cells with the highest Mg content reveals that Mg subcellular localization correlates with oxygen in a different fashion with respect the other sister cells of the same strain. Graphical abstract Single cells or large population analysis this is the question!
Baeten; Bruggeman; Paepen; Carchon
2000-03-01
The non-destructive quantification of transuranic elements in nuclear waste management or in safeguards verifications is commonly performed by passive neutron assay techniques. To minimise the number of unknown sample-dependent parameters, Neutron Multiplicity Counting (NMC) is applied. We developed a new NMC-technique, called Time Interval Correlation Spectroscopy (TICS), which is based on the measurement of Rossi-alpha time interval distributions. Compared to other NMC-techniques, TICS offers several advantages.
Linscheid, Michael W
2018-03-30
To understand biological processes, not only reliable identification, but quantification of constituents in biological processes play a pivotal role. This is especially true for the proteome: protein quantification must follow protein identification, since sometimes minute changes in abundance tell the real tale. To obtain quantitative data, many sophisticated strategies using electrospray and MALDI mass spectrometry (MS) have been developed in recent years. All of them have advantages and limitations. Several years ago, we started to work on strategies, which are principally capable to overcome some of these limits. The fundamental idea is to use elemental signals as a measure for quantities. We began by replacing the radioactive 32 P with the "cold" natural 31 P to quantify modified nucleotides and phosphorylated peptides and proteins and later used tagging strategies for quantification of proteins more generally. To do this, we introduced Inductively Coupled Plasma Mass Spectrometry (ICP-MS) into the bioanalytical workflows, allowing not only reliable and sensitive detection but also quantification based on isotope dilution absolute measurements using poly-isotopic elements. The detection capability of ICP-MS becomes particularly attractive with heavy metals. The covalently bound proteins tags developed in our group are based on the well-known DOTA chelate complex (1,4,7,10-tetraazacyclododecane-N,N',N″,N‴-tetraacetic acid) carrying ions of lanthanoides as metal core. In this review, I will outline the development of this mutual assistance between molecular and elemental mass spectrometry and discuss the scope and limitations particularly of peptide and protein quantification. The lanthanoide tags provide low detection limits, but offer multiplexing capabilities due to the number of very similar lanthanoides and their isotopes. With isotope dilution comes previously unknown accuracy. Separation techniques such as electrophoresis and HPLC were used and just slightly adapted workflows, already in use for quantification in bioanalysis. Imaging mass spectrometry (MSI) with MALDI and laser ablation ICP-MS complemented the range of application in recent years. © 2018 Wiley Periodicals, Inc.
Serum protein measurement using a tapered fluorescent fibre-optic evanescent wave-based biosensor
NASA Astrophysics Data System (ADS)
Preejith, P. V.; Lim, C. S.; Chia, T. F.
2006-12-01
A novel method to measure the total serum protein concentration is described in this paper. The method is based on the principles of fibre-optic evanescent wave spectroscopy. The biosensor applies a fluorescent dye-immobilized porous glass coating on a multi-mode optical fibre. The evanescent wave's intensity at the fibre-optic core-cladding interface is used to monitor the protein-induced changes in the sensor element. The sensor offers a rapid, single-step method for quantifying protein concentrations without destroying the sample. This unique sensing method presents a sensitive and accurate platform for the quantification of protein.
Li, Jing; Xie, Jianming; Yu, Jihua; Lv, Jian; Zhang, Junfeng; Wang, Xiaolong; Wang, Cheng; Tang, Chaonan; Zhang, Yingchun; Dawuda, Mohammed Mujitaba; Zhu, Daiqiang; Ma, Guoli
2017-09-27
Carotenoids are considered to be crucial elements in many fields and, furthermore, the significant factor in pepper leaves under low light and chilling temperature. However, little literature focused on the method to determinate and extract the contents of carotenoid compositions in pepper leaves. Therefore, a time-saving and highly sensitive reversed-phase high-performance liquid chromatography method for separation and quantification of 10 carotenoids was developed, and an optimized technological process for carotenoid composition extraction in pepper leaves was established for the first time. Our final method concluded that six xanthophylls eluted after about 9-26 min. In contrast, four carotenes showed higher retention times after nearly 28-40 min, which significantly shortened time and improved efficiency. Meanwhile, we suggested that 8 mL of 20% KOH-methanol solution should be added to perform saponification at 60 °C for 30 min. The ratio of solid-liquid was 1:8, and the ultrasound-assisted extraction time was 40 min.
Kassler, Alexander; Pittenauer, Ernst; Doerr, Nicole; Allmaier, Guenter
2014-01-15
For the qualification and quantification of antioxidants (aromatic amines and sterically hindered phenols), most of them applied as lubricant additives, two ultrahigh-performance liquid chromatography (UHPLC) electrospray ionization mass spectrometric methods applying the positive and negative ion mode have been developed for lubricant design and engineering thus allowing e.g. the study of the degradation of lubricants. Based on the different chemical properties of the two groups of antioxidants, two methods offering a fast separation (10 min) without prior derivatization were developed. In order to reach these requirements, UHPLC was coupled with an LTQ Orbitrap hybrid tandem mass spectrometer with positive and negative ion electrospray ionization for simultaneous detection of spectra from UHPLC-high-resolution (HR)-MS (full scan mode) and UHPLC-low-resolution linear ion trap MS(2) (LITMS(2)), which we term UHPLC/HRMS-LITMS(2). All 20 analytes investigated could be qualified by an UHPLC/HRMS-LITMS(2) approach consisting of simultaneous UHPLC/HRMS (elemental composition) and UHPLC/LITMS(2) (diagnostic product ions) according to EC guidelines. Quantification was based on an UHPLC/LITMS(2) approach due to increased sensitivity and selectivity compared to UHPLC/HRMS. Absolute quantification was only feasible for seven analytes with well-specified purity of references whereas relative quantification was obtainable for another nine antioxidants. All of them showed good standard deviation and repeatability. The combined methods allow qualitative and quantitative determination of a wide variety of different antioxidants including aminic/phenolic compounds applied in lubricant engineering. These data show that the developed methods will be versatile tools for further research on identification and characterization of the thermo-oxidative degradation products of antioxidants in lubricants. Copyright © 2013 John Wiley & Sons, Ltd.
NASA Astrophysics Data System (ADS)
Chen, Z.; Jones, C. M.
2002-05-01
Microchemistry of fish otoliths (fish ear bones) is a very useful tool for monitoring aquatic environments and fish migration. However, determination of the elemental composition in fish otolith by ICP-MS has been limited to either analysis of dissolved sample solution or measurement of limited number of trace elements by laser ablation (LA)- ICP-MS due to low sensitivity, lack of available calibration standards, and complexity of polyatomic molecular interference. In this study, a method was developed for in situ determination of trace elements in fish otoliths by laser ablation double focusing sector field ultra high sensitivity Finnigan Element 2 ICP-MS using a solution standard addition calibration method. Due to the lack of matrix-match solid calibration standards, sixteen trace elements (Na, Mg, P, Cr, Mn, Fe, Ni, Cu, Rb, Sr, Y, Cd, La, Ba, Pb and U) were determined using a solution standard calibration with Ca as an internal standard. Flexibility, easy preparation and stable signals are the advantages of using solution calibration standards. In order to resolve polyatomic molecular interferences, medium resolution (M/delta M > 4000) was used for some elements (Na, Mg, P, Cr, Mn, Fe, Ni, and Cu). Both external calibration and standard addition quantification strategies are compared and discussed. Precision, accuracy, and limits of detection are presented.
Quantitative elemental imaging of heterogeneous catalysts using laser-induced breakdown spectroscopy
NASA Astrophysics Data System (ADS)
Trichard, F.; Sorbier, L.; Moncayo, S.; Blouët, Y.; Lienemann, C.-P.; Motto-Ros, V.
2017-07-01
Currently, the use of catalysis is widespread in almost all industrial processes; its use improves productivity, synthesis yields and waste treatment as well as decreases energy costs. The increasingly stringent requirements, in terms of reaction selectivity and environmental standards, impose progressively increasing accuracy and control of operations. Meanwhile, the development of characterization techniques has been challenging, and the techniques often require equipment with high complexity. In this paper, we demonstrate a novel elemental approach for performing quantitative space-resolved analysis with ppm-scale quantification limits and μm-scale resolution. This approach, based on laser-induced breakdown spectroscopy (LIBS), is distinguished by its simplicity, all-optical design, and speed of operation. This work analyzes palladium-based porous alumina catalysts, which are commonly used in the selective hydrogenation process, using the LIBS method. We report an exhaustive study of the quantification capability of LIBS and its ability to perform imaging measurements over a large dynamic range, typically from a few ppm to wt%. These results offer new insight into the use of LIBS-based imaging in the industry and paves the way for innumerable applications.
Chakraborty, Somsubhra; Weindorf, David C; Li, Bin; Ali Aldabaa, Abdalsamad Abdalsatar; Ghosh, Rakesh Kumar; Paul, Sathi; Nasim Ali, Md
2015-05-01
Using 108 petroleum contaminated soil samples, this pilot study proposed a new analytical approach of combining visible near-infrared diffuse reflectance spectroscopy (VisNIR DRS) and portable X-ray fluorescence spectrometry (PXRF) for rapid and improved quantification of soil petroleum contamination. Results indicated that an advanced fused model where VisNIR DRS spectra-based penalized spline regression (PSR) was used to predict total petroleum hydrocarbon followed by PXRF elemental data-based random forest regression was used to model the PSR residuals, it outperformed (R(2)=0.78, residual prediction deviation (RPD)=2.19) all other models tested, even producing better generalization than using VisNIR DRS alone (RPD's of 1.64, 1.86, and 1.96 for random forest, penalized spline regression, and partial least squares regression, respectively). Additionally, unsupervised principal component analysis using the PXRF+VisNIR DRS system qualitatively separated contaminated soils from control samples. Fusion of PXRF elemental data and VisNIR derivative spectra produced an optimized model for total petroleum hydrocarbon quantification in soils. Copyright © 2015 Elsevier B.V. All rights reserved.
New methods for image collection and analysis in scanning Auger microscopy
NASA Technical Reports Server (NTRS)
Browning, R.
1985-01-01
While scanning Auger micrographs are used extensively for illustrating the stoichiometry of complex surfaces and for indicating areas of interest for fine point Auger spectroscopy, there are many problems in the quantification and analysis of Auger images. These problems include multiple contrast mechanisms and the lack of meaningful relationships with other Auger data. Collection of multielemental Auger images allows some new approaches to image analysis and presentation. Information about the distribution and quantity of elemental combinations at a surface are retrievable, and particular combinations of elements can be imaged, such as alloy phases. Results from the precipitate hardened alloy Al-2124 illustrate multispectral Auger imaging.
Determination of alloy content from plume spectral measurements
NASA Technical Reports Server (NTRS)
Madzsar, George C.
1991-01-01
The mathematical derivation for a method to determine the identities and amounts of alloys present in a flame where numerous alloys may be present is described. This method is applicable if the total number of elemental species from all alloys that may be in the flame is greater than or equal to the total number of alloys. Arranging the atomic spectral line emission equations for the elemental species as a series of simultaneous equations enables solution for identity and amount of the alloy present in the flame. This technique is intended for identification and quantification of alloy content in the plume of a rocket engine. Spectroscopic measurements reveal the atomic species entrained in the plume. Identification of eroding alloys may lead to the identification of the eroding component.
Gorguluarslan, Recep M; Choi, Seung-Kyum; Saldana, Christopher J
2017-07-01
A methodology is proposed for uncertainty quantification and validation to accurately predict the mechanical response of lattice structures used in the design of scaffolds. Effective structural properties of the scaffolds are characterized using a developed multi-level stochastic upscaling process that propagates the quantified uncertainties at strut level to the lattice structure level. To obtain realistic simulation models for the stochastic upscaling process and minimize the experimental cost, high-resolution finite element models of individual struts were reconstructed from the micro-CT scan images of lattice structures which are fabricated by selective laser melting. The upscaling method facilitates the process of determining homogenized strut properties to reduce the computational cost of the detailed simulation model for the scaffold. Bayesian Information Criterion is utilized to quantify the uncertainties with parametric distributions based on the statistical data obtained from the reconstructed strut models. A systematic validation approach that can minimize the experimental cost is also developed to assess the predictive capability of the stochastic upscaling method used at the strut level and lattice structure level. In comparison with physical compression test results, the proposed methodology of linking the uncertainty quantification with the multi-level stochastic upscaling method enabled an accurate prediction of the elastic behavior of the lattice structure with minimal experimental cost by accounting for the uncertainties induced by the additive manufacturing process. Copyright © 2017 Elsevier Ltd. All rights reserved.
Bouby, M; Geckeis, H; Geyer, F W
2008-12-01
A straightforward quantification method is presented for the application of asymmetric flow field-flow fractionation (AsFlFFF) combined with inductively coupled plasma mass spectrometry (ICPMS) to the characterization of colloid-borne metal ions and nanoparticles. Reproducibility of the size calibration and recovery of elements are examined. Channel flow fluctuations are observed notably after initiation of the fractionation procedure. Their impact on quantification is considered by using (103)Rh as internal reference. Intensity ratios measured for various elements and Rh are calculated for each data point. These ratios turned out to be independent of the metal concentration and total sample solution flow introduced into the nebulizer within a range of 0.4-1.2 mL min(-1). The method is applied to study the interaction of Eu, U(VI) and Th with a mixture of humic acid and clay colloids and to the characterization of synthetic nanoparticles, namely CdSe/ZnS-MAA (mercaptoacetic acid) core/shell-coated quantum dots (QDs). Information is given not only on inorganic element composition but also on the effective hydrodynamic size under relevant conditions. Detection limits (DLs) are estimated for Ca, Al, Fe, the lanthanide Ce and the natural actinides Th and U in colloid-containing groundwater. For standard crossflow nebulizer, estimated values are 7 x 10(3), 20, 3 x 10(2), 0.1, 0.1 and 7 x 10(-2) microg L(-1), respectively. DLs for Zn and Cd in QD characterization are 28 and 11 microg L(-1), respectively.
NASA Technical Reports Server (NTRS)
Dekorvin, Andre
1992-01-01
The Dempster-Shafer theory of evidence is applied to a multiattribute decision making problem whereby the decision maker (DM) must compromise with available alternatives, none of which exactly satisfies his ideal. The decision mechanism is constrained by the uncertainty inherent in the determination of the relative importance of each attribute element and the classification of existing alternatives. The classification of alternatives is addressed through expert evaluation of the degree to which each element is contained in each available alternative. The relative importance of each attribute element is determined through pairwise comparisons of the elements by the decision maker and implementation of a ratio scale quantification method. Then the 'belief' and 'plausibility' that an alternative will satisfy the decision maker's ideal are calculated and combined to rank order the available alternatives. Application to the problem of selecting computer software is given.
The quantification of pattern is a key element of landscape analyses. One aspect of this quantification of particular importance to landscape ecologists regards the classification of continuous variables to produce categorical variables such as land-cover type or elevation strat...
2012-03-19
THREE EXTREMITY ARMOR SYSTEMS: DETERMINATION OF PHYSIOLOGICAL, BIOMECHANICAL, AND PHYSICAL PERFORMANCE EFFECTS AND QUANTIFICATION OF BODY AREA...PHYSICAL PERFORMANCE EFFECTS AND QUANTIFICATION OF BODY AREA COVERAGE 5a. CONTRACT NUMBER MIPR #M9545006MPR6CC7 5b. GRANT NUMBER 5c. PROGRAM ELEMENT...WORK UNIT NUMBER 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) 8. PERFORMING ORGANIZATION REPORT NUMBER NATICK/TR-12/014 9
NASA Astrophysics Data System (ADS)
Udeigwe, T. K.; Young, J.; Kandakji, T.; Weindorf, D. C.; Mahmoud, M. A.; Stietiya, M. H.
2015-04-01
This study extends the application of the portable X-ray fluorescence (PXRF) spectrometry to the examination of elements in semi-arid urban landscapes of the Southern High Plains (SHP) of the United States, focusing on golf courses. The complex environmental challenges of this region and the unique management practices at golf course facilities could lead to differences in concentration and in the chemistry of elements between managed (irrigated) and non-managed (non-irrigated) portions of these facilities. Soil samples were collected at depths of 0-10, 10-20, and 20-30 cm from managed and non-managed areas of seven different facilities in the city of Lubbock, Texas, and analyzed for a suite of soil properties. Total elemental quantification was conducted using a PXRF spectrometer. Findings mostly indicated no significant differences in the concentration of examined elements between the managed and non-managed areas of the facilities. However, strong positive relationships (R = 0.82-0.91, p < 0.001) were observed among elements (e.g., Fe on the one hand and Cr, Mn, Ni, and As on the other; Cu and Zn; As and Cr) and between these elements and soil constituents or properties such as clay, calcium carbonate, organic matter, and pH. The strengths of these relationships were mostly higher in the non-managed areas, suggesting a possible alteration in the chemistry of these elements by anthropogenic influences in the managed areas. Principal component and correlation analyses within the managed areas suggested that As, Cr, Fe, Mn, and Ni could be of lithogenic origin, while Cu, Pb, and Zn could have anthropogenic influences. Only one possible, likely lithogenic, source of the elements was identified within the non-managed areas. As evidenced by the study, the PXRF spectrometer can be a valuable tool for elemental quantification and rapid investigation of elemental interaction and source apportionment in semi-arid climates.
NASA Astrophysics Data System (ADS)
Udeigwe, T. K.; Young, J.; Kandakji, T.; Weindorf, D. C.; Mahmoud, M. A.; Stietiya, M. H.
2015-01-01
This study extends the application of the portable x-ray fluorescence (PXRF) spectrometry to examination of elements in semi-arid urban landscapes of the Southern High Plains (SHP) of the United States (US), focusing on golf courses. The complex environmental challenges of this region and the unique management practices at golf course facilities could lead to differences in concentration and chemistry of elements between managed (irrigated) and non-managed (non-irrigated) portions of these facilities. Soil samples were collected at the depths of 0-10, 10-20, and 20-30 cm from managed and non-managed areas of seven different facilities in the city of Lubbock, Texas, and analyzed for a suite of soil properties. Total elemental quantification was conducted using PXRF. Findings mostly indicated no significant differences in concentration of examined elements between the managed and non-managed areas of the facilities. However, strong positive relationships (R2 = 0.82-0.91, p < 0.001) were observed among elements (e.g. Fe and each of Cr, Mn, Ni, and As; Cu and Zn; As and Cr) and between these elements and soil constituents or properties such as clay, calcium carbonate, organic matter, and pH. The strengths of these relationships were mostly higher in the non-managed areas, suggesting possible alteration in the chemistry of these elements by anthropogenic influences. Principal component analyses (PCA) and correlation analyses within the managed areas suggested that As, Cr, Fe, Mn, and Ni could be of lithogenic origin, while Cu, Pb, and Zn were attributed to anthropogenic influences. Only one possible source of element, likely lithogenic, was identified within non-managed areas. As evidenced from the study, the PXRF can be a valuable tool for elemental quantification, and rapid investigation of elemental interaction and source apportionment in semi-arid climates.
NASA Astrophysics Data System (ADS)
Oxmann, J. F.; Schwendenmann, L.
2014-06-01
Knowledge of calcium phosphate (Ca-P) solubility is crucial for understanding temporal and spatial variations of phosphorus (P) concentrations in water bodies and sedimentary reservoirs. In situ relationships between liquid- and solid-phase levels cannot be fully explained by dissolved analytes alone and need to be verified by determining particular sediment P species. Lack of quantification methods for these species limits the knowledge of the P cycle. To address this issue, we (i) optimized a specifically developed conversion-extraction (CONVEX) method for P species quantification using standard additions, and (ii) simultaneously determined solubilities of Ca-P standards by measuring their pH-dependent contents in the sediment matrix. Ca-P minerals including various carbonate fluorapatite (CFAP) specimens from different localities, fluorapatite (FAP), fish bone apatite, synthetic hydroxylapatite (HAP) and octacalcium phosphate (OCP) were characterized by XRD, Raman, FTIR and elemental analysis. Sediment samples were incubated with and without these reference minerals and then sequentially extracted to quantify Ca-P species by their differential dissolution at pH values between 3 and 8. The quantification of solid-phase phosphates at varying pH revealed solubilities in the following order: OCP > HAP > CFAP (4.5% CO3) > CFAP (3.4% CO3) > CFAP (2.2% CO3) > FAP. Thus, CFAP was less soluble in sediment than HAP, and CFAP solubility increased with carbonate content. Unspiked sediment analyses together with standard addition analyses indicated consistent differential dissolution of natural sediment species vs. added reference species and therefore verified the applicability of the CONVEX method in separately determining the most prevalent Ca-P minerals. We found surprisingly high OCP contents in the coastal sediments analyzed, which supports the hypothesis of apatite formation by an OCP precursor mechanism.
NASA Astrophysics Data System (ADS)
Oxmann, J. F.; Schwendenmann, L.
2014-01-01
Knowledge of calcium phosphate (Ca-P) solubility is crucial for understanding temporal and spatial variations of phosphorus (P) concentrations in water bodies and sedimentary reservoirs. In-situ relationships between liquid and solid-phase levels cannot be fully explained by dissolved analytes alone and need to be verified by determination of particular sediment P species. Lack of quantification methods for these species limits the knowledge of the P cycle. To address this issue, we (i) optimized a specifically developed conversion-extraction (CONVEX) method for P species quantification using standard additions; and (ii) simultaneously determined solubilities of Ca-P standards by measuring their pH-dependent contents in the sediment matrix. Ca-P minerals including various carbonate fluorapatite (CFAP) specimens from different localities, fluorapatite (FAP), fish bone apatite, synthetic hydroxylapatite (HAP) and octacalcium phosphate (OCP) were characterized by XRD, Raman, FTIR and elemental analysis. Sediment samples were incubated with and without these reference minerals and then sequentially extracted to quantify Ca-P species by their differential dissolution at pH values between 3 and 8. The quantification of solid-phase phosphates at varying pH revealed solubilities in the following order: OCP > HAP > CFAP (4.5% CO3) > CFAP (3.4% CO3) > CFAP (2.2% CO3) > FAP. Thus, CFAP was less soluble in sediment than HAP, and CFAP solubility increased with carbonate content. Unspiked sediment analyses together with standard addition analyses indicated consistent differential dissolution of natural sediment species vs. added reference species and therefore verified the applicability of the CONVEX method in separately determining the most prevalent Ca-P minerals. We found surprisingly high OCP contents in the analyzed coastal sediments which supports the hypothesis of apatite formation by an OCP precursor.
Inorganic trace analysis by mass spectrometry
NASA Astrophysics Data System (ADS)
Becker, Johanna Sabine; Dietze, Hans-Joachim
1998-10-01
Mass spectrometric methods for the trace analysis of inorganic materials with their ability to provide a very sensitive multielemental analysis have been established for the determination of trace and ultratrace elements in high-purity materials (metals, semiconductors and insulators), in different technical samples (e.g. alloys, pure chemicals, ceramics, thin films, ion-implanted semiconductors), in environmental samples (waters, soils, biological and medical materials) and geological samples. Whereas such techniques as spark source mass spectrometry (SSMS), laser ionization mass spectrometry (LIMS), laser ablation inductively coupled plasma mass spectrometry (LA-ICP-MS), glow discharge mass spectrometry (GDMS), secondary ion mass spectrometry (SIMS) and inductively coupled plasma mass spectrometry (ICP-MS) have multielemental capability, other methods such as thermal ionization mass spectrometry (TIMS), accelerator mass spectrometry (AMS) and resonance ionization mass spectrometry (RIMS) have been used for sensitive mono- or oligoelemental ultratrace analysis (and precise determination of isotopic ratios) in solid samples. The limits of detection for chemical elements using these mass spectrometric techniques are in the low ng g -1 concentration range. The quantification of the analytical results of mass spectrometric methods is sometimes difficult due to a lack of matrix-fitted multielement standard reference materials (SRMs) for many solid samples. Therefore, owing to the simple quantification procedure of the aqueous solution, inductively coupled plasma mass spectrometry (ICP-MS) is being increasingly used for the characterization of solid samples after sample dissolution. ICP-MS is often combined with special sample introduction equipment (e.g. flow injection, hydride generation, high performance liquid chromatography (HPLC) or electrothermal vaporization) or an off-line matrix separation and enrichment of trace impurities (especially for characterization of high-purity materials and environmental samples) is used in order to improve the detection limits of trace elements. Furthermore, the determination of chemical elements in the trace and ultratrace concentration range is often difficult and can be disturbed through mass interferences of analyte ions by molecular ions at the same nominal mass. By applying double-focusing sector field mass spectrometry at the required mass resolution—by the mass spectrometric separation of molecular ions from the analyte ions—it is often possible to overcome these interference problems. Commercial instrumental equipment, the capability (detection limits, accuracy, precision) and the analytical application fields of mass spectrometric methods for the determination of trace and ultratrace elements and for surface analysis are discussed.
Armigliato, Aldo; Frabboni, Stefano; Gazzadi, Gian Carlo; Rosa, Rodolfo
2013-02-01
A method for the fabrication of a wedge-shaped thin NiO lamella by focused ion beam is reported. The starting sample is an oxidized bulk single crystalline, <100> oriented, Ni commercial standard. The lamella is employed for the determination, by analytical electron microscopy at 200 kV of the experimental k(O-Ni) Cliff-Lorimer (G. Cliff & G.W. Lorimer, J Microsc 103, 203-207, 1975) coefficient, according to the extrapolation method by Van Cappellen (E. Van Cappellen, Microsc Microstruct Microanal 1, 1-22, 1990). The result thus obtained is compared to the theoretical k(O-Ni) values either implemented into the commercial software for X-ray microanalysis quantification of the scanning transmission electron microscopy/energy dispersive spectrometry equipment or calculated by the Monte Carlo method. Significant differences among the three values are found. This confirms that for a reliable quantification of binary alloys containing light elements, the choice of the Cliff-Lorimer coefficients is crucial and experimental values are recommended.
Thyssen, G M; Holtkamp, M; Kaulfürst-Soboll, H; Wehe, C A; Sperling, M; von Schaewen, A; Karst, U
2017-06-21
Laser ablation-inductively coupled plasma-optical emission spectroscopy (LA-ICP-OES) is presented as a valuable tool for elemental bioimaging of alkali and earth alkali elements in plants. Whereas LA-ICP-OES is commonly used for micro analysis of solid samples, laser ablation-inductively coupled plasma-mass spectrometry (LA-ICP-MS) has advanced to the gold standard for bioimaging. However, especially for easily excitable and ubiquitous elements such as alkali and earth alkali elements, LA-ICP-OES holds some advantages regarding simultaneous detection, costs, contamination, and user-friendliness. This is demonstrated by determining the calcium, sodium and potassium distribution in tobacco plant stem and leaf petiole tissues. A quantification of the calcium contents in a concentration range up to 1000 μg g -1 using matrix-matched standards is presented as well. The method is directly compared to a LA-ICP-MS approach by analyzing parallel slices of the same samples.
Dakota Uncertainty Quantification Methods Applied to the CFD code Nek5000
DOE Office of Scientific and Technical Information (OSTI.GOV)
Delchini, Marc-Olivier; Popov, Emilian L.; Pointer, William David
This report presents the state of advancement of a Nuclear Energy Advanced Modeling and Simulation (NEAMS) project to characterize the uncertainty of the computational fluid dynamics (CFD) code Nek5000 using the Dakota package for flows encountered in the nuclear engineering industry. Nek5000 is a high-order spectral element CFD code developed at Argonne National Laboratory for high-resolution spectral-filtered large eddy simulations (LESs) and unsteady Reynolds-averaged Navier-Stokes (URANS) simulations.
Calderón-Celis, Francisco; Sanz-Medel, Alfredo; Encinar, Jorge Ruiz
2018-01-23
We present a novel and highly sensitive ICP-MS approach for absolute quantification of all important target biomolecule containing P, S, Se, As, Br, and/or I (e.g., proteins and phosphoproteins, metabolites, pesticides, drugs), under the same simple instrumental conditions and without requiring any specific and/or isotopically enriched standard.
Analysis of metal-laden water via portable X-ray fluorescence spectrometry
NASA Astrophysics Data System (ADS)
Pearson, Delaina; Weindorf, David C.; Chakraborty, Somsubhra; Li, Bin; Koch, Jaco; Van Deventer, Piet; de Wet, Jandre; Kusi, Nana Yaw
2018-06-01
A rapid method for in-situ elemental composition analysis of metal-laden water would be indispensable for studying polluted water. Current analytical lab methods to determine water quality include flame atomic absorption spectrometry (FAAS), atomic absorption spectrophotometry (AAS), electrothermal atomic absorption spectrometry (EAAS), and inductively coupled plasma (ICP) spectroscopy. However only two field methods, colorimetry and absorptiometry, exist for elemental analysis of water. Portable X-ray fluorescence (PXRF) spectrometry is an effective method for elemental analysis of soil, sediment, and other matrices. However, the accuracy of PXRF is known to be affected while scanning moisture-laden soil samples. This study sought to statistically establish PXRF's predictive ability for various elements in water at different concentrations relative to inductively coupled plasma atomic emission spectroscopy (ICP-AES). A total of 390 metal-laden water samples collected from leaching columns of mine tailings in South Africa were analyzed via PXRF and ICP-AES. The PXRF showed differential effectiveness in elemental quantification. For the collected water samples, the best relationships between ICP and PXRF elemental data were obtained for K and Cu (R2 = 0.92). However, when scanning ICP calibration solutions with elements in isolation, PXRF results indicated near perfect agreement; Ca, K, Fe, Cu and Pb produced an R2 of 0.99 while Zn and Mn produced an R2 of 1.00. The utilization of multiple PXRF (stacked) beams produced stronger correlation to ICP relative to the use of a single beam in isolation. The results of this study demonstrated the PXRF's ability to satisfactorily predict the composition of metal-laden water as reported by ICP for several elements. Additionally this study indicated the need for a "Water Mode" calibration for the PXRF and demonstrates the potential of PXRF for future study of polluted or contaminated waters.
Li, Yan; Yu, Hua; Zheng, Siqian; Miao, Yang; Yin, Shi; Li, Peng; Bian, Ying
2016-03-22
Rare earth elements (REEs) have undergone a steady spread in several industrial, agriculture and medical applications. With the aim of exploring a sensitive and reliable indicator of estimating exposure level to REEs, a simple, accurate and specific ICP-MS method for simultaneous direct quantification of 15 REEs ((89)Y, (139)La, (140)Ce, (141)Pr, (146)Nd, (147)Sm, (153)Eu, (157)Gd, (159)Tb, (163)Dy, (165)Ho, (166)Er, (169)Tm, (172)Yb and (175)Lu) in human urine has been developed and validated. The method showed good linearity for all REEs in human urine in the concentrations ranging from 0.001-1.000 μg ∙ L(-1) with r² > 0.997. The limits of detection and quantification for this method were in the range of 0.009-0.010 μg ∙ L(-1) and 0.029-0.037 μg ∙ L(-1), the recoveries on spiked samples of the 15 REEs ranged from 93.3% to 103.0% and the relative percentage differences were less than 6.2% in duplicate samples, and the intra- and inter-day variations of the analysis were less than 1.28% and less than 0.85% for all REEs, respectively. The developed method was successfully applied to the determination of 15 REEs in 31 urine samples obtained from the control subjects and the workers engaged in work with manufacturing of ultrafine and nanoparticles containing cerium and lanthanum oxide. The results suggested that only the urinary levels of La (1.234 ± 0.626 μg ∙ L(-1)), Ce (1.492 ± 0.995 μg ∙ L(-1)), Nd (0.014 ± 0.009 μg ∙ L(-1)) and Gd (0.023 ± 0.010 μg ∙ L(-1)) among the exposed workers were significantly higher (p < 0.05) than the levels measured in the control subjects. From these, La and Ce were the primary components, and accounted for 88% of the total REEs. Lanthanum comprised 27% of the total REEs while Ce made up the majority of REE content at 61%. The remaining elements only made up 1% each, with the exception of Dy which was not detected. Comparison with the previously published data, the levels of urinary La and Ce in workers and the control subjects show a higher trend than previous reports.
Li, Yan; Yu, Hua; Zheng, Siqian; Miao, Yang; Yin, Shi; Li, Peng; Bian, Ying
2016-01-01
Rare earth elements (REEs) have undergone a steady spread in several industrial, agriculture and medical applications. With the aim of exploring a sensitive and reliable indicator of estimating exposure level to REEs, a simple, accurate and specific ICP-MS method for simultaneous direct quantification of 15 REEs (89Y, 139La, 140Ce, 141Pr, 146Nd, 147Sm, 153Eu, 157Gd, 159Tb, 163Dy, 165Ho, 166Er, 169Tm, 172Yb and 175Lu) in human urine has been developed and validated. The method showed good linearity for all REEs in human urine in the concentrations ranging from 0.001–1.000 μg∙L−1 with r2 > 0.997. The limits of detection and quantification for this method were in the range of 0.009–0.010 μg∙L−1 and 0.029–0.037 μg∙L−1, the recoveries on spiked samples of the 15 REEs ranged from 93.3% to 103.0% and the relative percentage differences were less than 6.2% in duplicate samples, and the intra- and inter-day variations of the analysis were less than 1.28% and less than 0.85% for all REEs, respectively. The developed method was successfully applied to the determination of 15 REEs in 31 urine samples obtained from the control subjects and the workers engaged in work with manufacturing of ultrafine and nanoparticles containing cerium and lanthanum oxide. The results suggested that only the urinary levels of La (1.234 ± 0.626 μg∙L−1), Ce (1.492 ± 0.995 μg∙L−1), Nd (0.014 ± 0.009 μg∙L−1) and Gd (0.023 ± 0.010 μg∙L−1) among the exposed workers were significantly higher (p < 0.05) than the levels measured in the control subjects. From these, La and Ce were the primary components, and accounted for 88% of the total REEs. Lanthanum comprised 27% of the total REEs while Ce made up the majority of REE content at 61%. The remaining elements only made up 1% each, with the exception of Dy which was not detected. Comparison with the previously published data, the levels of urinary La and Ce in workers and the control subjects show a higher trend than previous reports. PMID:27011194
NASA Astrophysics Data System (ADS)
Schaumann, Ina; Malzer, Wolfgang; Mantouvalou, Ioanna; Lühl, Lars; Kanngießer, Birgit; Dargel, Rainer; Giese, Ulrich; Vogt, Carla
2009-04-01
For the validation of the quantification of the newly-developed method of 3D Micro X-ray fluorescence spectroscopy (3D Micro-XRF) samples with a low average Z matrix and minor high Z elements are best suited. In a light matrix the interferences by matrix effects are minimized so that organic polymers are appropriate as basis for analytes which are more easily detected by X-ray fluorescence spectroscopy. Polymer layer systems were assembled from single layers of ethylene-propylene-diene rubber (EPDM) filled with changing concentrations of silica and zinc oxide as inorganic additives. Layer thicknesses were in the range of 30-150 μm. Before the analysis with 3D Micro-XRF all layers have been characterized by scanning micro-XRF with regard to filler dispersion, by infrared microscopy and light microscopy in order to determine the layer thicknesses and by ICP-OES to verify the concentration of the X-ray sensitive elements in the layers. With the results obtained for stacked polymer systems the validity of the analytical quantification model for the determination of stratified materials by 3D Micro-XRF could be demonstrated.
Uncertainty quantification in volumetric Particle Image Velocimetry
NASA Astrophysics Data System (ADS)
Bhattacharya, Sayantan; Charonko, John; Vlachos, Pavlos
2016-11-01
Particle Image Velocimetry (PIV) uncertainty quantification is challenging due to coupled sources of elemental uncertainty and complex data reduction procedures in the measurement chain. Recent developments in this field have led to uncertainty estimation methods for planar PIV. However, no framework exists for three-dimensional volumetric PIV. In volumetric PIV the measurement uncertainty is a function of reconstructed three-dimensional particle location that in turn is very sensitive to the accuracy of the calibration mapping function. Furthermore, the iterative correction to the camera mapping function using triangulated particle locations in space (volumetric self-calibration) has its own associated uncertainty due to image noise and ghost particle reconstructions. Here we first quantify the uncertainty in the triangulated particle position which is a function of particle detection and mapping function uncertainty. The location uncertainty is then combined with the three-dimensional cross-correlation uncertainty that is estimated as an extension of the 2D PIV uncertainty framework. Finally the overall measurement uncertainty is quantified using an uncertainty propagation equation. The framework is tested with both simulated and experimental cases. For the simulated cases the variation of estimated uncertainty with the elemental volumetric PIV error sources are also evaluated. The results show reasonable prediction of standard uncertainty with good coverage.
k0-INAA for determining chemical elements in bird feathers
NASA Astrophysics Data System (ADS)
França, Elvis J.; Fernandes, Elisabete A. N.; Fonseca, Felipe Y.; Antunes, Alexsander Z.; Bardini Junior, Claudiney; Bacchi, Márcio A.; Rodrigues, Vanessa S.; Cavalca, Isabel P. O.
2010-10-01
The k0-method instrumental neutron activation analysis ( k0-INAA) was employed for determining chemical elements in bird feathers. A collection was obtained taking into account several bird species from wet ecosystems in diverse regions of Brazil. For comparison reason, feathers were actively sampled in a riparian forest from the Marins Stream, Piracicaba, São Paulo State, using mist nets specific for capturing birds. Biological certified reference materials were used for assessing the quality of analytical procedure. Quantification of chemical elements was performed using the k0-INAA Quantu Software. Sixteen chemical elements, including macro and micronutrients, and trace elements, have been quantified in feathers, in which analytical uncertainties varied from 2% to 40% depending on the chemical element mass fraction. Results indicated high mass fractions of Br (max=7.9 mg kg -1), Co (max=0.47 mg kg -1), Cr (max=68 mg kg -1), Hg (max=2.79 mg kg -1), Sb (max=0.20 mg kg -1), Se (max=1.3 mg kg -1) and Zn (max=192 mg kg -1) in bird feathers, probably associated with the degree of pollution of the areas evaluated. In order to corroborate the use of k0-INAA results in biomonitoring studies using avian community, different factor analysis methods were used to check chemical element source apportionment and locality clustering based on feather chemical composition.
NASA Astrophysics Data System (ADS)
Samanta, Sudipta; Mukherjee, Sanchita
2017-10-01
The p53 protein activation protects the organism from propagation of cells with damaged DNA having oncogenic mutations. In normal cells, activity of p53 is controlled by interaction with MDM2. The well understood p53-MDM2 interaction facilitates design of ligands that could potentially disrupt or prevent the complexation owing to its emergence as an important objective for cancer therapy. However, thermodynamic quantification of the p53-peptide induced structural changes of the MDM2-protein remains an area to be explored. This study attempts to understand the conformational free energy and entropy costs due to this complex formation from the histograms of dihedral angles generated from molecular dynamics simulations. Residue-specific quantification illustrates that, hydrophobic residues of the protein contribute maximum to the conformational thermodynamic changes. Thermodynamic quantification of structural changes of the protein unfold the fact that, p53 binding provides a source of inter-element cooperativity among the protein secondary structural elements, where the highest affected structural elements (α2 and α4) found at the binding site of the protein affects faraway structural elements (β1 and Loop1) of the protein. The communication perhaps involves water mediated hydrogen bonded network formation. Further, we infer that in inhibitory F19A mutation of P53, though Phe19 is important in the recognition process, it has less prominent contribution in the stability of the complex. Collectively, this study provides vivid microscopic understanding of the interaction within the protein complex along with exploring mutation sites, which will contribute further to engineer the protein function and binding affinity.
NASA Astrophysics Data System (ADS)
Zhang, Xing; Chen, Beibei; He, Man; Zhang, Yiwen; Xiao, Guangyang; Hu, Bin
2015-04-01
The absolute quantification of glycoproteins in complex biological samples is a challenge and of great significance. Herein, 4-mercaptophenylboronic acid functionalized magnetic beads were prepared to selectively capture glycoproteins, while antibody conjugated gold and silver nanoparticles were synthesized as element tags to label two different glycoproteins. Based on that, a new approach of magnetic immunoassay-inductively coupled plasma mass spectrometry (ICP-MS) was established for simultaneous quantitative analysis of glycoproteins. Taking biomarkers of alpha-fetoprotein (AFP) and carcinoembryonic antigen (CEA) as two model glycoproteins, experimental parameters involved in the immunoassay procedure were carefully optimized and analytical performance of the proposed method was evaluated. The limits of detection (LODs) for AFP and CEA were 0.086 μg L- 1 and 0.054 μg L- 1 with the relative standard deviations (RSDs, n = 7, c = 5 μg L- 1) of 6.5% and 6.2% for AFP and CEA, respectively. Linear range for both AFP and CEA was 0.2-50 μg L- 1. To validate the applicability of the proposed method, human serum samples were analyzed, and the obtained results were in good agreement with that obtained by the clinical chemiluminescence immunoassay. The developed method exhibited good selectivity and sensitivity for the simultaneous determination of AFP and CEA, and extended the applicability of metal nanoparticle tags based on ICP-MS methodology in multiple glycoprotein quantifications.
2017-02-02
Corresponding Author Abstract Accurate virus quantification is sought, but a perfect method still eludes the scientific community. Electron...unlimited. UNCLASSIFIED 2 provides morphology data and counts all viral particles, including partial or noninfectious particles; however, EM methods ...consistent, reproducible virus quantification method called Scanning Transmission Electron Microscopy – Virus Quantification (STEM-VQ) which simplifies
Kwasniewski, Misha T; Allison, Rachel B; Wilcox, Wayne F; Sacks, Gavin L
2011-10-03
Rapid, inexpensive, and convenient methods for quantifying elemental sulfur (S(0)) with low or sub-μgg(-1) limits of detection would be useful for a range of applications where S(0) can act as a precursor for noxious off-aromas, e.g., S(0) in pesticide residues on winegrapes or as a contaminant in drywall. However, existing quantification methods rely on toxic reagents, expensive and cumbersome equipment, or demonstrate poor selectivity. We have developed and optimized an inexpensive, rapid method (∼15 min per sample) for quantifying S(0) in complex matrices. Following dispersion of the sample in PEG-400 and buffering, S(0) is quantitatively reduced to H(2)S in situ by dithiothreitol and simultaneously quantified by commercially available colorimetric H(2)S detection tubes. By employing multiple tubes, the method demonstrated linearity from 0.03 to 100 μg S(0) g(-1) for a 5 g sample (R(2)=0.994, mean CV=6.4%), and the methodological detection limit was 0.01 μg S(0) g(-1). Interferences from sulfite or sulfate were not observed. Mean recovery of an S(0) containing sulfur fungicide in grape macerate was 84.7% with a mean CV of 10.4%. Mean recovery of S(0) in a colloidal sulfur preparation from a drywall matrix was 106.6% with a mean CV of 6.9%. Comparable methodological detection limits, sensitivity, and recoveries were achieved in grape juice, grape macerate and with 1g drywall samples, indicating that the methodology should be robust across a range of complex matrices. Copyright © 2011 Elsevier B.V. All rights reserved.
Loukotková, Lucie; VonTungeln, Linda S; Vanlandingham, Michelle; da Costa, Gonçalo Gamboa
2018-01-01
According to the World Health Organization, the consumption of tobacco products is the single largest cause of preventable deaths in the world, exceeding the total aggregated number of deaths caused by diseases such as AIDS, tuberculosis, and malaria. An important element in the evaluation of the health risks associated with the consumption of tobacco products is the assessment of the internal exposure to the tobacco constituents responsible for their addictive (e.g. nicotine) and carcinogenic (e.g. N-nitrosamines such as NNN and NNK) properties. However, the assessment of the serum levels of these compounds is often challenging from an analytical standpoint, in particular when limited sample volumes are available and low detection limits are required. Currently available analytical methods often rely on complex multi-step sample preparation procedures, which are prone to low analyte recoveries and ex-vivo contamination due to the ubiquitous nature of these compounds as background contaminants. In order to circumvent these problems, we report a facile and highly sensitive method for the simultaneous quantification of nicotine, cotinine, NNN, and NNK in serum samples. The method relies on a simple "one pot" liquid-liquid extraction procedure and isotope dilution ultra-high pressure (UPLC) hydrophilic interaction liquid chromatography (HILIC) coupled with tandem mass spectrometry. The method requires only 10μL of serum and presents a limit of quantification of 0.02nmol (3000pg/mL) nicotine, 0.6pmol (100pg/mL) cotinine, 0.05pmol NNK (10pg/mL), and 0.06pmol NNN (10pg/mL), making it appropriate for pharmacokinetic evaluations. Published by Elsevier B.V.
QACD: A method for the quantitative assessment of compositional distribution in geologic materials
NASA Astrophysics Data System (ADS)
Loocke, M. P.; Lissenberg, J. C. J.; MacLeod, C. J.
2017-12-01
In order to fully understand the petrogenetic history of a rock, it is critical to obtain a thorough characterization of the chemical and textural relationships of its mineral constituents. Element mapping combines the microanalytical techniques that allow for the analysis of major- and minor elements at high spatial resolutions (e.g., electron microbeam analysis) with 2D mapping of samples in order to provide unprecedented detail regarding the growth histories and compositional distributions of minerals within a sample. We present a method for the acquisition and processing of large area X-ray element maps obtained by energy-dispersive X-ray spectrometer (EDS) to produce a quantitative assessment of compositional distribution (QACD) of mineral populations within geologic materials. By optimizing the conditions at which the EDS X-ray element maps are acquired, we are able to obtain full thin section quantitative element maps for most major elements in relatively short amounts of time. Such maps can be used to not only accurately identify all phases and calculate mineral modes for a sample (e.g., a petrographic thin section), but, critically, enable a complete quantitative assessment of their compositions. The QACD method has been incorporated into a python-based, easy-to-use graphical user interface (GUI) called Quack. The Quack software facilitates the generation of mineral modes, element and molar ratio maps and the quantification of full-sample compositional distributions. The open-source nature of the Quack software provides a versatile platform which can be easily adapted and modified to suit the needs of the user.
Quantitative ion beam analysis of M-C-O systems: application to an oxidized uranium carbide sample
NASA Astrophysics Data System (ADS)
Martin, G.; Raveu, G.; Garcia, P.; Carlot, G.; Khodja, H.; Vickridge, I.; Barthe, M. F.; Sauvage, T.
2014-04-01
A large variety of materials contain both carbon and oxygen atoms, in particular oxidized carbides, carbon alloys (as ZrC, UC, steels, etc.), and oxycarbide compounds (SiCO glasses, TiCO, etc.). Here a new ion beam analysis methodology is described which enables quantification of elemental composition and oxygen concentration profile over a few microns. It is based on two procedures. The first, relative to the experimental configuration relies on a specific detection setup which is original in that it enables the separation of the carbon and oxygen NRA signals. The second concerns the data analysis procedure i.e. the method for deriving the elemental composition from the particle energy spectrum. It is a generic algorithm and is here successfully applied to characterize an oxidized uranium carbide sample, developed as a potential fuel for generation IV nuclear reactors. Furthermore, a micro-beam was used to simultaneously determine the local elemental composition and oxygen concentration profiles over the first microns below the sample surface. This method is adapted to the determination of the composition of M?C?O? compounds with a sensitivity on elemental atomic concentrations around 1000 ppm.
Quantification of Self Pollution from Two Diesel School Buses using Three Independent Methods.
Liu, L-J Sally; Phuleria, Harish C; Webber, Whitney; Davey, Mark; Lawson, Douglas R; Ireson, Robert G; Zielinska, Barbara; Ondov, John M; Weaver, Christopher S; Lapin, Charles A; Easter, Michael; Hesterberg, Thomas W; Larson, Timothy
2010-09-01
We monitored two Seattle school buses to quantify the buses' self pollution using the dual tracers (DT), lead vehicle (LV), and chemical mass balance (CMB) methods. Each bus drove along a residential route simulating stops, with windows closed or open. Particulate matter (PM) and its constituents were monitored in the bus and from a LV. We collected source samples from the tailpipe and crankcase emissions using an on-board dilution tunnel. Concentrations of PM(1), ultrafine particle counts, elemental and organic carbon (EC/OC) were higher on the bus than the LV. The DT method estimated that the tailpipe and the crankcase emissions contributed 1.1 and 6.8 mug/m(3) of PM(2.5) inside the bus, respectively, with significantly higher crankcase self pollution (SP) when windows were closed. Approximately two-thirds of in-cabin PM(2.5) originated from background sources. Using the LV approach, SP estimates from the EC and the active personal DataRAM (pDR) measurements correlated well with the DT estimates for tailpipe and crankcase emissions, respectively, although both measurements need further calibration for accurate quantification. CMB results overestimated SP from the DT method but confirmed crankcase emissions as the major SP source. We confirmed buses' SP using three independent methods and quantified crankcase emissions as the dominant contributor.
Quantification of self pollution from two diesel school buses using three independent methods
NASA Astrophysics Data System (ADS)
Sally Liu, L.-J.; Phuleria, Harish C.; Webber, Whitney; Davey, Mark; Lawson, Douglas R.; Ireson, Robert G.; Zielinska, Barbara; Ondov, John M.; Weaver, Christopher S.; Lapin, Charles A.; Easter, Michael; Hesterberg, Thomas W.; Larson, Timothy
2010-09-01
We monitored two Seattle school buses to quantify the buses' self pollution using the dual tracers (DT), lead vehicle (LV), and chemical mass balance (CMB) methods. Each bus drove along a residential route simulating stops, with windows closed or open. Particulate matter (PM) and its constituents were monitored in the bus and from a LV. We collected source samples from the tailpipe and crankcase emissions using an on-board dilution tunnel. Concentrations of PM 1, ultrafine particle counts, elemental and organic carbon (EC/OC) were higher on the bus than the LV. The DT method estimated that the tailpipe and the crankcase emissions contributed 1.1 and 6.8 μg m -3 of PM 2.5 inside the bus, respectively, with significantly higher crankcase self pollution (SP) when windows were closed. Approximately two-thirds of in-cabin PM 2.5 originated from background sources. Using the LV approach, SP estimates from the EC and the active personal DataRAM (pDR) measurements correlated well with the DT estimates for tailpipe and crankcase emissions, respectively, although both measurements need further calibration for accurate quantification. CMB results overestimated SP from the DT method but confirmed crankcase emissions as the major SP source. We confirmed buses' SP using three independent methods and quantified crankcase emissions as the dominant contributor.
NASA Astrophysics Data System (ADS)
de Souza, Roseli M.; Mathias, Bárbara M.; da Silveira, Carmem Lúcia P.; Aucélio, Ricardo Q.
2005-06-01
The quantitative evaluation of trace elements in foodstuffs is of considerable interest due to the potential toxicity of many elements, and because the presence of some metallic species might affect the overall quality (flavor and stability) of these products. In the present work, an inductively coupled plasma optical emission spectrometric method has been developed for the determination of six elements (Cd, Co, Cr, Cu, Ni and Mn) in olive oil, soy oil, margarine and butter. Organic samples (oils and fats) were stabilized using propan-1-ol and water, which enabled long-time sample dispersion in the solution. This simple sample preparation procedure, together with an efficient sample introduction strategy (using a Meinhard K3 nebulizer and a twister cyclonic spray chamber), facilitated the overall analytical procedure, allowing quantification using calibration curves prepared with inorganic standards. Internal standardization (Sc) was used for correction of matrix effects and signal fluctuations. Good sensitivities with limits of detection in the ng g -1 range were achieved for all six elements. These sensitivities were appropriate for the intended application. The method was tested through the analysis of laboratory-fortified samples with good recoveries (between 91.3% and 105.5%).
A Data Matrix Method for Improving the Quantification of Element Percentages of SEM/EDX Analysis
NASA Technical Reports Server (NTRS)
Lane, John
2009-01-01
A simple 2D M N matrix involving sample preparation enables the microanalyst to peer below the noise floor of element percentages reported by the SEM/EDX (scanning electron microscopy/ energy dispersive x-ray) analysis, thus yielding more meaningful data. Using the example of a 2 3 sample set, there are M = 2 concentration levels of the original mix under test: 10 percent ilmenite (90 percent silica) and 20 percent ilmenite (80 percent silica). For each of these M samples, N = 3 separate SEM/EDX samples were drawn. In this test, ilmenite is the element of interest. By plotting the linear trend of the M sample s known concentration versus the average of the N samples, a much higher resolution of elemental analysis can be performed. The resulting trend also shows how the noise is affecting the data, and at what point (of smaller concentrations) is it impractical to try to extract any further useful data.
Hunsche, Mauricio; Noga, Georg
2009-12-01
In the present study the principle of energy dispersive X-ray microanalysis (EDX), i.e. the detection of elements based on their characteristic X-rays, was used to localise and quantify organic and inorganic pesticides on enzymatically isolated fruit cuticles. Pesticides could be discriminated from the plant surface because of their distinctive elemental composition. Findings confirm the close relation between net intensity (NI) and area covered by the active ingredient (AI area). Using wide and narrow concentration ranges of glyphosate and glufosinate, respectively, results showed that quantification of AI requires the selection of appropriate regression equations while considering NI, peak-to-background (P/B) ratio, and AI area. The use of selected internal standards (ISs) such as Ca(NO(3))(2) improved the accuracy of the quantification slightly but led to the formation of particular, non-typical microstructured deposits. The suitability of SEM-EDX as a general technique to quantify pesticides was evaluated additionally on 14 agrochemicals applied at diluted or regular concentration. Among the pesticides tested, spatial localisation and quantification of AI amount could be done for inorganic copper and sulfur as well for the organic agrochemicals glyphosate, glufosinate, bromoxynil and mancozeb. (c) 2009 Society of Chemical Industry.
Uncertainty Quantification in Aeroelasticity
NASA Astrophysics Data System (ADS)
Beran, Philip; Stanford, Bret; Schrock, Christopher
2017-01-01
Physical interactions between a fluid and structure, potentially manifested as self-sustained or divergent oscillations, can be sensitive to many parameters whose values are uncertain. Of interest here are aircraft aeroelastic interactions, which must be accounted for in aircraft certification and design. Deterministic prediction of these aeroelastic behaviors can be difficult owing to physical and computational complexity. New challenges are introduced when physical parameters and elements of the modeling process are uncertain. By viewing aeroelasticity through a nondeterministic prism, where key quantities are assumed stochastic, one may gain insights into how to reduce system uncertainty, increase system robustness, and maintain aeroelastic safety. This article reviews uncertainty quantification in aeroelasticity using traditional analytical techniques not reliant on computational fluid dynamics; compares and contrasts this work with emerging methods based on computational fluid dynamics, which target richer physics; and reviews the state of the art in aeroelastic optimization under uncertainty. Barriers to continued progress, for example, the so-called curse of dimensionality, are discussed.
NASA Astrophysics Data System (ADS)
Gramaccioni, Chiara; Yang, Yang; Procopio, Alessandra; Pacureanu, Alexandra; Bohic, Sylvain; Malucelli, Emil; Iotti, Stefano; Farruggia, Giovanna; Bukreeva, Inna; Notargiacomo, Andrea; Fratini, Michela; Valenti, Piera; Rosa, Luigi; Berlutti, Francesca; Cloetens, Peter; Lagomarsino, Stefano
2018-01-01
We present here a correlative X-ray microscopy approach for quantitative single cell imaging of molar concentrations. By combining the elemental content provided by X-ray fluorescence microscopy and the morphology information extracted from X-ray phase nanotomography, we determine the intracellular molarity distributions. This correlative method was demonstrated on a freeze-dried human phagocytic cell to obtain the absolute elemental concentration maps of K, P, and Fe. The cell morphology results showed a very good agreement with atomic-force microscopy measurements. This work opens the way for non-destructive single cell chemical analysis down to the sub-cellular level using exclusively synchrotron radiation techniques. It will be of high interest in the case where it is difficult to access the morphology using atomic-force microscopy, for example, on frozen-hydrated cells or tissues.
NASA Astrophysics Data System (ADS)
Nilsson, E. J. C.; Pallon, J.; Przybylowicz, W. J.; Wang, Y. D.; Jönsson, K. I.
2014-08-01
Although heavy on labor and equipment, thus not often applied, cryoanalysis of frozen hydrated biological specimens can provide information that better reflects the living state of the organism, compared with analysis in the freeze-dried state. In this paper we report a study where the cryoanalysis facility with cryosectioning capabilities at Materials Research Department, iThemba LABS, South Africa was employed to evaluate the usefulness of combining three ion beam analytical methods (μPIXE, RBS and STIM) to analyze a biological target where a better elemental compositional description is needed - the tardigrade. Imaging as well as quantification results are of interest. In a previous study, the element composition and redistribution of elements in the desiccated and active states of two tardigrade species was investigated. This study included analysis of both whole and sectioned tardigrades, and the aim was to analyze each specimen twice; first frozen hydrated and later freeze-dried. The combination of the three analytical techniques proved useful: elements from C to Rb in the tardigrades could be determined and certain differences in distribution of elements between the frozen hydrated and the freeze-dried states were observed. RBS on frozen hydrated specimens provided knowledge of matrix elements.
den Braver, Michiel W; Vermeulen, Nico P E; Commandeur, Jan N M
2017-03-01
Modification of cellular macromolecules by reactive drug metabolites is considered to play an important role in the initiation of tissue injury by many drugs. Detection and identification of reactive intermediates is often performed by analyzing the conjugates formed after trapping by glutathione (GSH). Although sensitivity of modern mass spectrometrical methods is extremely high, absolute quantification of GSH-conjugates is critically dependent on the availability of authentic references. Although 1 H NMR is currently the method of choice for quantification of metabolites formed biosynthetically, its intrinsically low sensitivity can be a limiting factor in quantification of GSH-conjugates which generally are formed at low levels. In the present study, a simple but sensitive and generic method for absolute quantification of GSH-conjugates is presented. The method is based on quantitative alkaline hydrolysis of GSH-conjugates and subsequent quantification of glutamic acid and glycine by HPLC after precolumn derivatization with o-phthaldialdehyde/N-acetylcysteine (OPA/NAC). Because of the lower stability of the glycine OPA/NAC-derivate, quantification of the glutamic acid OPA/NAC-derivate appeared most suitable for quantification of GSH-conjugates. The novel method was used to quantify the concentrations of GSH-conjugates of diclofenac, clozapine and acetaminophen and quantification was consistent with 1 H NMR, but with a more than 100-fold lower detection limit for absolute quantification. Copyright © 2017. Published by Elsevier B.V.
NASA Astrophysics Data System (ADS)
Sivasubramaniam, Kiruba
This thesis makes advances in three dimensional finite element analysis of electrical machines and the quantification of their parameters and performance. The principal objectives of the thesis are: (1)the development of a stable and accurate method of nonlinear three-dimensional field computation and application to electrical machinery and devices; and (2)improvement in the accuracy of determination of performance parameters, particularly forces and torque computed from finite elements. Contributions are made in two general areas: a more efficient formulation for three dimensional finite element analysis which saves time and improves accuracy, and new post-processing techniques to calculate flux density values from a given finite element solution. A novel three-dimensional magnetostatic solution based on a modified scalar potential method is implemented. This method has significant advantages over the traditional total scalar, reduced scalar or vector potential methods. The new method is applied to a 3D geometry of an iron core inductor and a permanent magnet motor. The results obtained are compared with those obtained from traditional methods, in terms of accuracy and speed of computation. A technique which has been observed to improve force computation in two dimensional analysis using a local solution of Laplace's equation in the airgap of machines is investigated and a similar method is implemented in the three dimensional analysis of electromagnetic devices. A new integral formulation to improve force calculation from a smoother flux-density profile is also explored and implemented. Comparisons are made and conclusions drawn as to how much improvement is obtained and at what cost. This thesis also demonstrates the use of finite element analysis to analyze torque ripples due to rotor eccentricity in permanent magnet BLDC motors. A new method for analyzing torque harmonics based on data obtained from a time stepping finite element analysis of the machine is explored and implemented.
The use of discontinuities and functional groups to assess relative resilience in complex systems
Allen, Craig R.; Gunderson, Lance; Johnson, A.R.
2005-01-01
It is evident when the resilience of a system has been exceeded and the system qualitatively changed. However, it is not clear how to measure resilience in a system prior to the demonstration that the capacity for resilient response has been exceeded. We argue that self-organizing human and natural systems are structured by a relatively small set of processes operating across scales in time and space. These structuring processes should generate a discontinuous distribution of structures and frequencies, where discontinuities mark the transition from one scale to another. Resilience is not driven by the identity of elements of a system, but rather by the functions those elements provide, and their distribution within and across scales. A self-organizing system that is resilient should maintain patterns of function within and across scales despite the turnover of specific elements (for example, species, cities). However, the loss of functions, or a decrease in functional representation at certain scales will decrease system resilience. It follows that some distributions of function should be more resilient than others. We propose that the determination of discontinuities, and the quantification of function both within and across scales, produce relative measures of resilience in ecological and other systems. We describe a set of methods to assess the relative resilience of a system based upon the determination of discontinuities and the quantification of the distribution of functions in relation to those discontinuities. ?? 2005 Springer Science+Business Media, Inc.
Nabi, Nesrine; Chaouachi, Maher; Zellama, Mohamed Salem; Ben Hafsa, Ahmed; Mrabet, Besma; Saïd, Khaled; Fathia, Harzallah Skhiri
2016-04-01
The question asked in the present work was how to differentiate between contamination of field samples with and GM plants contained sequences provided from this bacterium in order to avoid false positives in the frame of the detection and the quantification of GMO. For this, new set of primers and corresponding TaqMan Minor Groove Binder (MGB) probes were designed to target Agrobacterium sp. using the tumor-morphology-shooty gene (TMS1). Final standard curves were calculated for each pathogen by plotting the threshold cycle value against the bacterial number (log (colony forming units) per milliliter) via linear regression. The method designed was highly specific and sensitive, with a detection limit of 10CFU/ml. No significant cross-reaction was observed. Results from this study showed that TaqMan real-time PCR, is potentially an effective method for the rapid and reliable quantification of Agrobacterium sp. in samples containing GMO or non GMO samples. Copyright © 2015 Elsevier Ltd. All rights reserved.
Lapin, Guilherme Abbud Franco; Hochman, Bernardo; Nishioka, Michele Akemi; Maximino, Jessica Ruivo; Chadi, Gerson; Ferreira, Lydia Masako
2015-06-01
To describe and standardize a protocol that overcomes the technical limitations of Western blot (WB) analysis in the quantification of the neuropeptides substance P (SP) and calcitonin gene-related peptide (CGRP) following nociceptive stimuli in rat skin. Male Wistar rats (Rattus norvegicus albinus) weighing 250 to 350 g were used in this study. Elements of WB analysis were adapted by using specific manipulation of samples, repeated cycles of freezing and thawing, more thorough maceration, and a more potent homogenizer; increasing lytic reagents; promoting greater inhibition of protease activity; and using polyvinylidene fluoride membranes as transfer means for skin-specific protein. Other changes were also made to adapt the WB analysis to a rat model. University research center. Western blot analysis adapted to a rat model. This research design has proven effective in collecting and preparing skin samples to quantify SP and CGRP using WB analysis in rat skin. This study described a research design that uses WB analysis as a reproducible, technically accessible, and cost-effective method for the quantification of SP and CGRP in rat skin that overcomes technical biases.
DOE Office of Scientific and Technical Information (OSTI.GOV)
West, Bradley M.; Stuckelberger, Michael; Jeffries, April
The study of a multilayered and multicomponent system by spatially resolved X-ray fluorescence microscopy poses unique challenges in achieving accurate quantification of elemental distributions. This is particularly true for the quantification of materials with high X-ray attenuation coefficients, depth-dependent composition variations and thickness variations. A widely applicable procedure for use after spectrum fitting and quantification is described. This procedure corrects the elemental distribution from the measured fluorescence signal, taking into account attenuation of the incident beam and generated fluorescence from multiple layers, and accounts for sample thickness variations. Deriving from Beer–Lambert's law, formulae are presented in a general integral formmore » and numerically applicable framework. Here, the procedure is applied using experimental data from a solar cell with a Cu(In,Ga)Se 2 absorber layer, measured at two separate synchrotron beamlines with varied measurement geometries. This example shows the importance of these corrections in real material systems, which can change the interpretation of the measured distributions dramatically.« less
West, Bradley M.; Stuckelberger, Michael; Jeffries, April; ...
2017-01-01
The study of a multilayered and multicomponent system by spatially resolved X-ray fluorescence microscopy poses unique challenges in achieving accurate quantification of elemental distributions. This is particularly true for the quantification of materials with high X-ray attenuation coefficients, depth-dependent composition variations and thickness variations. A widely applicable procedure for use after spectrum fitting and quantification is described. This procedure corrects the elemental distribution from the measured fluorescence signal, taking into account attenuation of the incident beam and generated fluorescence from multiple layers, and accounts for sample thickness variations. Deriving from Beer–Lambert's law, formulae are presented in a general integral formmore » and numerically applicable framework. Here, the procedure is applied using experimental data from a solar cell with a Cu(In,Ga)Se 2 absorber layer, measured at two separate synchrotron beamlines with varied measurement geometries. This example shows the importance of these corrections in real material systems, which can change the interpretation of the measured distributions dramatically.« less
López-Heras, Isabel; Madrid, Yolanda; Cámara, Carmen
2014-06-01
In this work, we proposed an analytical approach based on asymmetrical flow field-flow fractionation combined to an inductively coupled plasma mass spectrometry (AsFlFFF-ICP-MS) for rutile titanium dioxide nanoparticles (TiO2NPs) characterization and quantification in cosmetic and food products. AsFlFFF-ICP-MS separation of TiO2NPs was performed using 0.2% (w/v) SDS, 6% (v/v) methanol at pH 8.7 as the carrier solution. Two problems were addressed during TiO2NPs analysis by AsFlFFF-ICP-MS: size distribution determination and element quantification of the NPs. Two approaches were used for size determination: size calibration using polystyrene latex standards of known sizes and transmission electron microscopy (TEM). A method based on focused sonication for preparing NPs dispersions followed by an on-line external calibration strategy based on AsFlFFF-ICP-MS, using rutile TiO2NPs as standards is presented here for the first time. The developed method suppressed non-specific interactions between NPs and membrane, and overcame possible erroneous results obtained when quantification is performed by using ionic Ti solutions. The applicability of the quantification method was tested on cosmetic products (moisturizing cream). Regarding validation, at the 95% confidence level, no significant differences were detected between titanium concentrations in the moisturizing cream prior sample mineralization (3865±139 mg Ti/kg sample), by FIA-ICP-MS analysis prior NPs extraction (3770±24 mg Ti/kg sample), and after using the optimized on-line calibration approach (3699±145 mg Ti/kg sample). Besides the high Ti content found in the studied food products (sugar glass and coffee cream), TiO2NPs were not detected. Copyright © 2014 Elsevier B.V. All rights reserved.
Uncertainty quantification applied to the radiological characterization of radioactive waste.
Zaffora, B; Magistris, M; Saporta, G; Chevalier, J-P
2017-09-01
This paper describes the process adopted at the European Organization for Nuclear Research (CERN) to quantify uncertainties affecting the characterization of very-low-level radioactive waste. Radioactive waste is a by-product of the operation of high-energy particle accelerators. Radioactive waste must be characterized to ensure its safe disposal in final repositories. Characterizing radioactive waste means establishing the list of radionuclides together with their activities. The estimated activity levels are compared to the limits given by the national authority of the waste disposal. The quantification of the uncertainty affecting the concentration of the radionuclides is therefore essential to estimate the acceptability of the waste in the final repository but also to control the sorting, volume reduction and packaging phases of the characterization process. The characterization method consists of estimating the activity of produced radionuclides either by experimental methods or statistical approaches. The uncertainties are estimated using classical statistical methods and uncertainty propagation. A mixed multivariate random vector is built to generate random input parameters for the activity calculations. The random vector is a robust tool to account for the unknown radiological history of legacy waste. This analytical technique is also particularly useful to generate random chemical compositions of materials when the trace element concentrations are not available or cannot be measured. The methodology was validated using a waste population of legacy copper activated at CERN. The methodology introduced here represents a first approach for the uncertainty quantification (UQ) of the characterization process of waste produced at particle accelerators. Copyright © 2017 Elsevier Ltd. All rights reserved.
Jann, Johann-Christoph; Nowak, Daniel; Nolte, Florian; Fey, Stephanie; Nowak, Verena; Obländer, Julia; Pressler, Jovita; Palme, Iris; Xanthopoulos, Christina; Fabarius, Alice; Platzbecker, Uwe; Giagounidis, Aristoteles; Götze, Katharina; Letsch, Anne; Haase, Detlef; Schlenk, Richard; Bug, Gesine; Lübbert, Michael; Ganser, Arnold; Germing, Ulrich; Haferlach, Claudia; Hofmann, Wolf-Karsten; Mossner, Maximilian
2017-01-01
Background Cytogenetic aberrations such as deletion of chromosome 5q (del(5q)) represent key elements in routine clinical diagnostics of haematological malignancies. Currently established methods such as metaphase cytogenetics, FISH or array-based approaches have limitations due to their dependency on viable cells, high costs or semi-quantitative nature. Importantly, they cannot be used on low abundance DNA. We therefore aimed to establish a robust and quantitative technique that overcomes these shortcomings. Methods For precise determination of del(5q) cell fractions, we developed an inexpensive multiplex-PCR assay requiring only nanograms of DNA that simultaneously measures allelic imbalances of 12 independent short tandem repeat markers. Results Application of this method to n=1142 samples from n=260 individuals revealed strong intermarker concordance (R²=0.77–0.97) and reproducibility (mean SD: 1.7%). Notably, the assay showed accurate quantification via standard curve assessment (R²>0.99) and high concordance with paired FISH measurements (R²=0.92) even with subnanogram amounts of DNA. Moreover, cytogenetic response was reliably confirmed in del(5q) patients with myelodysplastic syndromes treated with lenalidomide. While the assay demonstrated good diagnostic accuracy in receiver operating characteristic analysis (area under the curve: 0.97), we further observed robust correlation between bone marrow and peripheral blood samples (R²=0.79), suggesting its potential suitability for less-invasive clonal monitoring. Conclusions In conclusion, we present an adaptable tool for quantification of chromosomal aberrations, particularly in problematic samples, which should be easily applicable to further tumour entities. PMID:28600436
NASA Astrophysics Data System (ADS)
Nunes, Teresa; Mirante, Fátima; Almeida, Elza; Pio, Casimiro
2010-05-01
Atmospheric carbon consists of: organic carbon (OC, including various organic compounds), elemental carbon (EC, or black carbon [BC]/soot, a non-volatile/light-absorbing carbon), and a small quantity of carbonate carbon. Thermal/optical methods (TOM) have been widely used for quantifying total carbon (TC), OC, and EC in ambient and source particulate samples. Unfortunately, the different thermal evolution protocols in use can result in a wide elemental carbon-to-total carbon variation. Temperature evolution in thermal carbon analysis is critical to the allocation of carbon fractions. Another critical point in OC and EC quantification by TOM is the interference of carbonate carbon (CC) that could be present in the particulate samples, mainly in the coarse fraction of atmospheric aerosol. One of the methods used to minimize this interference consists on the use of a sample pre-treatment with acid to eliminate CC prior to thermal analysis (Chow et al., 2001; Pio et al., 1994). In Europe, there is currently no standard procedure for determining the carbonaceous aerosol fraction, which implies that data from different laboratories at various sites are of unknown accuracy and cannot be considered comparable. In the framework of the EU-project EUSAAR, a comprehensive study has been carried out to identify the causes of differences in the EC measured using different thermal evolution protocols. From this study an optimised protocol, the EUSAAR-2 protocol, was defined (Cavali et al., 2009). During the last two decades thousands of aerosol samples have been taken over quartz filters at urban, industrial, rural and background sites, and also from plume forest fires and biomass burning in a domestic closed stove. These samples were analysed for OC and EC, by a TOM, similar to that in use in the IMPROVE network (Pio et al., 2007). More recently we reduced the number of steps in thermal evolution protocols, without significant repercussions in the OC/EC quantifications. In order to evaluate the possibility of continue using, for trend analysis, the historical data set, we performed an inter-comparison between our method and an adaptation of EUSAAR-2 protocol, taking into account that this last protocol will possibly be recommended for analysing carbonaceous aerosols at European sites. In this inter-comparison we tested different types of samples (PM2,5, PM2,5-10, PM10) with large spectra of carbon loadings, with and without pre-treatment acidification. For a reduced number of samples, five replicates of each one were analysed by each method for statistical purposes. The inter-comparison study revealed that when the sample analysis were performed in similar room conditions, the two thermo-optic methods give similar results for TC, OC and EC, without significant differences at a 95% confidence level. The correlation between the methods, DAO and EUSAAR-2 for EC is smaller than for TC and OC, although showing a coefficient correlation over 0,95, with a slope close to one. For samples performed in different periods, room temperatures seem to have a significant effect over OC quantification. The sample pre-treatment with HCl fumigation tends to decrease TC quantification, mainly due to the more volatile organic fraction release during the first heating step. For a set of 20 domestic biomass burning samples analyzed by the DAO method we observed an average decrease in TC quantification of 3,7 % in relation to non-acidified samples, even though this decrease is accompanied by an average increase in the less volatile organic fraction. The indirect measurement of carbon carbonate, usually a minor carbon component in the carbonaceous aerosol, based on the difference between TC measured by TOM of acidified and non-acidified samples is not a robust measurement, considering the biases affecting his quantification. The present study show that the two thermo-optic temperature program used for OC and EC quantification give similar results, and if in the future the EUSAAR-2 protocol will be adopted the past measurement of carbonaceous fractions can be used for trend analysis. However this study demonstrates that the temperature control during post-sampling handling is a critical point in total OC and TC quantification that must be assigned in the new European protocol. References: Cavali et al., 2009, AMTD 2, 2321-2345, 2009 Chow et al., 2001, Aerosol. Sci. Technol., 34, 23-34, 2001. Pio et al., 1994, Proceedings of the Sixth European Symposium on Physico-Chemical Behavior of Atmospheric Pollutants. Report EUR 15609/2 EN, pp. 706-711. Pio et al, 2007, J. Geophys. Res. 112, D23S02 Acknowledgement: This work was funded by the Portuguese Science Foundation through the projects POCI/AMB/60267/2004 and PTDC/AMB/65706/2006 (BIOEMI). F. Mirante acknowledges the PhD grant SFRH/BD/45473/2008.
NASA Astrophysics Data System (ADS)
Mota, Mariana F. B.; Gama, Ednilton M.; Rodrigues, Gabrielle de C.; Rodrigues, Guilherme D.; Nascentes, Clésia C.; Costa, Letícia M.
2018-01-01
In this work, a dilute-and-shoot method was developed for Ca, P, S and Zn determination in new and used lubricating oil samples by total reflection X-ray fluorescence (TXRF). The oil samples were diluted with organic solvents followed by addition of yttrium as internal standard and the TXRF measurements were performed after solvent evaporation. The method was optimized using an interlaboratorial reference material. The experimental parameters evaluated were sample volume (50 or 100 μL), measurement time (250 or 500 s) and volume deposited on the quartz glass sample carrier (5 or 10 μL). All of them were evaluated and optimized using xylene, kerosene and hexane. Analytical figures of merit (accuracy, precision, limit of detection and quantification) were used to evaluate the performance of the analytical method for all solvents. The recovery rates varied from 99 to 111% and the relative standard deviation remained between 1.7% and 10% (n = 8). For all elements, the results obtained by applying the new method were in agreement with the certified value. After the validation step, the method was applied for Ca, P, S and Zn quantification in eight new and four used lubricating oil samples, for all solvents. The concentration of the elements in the samples varied in the ranges of 1620-3711 mg L- 1 for Ca, 704-1277 mg L- 1 for P, 2027-9147 mg L- 1 for S, and 898-1593 mg L- 1 for Zn. The association of TXRF with a dilute-and-shoot sample preparation strategy was efficient for Ca, P, S and Zn determination in lubricating oils, presenting accurate results. Additionally, the time required for analysis is short, the reagent volumes are low minimizing waste generation, and the technique does not require calibration curves.
Bergamaschi, B.A.; Baston, D.S.; Crepeau, K.L.; Kuivila, K.M.
1999-01-01
An analytical method useful for the quantification of a range of pesticides and pesticide degradation products associated with suspended sediments was developed by testing a variety of extraction and cleanup schemes. The final extraction and cleanup methods chosen for use are suitable for the quantification of the listed pesticides using gas chromatography-ion trap mass spectrometry and the removal of interfering coextractable organic material found in suspended sediments. Methylene chloride extraction followed by Florisil cleanup proved most effective for separation of coextractives from the pesticide analytes. Removal of elemental sulfur was accomplished with tetrabutylammonium hydrogen sulfite. The suitability of the method for the analysis of a variety of pesticides was evaluated, and the method detection limits (MDLs) were determined (0.1-6.0 ng/g dry weight of sediment) for 21 compounds. Recovery of pesticides dried onto natural sediments averaged 63%. Analysis of duplicate San Joaquin River suspended-sediment samples demonstrated the utility of the method for environmental samples with variability between replicate analyses lower than between environmental samples. Eight of 21 pesticides measured were observed at concentrations ranging from the MDL to more than 80 ng/g dry weight of sediment and exhibited significant temporal variability. Sediment-associated pesticides, therefore, may contribute to the transport of pesticides through aquatic systems and should be studied separately from dissolved pesticides.
Quantification of Self Pollution from Two Diesel School Buses using Three Independent Methods
Liu, L.-J. Sally; Phuleria, Harish C.; Webber, Whitney; Davey, Mark; Lawson, Douglas R.; Ireson, Robert G.; Zielinska, Barbara; Ondov, John M.; Weaver, Christopher S.; Lapin, Charles A.; Easter, Michael; Hesterberg, Thomas W.; Larson, Timothy
2010-01-01
We monitored two Seattle school buses to quantify the buses’ self pollution using the dual tracers (DT), lead vehicle (LV), and chemical mass balance (CMB) methods. Each bus drove along a residential route simulating stops, with windows closed or open. Particulate matter (PM) and its constituents were monitored in the bus and from a LV. We collected source samples from the tailpipe and crankcase emissions using an on-board dilution tunnel. Concentrations of PM1, ultrafine particle counts, elemental and organic carbon (EC/OC) were higher on the bus than the LV. The DT method estimated that the tailpipe and the crankcase emissions contributed 1.1 and 6.8 μg/m3 of PM2.5 inside the bus, respectively, with significantly higher crankcase self pollution (SP) when windows were closed. Approximately two-thirds of in-cabin PM2.5 originated from background sources. Using the LV approach, SP estimates from the EC and the active personal DataRAM (pDR) measurements correlated well with the DT estimates for tailpipe and crankcase emissions, respectively, although both measurements need further calibration for accurate quantification. CMB results overestimated SP from the DT method but confirmed crankcase emissions as the major SP source. We confirmed buses’ SP using three independent methods and quantified crankcase emissions as the dominant contributor. PMID:20694046
Laursen, Kristoffer; Adamsen, Christina E; Laursen, Jens; Olsen, Karsten; Møller, Jens K S
2008-03-01
Zinc-protoporphyrin (Zn-pp), which has been identified as the major pigment in certain dry-cured meat products, was extracted with acetone/water (75%) and isolated from the following meat products: Parma ham, Iberian ham and dry-cured ham with added nitrite. The quantification of Zn-pp by electron absorption, fluorescence and X-ray fluorescence (XRF) spectroscopy was compared (concentration range used [Zn-pp]=0.8-9.7μM). All three hams were found to contain Zn-pp, and the results show no significant difference among the content of Zn-pp quantified by fluorescence, absorbance and X-ray fluorescence spectroscopy for Parma ham and Iberian ham. All three methods can be used for quantification of Zn-pp in acetone/water extracts of different ham types if the content is higher than 1.0ppm. For dry-cured ham with added nitrite, XRF was not applicable due to the low content of Zn-pp (<0.1ppm). In addition, XRF spectroscopy provides further information regarding other trace elements and can therefore be advantageous in this aspect. This study also focused on XRF determination of Fe in the extracts and as no detectable Fe was found in the three types of ham extracts investigated (limit of detection; Fe⩽1.8ppm), it allows the conclusion that iron containing pigments, e.g., heme, do not contribute to the noticeable red colour observed in some of the extracts.
NASA Astrophysics Data System (ADS)
Atlas, Z. D.; Pasek, M. A.; Sampson, J.
2014-12-01
Phosphorus is a geologically important element making up approximately 0.12 % of the Earth's crust. It is commonly found as relatively insoluble apatite and this causes phosphorus to be a limiting nutrient in biologic processes. Despite this, phosphorus is a key element in DNA, RNA and other cellular materials. Recent works suggest that reduced phosphorus played a substantial role in the development of life on the early Earth. Reduced phosphorus is considerably more soluble than oxidized phosphorus, and reduced phosphorus may continue to play a role in biologic productivity. This study examines a new methodology for quantification of reduced phosphorus separated by coupled HPLC - ICP-MS. We show that phosphorus species (P1+, P3+ and P5+) are cleanly separated in the HPLC and coupled with the ICP-MS reaction cell (using O2 gas) effectively convert elemental P to P-O producing lower background and flatter baseline chromatography. Results suggest very low detection limits (0.05 mM) for P species analyzed as P-O at M/Z = 47. Additionally this technique has potential to speciate at least 5 other metastable forms of phosphorus. We verified this method on numerous materials including leached Archean rocks to suburban retention pond waters and many samples show small but detectible levels of reduced phosphorus. These data highlight a significant role of redox processing of phosphorus throughout the history of the Earth, with the reduced oxidation state phosphorus compounds, phosphite and hypophosphite, potentially acting as significant constituents in the anaerobic environment.
Risk and benefit of diffraction in Energy Dispersive X-ray fluorescence mapping
NASA Astrophysics Data System (ADS)
Nikonow, Wilhelm; Rammlmair, Dieter
2016-11-01
Energy dispersive X-ray fluorescence mapping (μ-EDXRF) is a fast and non-destructive method for chemical quantification and therefore used in many scientific fields. The combination of spatial and chemical information is highly valuable for understanding geological processes. Problems occur with crystalline samples due to diffraction, which appears according to Bragg's law, depending on the energy of the X-ray beam, the incident angle and the crystal parameters. In the spectra these peaks can overlap with element peaks suggesting higher element concentrations. The aim of this study is to investigate the effect of diffraction, the possibility of diffraction removal and potential geoscientific applications for X-ray mapping. In this work the μ-EDXRF M4 Tornado from Bruker was operated with a Rh-tube and polychromatic beam with two SDD detectors mounted each at ± 90° to the tube. Due to the polychromatic beam the Bragg condition fits for several mineral lattice planes. Since diffraction depends on the angle, it is shown that a novel correction approach can be applied by measuring from two different angles and calculating the minimum spectrum of both detectors gaining a better limit of quantification for this method. Furthermore, it is possible to use the diffraction information for separation of differently oriented crystallites within a monomineralic aggregate and obtain parameters like particle size distribution for the sample, as it is done by thin section image analysis in cross-polarized light. Only with μ-EDXRF this can be made on larger samples without preparation of thin sections.
Sancey, Lucie; Motto-Ros, Vincent; Kotb, Shady; Wang, Xiaochun; Lux, François; Panczer, Gérard; Yu, Jin; Tillement, Olivier
2014-01-01
Emission spectroscopy of laser-induced plasma was applied to elemental analysis of biological samples. Laser-induced breakdown spectroscopy (LIBS) performed on thin sections of rodent tissues: kidneys and tumor, allows the detection of inorganic elements such as (i) Na, Ca, Cu, Mg, P, and Fe, naturally present in the body and (ii) Si and Gd, detected after the injection of gadolinium-based nanoparticles. The animals were euthanized 1 to 24 hr after intravenous injection of particles. A two-dimensional scan of the sample, performed using a motorized micrometric 3D-stage, allowed the infrared laser beam exploring the surface with a lateral resolution less than 100 μm. Quantitative chemical images of Gd element inside the organ were obtained with sub-mM sensitivity. LIBS offers a simple and robust method to study the distribution of inorganic materials without any specific labeling. Moreover, the compatibility of the setup with standard optical microscopy emphasizes its potential to provide multiple images of the same biological tissue with different types of response: elemental, molecular, or cellular. PMID:24962015
Hernández González, Carolina; Cabezas, Alberto J Quejido; Díaz, Marta Fernández
2005-11-15
A 100-fold preconcentration procedure based on rare-earth elements (REEs) separation from water samples with an extraction chromatographic column has been developed. The separation of REEs from matrix elements (mainly Fe, alkaline and alkaline-earth elements) in water samples was performed loading the samples, previously acidified to pH 2.0 with HNO(3), in a 2ml column preconditioned with 20ml 0.01M HNO(3). Subsequently, REEs were quantitatively eluted with 20ml 7M HNO(3). This solution was evaporated to dryness and the final residue was dissolved in 10ml 2% HNO(3) containing 1mugl(-1) of cesium used as internal standard. The solution was directly analysed by inductively coupled plasma mass spectrometry (ICP-MS), using ultrasonic nebulization, obtaining quantification limits ranging from 0.05 to 0.10 ngl(-1). The proposed method has been applied to granitic waters running through fracture fillings coated by iron and manganese oxy-hydroxides in the area of the Ratones (Cáceres, Spain) old uranium mine.
Chemical Diversity along the Traverse of the Rover Spirit at Gusev Crater
NASA Technical Reports Server (NTRS)
Gellert, R.; Brueckner, J.; Clark, B. C.; Dreibus, G.; d'Uston, C.; Economou, T.; Klingelhoefer, G.; Lugmair, G.; Ming, D. W.; Morris, R. V.;
2006-01-01
The Alpha-Particle-X-ray Spectrometer (APXS) is part of the in situ payload of the Mars Exploration Rovers. It has determined the chemical composition of soils and rocks along the nearly 6 km long traverse of the rover Spirit. The measuring method a combination of PIXE and XRF using Cm244 sources - allowed the unambiguous identification of elemental compositions with high precision. Besides sample triage and quantification of saltforming elements as indicators for aqueous alteration, the APXS also delivered important constraints to mineralogy intruments (i.e., Mossbauer (MB), MiniTES, Pancam) on minerals and rock types. The mineralogy instruments on the other hand provided constraints on minerals used for APXS normative calculations and, e.g. allowed the attribution of S to sulfate, instead of sulfide or elemental sulfur. This abstract gives an updated overview of the data obtained up to our current rover position on sol 720 at the eastern base of the Columbia Hills. We will emphasize elemental correlations that imply the presence of certain minerals that can not be identified by the MER mineralogy instruments.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Li, Zhou; Adams, Rachel M; Chourey, Karuna
2012-01-01
A variety of quantitative proteomics methods have been developed, including label-free, metabolic labeling, and isobaric chemical labeling using iTRAQ or TMT. Here, these methods were compared in terms of the depth of proteome coverage, quantification accuracy, precision, and reproducibility using a high-performance hybrid mass spectrometer, LTQ Orbitrap Velos. Our results show that (1) the spectral counting method provides the deepest proteome coverage for identification, but its quantification performance is worse than labeling-based approaches, especially the quantification reproducibility; (2) metabolic labeling and isobaric chemical labeling are capable of accurate, precise, and reproducible quantification and provide deep proteome coverage for quantification. Isobaricmore » chemical labeling surpasses metabolic labeling in terms of quantification precision and reproducibility; (3) iTRAQ and TMT perform similarly in all aspects compared in the current study using a CID-HCD dual scan configuration. Based on the unique advantages of each method, we provide guidance for selection of the appropriate method for a quantitative proteomics study.« less
Determination of elemental composition of shale rocks by laser induced breakdown spectroscopy
NASA Astrophysics Data System (ADS)
Sanghapi, Hervé K.; Jain, Jinesh; Bol'shakov, Alexander; Lopano, Christina; McIntyre, Dustin; Russo, Richard
2016-08-01
In this study laser induced breakdown spectroscopy (LIBS) is used for elemental characterization of outcrop samples from the Marcellus Shale. Powdered samples were pressed to form pellets and used for LIBS analysis. Partial least squares regression (PLS-R) and univariate calibration curves were used for quantification of analytes. The matrix effect is substantially reduced using the partial least squares calibration method. Predicted results with LIBS are compared to ICP-OES results for Si, Al, Ti, Mg, and Ca. As for C, its results are compared to those obtained by a carbon analyzer. Relative errors of the LIBS measurements are in the range of 1.7 to 12.6%. The limits of detection (LODs) obtained for Si, Al, Ti, Mg and Ca are 60.9, 33.0, 15.6, 4.2 and 0.03 ppm, respectively. An LOD of 0.4 wt.% was obtained for carbon. This study shows that the LIBS method can provide a rapid analysis of shale samples and can potentially benefit depleted gas shale carbon storage research.
2012-01-01
Background Large scale usage of tobacco causes a lot of health troubles in human. Various formulations of tobacco are extensively used by the people particularly in developing world. Besides several toxic tobacco constituents some metals and metalloids are also believed to pose health risks. This paper describes inductively coupled plasma-mass spectrometric (ICP-MS) quantification of some important metals and metalloids in various brands of smoked, sniffed, dipped and chewed tobacco products. Results A microwave-assisted digestion method was used for sample preparation. The method was validated by analyzing a certified reference material. Percentage relative standard deviation (% R.S.D.) between recovered and certified values was < 5.8. Linearity value for calibration curve of each metal was 1 > r > 0.999. Improved limits of detection (LODs) were in range of ng/L for all elements. Fe, Al and Mn were found to be in the highest concentration in all types of tobacco products, while Zn, Cu, Ni and Cr were below the average concentration of 40 μg/g, and Pb, Co, As, Se and Cd were below 5 μg/g. All elements, apart from Pb, were high in concentration in dipping tobacco in comparison to other tobacco products. Generally, the order of all elemental concentration can be expressed in different tobacco products as chewing < smoked < sniffing < dipping. However, smoked and sniffing will interchange their position in the case of Mn, Cu, Se and Cd. Multivariate statistical analyses were also performed to evaluate the correlation and variations among tobacco products. Conclusions The present study highlights the quantification of some important metals and metalloids in a wide spectrum of tobacco formulations. The outcome of this study would be beneficial for health authorities and individuals. PMID:22709464
NASA Astrophysics Data System (ADS)
Saha, Abhijit; Deb, S. B.; Nagar, B. K.; Saxena, M. K.
An analytical methodology was developed for the precise quantification of ten trace rare earth elements (REEs), namely, La, Ce, Pr, Nd, Sm, Eu, Tb, Dy, Ho, and Tm, in gadolinium aluminate (GdAlO3) employing an ultrasonic nebulizer (USN)-desolvating device based inductively coupled plasma mass spectrometry (ICP-MS). A microwave digestion procedure was optimized for digesting 100 mg of the refractory oxide using a mixture of sulphuric acid (H2SO4), phosphoric acid (H3PO4) and water (H2O) with 1400 W power, 10 min ramp and 60 min hold time. An USN-desolvating sample introduction system was employed to enhance analyte sensitivities by minimizing their oxide ion formation in the plasma. Studies on the effect of various matrix concentrations on the analyte intensities revealed that precise quantification of the analytes was possible with matrix level of 250 mg L- 1. The possibility of using indium as an internal standard was explored and applied to correct for matrix effect and variation in analyte sensitivity under plasma operating conditions. Individual oxide ion formation yields were determined in matrix matched solution and employed for correcting polyatomic interferences of light REE (LREE) oxide ions on the intensities of middle and heavy rare earth elements (MREEs and HREEs). Recoveries of ≥ 90% were achieved for the analytes employing standard addition technique. Three real samples were analyzed for traces of REEs by the proposed method and cross validated for Eu and Nd by isotope dilution mass spectrometry (IDMS). The results show no significant difference in the values at 95% confidence level. The expanded uncertainty (coverage factor 1σ) in the determination of trace REEs in the samples were found to be between 3 and 8%. The instrument detection limits (IDLs) and the method detection limits (MDLs) for the ten REEs lie in the ranges 1-5 ng L- 1 and 7-64 μg kg- 1 respectively.
García-Florentino, Cristina; Maguregui, Maite; Romera-Fernández, Miriam; Queralt, Ignasi; Margui, Eva; Madariaga, Juan Manuel
2018-05-01
Wavelength dispersive X-ray fluorescence (WD-XRF) spectrometry has been widely used for elemental quantification of mortars and cements. In this kind of instrument, samples are usually prepared as pellets or fused beads and the whole volume of sample is measured at once. In this work, the usefulness of a dual energy dispersive X-ray fluorescence spectrometer (ED-XRF), working at two lateral resolutions (1 mm and 25 μm) for macro and microanalysis respectively, to develop quantitative methods for the elemental characterization of mortars and concretes is demonstrated. A crucial step before developing any quantitative method with this kind of spectrometers is to verify the homogeneity of the standards at these two lateral resolutions. This new ED-XRF quantitative method also demonstrated the importance of matrix effects in the accuracy of the results being necessary to use Certified Reference Materials as standards. The results obtained with the ED-XRF quantitative method were compared with the ones obtained with two WD-XRF quantitative methods employing two different sample preparation strategies (pellets and fused beads). The selected ED-XRF and both WD-XRF quantitative methods were applied to the analysis of real mortars. The accuracy of the ED-XRF results turn out to be similar to the one achieved by WD-XRF, except for the lightest elements (Na and Mg). The results described in this work proved that μ-ED-XRF spectrometers can be used not only for acquiring high resolution elemental map distributions, but also to perform accurate quantitative studies avoiding the use of more sophisticated WD-XRF systems or the acid extraction/alkaline fusion required as destructive pretreatment in Inductively coupled plasma mass spectrometry based procedures.
The first survey of airborne trace elements at airport using moss bag technique.
Vuković, Gordana; Urošević, Mira Aničić; Škrivanj, Sandra; Vergel, Konstantin; Tomašević, Milica; Popović, Aleksandar
2017-06-01
Air traffic represents an important way of social mobility in the world, and many ongoing discussions are related to the impacts that air transportation has on local air quality. In this study, moss Sphagnum girgensohnii was used for the first time in the assessment of trace element content at the international airport. The moss bags were exposed during the summer of 2013 at four sampling sites at the airport 'Nikola Tesla' (Belgrade, Serbia): runway (two), auxiliary runway and parking lot. According to the relative accumulation factor (RAF) and the limit of quantification of the moss bag technique (LOQ T ), the most abundant elements in the samples were Zn, Na, Cr, V, Cu and Fe. A comparison between the element concentrations at the airport and the corresponding values in different land use classes (urban central, suburban, industrial and green zones) across the city of Belgrade did not point out that the air traffic and associated activities significantly contribute to the trace element air pollution. This study emphasised an easy operational and robust (bio)monitoring, using moss bags as a suitable method for assessment of air quality within various microenvironments with restriction in positioning referent instrumental devices.
A mechanism for proven technology foresight for emerging fast reactor designs and concepts
DOE Office of Scientific and Technical Information (OSTI.GOV)
Anuar, Nuraslinda, E-mail: nuraslinda@uniten.edu.my; Muhamad Pauzi, Anas, E-mail: anas@uniten.edu.my
The assessment of emerging nuclear fast reactor designs and concepts viability requires a combination of foresight methods. A mechanism that allows for the comparison and quantification of the possibility of being a proven technology in the future, β for the existing fast reactor designs and concepts is proposed as one of the quantitative foresight method. The methodology starts with the identification at the national or regional level, of the factors that would affect β. The factors are then categorized into several groups; economic, social and technology elements. Each of the elements is proposed to be mathematically modelled before all ofmore » the elemental models can be combined. Once the overall β model is obtained, the β{sub min} is determined to benchmark the acceptance as a candidate design or concept. The β values for all the available designs and concepts are then determined and compared with the β{sub min}, resulting in a list of candidate designs that possess the β value that is larger than the β{sub min}. The proposed methodology can also be applied to purposes other than technological foresight.« less
National data elements for the clinical management of acute coronary syndromes.
Chew, Derek P B; Allan, Roger M; Aroney, Constantine N; Sheerin, Noella J
2005-05-02
Patients with acute coronary syndromes represent a clinically diverse group and their care remains heterogeneous. These patients account for a significant burden of morbidity and mortality in Australia. Optimal patient outcomes depend on rapid diagnosis, accurate risk stratification and the effective implementation of proven therapies, as advocated by clinical guidelines. The challenge is in effectively applying evidence in clinical practice. Objectivity and standardised quantification of clinical practice are essential in understanding the evidence-practice gap. Observational registries are key to understanding the link between evidence-based medicine, clinical practice and patient outcome. Data elements for monitoring clinical management of patients with acute coronary syndromes have been adapted from internationally accepted definitions and incorporated into the National Health Data Dictionary, the national standard for health data definitions in Australia. Widespread use of these data elements will assist in the local development of "quality-of-care" initiatives and performance indicators, facilitate collaboration in cardiovascular outcomes research, and aid in the development of electronic data collection methods.
Shivali, Garg; Praful, Lahorkar; Vijay, Gadgil
2012-01-01
Fourier transform infrared (FT-IR) spectroscopy is a technique widely used for detection and quantification of various chemical moieties. This paper describes the use of the FT-IR spectroscopy technique for the quantification of total lactones present in Inula racemosa and Andrographis paniculata. To validate the FT-IR spectroscopy method for quantification of total lactones in I. racemosa and A. paniculata. Dried and powdered I. racemosa roots and A. paniculata plant were extracted with ethanol and dried to remove ethanol completely. The ethanol extract was analysed in a KBr pellet by FT-IR spectroscopy. The FT-IR spectroscopy method was validated and compared with a known spectrophotometric method for quantification of lactones in A. paniculata. By FT-IR spectroscopy, the amount of total lactones was found to be 2.12 ± 0.47% (n = 3) in I. racemosa and 8.65 ± 0.51% (n = 3) in A. paniculata. The method showed comparable results with a known spectrophotometric method used for quantification of such lactones: 8.42 ± 0.36% (n = 3) in A. paniculata. Limits of detection and quantification for isoallantolactone were 1 µg and 10 µg respectively; for andrographolide they were 1.5 µg and 15 µg respectively. Recoveries were over 98%, with good intra- and interday repeatability: RSD ≤ 2%. The FT-IR spectroscopy method proved linear, accurate, precise and specific, with low limits of detection and quantification, for estimation of total lactones, and is less tedious than the UV spectrophotometric method for the compounds tested. This validated FT-IR spectroscopy method is readily applicable for the quality control of I. racemosa and A. paniculata. Copyright © 2011 John Wiley & Sons, Ltd.
NASA Astrophysics Data System (ADS)
Salary, Mohammad Mahdi; Mosallaei, Hossein
2015-06-01
Interactions between the plasmons of noble metal nanoparticles and non-absorbing biomolecules forms the basis of the plasmonic sensors, which have received much attention. Studying these interactions can help to exploit the full potentials of plasmonic sensors in quantification and analysis of biomolecules. Here, a quasi-static continuum model is adopted for this purpose. We present a boundary-element method for computing the optical response of plasmonic particles to the molecular binding events by solving the Poisson equation. The model represents biomolecules with their molecular surfaces, thus accurately accounting for the influence of exact binding conformations as well as structural differences between different proteins on the response of plasmonic nanoparticles. The linear systems arising in the method are solved iteratively with Krylov generalized minimum residual algorithm, and the acceleration is achieved by applying precorrected-Fast Fourier Transformation technique. We apply the developed method to investigate interactions of biotinylated gold nanoparticles (nanosphere and nanorod) with four different types of biotin-binding proteins. The interactions are studied at both ensemble and single-molecule level. Computational results demonstrate the ability of presented model for analyzing realistic nanoparticle-biomolecule configurations. The method can provide comprehensive study for wide variety of applications, including protein structures, monitoring structural and conformational transitions, and quantification of protein concentrations. In addition, it is suitable for design and optimization of the nano-plasmonic sensors.
Grate, Jay W; Gonzalez, Jhanis J; O'Hara, Matthew J; Kellogg, Cynthia M; Morrison, Samuel S; Koppenaal, David W; Chan, George C-Y; Mao, Xianglei; Zorba, Vassilia; Russo, Richard E
2017-09-08
Solid sampling and analysis methods, such as laser ablation inductively coupled plasma mass spectrometry (LA-ICP-MS), are challenged by matrix effects and calibration difficulties. Matrix-matched standards for external calibration are seldom available and it is difficult to distribute spikes evenly into a solid matrix as internal standards. While isotopic ratios of the same element can be measured to high precision, matrix-dependent effects in the sampling and analysis process frustrate accurate quantification and elemental ratio determinations. Here we introduce a potentially general solid matrix transformation approach entailing chemical reactions in molten ammonium bifluoride (ABF) salt that enables the introduction of spikes as tracers or internal standards. Proof of principle experiments show that the decomposition of uranium ore in sealed PFA fluoropolymer vials at 230 °C yields, after cooling, new solids suitable for direct solid sampling by LA. When spikes are included in the molten salt reaction, subsequent LA-ICP-MS sampling at several spots indicate that the spikes are evenly distributed, and that U-235 tracer dramatically improves reproducibility in U-238 analysis. Precisions improved from 17% relative standard deviation for U-238 signals to 0.1% for the ratio of sample U-238 to spiked U-235, a factor of over two orders of magnitude. These results introduce the concept of solid matrix transformation (SMT) using ABF, and provide proof of principle for a new method of incorporating internal standards into a solid for LA-ICP-MS. This new approach, SMT-LA-ICP-MS, provides opportunities to improve calibration and quantification in solids based analysis. Looking forward, tracer addition to transformed solids opens up LA-based methods to analytical methodologies such as standard addition, isotope dilution, preparation of matrix-matched solid standards, external calibration, and monitoring instrument drift against external calibration standards.
Stochastic Analysis and Design of Heterogeneous Microstructural Materials System
NASA Astrophysics Data System (ADS)
Xu, Hongyi
Advanced materials system refers to new materials that are comprised of multiple traditional constituents but complex microstructure morphologies, which lead to superior properties over the conventional materials. To accelerate the development of new advanced materials system, the objective of this dissertation is to develop a computational design framework and the associated techniques for design automation of microstructure materials systems, with an emphasis on addressing the uncertainties associated with the heterogeneity of microstructural materials. Five key research tasks are identified: design representation, design evaluation, design synthesis, material informatics and uncertainty quantification. Design representation of microstructure includes statistical characterization and stochastic reconstruction. This dissertation develops a new descriptor-based methodology, which characterizes 2D microstructures using descriptors of composition, dispersion and geometry. Statistics of 3D descriptors are predicted based on 2D information to enable 2D-to-3D reconstruction. An efficient sequential reconstruction algorithm is developed to reconstruct statistically equivalent random 3D digital microstructures. In design evaluation, a stochastic decomposition and reassembly strategy is developed to deal with the high computational costs and uncertainties induced by material heterogeneity. The properties of Representative Volume Elements (RVE) are predicted by stochastically reassembling SVE elements with stochastic properties into a coarse representation of the RVE. In design synthesis, a new descriptor-based design framework is developed, which integrates computational methods of microstructure characterization and reconstruction, sensitivity analysis, Design of Experiments (DOE), metamodeling and optimization the enable parametric optimization of the microstructure for achieving the desired material properties. Material informatics is studied to efficiently reduce the dimension of microstructure design space. This dissertation develops a machine learning-based methodology to identify the key microstructure descriptors that highly impact properties of interest. In uncertainty quantification, a comparative study on data-driven random process models is conducted to provide guidance for choosing the most accurate model in statistical uncertainty quantification. Two new goodness-of-fit metrics are developed to provide quantitative measurements of random process models' accuracy. The benefits of the proposed methods are demonstrated by the example of designing the microstructure of polymer nanocomposites. This dissertation provides material-generic, intelligent modeling/design methodologies and techniques to accelerate the process of analyzing and designing new microstructural materials system.
Evaluation of Downstream Regulatory Element Antagonistic Modulator Gene in Human Multinodular Goiter
Shinzato, Amanda; Lerario, Antonio M.; Lin, Chin J.; Danilovic, Debora S.; Marui, Suemi; Trarbach, Ericka B.
2015-01-01
Background DREAM (Downstream Regulatory Element Antagonistic Modulator) is a neuronal calcium sensor that was suggested to modulate TSH receptor activity and whose overexpression provokes an enlargement of the thyroid gland in transgenic mice. The aim of this study was to investigate somatic mutations and DREAM gene expression in human multinodular goiter (MNG). Material/Methods DNA and RNA samples were obtained from hyperplastic thyroid glands of 60 patients (54 females) with benign MNG. DREAM mutations were evaluated by PCR and direct automatic sequencing, whereas relative quantification of mRNA was performed by real-time PCR. Over- and under-expression were defined as a 2-fold increase and decrease in comparison to normal thyroid tissue, respectively. RQ M (relative quantification mean); SD (standard deviation). Results DREAM expression was detected in all nodules evaluated. DREAM mRNA was overexpressed in 31.7% of MNG (RQ M=6.26; SD=5.08), whereas 53.3% and 15% had either normal (RQ M=1.16; SD=0.46) or underexpression (RQ M=0.30; SD=0.10), respectively. Regarding DREAM mutations analysis, only previously described intronic polymorphisms were observed. Conclusions We report DREAM gene expression in the hyperplastic thyroid gland of MNG patients. However, DREAM expression did not vary significantly, and was somewhat underexpressed in most patients, suggesting that DREAM upregulation does not significantly affect nodular development in human goiter. PMID:26319784
Trace Elemental Imaging of Rare Earth Elements Discriminates Tissues at Microscale in Flat Fossils
Gueriau, Pierre; Mocuta, Cristian; Dutheil, Didier B.; Cohen, Serge X.; Thiaudière, Dominique; Charbonnier, Sylvain; Clément, Gaël; Bertrand, Loïc
2014-01-01
The interpretation of flattened fossils remains a major challenge due to compression of their complex anatomies during fossilization, making critical anatomical features invisible or hardly discernible. Key features are often hidden under greatly preserved decay prone tissues, or an unpreparable sedimentary matrix. A method offering access to such anatomical features is of paramount interest to resolve taxonomic affinities and to study fossils after a least possible invasive preparation. Unfortunately, the widely-used X-ray micro-computed tomography, for visualizing hidden or internal structures of a broad range of fossils, is generally inapplicable to flattened specimens, due to the very high differential absorbance in distinct directions. Here we show that synchrotron X-ray fluorescence spectral raster-scanning coupled to spectral decomposition or a much faster Kullback-Leibler divergence based statistical analysis provides microscale visualization of tissues. We imaged exceptionally well-preserved fossils from the Late Cretaceous without needing any prior delicate preparation. The contrasting elemental distributions greatly improved the discrimination of skeletal elements material from both the sedimentary matrix and fossilized soft tissues. Aside content in alkaline earth elements and phosphorus, a critical parameter for tissue discrimination is the distinct amounts of rare earth elements. Local quantification of rare earths may open new avenues for fossil description but also in paleoenvironmental and taphonomical studies. PMID:24489809
Trace elemental imaging of rare earth elements discriminates tissues at microscale in flat fossils.
Gueriau, Pierre; Mocuta, Cristian; Dutheil, Didier B; Cohen, Serge X; Thiaudière, Dominique; Charbonnier, Sylvain; Clément, Gaël; Bertrand, Loïc
2014-01-01
The interpretation of flattened fossils remains a major challenge due to compression of their complex anatomies during fossilization, making critical anatomical features invisible or hardly discernible. Key features are often hidden under greatly preserved decay prone tissues, or an unpreparable sedimentary matrix. A method offering access to such anatomical features is of paramount interest to resolve taxonomic affinities and to study fossils after a least possible invasive preparation. Unfortunately, the widely-used X-ray micro-computed tomography, for visualizing hidden or internal structures of a broad range of fossils, is generally inapplicable to flattened specimens, due to the very high differential absorbance in distinct directions. Here we show that synchrotron X-ray fluorescence spectral raster-scanning coupled to spectral decomposition or a much faster Kullback-Leibler divergence based statistical analysis provides microscale visualization of tissues. We imaged exceptionally well-preserved fossils from the Late Cretaceous without needing any prior delicate preparation. The contrasting elemental distributions greatly improved the discrimination of skeletal elements material from both the sedimentary matrix and fossilized soft tissues. Aside content in alkaline earth elements and phosphorus, a critical parameter for tissue discrimination is the distinct amounts of rare earth elements. Local quantification of rare earths may open new avenues for fossil description but also in paleoenvironmental and taphonomical studies.
Determination of uranium and thorium using gamma spectrometry: a pilot study
NASA Astrophysics Data System (ADS)
Olivares, D. M. M.; Koch, E. S.; Guevara, M. V. M.; Velasco, F. G.
2018-03-01
This paper presents the results of a pilot experiment aimed at standardizing procedures for the CPqCTR/UESC Gamma Spectrometry Laboratory (LEG) for the quantification of natural radioactive elements in solid environmental samples. The concentrations of 238U, 232Th and 40K in two sediment matrix belonging to the Caetité region were determined, by using the absolute method with uncertainties about 5%. The results were obtained using gamma spectrometry with a high-resolution p-type HPGe detector. As a closure, the absorbed dose, radium equivalent activity and the annual effective dose were calculated.
Study on the early warning mechanism for the security of blast furnace hearths
NASA Astrophysics Data System (ADS)
Zhao, Hong-bo; Huo, Shou-feng; Cheng, Shu-sen
2013-04-01
The campaign life of blast furnace (BF) hearths has become the limiting factor for safety and high efficiency production of modern BFs. However, the early warning mechanism of hearth security has not been clear. In this article, based on heat transfer calculations, heat flux and erosion monitoring, the features of heat flux and erosion were analyzed and compared among different types of hearths. The primary detecting elements, mathematical models, evaluating standards, and warning methods were discussed. A novel early warning mechanism with the three-level quantificational standards was proposed for BF hearth security.
Platelet kinetics and biodistribution in canine endotoxemia
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sostman, H.D.; Zoghbi, S.S.; Smith, G.J.
Kinetics and magnitudes of changes in indium-labeled platelet biodistribution were studied in dogs given E. coli endotoxin. Marked, reversible, dose-dependent shifts of platelets from blood to lung and apparently irreversible shifts to liver were demonstrated. These were contemporaneous with alterations in blood gases and in pulmonary and systemic hemodynamics. Morphologic studies revealed atelectasis, sequestration of leukocytes and platelets in the lungs, and mild interstitial pulmonary edema. This study provides in vivo quantification of labeled platelet response to a specific stimulus, and illustrates a method that could be applied to more extensive study of blood element participation in acute lung injury.
Recent application of quantification II in Japanese medical research.
Suzuki, T; Kudo, A
1979-01-01
Hayashi's Quantification II is a method of multivariate discrimination analysis to manipulate attribute data as predictor variables. It is very useful in the medical research field for estimation, diagnosis, prognosis, evaluation of epidemiological factors, and other problems based on multiplicity of attribute data. In Japan, this method is so well known that most of the computer program packages include the Hayashi Quantification, but it seems to be yet unfamiliar with the method for researchers outside Japan. In view of this situation, we introduced 19 selected articles of recent applications of the Quantification II in Japanese medical research. In reviewing these papers, special mention is made to clarify how the researchers were satisfied with findings provided by the method. At the same time, some recommendations are made about terminology and program packages. Also a brief discussion of the background of the quantification methods is given with special reference to the Behaviormetric Society of Japan. PMID:540587
Trace analysis of high-purity graphite by LA-ICP-MS.
Pickhardt, C; Becker, J S
2001-07-01
Laser-ablation inductively coupled plasma mass spectrometry (LA-ICP-MS) has been established as a very efficient and sensitive technique for the direct analysis of solids. In this work the capability of LA-ICP-MS was investigated for determination of trace elements in high-purity graphite. Synthetic laboratory standards with a graphite matrix were prepared for the purpose of quantifying the analytical results. Doped trace elements, concentration 0.5 microg g(-1), in a laboratory standard were determined with an accuracy of 1% to +/- 7% and a relative standard deviation (RSD) of 2-13%. Solution-based calibration was also used for quantitative analysis of high-purity graphite. It was found that such calibration led to analytical results for trace-element determination in graphite with accuracy similar to that obtained by use of synthetic laboratory standards for quantification of analytical results. Results from quantitative determination of trace impurities in a real reactor-graphite sample, using both quantification approaches, were in good agreement. Detection limits for all elements of interest were determined in the low ng g(-1) concentration range. Improvement of detection limits by a factor of 10 was achieved for analyses of high-purity graphite with LA-ICP-MS under wet plasma conditions, because the lower background signal and increased element sensitivity.
Rodak, Bruna Wurr; Freitas, Douglas Siqueira; Bamberg, Soraya Marx; Carneiro, Marco Aurélio Carbone; Guilherme, Luiz Roberto Guimarães
2017-01-01
The symbiosis between legumes, arbuscular mycorrhizal (AM) fungi, and N 2 -fixing bacteria (NFB) provides mutual nutritional gains. However, assessing the nutritional status of the microorganisms is a difficult task. A methodology that could assess this status, in situ, could assist managing these organisms in agriculture. This study used X-ray microanalyses to quantify and locate mineral elements in structures formed in a tripartite symbiosis. Lima bean (Phaseolus lunatus L. Walp) was cultivated in pots under greenhouse conditions, to which we have added AM fungal isolates (Glomus macrocarpum and Acaulospora colombiana) and NFB (Bradyrhizobium japonicum) inocula. Uninoculated control plants were also included. Symbionts were evaluated at the onset of flowering. Quantification of the mineral elements in the symbiotic components was performed using energy dispersive X-ray spectroscopy (EDX) and a scanning electron microscopy (SEM) was used to identify structures. EDX analysis detected 13 elements with the most abundant being N, Ca, and Se, occurring in all tissues, Fe in roots, Ni and Al in epidermis and P and Mo in nodules. Elemental quantification in fungal structures was not possible. The distribution of elements was related to their symbiotic function. X-ray microanalysis can be efficiently applied for nutritional diagnosis in tripartite symbiosis. Copyright © 2016 Elsevier B.V. All rights reserved.
Quantification of Training and Competition Loads in Endurance Sports: Methods and Applications.
Mujika, Iñigo
2017-04-01
Training quantification is basic to evaluate an endurance athlete's responses to training loads, ensure adequate stress/recovery balance, and determine the relationship between training and performance. Quantifying both external and internal workload is important, because external workload does not measure the biological stress imposed by the exercise sessions. Generally used quantification methods include retrospective questionnaires, diaries, direct observation, and physiological monitoring, often based on the measurement of oxygen uptake, heart rate, and blood lactate concentration. Other methods in use in endurance sports include speed measurement and the measurement of power output, made possible by recent technological advances such as power meters in cycling and triathlon. Among subjective methods of quantification, rating of perceived exertion stands out because of its wide use. Concurrent assessments of the various quantification methods allow researchers and practitioners to evaluate stress/recovery balance, adjust individual training programs, and determine the relationships between external load, internal load, and athletes' performance. This brief review summarizes the most relevant external- and internal-workload-quantification methods in endurance sports and provides practical examples of their implementation to adjust the training programs of elite athletes in accordance with their individualized stress/recovery balance.
Pineda, Gina M; Montgomery, Anne H; Thompson, Robyn; Indest, Brooke; Carroll, Marion; Sinha, Sudhir K
2014-11-01
There is a constant need in forensic casework laboratories for an improved way to increase the first-pass success rate of forensic samples. The recent advances in mini STR analysis, SNP, and Alu marker systems have now made it possible to analyze highly compromised samples, yet few tools are available that can simultaneously provide an assessment of quantity, inhibition, and degradation in a sample prior to genotyping. Currently there are several different approaches used for fluorescence-based quantification assays which provide a measure of quantity and inhibition. However, a system which can also assess the extent of degradation in a forensic sample will be a useful tool for DNA analysts. Possessing this information prior to genotyping will allow an analyst to more informatively make downstream decisions for the successful typing of a forensic sample without unnecessarily consuming DNA extract. Real-time PCR provides a reliable method for determining the amount and quality of amplifiable DNA in a biological sample. Alu are Short Interspersed Elements (SINE), approximately 300bp insertions which are distributed throughout the human genome in large copy number. The use of an internal primer to amplify a segment of an Alu element allows for human specificity as well as high sensitivity when compared to a single copy target. The advantage of an Alu system is the presence of a large number (>1000) of fixed insertions in every human genome, which minimizes the individual specific variation possible when using a multi-copy target quantification system. This study utilizes two independent retrotransposon genomic targets to obtain quantification of an 80bp "short" DNA fragment and a 207bp "long" DNA fragment in a degraded DNA sample in the multiplex system InnoQuant™. The ratio of the two quantitation values provides a "Degradation Index", or a qualitative measure of a sample's extent of degradation. The Degradation Index was found to be predictive of the observed loss of STR markers and alleles as degradation increases. Use of a synthetic target as an internal positive control (IPC) provides an additional assessment for the presence of PCR inhibitors in the test sample. In conclusion, a DNA based qualitative/quantitative/inhibition assessment system that accurately predicts the status of a biological sample, will be a valuable tool for deciding which DNA test kit to utilize and how much target DNA to use, when processing compromised forensic samples for DNA testing. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Newbury, Dale E.; Ritchie, Nicholas W. M.
2012-06-01
Scanning electron microscopy with energy dispersive x-ray spectrometry (SEM/EDS) is a powerful and flexible elemental analysis method that can identify and quantify elements with atomic numbers > 4 (Be) present as major constituents (where the concentration C > 0.1 mass fraction, or 10 weight percent), minor (0.01<= C <= 0.1) and trace (C < 0.01, with a minimum detectable limit of ~+/- 0.0005 - 0.001 under routine measurement conditions, a level which is analyte and matrix dependent ). SEM/EDS can select specimen volumes with linear dimensions from ~ 500 nm to 5 μm depending on composition (masses ranging from ~ 10 pg to 100 pg) and can provide compositional maps that depict lateral elemental distributions. Despite the maturity of SEM/EDS, which has a history of more than 40 years, and the sophistication of modern analytical software, the method is vulnerable to serious shortcomings that can lead to incorrect elemental identifications and quantification errors that significantly exceed reasonable expectations. This paper will describe shortcomings in peak identification procedures, limitations on the accuracy of quantitative analysis due to specimen topography or failures in physical models for matrix corrections, and quantitative artifacts encountered in xray elemental mapping. Effective solutions to these problems are based on understanding the causes and then establishing appropriate measurement science protocols. NIST DTSA II and Lispix are open source analytical software available free at www.nist.gov that can aid the analyst in overcoming significant limitations to SEM/EDS.
Osborn, Sarah; Zulian, Patrick; Benson, Thomas; ...
2018-01-30
This work describes a domain embedding technique between two nonmatching meshes used for generating realizations of spatially correlated random fields with applications to large-scale sampling-based uncertainty quantification. The goal is to apply the multilevel Monte Carlo (MLMC) method for the quantification of output uncertainties of PDEs with random input coefficients on general and unstructured computational domains. We propose a highly scalable, hierarchical sampling method to generate realizations of a Gaussian random field on a given unstructured mesh by solving a reaction–diffusion PDE with a stochastic right-hand side. The stochastic PDE is discretized using the mixed finite element method on anmore » embedded domain with a structured mesh, and then, the solution is projected onto the unstructured mesh. This work describes implementation details on how to efficiently transfer data from the structured and unstructured meshes at coarse levels, assuming that this can be done efficiently on the finest level. We investigate the efficiency and parallel scalability of the technique for the scalable generation of Gaussian random fields in three dimensions. An application of the MLMC method is presented for quantifying uncertainties of subsurface flow problems. Here, we demonstrate the scalability of the sampling method with nonmatching mesh embedding, coupled with a parallel forward model problem solver, for large-scale 3D MLMC simulations with up to 1.9·109 unknowns.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Osborn, Sarah; Zulian, Patrick; Benson, Thomas
This work describes a domain embedding technique between two nonmatching meshes used for generating realizations of spatially correlated random fields with applications to large-scale sampling-based uncertainty quantification. The goal is to apply the multilevel Monte Carlo (MLMC) method for the quantification of output uncertainties of PDEs with random input coefficients on general and unstructured computational domains. We propose a highly scalable, hierarchical sampling method to generate realizations of a Gaussian random field on a given unstructured mesh by solving a reaction–diffusion PDE with a stochastic right-hand side. The stochastic PDE is discretized using the mixed finite element method on anmore » embedded domain with a structured mesh, and then, the solution is projected onto the unstructured mesh. This work describes implementation details on how to efficiently transfer data from the structured and unstructured meshes at coarse levels, assuming that this can be done efficiently on the finest level. We investigate the efficiency and parallel scalability of the technique for the scalable generation of Gaussian random fields in three dimensions. An application of the MLMC method is presented for quantifying uncertainties of subsurface flow problems. Here, we demonstrate the scalability of the sampling method with nonmatching mesh embedding, coupled with a parallel forward model problem solver, for large-scale 3D MLMC simulations with up to 1.9·109 unknowns.« less
Quantitative proteome analysis using isobaric peptide termini labeling (IPTL).
Arntzen, Magnus O; Koehler, Christian J; Treumann, Achim; Thiede, Bernd
2011-01-01
The quantitative comparison of proteome level changes across biological samples has become an essential feature in proteomics that remains challenging. We have recently introduced isobaric peptide termini labeling (IPTL), a novel strategy for isobaric quantification based on the derivatization of peptide termini with complementary isotopically labeled reagents. Unlike non-isobaric quantification methods, sample complexity at the MS level is not increased, providing improved sensitivity and protein coverage. The distinguishing feature of IPTL when comparing it to more established isobaric labeling methods (iTRAQ and TMT) is the presence of quantification signatures in all sequence-determining ions in MS/MS spectra, not only in the low mass reporter ion region. This makes IPTL a quantification method that is accessible to mass spectrometers with limited capabilities in the low mass range. Also, the presence of several quantification points in each MS/MS spectrum increases the robustness of the quantification procedure.
Analysis of coke beverages by total-reflection X-ray fluorescence
NASA Astrophysics Data System (ADS)
Fernández-Ruiz, Ramón; von Bohlen, Alex; Friedrich K, E. Josue; Redrejo, M. J.
2018-07-01
The influence of the organic content, sample preparation process and the morphology of the depositions of two types of Coke beverage, traditional and light Coke, have been investigated by mean of Total-reflection X-ray Fluorescence (TXRF) spectrometry. Strong distortions of the nominal concentration values, up to 128% for P, have been detected in the analysis of traditional Coke by different preparation methods. These differences have been correlated with the edge X-ray energies of the elements analyzed being more pronounced for the lighter elements. The influence of the organic content (mainly sugar) was evaluated comparing traditional and light Coke analytical TXRF results. Three sample preparation methods have been evaluated as follows: direct TXRF analysis of the sample only adding internal standard, TXRF analysis after open vessel acid digestion and TXRF analysis after high pressure and temperature microwave-assisted acid digestion. Strong correlations were detected between quantitative results, methods of preparation and energies of the X-ray absorption edges of quantified elements. In this way, a decay behavior for the concentration differences between preparation methods and the energies of the X-ray absorption edges of each element were observed. The observed behaviors were modeled with exponential decay functions obtaining R2 correlation coefficients from 0.989 to 0.992. The strong absorption effect observed, and even possible matrix effect, can be explained by the inherent high organic content of the evaluated samples and also by the morphology and average thickness of the TXRF depositions observed. As main conclusion of this work, the analysis of light elements in samples with high organic content by TXRF, i.e. medical, biological, food or any other organic matrixes should be taken carefully. In any case, the direct analysis is not recommended and a previous microwave-assisted acid digestion, or similar, is mandatory, for the correct elemental quantification by TXRF.
IDHEAS – A NEW APPROACH FOR HUMAN RELIABILITY ANALYSIS
DOE Office of Scientific and Technical Information (OSTI.GOV)
G. W. Parry; J.A Forester; V.N. Dang
2013-09-01
This paper describes a method, IDHEAS (Integrated Decision-Tree Human Event Analysis System) that has been developed jointly by the US NRC and EPRI as an improved approach to Human Reliability Analysis (HRA) that is based on an understanding of the cognitive mechanisms and performance influencing factors (PIFs) that affect operator responses. The paper describes the various elements of the method, namely the performance of a detailed cognitive task analysis that is documented in a crew response tree (CRT), and the development of the associated time-line to identify the critical tasks, i.e. those whose failure results in a human failure eventmore » (HFE), and an approach to quantification that is based on explanations of why the HFE might occur.« less
Probabilistic structural analysis methods of hot engine structures
NASA Technical Reports Server (NTRS)
Chamis, C. C.; Hopkins, D. A.
1989-01-01
Development of probabilistic structural analysis methods for hot engine structures at Lewis Research Center is presented. Three elements of the research program are: (1) composite load spectra methodology; (2) probabilistic structural analysis methodology; and (3) probabilistic structural analysis application. Recent progress includes: (1) quantification of the effects of uncertainties for several variables on high pressure fuel turbopump (HPFT) turbine blade temperature, pressure, and torque of the space shuttle main engine (SSME); (2) the evaluation of the cumulative distribution function for various structural response variables based on assumed uncertainties in primitive structural variables; and (3) evaluation of the failure probability. Collectively, the results demonstrate that the structural durability of hot engine structural components can be effectively evaluated in a formal probabilistic/reliability framework.
NASA Astrophysics Data System (ADS)
Passos, Tassia R. G.; Artur, Adriana G.; Nóbrega, Gabriel N.; Otero, Xosé L.; Ferreira, Tiago O.
2016-06-01
The performance of the Walkley-Black wet oxidation chemical method for soil organic carbon (SOC) determination in coastal wetland soils (mangroves, coastal lagoons, and hypersaline tidal flats) was evaluated in the state of Ceará along the semiarid coast of Brazil, assessing pyrite oxidation and its effects on soil C stock (SCS) quantification. SOC determined by the chemical oxidation method (CWB) was compared to that assessed by means of a standard elemental analyzer (CEA) for surficial samples (<30 cm depth) from the three wetland settings. The pyrite fraction was quantified in various steps of the chemical oxidation method, evaluating the effects of pyrite oxidation. Regardless of the method used, and consistent with site-specific physicochemical conditions, higher pyrite and SOC contents were recorded in the mangroves, whereas lower values were found in the other settings. CWB values were higher than CEA values. Significant differences in SCS calculations based on CWB and CEA were recorded for the coastal lagoons and hypersaline tidal flats. Nevertheless, the CWB and CEA values were strongly correlated, indicating that the wet oxidation chemical method can be used in such settings. In contrast, the absence of correlation for the mangroves provides evidence of the inadequacy of this method for these soils. Air drying and oxidation decrease the pyrite content, with larger effects rooted in oxidation. Thus, the wet oxidation chemical method is not recommended for mangrove soils, but seems appropriate for SOC/SCS quantification in hypersaline tidal flat and coastal lagoon soils characterized by lower pyrite contents.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Part, Florian; Zecha, Gudrun; Causon, Tim
Highlights: • First review on detection of nanomaterials in complex waste samples. • Focus on nanoparticles in solid, liquid and gaseous waste samples. • Summary of current applicable methods for nanowaste detection and characterisation. • Limitations and challenges of characterisation of nanoparticles in waste. - Abstract: Engineered nanomaterials (ENMs) are already extensively used in diverse consumer products. Along the life cycle of a nano-enabled product, ENMs can be released and subsequently accumulate in the environment. Material flow models also indicate that a variety of ENMs may accumulate in waste streams. Therefore, a new type of waste, so-called nanowaste, is generatedmore » when end-of-life ENMs and nano-enabled products are disposed of. In terms of the precautionary principle, environmental monitoring of end-of-life ENMs is crucial to allow assessment of the potential impact of nanowaste on our ecosystem. Trace analysis and quantification of nanoparticulate species is very challenging because of the variety of ENM types that are used in products and low concentrations of nanowaste expected in complex environmental media. In the framework of this paper, challenges in nanowaste characterisation and appropriate analytical techniques which can be applied to nanowaste analysis are summarised. Recent case studies focussing on the characterisation of ENMs in waste streams are discussed. Most studies aim to investigate the fate of nanowaste during incineration, particularly considering aerosol measurements; whereas, detailed studies focusing on the potential release of nanowaste during waste recycling processes are currently not available. In terms of suitable analytical methods, separation techniques coupled to spectrometry-based methods are promising tools to detect nanowaste and determine particle size distribution in liquid waste samples. Standardised leaching protocols can be applied to generate soluble fractions stemming from solid wastes, while micro- and ultrafiltration can be used to enrich nanoparticulate species. Imaging techniques combined with X-ray-based methods are powerful tools for determining particle size, morphology and screening elemental composition. However, quantification of nanowaste is currently hampered due to the problem to differentiate engineered from naturally-occurring nanoparticles. A promising approach to face these challenges in nanowaste characterisation might be the application of nanotracers with unique optical properties, elemental or isotopic fingerprints. At present, there is also a need to develop and standardise analytical protocols regarding nanowaste sampling, separation and quantification. In general, more experimental studies are needed to examine the fate and transport of ENMs in waste streams and to deduce transfer coefficients, respectively to develop reliable material flow models.« less
Nanoparticle size detection limits by single particle ICP-MS for 40 elements.
Lee, Sungyun; Bi, Xiangyu; Reed, Robert B; Ranville, James F; Herckes, Pierre; Westerhoff, Paul
2014-09-02
The quantification and characterization of natural, engineered, and incidental nano- to micro-size particles are beneficial to assessing a nanomaterial's performance in manufacturing, their fate and transport in the environment, and their potential risk to human health. Single particle inductively coupled plasma mass spectrometry (spICP-MS) can sensitively quantify the amount and size distribution of metallic nanoparticles suspended in aqueous matrices. To accurately obtain the nanoparticle size distribution, it is critical to have knowledge of the size detection limit (denoted as Dmin) using spICP-MS for a wide range of elements (other than a few available assessed ones) that have been or will be synthesized into engineered nanoparticles. Herein is described a method to estimate the size detection limit using spICP-MS and then apply it to nanoparticles composed of 40 different elements. The calculated Dmin values correspond well for a few of the elements with their detectable sizes that are available in the literature. Assuming each nanoparticle sample is composed of one element, Dmin values vary substantially among the 40 elements: Ta, U, Ir, Rh, Th, Ce, and Hf showed the lowest Dmin values, ≤10 nm; Bi, W, In, Pb, Pt, Ag, Au, Tl, Pd, Y, Ru, Cd, and Sb had Dmin in the range of 11-20 nm; Dmin values of Co, Sr, Sn, Zr, Ba, Te, Mo, Ni, V, Cu, Cr, Mg, Zn, Fe, Al, Li, and Ti were located at 21-80 nm; and Se, Ca, and Si showed high Dmin values, greater than 200 nm. A range of parameters that influence the Dmin, such as instrument sensitivity, nanoparticle density, and background noise, is demonstrated. It is observed that, when the background noise is low, the instrument sensitivity and nanoparticle density dominate the Dmin significantly. Approaches for reducing the Dmin, e.g., collision cell technology (CCT) and analyte isotope selection, are also discussed. To validate the Dmin estimation approach, size distributions for three engineered nanoparticle samples were obtained using spICP-MS. The use of this methodology confirms that the observed minimum detectable sizes are consistent with the calculated Dmin values. Overall, this work identifies the elements and nanoparticles to which current spICP-MS approaches can be applied, in order to enable quantification of very small nanoparticles at low concentrations in aqueous media.
Analysis of nanoparticles with an optical sensor based on carbon nanotubes
NASA Astrophysics Data System (ADS)
Stäb, J.; Furin, D.; Fechner, P.; Proll, G.; Soriano-Dotor, L. M.; Ruiz-Palomero, C.; Valcárcel, M.; Gauglitz, G.
2017-05-01
Nanomaterials play an important role in science and in every day products. This is due to their varied and specific properties, whereby especially engineered nanoparticles (ENPs) have shown various beneficial properties for a wide range of application in consumables (e.g. cosmetics, drinks, food and food packaging). Silver nanoparticles for instance are hidden in meat packaging materials or in deodorants. Reasons for this can be found in the antibacterial effect of silver, which leads to high applicability in consumer products. However, ENPs are under permanent discussion due to their unforeseen hazards and an unknown disposition in living organisms and the environment. So far, there is a lack of methods, which allows for the fast and effective characterization and quantification of such nanoparticles in complex matrices (e.g. creams, fruit juice), since matrix components can impede a specific detection of the analyte. It was the objective of project INSTANT to address this topic and compose a method to detect nanoparticles as a first step. Therefore, the development of a sensor system with an upstream sample preparation for the characterization and quantification of specific nanoparticles in complex matrices using a label free optical sensor array in combination with novel recognition elements was developed. The promising optical technology iRIfS (imaging reflectometric interference sensor) was used for this purpose. As a recognition element, functionalized carbon nanotubes can be effectively used. Owing to their excellent electronical, mechanical and chemical properties, CNTs have already been used for extracting ENPs from complex matrices as sorbent material by filtration. After successful immobilization of CNTs on microscope glass slides e.g. the detection of stabilized silver nanoparticles extracted by a sample preparation unit using the iRIfS technology was performed.
Quantifying construction and demolition waste: An analytical review
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wu, Zezhou; Yu, Ann T.W., E-mail: bsannyu@polyu.edu.hk; Shen, Liyin
2014-09-15
Highlights: • Prevailing C and D waste quantification methodologies are identified and compared. • One specific methodology cannot fulfill all waste quantification scenarios. • A relevance tree for appropriate quantification methodology selection is proposed. • More attentions should be paid to civil and infrastructural works. • Classified information is suggested for making an effective waste management plan. - Abstract: Quantifying construction and demolition (C and D) waste generation is regarded as a prerequisite for the implementation of successful waste management. In literature, various methods have been employed to quantify the C and D waste generation at both regional and projectmore » levels. However, an integrated review that systemically describes and analyses all the existing methods has yet to be conducted. To bridge this research gap, an analytical review is conducted. Fifty-seven papers are retrieved based on a set of rigorous procedures. The characteristics of the selected papers are classified according to the following criteria - waste generation activity, estimation level and quantification methodology. Six categories of existing C and D waste quantification methodologies are identified, including site visit method, waste generation rate method, lifetime analysis method, classification system accumulation method, variables modelling method and other particular methods. A critical comparison of the identified methods is given according to their characteristics and implementation constraints. Moreover, a decision tree is proposed for aiding the selection of the most appropriate quantification method in different scenarios. Based on the analytical review, limitations of previous studies and recommendations of potential future research directions are further suggested.« less
2014-01-01
Background Various computer-based methods exist for the detection and quantification of protein spots in two dimensional gel electrophoresis images. Area-based methods are commonly used for spot quantification: an area is assigned to each spot and the sum of the pixel intensities in that area, the so-called volume, is used a measure for spot signal. Other methods use the optical density, i.e. the intensity of the most intense pixel of a spot, or calculate the volume from the parameters of a fitted function. Results In this study we compare the performance of different spot quantification methods using synthetic and real data. We propose a ready-to-use algorithm for spot detection and quantification that uses fitting of two dimensional Gaussian function curves for the extraction of data from two dimensional gel electrophoresis (2-DE) images. The algorithm implements fitting using logical compounds and is computationally efficient. The applicability of the compound fitting algorithm was evaluated for various simulated data and compared with other quantification approaches. We provide evidence that even if an incorrect bell-shaped function is used, the fitting method is superior to other approaches, especially when spots overlap. Finally, we validated the method with experimental data of urea-based 2-DE of Aβ peptides andre-analyzed published data sets. Our methods showed higher precision and accuracy than other approaches when applied to exposure time series and standard gels. Conclusion Compound fitting as a quantification method for 2-DE spots shows several advantages over other approaches and could be combined with various spot detection methods. The algorithm was scripted in MATLAB (Mathworks) and is available as a supplemental file. PMID:24915860
Aeras: A next generation global atmosphere model
Spotz, William F.; Smith, Thomas M.; Demeshko, Irina P.; ...
2015-06-01
Sandia National Laboratories is developing a new global atmosphere model named Aeras that is performance portable and supports the quantification of uncertainties. These next-generation capabilities are enabled by building Aeras on top of Albany, a code base that supports the rapid development of scientific application codes while leveraging Sandia's foundational mathematics and computer science packages in Trilinos and Dakota. Embedded uncertainty quantification (UQ) is an original design capability of Albany, and performance portability is a recent upgrade. Other required features, such as shell-type elements, spectral elements, efficient explicit and semi-implicit time-stepping, transient sensitivity analysis, and concurrent ensembles, were not componentsmore » of Albany as the project began, and have been (or are being) added by the Aeras team. We present early UQ and performance portability results for the shallow water equations.« less
NASA Astrophysics Data System (ADS)
Bourgeat, Pierrick; Dore, Vincent; Fripp, Jurgen; Villemagne, Victor L.; Rowe, Chris C.; Salvado, Olivier
2015-03-01
With the advances of PET tracers for β-Amyloid (Aβ) detection in neurodegenerative diseases, automated quantification methods are desirable. For clinical use, there is a great need for PET-only quantification method, as MR images are not always available. In this paper, we validate a previously developed PET-only quantification method against MR-based quantification using 6 tracers: 18F-Florbetaben (N=148), 18F-Florbetapir (N=171), 18F-NAV4694 (N=47), 18F-Flutemetamol (N=180), 11C-PiB (N=381) and 18F-FDG (N=34). The results show an overall mean absolute percentage error of less than 5% for each tracer. The method has been implemented as a remote service called CapAIBL (http://milxcloud.csiro.au/capaibl). PET images are uploaded to a cloud platform where they are spatially normalised to a standard template and quantified. A report containing global as well as local quantification, along with surface projection of the β-Amyloid deposition is automatically generated at the end of the pipeline and emailed to the user.
How to quantify conduits in wood?
Scholz, Alexander; Klepsch, Matthias; Karimi, Zohreh; Jansen, Steven
2013-01-01
Vessels and tracheids represent the most important xylem cells with respect to long distance water transport in plants. Wood anatomical studies frequently provide several quantitative details of these cells, such as vessel diameter, vessel density, vessel element length, and tracheid length, while important information on the three dimensional structure of the hydraulic network is not considered. This paper aims to provide an overview of various techniques, although there is no standard protocol to quantify conduits due to high anatomical variation and a wide range of techniques available. Despite recent progress in image analysis programs and automated methods for measuring cell dimensions, density, and spatial distribution, various characters remain time-consuming and tedious. Quantification of vessels and tracheids is not only important to better understand functional adaptations of tracheary elements to environment parameters, but will also be essential for linking wood anatomy with other fields such as wood development, xylem physiology, palaeobotany, and dendrochronology.
Attomole quantitation of protein separations with accelerator mass spectrometry
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vogel, J S; Grant, P G; Buccholz, B A
2000-12-15
Quantification of specific proteins depends on separation by chromatography or electrophoresis followed by chemical detection schemes such as staining and fluorophore adhesion. Chemical exchange of short-lived isotopes, particularly sulfur, is also prevalent despite the inconveniences of counting radioactivity. Physical methods based on isotopic and elemental analyses offer highly sensitive protein quantitation that has linear response over wide dynamic ranges and is independent of protein conformation. Accelerator mass spectrometry quantifies long-lived isotopes such as 14C to sub-attomole sensitivity. We quantified protein interactions with small molecules such as toxins, vitamins, and natural biochemicals at precisions of 1-5% . Micro-proton-induced-xray-emission quantifies elemental abundancesmore » in separated metalloprotein samples to nanogram amounts and is capable of quantifying phosphorylated loci in gels. Accelerator-based quantitation is a possible tool for quantifying the genome translation into proteome.« less
RNA-Skim: a rapid method for RNA-Seq quantification at transcript level
Zhang, Zhaojun; Wang, Wei
2014-01-01
Motivation: RNA-Seq technique has been demonstrated as a revolutionary means for exploring transcriptome because it provides deep coverage and base pair-level resolution. RNA-Seq quantification is proven to be an efficient alternative to Microarray technique in gene expression study, and it is a critical component in RNA-Seq differential expression analysis. Most existing RNA-Seq quantification tools require the alignments of fragments to either a genome or a transcriptome, entailing a time-consuming and intricate alignment step. To improve the performance of RNA-Seq quantification, an alignment-free method, Sailfish, has been recently proposed to quantify transcript abundances using all k-mers in the transcriptome, demonstrating the feasibility of designing an efficient alignment-free method for transcriptome quantification. Even though Sailfish is substantially faster than alternative alignment-dependent methods such as Cufflinks, using all k-mers in the transcriptome quantification impedes the scalability of the method. Results: We propose a novel RNA-Seq quantification method, RNA-Skim, which partitions the transcriptome into disjoint transcript clusters based on sequence similarity, and introduces the notion of sig-mers, which are a special type of k-mers uniquely associated with each cluster. We demonstrate that the sig-mer counts within a cluster are sufficient for estimating transcript abundances with accuracy comparable with any state-of-the-art method. This enables RNA-Skim to perform transcript quantification on each cluster independently, reducing a complex optimization problem into smaller optimization tasks that can be run in parallel. As a result, RNA-Skim uses <4% of the k-mers and <10% of the CPU time required by Sailfish. It is able to finish transcriptome quantification in <10 min per sample by using just a single thread on a commodity computer, which represents >100 speedup over the state-of-the-art alignment-based methods, while delivering comparable or higher accuracy. Availability and implementation: The software is available at http://www.csbio.unc.edu/rs. Contact: weiwang@cs.ucla.edu Supplementary information: Supplementary data are available at Bioinformatics online. PMID:24931995
Assessment of cardiac fibrosis: a morphometric method comparison for collagen quantification.
Schipke, Julia; Brandenberger, Christina; Rajces, Alexandra; Manninger, Martin; Alogna, Alessio; Post, Heiner; Mühlfeld, Christian
2017-04-01
Fibrotic remodeling of the heart is a frequent condition linked to various diseases and cardiac dysfunction. Collagen quantification is an important objective in cardiac fibrosis research; however, a variety of different histological methods are currently used that may differ in accuracy. Here, frequently applied collagen quantification techniques were compared. A porcine model of early stage heart failure with preserved ejection fraction was used as an example. Semiautomated threshold analyses were imprecise, mainly due to inclusion of noncollagen structures or failure to detect certain collagen deposits. In contrast, collagen assessment by automated image analysis and light microscopy (LM)-stereology was more sensitive. Depending on the quantification method, the amount of estimated collagen varied and influenced intergroup comparisons. PicroSirius Red, Masson's trichrome, and Azan staining protocols yielded similar results, whereas the measured collagen area increased with increasing section thickness. Whereas none of the LM-based methods showed significant differences between the groups, electron microscopy (EM)-stereology revealed a significant collagen increase between cardiomyocytes in the experimental group, but not at other localizations. In conclusion, in contrast to the staining protocol, section thickness and the quantification method being used directly influence the estimated collagen content and thus, possibly, intergroup comparisons. EM in combination with stereology is a precise and sensitive method for collagen quantification if certain prerequisites are considered. For subtle fibrotic alterations, consideration of collagen localization may be necessary. Among LM methods, LM-stereology and automated image analysis are appropriate to quantify fibrotic changes, the latter depending on careful control of algorithm and comparable section staining. NEW & NOTEWORTHY Direct comparison of frequently applied histological fibrosis assessment techniques revealed a distinct relation of measured collagen and utilized quantification method as well as section thickness. Besides electron microscopy-stereology, which was precise and sensitive, light microscopy-stereology and automated image analysis proved to be appropriate for collagen quantification. Moreover, consideration of collagen localization might be important in revealing minor fibrotic changes. Copyright © 2017 the American Physiological Society.
Sakan, Sanja; Popović, Aleksandar; Škrivanj, Sandra; Sakan, Nenad; Đorđević, Dragana
2016-11-01
Metals in sediments are present in different chemical forms which affect their ability to transfer. The objective of this body of work was to compare different extraction methods for the bioavailability evaluation of some elements, such as Ba, Cd, Co, Cr, Cu, Fe, K, Mg, Mn, Ni, Pb, V and Zn from Serbian river sediments. A bioavailability risk assessment index (BRAI) was used for the quantification of heavy metal bioavailability in the sediments. Actual and potential element availability was assessed by single extractions with mild (CaCl 2 and CH 3 COONH 4 ) and acidic (CH 3 COOH) extractants and complexing agents (EDTA). Aqua regia extraction was used for the determination of the pseudo-total element content in river sediments. In different single extraction tests, higher extraction of Cd, Cu, Zn and Pb was observed than for the other elements. The results of the single extraction tests revealed that there is a considerable chance of metal leaching from the sediments assessed in this study. When the BRAI was applied, the results showed a high risk of heavy metal bioavailability in Serbian river sediments.
A phase quantification method based on EBSD data for a continuously cooled microalloyed steel
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhao, H.; Wynne, B.P.; Palmiere, E.J., E-mail: e.j
2017-01-15
Mechanical properties of steels depend on the phase constitutions of the final microstructures which can be related to the processing parameters. Therefore, accurate quantification of different phases is necessary to investigate the relationships between processing parameters, final microstructures and mechanical properties. Point counting on micrographs observed by optical or scanning electron microscopy is widely used as a phase quantification method, and different phases are discriminated according to their morphological characteristics. However, it is difficult to differentiate some of the phase constituents with similar morphology. Differently, for EBSD based phase quantification methods, besides morphological characteristics, other parameters derived from the orientationmore » information can also be used for discrimination. In this research, a phase quantification method based on EBSD data in the unit of grains was proposed to identify and quantify the complex phase constitutions of a microalloyed steel subjected to accelerated coolings. Characteristics of polygonal ferrite/quasi-polygonal ferrite, acicular ferrite and bainitic ferrite on grain averaged misorientation angles, aspect ratios, high angle grain boundary fractions and grain sizes were analysed and used to develop the identification criteria for each phase. Comparing the results obtained by this EBSD based method and point counting, it was found that this EBSD based method can provide accurate and reliable phase quantification results for microstructures with relatively slow cooling rates. - Highlights: •A phase quantification method based on EBSD data in the unit of grains was proposed. •The critical grain area above which GAM angles are valid parameters was obtained. •Grain size and grain boundary misorientation were used to identify acicular ferrite. •High cooling rates deteriorate the accuracy of this EBSD based method.« less
Application of Hyphenated Techniques in Speciation Analysis of Arsenic, Antimony, and Thallium
Michalski, Rajmund; Szopa, Sebastian; Jabłońska, Magdalena; Łyko, Aleksandra
2012-01-01
Due to the fact that metals and metalloids have a strong impact on the environment, the methods of their determination and speciation have received special attention in recent years. Arsenic, antimony, and thallium are important examples of such toxic elements. Their speciation is especially important in the environmental and biomedical fields because of their toxicity, bioavailability, and reactivity. Recently, speciation analytics has been playing a unique role in the studies of biogeochemical cycles of chemical compounds, determination of toxicity and ecotoxicity of selected elements, quality control of food products, control of medicines and pharmaceutical products, technological process control, research on the impact of technological installation on the environment, examination of occupational exposure, and clinical analysis. Conventional methods are usually labor intensive, time consuming, and susceptible to interferences. The hyphenated techniques, in which separation method is coupled with multidimensional detectors, have become useful alternatives. The main advantages of those techniques consist in extremely low detection and quantification limits, insignificant interference, influence as well as high precision and repeatability of the determinations. In view of their importance, the present work overviews and discusses different hyphenated techniques used for arsenic, antimony, and thallium species analysis, in different clinical, environmental and food matrices. PMID:22654649
Brix, Kristina; Hein, Christina; Sander, Jonas Michael; Kautenburger, Ralf
2017-05-15
The determination of iodine as a main fission product (especially the isotopes I-129 and I-131) of stored HLW in a disposal beside its distribution as a natural ingredient of many different products like milk, food and seawater is a matter of particular interest. The simultaneous ICP-MS determination of iodine as iodide together with other elements (especially higher valent metal ions) relevant for HLW is analytically very problematic. A reliable ICP-MS quantification of iodide must be performed at neutral or alkaline conditions in contrast to the analysis of metal ions which are determined in acidic pH ranges. Herein, we present a method to solve this problem by changing the iodine speciation resulting in an ICP-MS determination of iodide as iodate. The oxidation from iodide to iodate with sodium hypochlorite at room temperature is a fast and convenient method with flexible reaction time, from one hour up to three days, thus eliminating the disadvantages of quantifying iodine species via ICP-MS. In the analysed concentration range of iodine (0.1-100µgL -1 ) we obtain likely quantitative recovery rates for iodine between 91% and 102% as well as relatively low RSD values (0.3-4.0%). As an additional result, it is possible to measure different other element species in parallel together with the generated iodate, even high valent metals (europium and uranium beside caesium) at recovery rates in the same order of magnitude (93-104%). In addition, the oxidation process operates above pH 7 thus offering a wide pH range for sample preparation. Even analytes in complex matrices, like 5M saline (NaCl) solution or artificial cement pore water (ACW) can be quantified with this robust sample preparation method. Copyright © 2017 Elsevier B.V. All rights reserved.
Cycling transport safety quantification
NASA Astrophysics Data System (ADS)
Drbohlav, Jiri; Kocourek, Josef
2018-05-01
Dynamic interest in cycling transport brings the necessity to design safety cycling infrastructure. In las few years, couple of norms with safety elements have been designed and suggested for the cycling infrastructure. But these were not fully examined. The main parameter of suitable and fully functional transport infrastructure is the evaluation of its safety. Common evaluation of transport infrastructure safety is based on accident statistics. These statistics are suitable for motor vehicle transport but unsuitable for the cycling transport. Cycling infrastructure evaluation of safety is suitable for the traffic conflicts monitoring. The results of this method are fast, based on real traffic situations and can be applied on any traffic situations.
Direct quantification of long-term rock nitrogen inputs to temperate forest ecosystems.
Morford, Scott L; Houlton, Benjamin Z; Dahlgren, Randy A
2016-01-01
Sedimentary and metasedimentary rocks contain large reservoirs of fixed nitrogen (N), but questions remain over the importance of rock N weathering inputs in terrestrial ecosystems. Here we provide direct evidence for rock N weathering (i.e., loss of N from rock) in three temperate forest sites residing on a N-rich parent material (820-1050 mg N kg(-1); mica schist) in the Klamath Mountains (northern California and southern Oregon), USA. Our method combines a mass balance model of element addition/ depletion with a procedure for quantifying fixed N in rock minerals, enabling quantification of rock N inputs to bioavailable reservoirs in soil and regolith. Across all sites, -37% to 48% of the initial bedrock N content has undergone long-term weathering in the soil. Combined with regional denudation estimates (sum of physical + chemical erosion), these weathering fractions translate to 1.6-10.7 kg x ha(-1) x yr(-1) of rock N input to these forest ecosystems. These N input fluxes are substantial in light of estimates for atmospheric sources in these sites (4.5-7.0 kg x ha(-1) x yr(-1)). In addition, N depletion from rock minerals was greater than sodium, suggesting active biologically mediated weathering of growth-limiting nutrients compared to nonessential elements. These results point to regional tectonics, biologically mediated weathering effects, and rock N chemistry in shaping the magnitude of rock N inputs to the forest ecosystems examined.
NASA Technical Reports Server (NTRS)
West, Jeff; Westra, Doug; Lin, Jeff; Tucker, Kevin
2006-01-01
A robust rocket engine combustor design and development process must include tools which can accurately predict the multi-dimensional thermal environments imposed on solid surfaces by the hot combustion products. Currently, empirical methods used in the design process are typically one dimensional and do not adequately account for the heat flux rise rate in the near-injector region of the chamber. Computational Fluid Dynamics holds promise to meet the design tool requirement, but requires accuracy quantification, or validation, before it can be confidently applied in the design process. This effort presents the beginning of such a validation process for the Loci-CHEM CFD code. The model problem examined here is a gaseous oxygen (GO2)/gaseous hydrogen (GH2) shear coaxial single element injector operating at a chamber pressure of 5.42 MPa. The GO2/GH2 propellant combination in this geometry represents one the simplest rocket model problems and is thus foundational to subsequent validation efforts for more complex injectors. Multiple steady state solutions have been produced with Loci-CHEM employing different hybrid grids and two-equation turbulence models. Iterative convergence for each solution is demonstrated via mass conservation, flow variable monitoring at discrete flow field locations as a function of solution iteration and overall residual performance. A baseline hybrid was used and then locally refined to demonstrate grid convergence. Solutions were obtained with three variations of the k-omega turbulence model.
NASA Technical Reports Server (NTRS)
West, Jeff; Westra, Doug; Lin, Jeff; Tucker, Kevin
2006-01-01
A robust rocket engine combustor design and development process must include tools which can accurately predict the multi-dimensional thermal environments imposed on solid surfaces by the hot combustion products. Currently, empirical methods used in the design process are typically one dimensional and do not adequately account for the heat flux rise rate in the near-injector region of the chamber. Computational Fluid Dynamics holds promise to meet the design tool requirement, but requires accuracy quantification, or validation, before it can be confidently applied in the design process. This effort presents the beginning of such a validation process for the Loci- CHEM CPD code. The model problem examined here is a gaseous oxygen (GO2)/gaseous hydrogen (GH2) shear coaxial single element injector operating at a chamber pressure of 5.42 MPa. The GO2/GH2 propellant combination in this geometry represents one the simplest rocket model problems and is thus foundational to subsequent validation efforts for more complex injectors. Multiple steady state solutions have been produced with Loci-CHEM employing different hybrid grids and two-equation turbulence models. Iterative convergence for each solution is demonstrated via mass conservation, flow variable monitoring at discrete flow field locations as a function of solution iteration and overall residual performance. A baseline hybrid grid was used and then locally refined to demonstrate grid convergence. Solutions were also obtained with three variations of the k-omega turbulence model.
Takeno, Shinya; Bamba, Takeshi; Nakazawa, Yoshihisa; Fukusaki, Eiichiro; Okazawa, Atsushi; Kobayashi, Akio
2008-04-01
Commercial development of trans-1,4-polyisoprene from Eucommia ulmoides Oliver (EU-rubber) requires specific knowledge on selection of high-rubber-content lines and establishment of agronomic cultivation methods for achieving maximum EU-rubber yield. The development can be facilitated by high-throughput and highly sensitive analytical techniques for EU-rubber extraction and quantification. In this paper, we described an efficient EU-rubber extraction method, and validated that the accuracy was equivalent to that of the conventional Soxhlet extraction method. We also described a highly sensitive quantification method for EU-rubber by Fourier transform infrared spectroscopy (FT-IR) and pyrolysis-gas chromatography/mass spectrometry (PyGC/MS). We successfully applied the extraction/quantification method for study of seasonal changes in EU-rubber content and molecular weight distribution.
Characterization and quantification of biochar alkalinity.
Fidel, Rivka B; Laird, David A; Thompson, Michael L; Lawrinenko, Michael
2017-01-01
Lack of knowledge regarding the nature of biochar alkalis has hindered understanding of pH-sensitive biochar-soil interactions. Here we investigate the nature of biochar alkalinity and present a cohesive suite of methods for its quantification. Biochars produced from cellulose, corn stover and wood feedstocks had significant low-pK a organic structural (0.03-0.34 meq g -1 ), other organic (0-0.92 meq g -1 ), carbonate (0.02-1.5 meq g -1 ), and other inorganic (0-0.26 meq g -1 ) alkalinities. All four categories of biochar alkalinity contributed to total biochar alkalinity and are therefore relevant to pH-sensitive soil processes. Total biochar alkalinity was strongly correlated with base cation concentration, but biochar alkalinity was not a simple function of elemental composition, soluble ash, fixed carbon, or volatile matter content. More research is needed to characterize soluble biochar alkalis other than carbonates and to establish predictive relationships among biochar production parameters and the composition of biochar alkalis. Copyright © 2016 Elsevier Ltd. All rights reserved.
Apetrei, Irina Mirela; Apetrei, Constantin
2016-03-24
This work describes the development and optimization studies of a novel biosensor employed in the detection and quantification of histamine in freshwater fish samples. The proposed biosensor is based on a modified carbon screen-printed electrode with diamineoxidase, graphene and platinum nanoparticles, which detects the hydrogen peroxide formed by the chemical process biocatalysed by the enzyme diamine oxidase and immobilized onto the nanostructurated surface of the receptor element. The amperometric measurements with the biosensor have been implemented in buffer solution of pH 7.4, applying an optimal low potential of +0.4 V. The novel biosensor shows high sensitivity (0.0631 μA·μM), low detection limit (2.54 × 10(-8) M) and a broad linear domain from 0.1 to 300 μM. The applicability in natural complex samples and the analytical parameters of this enzyme sensor have been performed in the quantification of histamine in freshwater fish. An excellent correlation among results achieved with the developed biosensor and results found with the standard method for all freshwater fish samples has been achieved.
Apetrei, Irina Mirela; Apetrei, Constantin
2016-01-01
This work describes the development and optimization studies of a novel biosensor employed in the detection and quantification of histamine in freshwater fish samples. The proposed biosensor is based on a modified carbon screen-printed electrode with diamineoxidase, graphene and platinum nanoparticles, which detects the hydrogen peroxide formed by the chemical process biocatalysed by the enzyme diamine oxidase and immobilized onto the nanostructurated surface of the receptor element. The amperometric measurements with the biosensor have been implemented in buffer solution of pH 7.4, applying an optimal low potential of +0.4 V. The novel biosensor shows high sensitivity (0.0631 μA·μM), low detection limit (2.54 × 10−8 M) and a broad linear domain from 0.1 to 300 μM. The applicability in natural complex samples and the analytical parameters of this enzyme sensor have been performed in the quantification of histamine in freshwater fish. An excellent correlation among results achieved with the developed biosensor and results found with the standard method for all freshwater fish samples has been achieved. PMID:27023541
NASA Astrophysics Data System (ADS)
Marguí, E.; Queralt, I.; García-Ruiz, E.; García-González, E.; Rello, L.; Resano, M.
2018-01-01
Home-based collection protocols for clinical specimens are actively pursued as a means of improving life quality of patients. In this sense, dried blood spots (DBS) are proposed as a non-invasive and even self-administered alternative to sampling whole venous blood. This contribution explores the potential of energy dispersive X-ray fluorescence spectrometry for the simultaneous and direct determination of some major (S, Cl, K, Na), minor (P, Fe) and trace (Ca, Cu, Zn) elements in blood, after its deposition onto clinical filter papers, thus giving rise to DBS. For quantification purposes the best strategy was to use matrix-matched blood samples of known analyte concentrations. The accuracy and precision of the method were evaluated by analysis of a blood reference material (Seronorm™ trace elements whole blood L3). Quantitative results were obtained for the determination of P, S, Cl, K and Fe, and limits of detection for these elements were adequate, taking into account their typical concentrations in real blood samples. Determination of Na, Ca, Cu and Zn was hampered by the occurrence of high sample support (Na, Ca) and instrumental blanks (Cu, Zn). Therefore, the quantitative determination of these elements at the levels expected in blood samples was not feasible. The methodology developed was applied to the analysis of several blood samples and the results obtained were compared with those reported by standard techniques. Overall, the performance of the method developed is promising and it could be used to determine the aforementioned elements in blood samples in a simple, fast and economic way. Furthermore, its non-destructive nature enables further analyses by means of complementary techniques to be carried out.
Selective Distance-Based K+ Quantification on Paper-Based Microfluidics.
Gerold, Chase T; Bakker, Eric; Henry, Charles S
2018-04-03
In this study, paper-based microfluidic devices (μPADs) capable of K + quantification in aqueous samples, as well as in human serum, using both colorimetric and distance-based methods are described. A lipophilic phase containing potassium ionophore I (valinomycin) was utilized to achieve highly selective quantification of K + in the presence of Na + , Li + , and Mg 2+ ions. Successful addition of a suspended lipophilic phase to a wax printed paper-based device is described and offers a solution to current approaches that rely on organic solvents, which damage wax barriers. The approach provides an avenue for future alkali/alkaline quantification utilizing μPADs. Colorimetric spot tests allowed for K + quantification from 0.1-5.0 mM using only 3.00 μL of sample solution. Selective distance-based quantification required small sample volumes (6.00 μL) and gave responses sensitive enough to distinguish between 1.0 and 2.5 mM of sample K + . μPADs using distance-based methods were also capable of differentiating between 4.3 and 6.9 mM K + in human serum samples. Distance-based methods required no digital analysis, electronic hardware, or pumps; any steps required for quantification could be carried out using the naked eye.
Jiménez-Carvelo, Ana M; González-Casado, Antonio; Cuadros-Rodríguez, Luis
2017-03-01
A new analytical method for the quantification of olive oil and palm oil in blends with other vegetable edible oils (canola, safflower, corn, peanut, seeds, grapeseed, linseed, sesame and soybean) using normal phase liquid chromatography, and applying chemometric tools was developed. The procedure for obtaining of chromatographic fingerprint from the methyl-transesterified fraction from each blend is described. The multivariate quantification methods used were Partial Least Square-Regression (PLS-R) and Support Vector Regression (SVR). The quantification results were evaluated by several parameters as the Root Mean Square Error of Validation (RMSEV), Mean Absolute Error of Validation (MAEV) and Median Absolute Error of Validation (MdAEV). It has to be highlighted that the new proposed analytical method, the chromatographic analysis takes only eight minutes and the results obtained showed the potential of this method and allowed quantification of mixtures of olive oil and palm oil with other vegetable oils. Copyright © 2016 Elsevier B.V. All rights reserved.
Quantification of chemical elements in blood of patients affected by multiple sclerosis.
Forte, Giovanni; Visconti, Andrea; Santucci, Simone; Ghazaryan, Anna; Figà-Talamanca, Lorenzo; Cannoni, Stefania; Bocca, Beatrice; Pino, Anna; Violante, Nicola; Alimonti, Alessandro; Salvetti, Marco; Ristori, Giovanni
2005-01-01
Although some studies suggested a link between exposure to trace elements and development of multiple sclerosis (MS), clear information on their role in the aetiology of MS is still lacking. In this study the concentrations of Al, Ba, Be, Bi, Ca, Cd, Co, Cr, Cu, Fe, Hg, Li, Mg, Mn, Mo, Ni, Pb, Sb, Si, Sn, Sr, Tl, V, W, Zn and Zr were determined in the blood of 60 patients with MS and 60 controls. Quantifications were performed by inductively coupled plasma (ICP) atomic emission spectrometry and sector field ICP mass spectrometry. When the two groups were compared, an increased level of Co, Cu and Ni and a decrement of Be, Fe, Hg, Mg, Mo, Pb and Zn in blood of patients were observed. In addition, the discriminant analysis pointed out that Cu, Be, Hg, Co and Mo were able to discriminate between MS patients and controls (92.5% of cases correctly classified).
Jann, Johann-Christoph; Nowak, Daniel; Nolte, Florian; Fey, Stephanie; Nowak, Verena; Obländer, Julia; Pressler, Jovita; Palme, Iris; Xanthopoulos, Christina; Fabarius, Alice; Platzbecker, Uwe; Giagounidis, Aristoteles; Götze, Katharina; Letsch, Anne; Haase, Detlef; Schlenk, Richard; Bug, Gesine; Lübbert, Michael; Ganser, Arnold; Germing, Ulrich; Haferlach, Claudia; Hofmann, Wolf-Karsten; Mossner, Maximilian
2017-09-01
Cytogenetic aberrations such as deletion of chromosome 5q (del(5q)) represent key elements in routine clinical diagnostics of haematological malignancies. Currently established methods such as metaphase cytogenetics, FISH or array-based approaches have limitations due to their dependency on viable cells, high costs or semi-quantitative nature. Importantly, they cannot be used on low abundance DNA. We therefore aimed to establish a robust and quantitative technique that overcomes these shortcomings. For precise determination of del(5q) cell fractions, we developed an inexpensive multiplex-PCR assay requiring only nanograms of DNA that simultaneously measures allelic imbalances of 12 independent short tandem repeat markers. Application of this method to n=1142 samples from n=260 individuals revealed strong intermarker concordance (R²=0.77-0.97) and reproducibility (mean SD: 1.7%). Notably, the assay showed accurate quantification via standard curve assessment (R²>0.99) and high concordance with paired FISH measurements (R²=0.92) even with subnanogram amounts of DNA. Moreover, cytogenetic response was reliably confirmed in del(5q) patients with myelodysplastic syndromes treated with lenalidomide. While the assay demonstrated good diagnostic accuracy in receiver operating characteristic analysis (area under the curve: 0.97), we further observed robust correlation between bone marrow and peripheral blood samples (R²=0.79), suggesting its potential suitability for less-invasive clonal monitoring. In conclusion, we present an adaptable tool for quantification of chromosomal aberrations, particularly in problematic samples, which should be easily applicable to further tumour entities. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.
NASA Astrophysics Data System (ADS)
Elhag, Mohamed; Boteva, Silvena
2017-12-01
Quantification of geomorphometric features is the keystone concern of the current study. The quantification was based on the statistical approach in term of multivariate analysis of local topographic features. The implemented algorithm utilizes the Digital Elevation Model (DEM) to categorize and extract the geomorphometric features embedded in the topographic dataset. The morphological settings were exercised on the central pixel of 3x3 per-defined convolution kernel to evaluate the surrounding pixels under the right directional pour point model (D8) of the azimuth viewpoints. Realization of unsupervised classification algorithm in term of Iterative Self-Organizing Data Analysis Technique (ISODATA) was carried out on ASTER GDEM within the boundary of the designated study area to distinguish 10 morphometric classes. The morphometric classes expressed spatial distribution variation in the study area. The adopted methodology is successful to appreciate the spatial distribution of the geomorphometric features under investigation. The conducted results verified the superimposition of the delineated geomorphometric elements over a given remote sensing imagery to be further analyzed. Robust relationship between different Land Cover types and the geomorphological elements was established in the context of the study area. The domination and the relative association of different Land Cover types in corresponding to its geomorphological elements were demonstrated.
NASA Astrophysics Data System (ADS)
Duan, Fajie; Fu, Xiao; Jiang, Jiajia; Huang, Tingting; Ma, Ling; Zhang, Cong
2018-05-01
In this work, an automatic variable selection method for quantitative analysis of soil samples using laser-induced breakdown spectroscopy (LIBS) is proposed, which is based on full spectrum correction (FSC) and modified iterative predictor weighting-partial least squares (mIPW-PLS). The method features automatic selection without artificial processes. To illustrate the feasibility and effectiveness of the method, a comparison with genetic algorithm (GA) and successive projections algorithm (SPA) for different elements (copper, barium and chromium) detection in soil was implemented. The experimental results showed that all the three methods could accomplish variable selection effectively, among which FSC-mIPW-PLS required significantly shorter computation time (12 s approximately for 40,000 initial variables) than the others. Moreover, improved quantification models were got with variable selection approaches. The root mean square errors of prediction (RMSEP) of models utilizing the new method were 27.47 (copper), 37.15 (barium) and 39.70 (chromium) mg/kg, which showed comparable prediction effect with GA and SPA.
Quantitative analysis of trace element concentrations in some gem-quality diamonds
NASA Astrophysics Data System (ADS)
McNeill, J.; Pearson, D. G.; Klein-Ben David, O.; Nowell, G. M.; Ottley, C. J.; Chinn, I.
2009-09-01
The geochemical signature of diamond-forming fluids can be used to unravel diamond-forming processes and is of potential use in the detection of so-called 'conflict' diamonds. While fluid-rich fibrous diamonds can be analyzed by a variety of techniques, very few data have been published for fluid-poor, gem-quality diamonds because of their very low impurity levels. Here we present a new ICPMS-based (ICPMS: inductively coupled plasma mass spectrometry) method for the analysis of trace element concentrations within fluid-poor, gem-quality diamonds. The method employs a closed-system laser ablation cell. Diamonds are ablated and the products trapped for later pre-concentration into solutions that are analyzed by sector-field ICPMS. We show that our limits of quantification for a wide range of elements are at the sub-pg to low pg level. The method is applied to a suite of 10 diamonds from the Cullinan Mine (previously known as Premier), South Africa, along with other diamonds from Siberia (Mir and Udachnaya) and Venezuela. The concentrations of a wide range of elements for all the samples (expressed by weight in the solid) are very low, with rare earth elements along with Y, Nb, Cs ranging from 0.01 to 2 ppb. Large ion lithophile elements (LILE) such as Rb and Ba vary from 1 to 30 ppb. Ti ranges from ppb levels up to 2 ppm. From the combined, currently small data set we observe two kinds of diamond-forming fluids within gem diamonds. One group has enrichments in LILE over Nb, whereas a second group has normalized LILE abundances more similar to those of Nb. These two groups bear some similarity to different groups of fluid-rich diamonds, providing some supporting evidence of a link between the parental fluids for both fluid-inclusion-rich and gem diamonds.
Lu, Tzong-Shi; Yiao, Szu-Yu; Lim, Kenneth; Jensen, Roderick V; Hsiao, Li-Li
2010-07-01
The identification of differences in protein expression resulting from methodical variations is an essential component to the interpretation of true, biologically significant results. We used the Lowry and Bradford methods- two most commonly used methods for protein quantification, to assess whether differential protein expressions are a result of true biological or methodical variations. MATERIAL #ENTITYSTARTX00026; Differential protein expression patterns was assessed by western blot following protein quantification by the Lowry and Bradford methods. We have observed significant variations in protein concentrations following assessment with the Lowry versus Bradford methods, using identical samples. Greater variations in protein concentration readings were observed over time and in samples with higher concentrations, with the Bradford method. Identical samples quantified using both methods yielded significantly different expression patterns on Western blot. We show for the first time that methodical variations observed in these protein assay techniques, can potentially translate into differential protein expression patterns, that can be falsely taken to be biologically significant. Our study therefore highlights the pivotal need to carefully consider methodical approaches to protein quantification in techniques that report quantitative differences.
Comparison of Calibration of Sensors Used for the Quantification of Nuclear Energy Rate Deposition
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brun, J.; Reynard-Carette, C.; Tarchalski, M.
This present work deals with a collaborative program called GAMMA-MAJOR 'Development and qualification of a deterministic scheme for the evaluation of GAMMA heating in MTR reactors with exploitation as example MARIA reactor and Jules Horowitz Reactor' between the National Centre for Nuclear Research of Poland, the French Atomic Energy and Alternative Energies Commission and Aix Marseille University. One of main objectives of this program is to optimize the nuclear heating quantification thanks to calculation validated from experimental measurements of radiation energy deposition carried out in irradiation reactors. The quantification of the nuclear heating is a key data especially for themore » thermal, mechanical design and sizing of irradiation experimental devices in specific irradiated conditions and locations. The determination of this data is usually performed by differential calorimeters and gamma thermometers such as used in the experimental multi-sensors device called CARMEN 'Calorimetric en Reacteur et Mesures des Emissions Nucleaires'. In the framework of the GAMMA-MAJOR program a new calorimeter was designed for the nuclear energy deposition quantification. It corresponds to a single-cell calorimeter and it is called KAROLINA. This calorimeter was recently tested during an irradiation campaign inside MARIA reactor in Poland. This new single-cell calorimeter differs from previous CALMOS or CARMEN type differential calorimeters according to three main points: its geometry, its preliminary out-of-pile calibration, and its in-pile measurement method. The differential calorimeter, which is made of two identical cells containing heaters, has a calibration method based on the use of steady thermal states reached by simulating the nuclear energy deposition into the calorimeter sample by Joule effect; whereas the single-cell calorimeter, which has no heater, is calibrated by using the transient thermal response of the sensor (heating and cooling steps). The paper will concern these two kinds of calorimetric sensors. It will focus in particular on studies on their out-of-pile calibrations. Firstly, the characteristics of the sensor designs will be detailed (such as geometry, dimension, material sample, assembly, instrumentation). Then the out-of-pile calibration methods will be described. Furthermore numerical results obtained thanks to 2D axisymmetrical thermal simulations (Finite Element Method, CAST3M) and experimental results will be presented for each sensor. A comparison of the two different thermal sensor behaviours will be realized. To conclude a discussion of the advantages and the drawbacks of each sensor will be performed especially regarding measurement methods. (authors)« less
Rauniyar, Navin
2015-01-01
The parallel reaction monitoring (PRM) assay has emerged as an alternative method of targeted quantification. The PRM assay is performed in a high resolution and high mass accuracy mode on a mass spectrometer. This review presents the features that make PRM a highly specific and selective method for targeted quantification using quadrupole-Orbitrap hybrid instruments. In addition, this review discusses the label-based and label-free methods of quantification that can be performed with the targeted approach. PMID:26633379
NASA Astrophysics Data System (ADS)
Restaino, Stephen M.; White, Ian M.
2017-03-01
Surface Enhanced Raman spectroscopy (SERS) provides significant improvements over conventional methods for single and multianalyte quantification. Specifically, the spectroscopic fingerprint provided by Raman scattering allows for a direct multiplexing potential far beyond that of fluorescence and colorimetry. Additionally, SERS generates a comparatively low financial and spatial footprint compared with common fluorescence based systems. Despite the advantages of SERS, it has remained largely an academic pursuit. In the field of biosensing, techniques to apply SERS to molecular diagnostics are constantly under development but, most often, assay protocols are redesigned around the use of SERS as a quantification method and ultimately complicate existing protocols. Our group has sought to rethink common SERS methodologies in order to produce translational technologies capable of allowing SERS to compete in the evolving, yet often inflexible biosensing field. This work will discuss the development of two techniques for quantification of microRNA, a promising biomarker for homeostatic and disease conditions ranging from cancer to HIV. First, an inkjet-printed paper SERS sensor has been developed to allow on-demand production of a customizable and multiplexable single-step lateral flow assay for miRNA quantification. Second, as miRNA concentrations commonly exist in relatively low concentrations, amplification methods (e.g. PCR) are therefore required to facilitate quantification. This work presents a novel miRNA assay alongside a novel technique for quantification of nuclease driven nucleic acid amplification strategies that will allow SERS to be used directly with common amplification strategies for quantification of miRNA and other nucleic acid biomarkers.
Haberl, Jasmin; Koralewska, Ralf; Schlumberger, Stefan; Schuster, Michael
2018-05-01
The elemental composition of fly ash from six waste-to-energy (WTE) plants in Germany and two WTE plants in Switzerland were analyzed. Samples were taken daily over a period of one month and mixed to a composite sample for each German plant. From two Swiss plants, two and three of these composite samples, respectively, were collected for different months in order to assess temporal differences between these months. In total, 61 elements, including rare earth elements, were analyzed using ICP-OES and ICP-MS. The analysis method was validated for 44 elements either by reference materials (BCR 176R and NIST 1633c) or analysis with both methods. Good recoveries, mostly ±10%, and high agreements between both methods were achieved. As long as no additives from flue gas cleaning were mixed with the fly ash, quite similar element contents were observed between all of the different incinerators. For most elements, the variations between the different months within the two Swiss plants were lower than differences between various plants. Especially main components show low variations between different months. To get a more detailed insight into temporal fluctuations within the mentioned Swiss plants, the concentrations of Zn, Pb, Cu, Cd, Sb, and Sn are presented over a period of three years (Jan. 2015 - Oct. 2017). The concentration profiles are based on weekly composite samples (consisting of daily taken samples) analyzed by the routine control of these plants using ED-XRF. The standard deviations of the average concentrations were around 20% over the three years for the regarded elements. The fluctuations were comparable at both plants. Due to the relatively low temporal concentration fluctuations observed within the plants, fly ash would be a continuous and constant source of secondary raw materials. Beside Zn, Pb, Cu, and Cd, which were already recovered on an industrial scale, Sb, Sn, and Bi also show a high potential as secondary raw material due to the high concentration of these elements in fly ash. Copyright © 2018 Elsevier Ltd. All rights reserved.
Beeston, Michael Philip; Glass, Hylke Jan; van Elteren, Johannes Teun; Slejkovec, Zdenka
2007-09-19
A new method has been developed to analyse the mobility of elements within soils employing counter-current flow soil contacting in a fluidised bed (FB) column. This method alleviates the problem of irreproducible peaks suffered by state-of-the-art micro-column techniques as a result of particle compaction. Reproducible extraction profiles are produced through the leaching of soil with a linear gradient of 0.05 mol L(-1) ammonium sulphate to 0.11 mol L(-1) acetic acid using a high pressure liquid chromatography (HPLC) quaternary pump, and the continuous monitoring of the elements in the leachate with inductively coupled plasma mass spectrometry (ICP-MS). Quantification of the procedure is achieved with an external flow injection (FI) calibration method. Flow rate and FB column length were investigated as critical parameters to the efficiency of the extraction methodology. It was found that an increase in the column length from 10 to 20 cm using a flow rate of 0.15 mL min(-1) produced the same increase in extracted elemental concentration as an increase in flow rate from 0.15 to 0.30 mL min(-1). In both examples, the increase in the concentration of elements leached from the soil may be ascribed to the increase in the concentration gradient between the solid and liquid. The exhaustive nature of the technique defines the maximum leachable concentration within the operationally defined leaching parameters of the exchangeable phase, providing a more accurate assessment of the risk associated with the elements in the soil for the phase providing the greatest risk to the environment. The multi-elemental high sensitivity nature of the on-line detector provides an accurate determination of the associations present between the elements in the soil, and the identification of multiple phases within the exchangeable phase through the presence of multiple peaks in the extraction profiles. It is possible through the deconvolution of these extraction profiles that the concentration corresponding to the peaks identified can be defined.
Jeong, Hyun Cheol; Hong, Hee-Do; Kim, Young-Chan; Rhee, Young Kyoung; Choi, Sang Yoon; Kim, Kyung-Tack; Kim, Sung Soo; Lee, Young-Chul; Cho, Chang-Won
2015-01-01
Background: Maltol, as a type of phenolic compounds, is produced by the browning reaction during the high-temperature treatment of ginseng. Thus, maltol can be used as a marker for the quality control of various ginseng products manufactured by high-temperature treatment including red ginseng. For the quantification of maltol in Korean ginseng products, an effective high-performance liquid chromatography-diode array detector (HPLC-DAD) method was developed. Materials and Methods: The HPLC-DAD method for maltol quantification coupled with a liquid-liquid extraction (LLE) method was developed and validated in terms of linearity, precision, and accuracy. An HPLC separation was performed on a C18 column. Results: The LLE methods and HPLC running conditions for maltol quantification were optimized. The calibration curve of the maltol exhibited good linearity (R2 = 1.00). The limit of detection value of maltol was 0.26 μg/mL, and the limit of quantification value was 0.79 μg/mL. The relative standard deviations (RSDs) of the data of the intra- and inter-day experiments were <1.27% and 0.61%, respectively. The results of the recovery test were 101.35–101.75% with an RSD value of 0.21–1.65%. The developed method was applied successfully to quantify the maltol in three ginseng products manufactured by different methods. Conclusion: The results of validation demonstrated that the proposed HPLC-DAD method was useful for the quantification of maltol in various ginseng products. PMID:26246746
Artifacts Quantification of Metal Implants in MRI
NASA Astrophysics Data System (ADS)
Vrachnis, I. N.; Vlachopoulos, G. F.; Maris, T. G.; Costaridou, L. I.
2017-11-01
The presence of materials with different magnetic properties, such as metal implants, causes distortion of the magnetic field locally, resulting in signal voids and pile ups, i.e. susceptibility artifacts in MRI. Quantitative and unbiased measurement of the artifact is prerequisite for optimization of acquisition parameters. In this study an image gradient based segmentation method is proposed for susceptibility artifact quantification. The method captures abrupt signal alterations by calculation of the image gradient. Then the artifact is quantified in terms of its extent by an automated cross entropy thresholding method as image area percentage. The proposed method for artifact quantification was tested in phantoms containing two orthopedic implants with significantly different magnetic permeabilities. The method was compared against a method proposed in the literature, considered as a reference, demonstrating moderate to good correlation (Spearman’s rho = 0.62 and 0.802 in case of titanium and stainless steel implants). The automated character of the proposed quantification method seems promising towards MRI acquisition parameter optimization.
Quantitative Proteomics via High Resolution MS Quantification: Capabilities and Limitations
Higgs, Richard E.; Butler, Jon P.; Han, Bomie; Knierman, Michael D.
2013-01-01
Recent improvements in the mass accuracy and resolution of mass spectrometers have led to renewed interest in label-free quantification using data from the primary mass spectrum (MS1) acquired from data-dependent proteomics experiments. The capacity for higher specificity quantification of peptides from samples enriched for proteins of biological interest offers distinct advantages for hypothesis generating experiments relative to immunoassay detection methods or prespecified peptide ions measured by multiple reaction monitoring (MRM) approaches. Here we describe an evaluation of different methods to post-process peptide level quantification information to support protein level inference. We characterize the methods by examining their ability to recover a known dilution of a standard protein in background matrices of varying complexity. Additionally, the MS1 quantification results are compared to a standard, targeted, MRM approach on the same samples under equivalent instrument conditions. We show the existence of multiple peptides with MS1 quantification sensitivity similar to the best MRM peptides for each of the background matrices studied. Based on these results we provide recommendations on preferred approaches to leveraging quantitative measurements of multiple peptides to improve protein level inference. PMID:23710359
Quantification by SEM-EDS in uncoated non-conducting samples
NASA Astrophysics Data System (ADS)
Galván Josa, V.; Castellano, G.; Bertolino, S. R.
2013-07-01
An approach to perform elemental quantitative analysis in a conventional scanning electron microscope with an energy dispersive spectrometer has been developed for non-conductive samples in which the conductive coating should be avoided. Charge accumulation effects, which basically decrease the energy of the primary beam, were taken into account by means of the Duane-Hunt limit. This value represents the maximum energy of the continuum X-ray spectrum, and is related to the effective energy of the incident electron beam. To validate the results obtained by this procedure, a non-conductive sample of known composition was quantified without conductive coating. Complementarily, changes in the X-ray spectrum due to charge accumulation effects were studied by Monte Carlo simulations, comparing relative characteristic intensities as a function of the incident energy. This methodology is exemplified here to obtain the chemical composition of white and reddish archaeological pigments belonging to the Ambato style of "Aguada" culture (Catamarca, Argentina 500-1100 AD). The results obtained in this work show that the quantification procedure taking into account the Duane-Hunt limit is suitable for this kind of samples. This approach may be recommended for the quantification of samples for which coating is not desirable, such as ancient artwork, forensic or archaeological samples, or when the coating element is also present in the sample.
Ahn, Sung Hee; Bae, Yong Jin; Moon, Jeong Hee; Kim, Myung Soo
2013-09-17
We propose to divide matrix suppression in matrix-assisted laser desorption ionization into two parts, normal and anomalous. In quantification of peptides, the normal effect can be accounted for by constructing the calibration curve in the form of peptide-to-matrix ion abundance ratio versus concentration. The anomalous effect forbids reliable quantification and is noticeable when matrix suppression is larger than 70%. With this 70% rule, matrix suppression becomes a guideline for reliable quantification, rather than a nuisance. A peptide in a complex mixture can be quantified even in the presence of large amounts of contaminants, as long as matrix suppression is below 70%. The theoretical basis for the quantification method using a peptide as an internal standard is presented together with its weaknesses. A systematic method to improve quantification of high concentration analytes has also been developed.
Cai, Yicun; He, Yuping; Lv, Rong; Chen, Hongchao; Wang, Qiang; Pan, Liangwen
2017-01-01
Meat products often consist of meat from multiple animal species, and inaccurate food product adulteration and mislabeling can negatively affect consumers. Therefore, a cost-effective and reliable method for identification and quantification of animal species in meat products is required. In this study, we developed a duplex droplet digital PCR (dddPCR) detection and quantification system to simultaneously identify and quantify the source of meat in samples containing a mixture of beef (Bos taurus) and pork (Sus scrofa) in a single digital PCR reaction tube. Mixed meat samples of known composition were used to test the accuracy and applicability of this method. The limit of detection (LOD) and the limit of quantification (LOQ) of this detection and quantification system were also identified. We conclude that our dddPCR detection and quantification system is suitable for quality control and routine analyses of meat products.
de Kinkelder, R; van der Veen, R L P; Verbaak, F D; Faber, D J; van Leeuwen, T G; Berendschot, T T J M
2011-01-01
Purpose Accurate assessment of the amount of macular pigment (MPOD) is necessary to investigate the role of carotenoids and their assumed protective functions. High repeatability and reliability are important to monitor patients in studies investigating the influence of diet and supplements on MPOD. We evaluated the Macuscope (Macuvision Europe Ltd., Lapworth, Solihull, UK), a recently introduced device for measuring MPOD using the technique of heterochromatic flicker photometry (HFP). We determined agreement with another HFP device (QuantifEye; MPS 9000 series: Tinsley Precision Instruments Ltd., Croydon, Essex, UK) and a fundus reflectance method. Methods The right eyes of 23 healthy subjects (mean age 33.9±15.1 years) were measured. We determined agreement with QuantifEye and correlation with a fundus reflectance method. Repeatability of QuantifEye was assessed in 20 other healthy subjects (mean age 32.1±7.3 years). Repeatability was also compared with measurements by a fundus reflectance method in 10 subjects. Results We found low agreement between test and retest measurements with Macuscope. The average difference and the limits of agreement were −0.041±0.32. We found high agreement between test and retest measurements of QuantifEye (−0.02±0.18) and the fundus reflectance method (−0.04±0.18). MPOD data obtained by Macuscope and QuantifEye showed poor agreement: −0.017±0.44. For Macuscope and the fundus reflectance method, the correlation coefficient was r=0.05 (P=0.83). A significant correlation of r=0.87 (P<0.001) was found between QuantifEye and the fundus reflectance method. Conclusions Because repeatability of Macuscope measurements was low (ie, wide limits of agreement) and MPOD values correlated poorly with the fundus reflectance method, and agreed poorly with QuantifEye, the tested Macuscope protocol seems less suitable for studying MPOD. PMID:21057522
Elucidating rhizosphere processes by mass spectrometry - A review.
Rugova, Ariana; Puschenreiter, Markus; Koellensperger, Gunda; Hann, Stephan
2017-03-01
The presented review discusses state-of-the-art mass spectrometric methods, which have been developed and applied for investigation of chemical processes in the soil-root interface, the so-called rhizosphere. Rhizosphere soil's physical and chemical characteristics are to a great extent influenced by a complex mixture of compounds released from plant roots, i.e. root exudates, which have a high impact on nutrient and trace element dynamics in the soil-root interface as well as on microbial activities or soil physico-chemical characteristics. Chemical characterization as well as accurate quantification of the compounds present in the rhizosphere is a major prerequisite for a better understanding of rhizosphere processes and requires the development and application of advanced sampling procedures in combination with highly selective and sensitive analytical techniques. During the last years, targeted and non-targeted mass spectrometry-based methods have emerged and their combination with specific separation methods for various elements and compounds of a wide polarity range have been successfully applied in several studies. With this review we critically discuss the work that has been conducted within the last decade in the context of rhizosphere research and elemental or molecular mass spectrometry emphasizing different separation techniques as GC, LC and CE. Moreover, selected applications such as metal detoxification or nutrient acquisition will be discussed regarding the mass spectrometric techniques applied in studies of root exudates in plant-bacteria interactions. Additionally, a more recent isotope probing technique as novel mass spectrometry based application is highlighted. Copyright © 2017 Elsevier B.V. All rights reserved.
Rodríguez, Pablo Fernández; Marchante-Gayón, Juan Manuel; Sanz-Medel, Alfredo
2006-01-15
Ultrasonic slurry sampling electrothermal vaporisation inductively coupled plasma mass spectrometry (USS-ETV-ICP-MS) was applied to the elemental analysis of silicate based minerals, such as talc or quartz, without any pre-treatment except the grinding of the sample. The electrothermal vaporisation device consists of a tungsten coil connected to a home-made power supply. The voltage program, carrier gas flow rate and sonication time were optimised in order to obtain the best sensitivity for elements determined. The relationship between the amount of sample in the slurry and the signal intensity was also evaluated. Unfortunately, in all cases, quantification had to be carried out by the standard additions method owing to the strong matrix interferences. The global precision of the proposed method was always better than 12%. The limits of detection, calculated as three times the standard deviation of the blank value divided by the slope of the calibration curve, were between 0.5 ng/g for As and 3.5 ng/g for Ba. The method was validated by comparing the concentrations found for Cu, Mn, Cr, V, Li, Pb, Sn, Mg, U, Ba, Sr, Zn, Sb, Rb and Ce using the proposed methodology with those obtained by conventional nebulisation ICP-MS after acid digestion of the samples in a microwave oven. The concentration range in the solid samples was between 0.2 microg/g for Cr and 60 microg/g for Ba. All results were statistically in agreement with those found by conventional nebulisation.
Fluorescent quantification of melanin.
Fernandes, Bruno; Matamá, Teresa; Guimarães, Diana; Gomes, Andreia; Cavaco-Paulo, Artur
2016-11-01
Melanin quantification is reportedly performed by absorption spectroscopy, commonly at 405 nm. Here, we propose the implementation of fluorescence spectroscopy for melanin assessment. In a typical in vitro assay to assess melanin production in response to an external stimulus, absorption spectroscopy clearly overvalues melanin content. This method is also incapable of distinguishing non-melanotic/amelanotic control cells from those that are actually capable of performing melanogenesis. Therefore, fluorescence spectroscopy is the best method for melanin quantification as it proved to be highly specific and accurate, detecting even small variations in the synthesis of melanin. This method can also be applied to the quantification of melanin in more complex biological matrices like zebrafish embryos and human hair. © 2016 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
Collender, Philip A.; Kirby, Amy E.; Addiss, David G.; Freeman, Matthew C.; Remais, Justin V.
2015-01-01
Limiting the environmental transmission of soil-transmitted helminths (STH), which infect 1.5 billion people worldwide, will require sensitive, reliable, and cost effective methods to detect and quantify STH in the environment. We review the state of the art of STH quantification in soil, biosolids, water, produce, and vegetation with respect to four major methodological issues: environmental sampling; recovery of STH from environmental matrices; quantification of recovered STH; and viability assessment of STH ova. We conclude that methods for sampling and recovering STH require substantial advances to provide reliable measurements for STH control. Recent innovations in the use of automated image identification and developments in molecular genetic assays offer considerable promise for improving quantification and viability assessment. PMID:26440788
NASA Astrophysics Data System (ADS)
de Oliveira Souza, Sidnei; da Costa, Silvânio Silvério Lopes; Santos, Dayane Melo; dos Santos Pinto, Jéssica; Garcia, Carlos Alexandre Borges; Alves, José do Patrocínio Hora; Araujo, Rennan Geovanny Oliveira
2014-06-01
An analytical method for simultaneous determination of macronutrients (Ca, Mg, Na and P), micronutrients (Cu, Fe, Mn and Zn) and trace elements (Al, As, Cd, Pb and V) in mineral fertilizers was optimized. Two-level full factorial design was applied to evaluate the optimal proportions of reagents used in the sample digestion on hot plate. A Doehlert design for two variables was used to evaluate the operating conditions of the inductively coupled plasma optical emission spectrometer in order to accomplish the simultaneous determination of the analyte concentrations. The limits of quantification (LOQs) ranged from 2.0 mg kg- 1 for Mn to 77.3 mg kg- 1 for P. The accuracy and precision of the proposed method were evaluated by analysis of standard reference materials (SRMs) of Western phosphate rock (NIST 694), Florida phosphate rock (NIST 120C) and Trace elements in multi-nutrient fertilizer (NIST 695), considered to be adequate for simultaneous determination. Twenty-one samples of mineral fertilizers collected in Sergipe State, Brazil, were analyzed. For all samples, the As, Ca, Cd and Pb concentrations were below the LOQ values of the analytical method. For As, Cd and Pb the obtained LOQ values were below the maximum limit allowed by the Brazilian Ministry of Agriculture, Livestock and Food Supply (Ministério da Agricultura, Pecuária e Abastecimento - MAPA). The optimized method presented good accuracy and was effectively applied to quantitative simultaneous determination of the analytes in mineral fertilizers by inductively coupled plasma optical emission spectrometry (ICP OES).
Liu, Junyan; Liu, Yang; Gao, Mingxia; Zhang, Xiangmin
2012-08-01
A facile proteomic quantification method, fluorescent labeling absolute quantification (FLAQ), was developed. Instead of using MS for quantification, the FLAQ method is a chromatography-based quantification in combination with MS for identification. Multidimensional liquid chromatography (MDLC) with laser-induced fluorescence (LIF) detection with high accuracy and tandem MS system were employed for FLAQ. Several requirements should be met for fluorescent labeling in MS identification: Labeling completeness, minimum side-reactions, simple MS spectra, and no extra tandem MS fragmentations for structure elucidations. A fluorescence dye, 5-iodoacetamidofluorescein, was finally chosen to label proteins on all cysteine residues. The fluorescent dye was compatible with the process of the trypsin digestion and MALDI MS identification. Quantitative labeling was achieved with optimization of reacting conditions. A synthesized peptide and model proteins, BSA (35 cysteines), OVA (five cysteines), were used for verifying the completeness of labeling. Proteins were separated through MDLC and quantified based on fluorescent intensities, followed by MS identification. High accuracy (RSD% < 1.58) and wide linearity of quantification (1-10(5) ) were achieved by LIF detection. The limit of quantitation for the model protein was as low as 0.34 amol. Parts of proteins in human liver proteome were quantified and demonstrated using FLAQ. © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Bonta, Maximilian; Török, Szilvia; Hegedus, Balazs; Döme, Balazs; Limbeck, Andreas
2017-03-01
Laser ablation-inductively coupled plasma-mass spectrometry (LA-ICP-MS) is one of the most commonly applied methods for lateral trace element distribution analysis in medical studies. Many improvements of the technique regarding quantification and achievable lateral resolution have been achieved in the last years. Nevertheless, sample preparation is also of major importance and the optimal sample preparation strategy still has not been defined. While conventional histology knows a number of sample pre-treatment strategies, little is known about the effect of these approaches on the lateral distributions of elements and/or their quantities in tissues. The technique of formalin fixation and paraffin embedding (FFPE) has emerged as the gold standard in tissue preparation. However, the potential use for elemental distribution studies is questionable due to a large number of sample preparation steps. In this work, LA-ICP-MS was used to examine the applicability of the FFPE sample preparation approach for elemental distribution studies. Qualitative elemental distributions as well as quantitative concentrations in cryo-cut tissues as well as FFPE samples were compared. Results showed that some metals (especially Na and K) are severely affected by the FFPE process, whereas others (e.g., Mn, Ni) are less influenced. Based on these results, a general recommendation can be given: FFPE samples are completely unsuitable for the analysis of alkaline metals. When analyzing transition metals, FFPE samples can give comparable results to snap-frozen tissues. Graphical abstract Sample preparation strategies for biological tissues are compared with regard to the elemental distributions and average trace element concentrations.
Development and application of a novel metric to assess effectiveness of biomedical data
Bloom, Gregory C; Eschrich, Steven; Hang, Gang; Schabath, Matthew B; Bhansali, Neera; Hoerter, Andrew M; Morgan, Scott; Fenstermacher, David A
2013-01-01
Objective Design a metric to assess the comparative effectiveness of biomedical data elements within a study that incorporates their statistical relatedness to a given outcome variable as well as a measurement of the quality of their underlying data. Materials and methods The cohort consisted of 874 patients with adenocarcinoma of the lung, each with 47 clinical data elements. The p value for each element was calculated using the Cox proportional hazard univariable regression model with overall survival as the endpoint. An attribute or A-score was calculated by quantification of an element's four quality attributes; Completeness, Comprehensiveness, Consistency and Overall-cost. An effectiveness or E-score was obtained by calculating the conditional probabilities of the p-value and A-score within the given data set with their product equaling the effectiveness score (E-score). Results The E-score metric provided information about the utility of an element beyond an outcome-related p value ranking. E-scores for elements age-at-diagnosis, gender and tobacco-use showed utility above what their respective p values alone would indicate due to their relative ease of acquisition, that is, higher A-scores. Conversely, elements surgery-site, histologic-type and pathological-TNM stage were down-ranked in comparison to their p values based on lower A-scores caused by significantly higher acquisition costs. Conclusions A novel metric termed E-score was developed which incorporates standard statistics with data quality metrics and was tested on elements from a large lung cohort. Results show that an element's underlying data quality is an important consideration in addition to p value correlation to outcome when determining the element's clinical or research utility in a study. PMID:23975264
A SIMPLE METHOD FOR THE EXTRACTION AND QUANTIFICATION OF PHOTOPIGMENTS FROM SYMBIODINIUM SPP.
John E. Rogers and Dragoslav Marcovich. Submitted. Simple Method for the Extraction and Quantification of Photopigments from Symbiodinium spp.. Limnol. Oceanogr. Methods. 19 p. (ERL,GB 1192).
We have developed a simple, mild extraction procedure using methanol which, when...
Unified framework for information integration based on information geometry
Oizumi, Masafumi; Amari, Shun-ichi
2016-01-01
Assessment of causal influences is a ubiquitous and important subject across diverse research fields. Drawn from consciousness studies, integrated information is a measure that defines integration as the degree of causal influences among elements. Whereas pairwise causal influences between elements can be quantified with existing methods, quantifying multiple influences among many elements poses two major mathematical difficulties. First, overestimation occurs due to interdependence among influences if each influence is separately quantified in a part-based manner and then simply summed over. Second, it is difficult to isolate causal influences while avoiding noncausal confounding influences. To resolve these difficulties, we propose a theoretical framework based on information geometry for the quantification of multiple causal influences with a holistic approach. We derive a measure of integrated information, which is geometrically interpreted as the divergence between the actual probability distribution of a system and an approximated probability distribution where causal influences among elements are statistically disconnected. This framework provides intuitive geometric interpretations harmonizing various information theoretic measures in a unified manner, including mutual information, transfer entropy, stochastic interaction, and integrated information, each of which is characterized by how causal influences are disconnected. In addition to the mathematical assessment of consciousness, our framework should help to analyze causal relationships in complex systems in a complete and hierarchical manner. PMID:27930289
NASA Astrophysics Data System (ADS)
Guerra, M.; Ferreira, C.; Carvalho, M. L.; Santos, J. P.; Pessanha, S.
2016-08-01
Over the years, the presence of mercury in amalgam fillings has raised some safety concerns. Amalgam is one of the most commonly used tooth fillings and contains approximately 50% of elemental mercury and 50% of other metals, mostly silver, tin and copper. Amalgam can release small amounts of mercury vapor over time, and patients can absorb these vapors by inhaling or ingesting them. In this study, 10 human teeth treated with dental amalgam were analyzed using energy dispersive X-ray fluorescence (EDXRF) to study the diffusion of its constituents, Ag, Cu, Sn and Hg. The used EDXRF setup, makes use of a polycapillary lens to focus radiation up to 25 μm allowing the mapping of the elemental distribution in the samples. Quantification was performed using the inbuilt software based on the Fundamental Parameters method for bulk samples, considering a hydroxyapatite matrix. The teeth were longitudinally cut and each slice was scanned from the surface enamel to the inner region (dentin and pulp cavity). Mercury concentration profiles show strong levels of this element close to the amalgam region, decreasing significantly in the dentin, and increasing again up to 40,000 μg·g- 1 in the cavity were the pulp used to exist when the tooth was vital.
Laurie, Matthew T; Bertout, Jessica A; Taylor, Sean D; Burton, Joshua N; Shendure, Jay A; Bielas, Jason H
2013-08-01
Due to the high cost of failed runs and suboptimal data yields, quantification and determination of fragment size range are crucial steps in the library preparation process for massively parallel sequencing (or next-generation sequencing). Current library quality control methods commonly involve quantification using real-time quantitative PCR and size determination using gel or capillary electrophoresis. These methods are laborious and subject to a number of significant limitations that can make library calibration unreliable. Herein, we propose and test an alternative method for quality control of sequencing libraries using droplet digital PCR (ddPCR). By exploiting a correlation we have discovered between droplet fluorescence and amplicon size, we achieve the joint quantification and size determination of target DNA with a single ddPCR assay. We demonstrate the accuracy and precision of applying this method to the preparation of sequencing libraries.
Deng, Ning; Li, Zhenye; Pan, Chao; Duan, Huilong
2015-01-01
Study of complex proteome brings forward higher request for the quantification method using mass spectrometry technology. In this paper, we present a mass spectrometry label-free quantification tool for complex proteomes, called freeQuant, which integrated quantification with functional analysis effectively. freeQuant consists of two well-integrated modules: label-free quantification and functional analysis with biomedical knowledge. freeQuant supports label-free quantitative analysis which makes full use of tandem mass spectrometry (MS/MS) spectral count, protein sequence length, shared peptides, and ion intensity. It adopts spectral count for quantitative analysis and builds a new method for shared peptides to accurately evaluate abundance of isoforms. For proteins with low abundance, MS/MS total ion count coupled with spectral count is included to ensure accurate protein quantification. Furthermore, freeQuant supports the large-scale functional annotations for complex proteomes. Mitochondrial proteomes from the mouse heart, the mouse liver, and the human heart were used to evaluate the usability and performance of freeQuant. The evaluation showed that the quantitative algorithms implemented in freeQuant can improve accuracy of quantification with better dynamic range.
Turner, Clare E; Russell, Bruce R; Gant, Nicholas
2015-11-01
Magnetic resonance spectroscopy (MRS) is an analytical procedure that can be used to non-invasively measure the concentration of a range of neural metabolites. Creatine is an important neurometabolite with dietary supplementation offering therapeutic potential for neurological disorders with dysfunctional energetic processes. Neural creatine concentrations can be probed using proton MRS and quantified using a range of software packages based on different analytical methods. This experiment examines the differences in quantification performance of two commonly used analysis packages following a creatine supplementation strategy with potential therapeutic application. Human participants followed a seven day dietary supplementation regime in a placebo-controlled, cross-over design interspersed with a five week wash-out period. Spectroscopy data were acquired the day immediately following supplementation and analyzed with two commonly-used software packages which employ vastly different quantification methods. Results demonstrate that neural creatine concentration was augmented following creatine supplementation when analyzed using the peak fitting method of quantification (105.9%±10.1). In contrast, no change in neural creatine levels were detected with supplementation when analysis was conducted using the basis spectrum method of quantification (102.6%±8.6). Results suggest that software packages that employ the peak fitting procedure for spectral quantification are possibly more sensitive to subtle changes in neural creatine concentrations. The relative simplicity of the spectroscopy sequence and the data analysis procedure suggest that peak fitting procedures may be the most effective means of metabolite quantification when detection of subtle alterations in neural metabolites is necessary. The straightforward technique can be used on a clinical magnetic resonance imaging system. Copyright © 2015 Elsevier Inc. All rights reserved.
Collender, Philip A; Kirby, Amy E; Addiss, David G; Freeman, Matthew C; Remais, Justin V
2015-12-01
Limiting the environmental transmission of soil-transmitted helminths (STHs), which infect 1.5 billion people worldwide, will require sensitive, reliable, and cost-effective methods to detect and quantify STHs in the environment. We review the state-of-the-art of STH quantification in soil, biosolids, water, produce, and vegetation with regard to four major methodological issues: environmental sampling; recovery of STHs from environmental matrices; quantification of recovered STHs; and viability assessment of STH ova. We conclude that methods for sampling and recovering STHs require substantial advances to provide reliable measurements for STH control. Recent innovations in the use of automated image identification and developments in molecular genetic assays offer considerable promise for improving quantification and viability assessment. Copyright © 2015 Elsevier Ltd. All rights reserved.
Characterization of Nanopipettes.
Perry, David; Momotenko, Dmitry; Lazenby, Robert A; Kang, Minkyung; Unwin, Patrick R
2016-05-17
Nanopipettes are widely used in electrochemical and analytical techniques as tools for sizing, sequencing, sensing, delivery, and imaging. For all of these applications, the response of a nanopipette is strongly affected by its geometry and surface chemistry. As the size of nanopipettes becomes smaller, precise geometric characterization is increasingly important, especially if nanopipette probes are to be used for quantitative studies and analysis. This contribution highlights the combination of data from voltage-scanning ion conductivity experiments, transmission electron microscopy and finite element method simulations to fully characterize nanopipette geometry and surface charge characteristics, with an accuracy not achievable using existing approaches. Indeed, it is shown that presently used methods for characterization can lead to highly erroneous information on nanopipettes. The new approach to characterization further facilitates high-level quantification of the behavior of nanopipettes in electrochemical systems, as demonstrated herein for a scanning ion conductance microscope setup.
Quantification of mixed chimerism by real time PCR on whole blood-impregnated FTA cards.
Pezzoli, N; Silvy, M; Woronko, A; Le Treut, T; Lévy-Mozziconacci, A; Reviron, D; Gabert, J; Picard, C
2007-09-01
This study has investigated quantification of chimerism in sex-mismatched transplantations by quantitative real time PCR (RQ-PCR) using FTA paper for blood sampling. First, we demonstrate that the quantification of DNA from EDTA-blood which has been deposit on FTA card is accurate and reproducible. Secondly, we show that fraction of recipient cells detected by RQ-PCR was concordant between the FTA and salting-out method, reference DNA extraction method. Furthermore, the sensitivity of detection of recipient cells is relatively similar with the two methods. Our results show that this innovative method can be used for MC assessment by RQ-PCR.
Biniarz, Piotr; Łukaszewicz, Marcin
2017-06-01
The rapid and accurate quantification of biosurfactants in biological samples is challenging. In contrast to the orcinol method for rhamnolipids, no simple biochemical method is available for the rapid quantification of lipopeptides. Various liquid chromatography (LC) methods are promising tools for relatively fast and exact quantification of lipopeptides. Here, we report strategies for the quantification of the lipopeptides pseudofactin and surfactin in bacterial cultures using different high- (HPLC) and ultra-performance liquid chromatography (UPLC) systems. We tested three strategies for sample pretreatment prior to LC analysis. In direct analysis (DA), bacterial cultures were injected directly and analyzed via LC. As a modification, we diluted the samples with methanol and detected an increase in lipopeptide recovery in the presence of methanol. Therefore, we suggest this simple modification as a tool for increasing the accuracy of LC methods. We also tested freeze-drying followed by solvent extraction (FDSE) as an alternative for the analysis of "heavy" samples. In FDSE, the bacterial cultures were freeze-dried, and the resulting powder was extracted with different solvents. Then, the organic extracts were analyzed via LC. Here, we determined the influence of the extracting solvent on lipopeptide recovery. HPLC methods allowed us to quantify pseudofactin and surfactin with run times of 15 and 20 min per sample, respectively, whereas UPLC quantification was as fast as 4 and 5.5 min per sample, respectively. Our methods provide highly accurate measurements and high recovery levels for lipopeptides. At the same time, UPLC-MS provides the possibility to identify lipopeptides and their structural isoforms.
2015-01-01
Food consumption is an important behavior that is regulated by an intricate array of neuropeptides (NPs). Although many feeding-related NPs have been identified in mammals, precise mechanisms are unclear and difficult to study in mammals, as current methods are not highly multiplexed and require extensive a priori knowledge about analytes. New advances in data-independent acquisition (DIA) MS/MS and the open-source quantification software Skyline have opened up the possibility to identify hundreds of compounds and quantify them from a single DIA MS/MS run. An untargeted DIA MSE quantification method using Skyline software for multiplexed, discovery-driven quantification was developed and found to produce linear calibration curves for peptides at physiologically relevant concentrations using a protein digest as internal standard. By using this method, preliminary relative quantification of the crab Cancer borealis neuropeptidome (<2 kDa, 137 peptides from 18 families) was possible in microdialysates from 8 replicate feeding experiments. Of these NPs, 55 were detected with an average mass error below 10 ppm. The time-resolved profiles of relative concentration changes for 6 are shown, and there is great potential for the use of this method in future experiments to aid in correlation of NP changes with behavior. This work presents an unbiased approach to winnowing candidate NPs related to a behavior of interest in a functionally relevant manner, and demonstrates the success of such a UPLC-MSE quantification method using the open source software Skyline. PMID:25552291
Schmerberg, Claire M; Liang, Zhidan; Li, Lingjun
2015-01-21
Food consumption is an important behavior that is regulated by an intricate array of neuropeptides (NPs). Although many feeding-related NPs have been identified in mammals, precise mechanisms are unclear and difficult to study in mammals, as current methods are not highly multiplexed and require extensive a priori knowledge about analytes. New advances in data-independent acquisition (DIA) MS/MS and the open-source quantification software Skyline have opened up the possibility to identify hundreds of compounds and quantify them from a single DIA MS/MS run. An untargeted DIA MS(E) quantification method using Skyline software for multiplexed, discovery-driven quantification was developed and found to produce linear calibration curves for peptides at physiologically relevant concentrations using a protein digest as internal standard. By using this method, preliminary relative quantification of the crab Cancer borealis neuropeptidome (<2 kDa, 137 peptides from 18 families) was possible in microdialysates from 8 replicate feeding experiments. Of these NPs, 55 were detected with an average mass error below 10 ppm. The time-resolved profiles of relative concentration changes for 6 are shown, and there is great potential for the use of this method in future experiments to aid in correlation of NP changes with behavior. This work presents an unbiased approach to winnowing candidate NPs related to a behavior of interest in a functionally relevant manner, and demonstrates the success of such a UPLC-MS(E) quantification method using the open source software Skyline.
Gallo-Oller, Gabriel; Ordoñez, Raquel; Dotor, Javier
2018-06-01
Since its first description, Western blot has been widely used in molecular labs. It constitutes a multistep method that allows the detection and/or quantification of proteins from simple to complex protein mixtures. Western blot quantification method constitutes a critical step in order to obtain accurate and reproducible results. Due to the technical knowledge required for densitometry analysis together with the resources availability, standard office scanners are often used for the imaging acquisition of developed Western blot films. Furthermore, the use of semi-quantitative software as ImageJ (Java-based image-processing and analysis software) is clearly increasing in different scientific fields. In this work, we describe the use of office scanner coupled with the ImageJ software together with a new image background subtraction method for accurate Western blot quantification. The proposed method represents an affordable, accurate and reproducible approximation that could be used in the presence of limited resources availability. Copyright © 2018 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Mallah, Muhammad Ali; Sherazi, Syed Tufail Hussain; Bhanger, Muhammad Iqbal; Mahesar, Sarfaraz Ahmed; Bajeer, Muhammad Ashraf
2015-04-01
A transmission FTIR spectroscopic method was developed for direct, inexpensive and fast quantification of paracetamol content in solid pharmaceutical formulations. In this method paracetamol content is directly analyzed without solvent extraction. KBr pellets were formulated for the acquisition of FTIR spectra in transmission mode. Two chemometric models: simple Beer's law and partial least squares employed over the spectral region of 1800-1000 cm-1 for quantification of paracetamol content had a regression coefficient of (R2) of 0.999. The limits of detection and quantification using FTIR spectroscopy were 0.005 mg g-1 and 0.018 mg g-1, respectively. Study for interference was also done to check effect of the excipients. There was no significant interference from the sample matrix. The results obviously showed the sensitivity of transmission FTIR spectroscopic method for pharmaceutical analysis. This method is green in the sense that it does not require large volumes of hazardous solvents or long run times and avoids prior sample preparation.
Gimeno, Pascal; Maggio, Annie-Françoise; Bousquet, Claudine; Quoirez, Audrey; Civade, Corinne; Bonnet, Pierre-Antoine
2012-08-31
Esters of phthalic acid, more commonly named phthalates, may be present in cosmetic products as ingredients or contaminants. Their presence as contaminant can be due to the manufacturing process, to raw materials used or to the migration of phthalates from packaging when plastic (polyvinyl chloride--PVC) is used. 8 phthalates (DBP, DEHP, BBP, DMEP, DnPP, DiPP, DPP, and DiBP), classified H360 or H361, are forbidden in cosmetics according to the European regulation on cosmetics 1223/2009. A GC/MS method was developed for the assay of 12 phthalates in cosmetics, including the 8 phthalates regulated. Analyses are carried out on a GC/MS system with electron impact ionization mode (EI). The separation of phthalates is obtained on a cross-linked 5%-phenyl/95%-dimethylpolysiloxane capillary column 30 m × 0.25 mm (i.d.) × 0.25 mm film thickness using a temperature gradient. Phthalate quantification is performed by external calibration using an internal standard. Validation elements obtained on standard solutions, highlight a satisfactory system conformity (resolution>1.5), a common quantification limit at 0.25 ng injected, an acceptable linearity between 0.5 μg mL⁻¹ and 5.0 μg mL⁻¹ as well as a precision and an accuracy in agreement with in-house specifications. Cosmetic samples ready for analytical injection are analyzed after a dilution in ethanol whereas more complex cosmetic matrices, like milks and creams, are assayed after a liquid/liquid extraction using ter-butyl methyl ether (TBME). Depending on the type of cosmetics analyzed, the common limits of quantification for the 12 phthalates were set at 0.5 or 2.5 μg g⁻¹. All samples were assayed using the analytical approach described in the ISO 12787 international standard "Cosmetics-Analytical methods-Validation criteria for analytical results using chromatographic techniques". This analytical protocol is particularly adapted when it is not possible to make reconstituted sample matrices. Copyright © 2012 Elsevier B.V. All rights reserved.
Quantitative Method for Simultaneous Analysis of Acetaminophen and 6 Metabolites.
Lammers, Laureen A; Achterbergh, Roos; Pistorius, Marcel C M; Romijn, Johannes A; Mathôt, Ron A A
2017-04-01
Hepatotoxicity after ingestion of high-dose acetaminophen [N-acetyl-para-aminophenol (APAP)] is caused by the metabolites of the drug. To gain more insight into factors influencing susceptibility to APAP hepatotoxicity, quantification of APAP and metabolites is important. A few methods have been developed to simultaneously quantify APAP and its most important metabolites. However, these methods require a comprehensive sample preparation and long run times. The aim of this study was to develop and validate a simplified, but sensitive method for the simultaneous quantification of acetaminophen, the main metabolites acetaminophen glucuronide and acetaminophen sulfate, and 4 Cytochrome P450-mediated metabolites by using liquid chromatography with mass spectrometric (LC-MS) detection. The method was developed and validated for the human plasma, and it entailed a single method for sample preparation, enabling quick processing of the samples followed by an LC-MS method with a chromatographic run time of 9 minutes. The method was validated for selectivity, linearity, accuracy, imprecision, dilution integrity, recovery, process efficiency, ionization efficiency, and carryover effect. The method showed good selectivity without matrix interferences. For all analytes, the mean process efficiency was >86%, and the mean ionization efficiency was >94%. Furthermore, the accuracy was between 90.3% and 112% for all analytes, and the within- and between-run imprecision were <20% for the lower limit of quantification and <14.3% for the middle level and upper limit of quantification. The method presented here enables the simultaneous quantification of APAP and 6 of its metabolites. It is less time consuming than previously reported methods because it requires only a single and simple method for the sample preparation followed by an LC-MS method with a short run time. Therefore, this analytical method provides a useful method for both clinical and research purposes.
Probabilistic simulation of uncertainties in thermal structures
NASA Technical Reports Server (NTRS)
Chamis, Christos C.; Shiao, Michael
1990-01-01
Development of probabilistic structural analysis methods for hot structures is a major activity at Lewis Research Center. It consists of five program elements: (1) probabilistic loads; (2) probabilistic finite element analysis; (3) probabilistic material behavior; (4) assessment of reliability and risk; and (5) probabilistic structural performance evaluation. Recent progress includes: (1) quantification of the effects of uncertainties for several variables on high pressure fuel turbopump (HPFT) blade temperature, pressure, and torque of the Space Shuttle Main Engine (SSME); (2) the evaluation of the cumulative distribution function for various structural response variables based on assumed uncertainties in primitive structural variables; (3) evaluation of the failure probability; (4) reliability and risk-cost assessment, and (5) an outline of an emerging approach for eventual hot structures certification. Collectively, the results demonstrate that the structural durability/reliability of hot structural components can be effectively evaluated in a formal probabilistic framework. In addition, the approach can be readily extended to computationally simulate certification of hot structures for aerospace environments.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Salinger, Andrew; Phipps, Eric; Ostien, Jakob
2016-01-13
The Albany code is a general-purpose finite element code for solving partial differential equations (PDEs). Albany is a research code that demonstrates how a PDE code can be built by interfacing many of the open-source software libraries that are released under Sandia's Trilinos project. Part of the mission of Albany is to be a testbed for new Trilinos libraries, to refine their methods, usability, and interfaces. Albany includes hooks to optimization and uncertainty quantification algorithms, including those in Trilinos as well as those in the Dakota toolkit. Because of this, Albany is a desirable starting point for new code developmentmore » efforts that wish to make heavy use of Trilinos. Albany is both a framework and the host for specific finite element applications. These applications have project names, and can be controlled by configuration option when the code is compiled, but are all developed and released as part of the single Albany code base, These include LCM, QCAD, FELIX, Aeras, and ATO applications.« less
Sanati Nezhad, Amir; Naghavi, Mahsa; Packirisamy, Muthukumaran; Bhat, Rama; Geitmann, Anja
2013-01-01
Tip-growing cells have the unique property of invading living tissues and abiotic growth matrices. To do so, they exert significant penetrative forces. In plant and fungal cells, these forces are generated by the hydrostatic turgor pressure. Using the TipChip, a microfluidic lab-on-a-chip device developed for tip-growing cells, we tested the ability to exert penetrative forces generated in pollen tubes, the fastest-growing plant cells. The tubes were guided to grow through microscopic gaps made of elastic polydimethylsiloxane material. Based on the deformation of the gaps, the force exerted by the elongating tubes to permit passage was determined using finite element methods. The data revealed that increasing mechanical impedance was met by the pollen tubes through modulation of the cell wall compliance and, thus, a change in the force acting on the obstacle. Tubes that successfully passed a narrow gap frequently burst, raising questions about the sperm discharge mechanism in the flowering plants. PMID:23630253
NASA Astrophysics Data System (ADS)
Schwieters, Timo; Evertz, Marco; Mense, Maximilian; Winter, Martin; Nowak, Sascha
2017-07-01
In this work we present a new method using LA-ICP-MS to quantitatively determine the lithium content in aged graphite electrodes of a lithium ion battery (LIB) by performing total depth profiling. Matrix matched solid external standards are prepared using a solid doping approach to avoid elemental fractionation effects during the measurement. The results are compared and matched to the established ICP-OES technique for bulk quantification after performing a microwave assisted acid digestion. The method is applied to aged graphite electrodes in order to determine the lithium immobilization (= "Li loss") in the solid electrolyte interphase after the first cycle of formation. For this, different samples including a reference sample are created to obtain varying thicknesses of the SEI covering the electrode particles. By applying defined charging voltages, an initial lithiation process is performed to obtain specific graphite intercalation compounds (GICs, with target stoichiometries of LiC30, LiC18, LiC12 and LiC6). Afterwards, the graphite electrode is completely discharged to obtain samples without mobile, thus active lithium in its lattice. Taking the amount of lithium into account which originates from the residues of the LiPF6 (dissolved in the carbon components containing electrolyte), it is possible to subtract the amount of lithium in the SEI.
1H NMR quantification in very dilute toxin solutions: application to anatoxin-a analysis.
Dagnino, Denise; Schripsema, Jan
2005-08-01
A complete procedure is described for the extraction, detection and quantification of anatoxin-a in biological samples. Anatoxin-a is extracted from biomass by a routine acid base extraction. The extract is analysed by GC-MS, without the need of derivatization, with a detection limit of 0.5 ng. A method was developed for the accurate quantification of anatoxin-a in the standard solution to be used for the calibration of the GC analysis. 1H NMR allowed the accurate quantification of microgram quantities of anatoxin-a. The accurate quantification of compounds in standard solutions is rarely discussed, but for compounds like anatoxin-a (toxins with prices in the range of a million dollar a gram), of which generally only milligram quantities or less are available, this factor in the quantitative analysis is certainly not trivial. The method that was developed can easily be adapted for the accurate quantification of other toxins in very dilute solutions.
Jiang, Jun; Feng, Liang; Li, Jie; Sun, E; Ding, Shu-Min; Jia, Xiao-Bin
2014-04-10
Suet oil (SO) has been used commonly for food and medicine preparation. The determination of its elemental composition has became an important challenge for human safety and health owing to its possible contents of heavy metals or other elements. In this study, ultrawave single reaction chamber microwave digestion (Ultrawave) and inductively coupled plasma-mass spectrometry (ICP-MS) analysis was performed to determine 14 elements (Pb, As, Hg, Cd, Fe, Cu, Mn, Ti, Ni, V, Sr, Na, Ka and Ca) in SO samples. Furthermore, the multielemental content of 18 SO samples, which represented three different sources in China: Qinghai, Anhui and Jiangsu, were evaluated and compared. The optimal ultrawave digestion conditions, namely, the optimal time (35 min), temperature (210 °C) and pressure (90 bar), were screened by Box-Behnken design (BBD). Eighteen samples were successfully classified into three groups by principal component analysis (PCA) according to the contents of 14 elements. The results showed that all SO samples were rich in elements, but with significant differences corresponding to different origins. The outliers and majority of SO could be discriminated by PCA according to the multielemental content profile. The results highlighted that the element distribution was associated with the origins of SO samples. The proposed ultrawave digestion system was quite efficient and convenient, which could be mainly attributed to its high pressure and special high-throughput for the sample digestion procedure. Our established method could be useful for the quality control and standardization of elements in SO samples and products.
Wu, Chung-Che; Burger, Marcel; Günther, Detlef; Shen, Chuan-Chou; Hattendorf, Bodo
2018-08-14
This work presents a high-sensitivity approach to quantify ultra-trace concentrations of rare earth elements (REEs) in speleothem carbonates using open-cell laser ablation-sector field-inductively coupled plasma mass spectrometry (open-cell LA-SF-ICPMS). Specifically, open-cell LA in combination with a gas exchange device enabled sampling of large-scale carbonate specimens in an ambient environment. The use of a "jet" vacuum interface and the addition of small amounts of N 2 gas allowed for a 20-40 fold sensitivity enhancement compared to the conventional interface configuration. Mass load effects, quantification capabilities and detection power were investigated in analyses of reference materials using various combinations of spot sizes and laser repetition rates. From a 160 μm diameter circular laser spot and 10 Hz ablation frequency, limits of detection were in the low or sub-ng g -1 range for REEs. Little dependence of Ca normalized sensitivity factors on the amount of material introduced into the plasma was observed. Relative deviations of quantified concentrations from USGS MACS-3 preferred values were smaller than 12%. The analytical approach enabled the determination of REE concentration profiles at the single digit ng g -1 level. Application to a 15-cm piece stalagmite collected from East Timor revealed at least two abrupt elevations in light rare earth elements (LREEs) within a scanning distance of 8 mm. These anomaly regions extended over a distance of ≈200 μm and showed LREE abundances elevated by at least one order of magnitude. This high-resolution open-cell LA-SF-ICPMS method has the potential to be applied in micro-domain analyses of other natural carbonates, such as travertine, tufa, and flowstones. This is promising for a better understanding of earth and environmental sciences. Copyright © 2018 Elsevier B.V. All rights reserved.
Use of shape-preserving interpolation methods in surface modeling
NASA Technical Reports Server (NTRS)
Ftitsch, F. N.
1984-01-01
In many large-scale scientific computations, it is necessary to use surface models based on information provided at only a finite number of points (rather than determined everywhere via an analytic formula). As an example, an equation of state (EOS) table may provide values of pressure as a function of temperature and density for a particular material. These values, while known quite accurately, are typically known only on a rectangular (but generally quite nonuniform) mesh in (T,d)-space. Thus interpolation methods are necessary to completely determine the EOS surface. The most primitive EOS interpolation scheme is bilinear interpolation. This has the advantages of depending only on local information, so that changes in data remote from a mesh element have no effect on the surface over the element, and of preserving shape information, such as monotonicity. Most scientific calculations, however, require greater smoothness. Standard higher-order interpolation schemes, such as Coons patches or bicubic splines, while providing the requisite smoothness, tend to produce surfaces that are not physically reasonable. This means that the interpolant may have bumps or wiggles that are not supported by the data. The mathematical quantification of ideas such as physically reasonable and visually pleasing is examined.
Effect of membrane filtration artifacts on dissolved trace element concentrations
Horowitz, Arthur J.; Elrick, Kent A.; Colberg, Mark R.
1992-01-01
Among environment scientists, the current and almost universally accepted definition of dissolved constituents is an operational one; only those materials which pass through a 0.45-??m membrane filter are considered to be dissolved. Detailed laboratory and field studies on Fe and Al indicate that a number of factors associated with filtration, other than just pore size, can substantially alter 'dissolved' trace element concentrations; these include: filter type, filter diameter, filtration method, volume of sample processed, suspended sediment concentration, suspended sediment grain-size distribution, concentration of colloids and colloidally associated trace elements and concentration of organic matter. As such, reported filtered-water concentrations employing the same pore size filter may not be equal. Filtration artifacts may lead to the production of chemical data that indicate seasonal or annual 'dissolved' chemical trends which do not reflect actual environmental conditions. Further, the development of worldwide averages for various dissolved chemical constituents, the quantification of geochemical cycles, and the determination of short- or long-term environmental chemical trends may be subject to substantial errors, due to filtration artifacts, when data from the same or multiple sources are combined. Finally, filtration effects could have a substantial impact on various regulatory requirements.
The effect of membrane filtration artifacts on dissolved trace element concentrations
Horowitz, A.J.; Elrick, K.A.; Colberg, M.R.
1992-01-01
Among environment scientists, the current and almost universally accepted definition of dissolved constituents is an operational one only those materials which pass through a 0.45-??m membrane filter are considered to be dissolved. Detailed laboratory and field studies on Fe and Al indicate that a number of factors associated with filtration, other than just pore size, can substantially alter 'dissolved' trace element concentrations; these include: filter type, filter diameter, filtration method, volume of sample processed, suspended sediment concentration, suspended sediment grain-size distribution, concentration of colloids and colloidally-associated trace elements and concentration of organic matter. As such, reported filtered-water concentrations employing the same pore size filter may not be equal. Filtration artifacts may lead to the production of chemical data that indicate seasonal or annual 'dissolved' chemical trends which do not reflect actual environmental conditions. Further, the development of worldwide averages for various dissolved chemical constituents, the quantification of geochemical cycles, and the determination of short- or long-term environmental chemical trends may be subject to substantial errors, due to filtration artifacts, when data from the same or multiple sources are combined. Finally, filtration effects could have a substantial impact on various regulatory requirements.
Organophilic clays as a tracer to determine Erosion processes
NASA Astrophysics Data System (ADS)
Mentler, A.; Strauss, P.; Schomakers, J.; Hann, S.; Köllensberger, G.; Ottner, F.
2009-04-01
In recent years the use of new tracing techniques to measure soil erosion has gained attention. Beside long time existing isotopic methods the use of rare earth elements has been reported. We wanted to contribute to the efforts of obtaining better methods for determination surface soil movement and tested a novel method using organophilic clays as a tracer for erosion related studies. At present tests to extract organophilic clays from soil have been performed successfully using an Industrial produced organophilic bentonite (Tixogel TVZ, Süd-Chemie) treated with quaternary ammonium surfactants. A liquid extraction method with barium ions (Ba2+) and methanol was used to extract the n-alkyl ammonium compounds from the inter crystal layers of the modified Bentonite. To increase extraction efficiency, an ultrasound device was used (UW 2200 Bandelin, 10.000 cycles per second, vibration amplitude 54 µm, sonification time of one minute). This procedure lead to a recovery rate of about 85% for the organophilic bentonite. This was clearly superior to alternative extraction methods such as acetonitrile in different mixing ratios. Quantification of the extracted surfactants was performed via high performance liquid chromatography - mass spectrometry (HPLC-MS, Agilent 1200 SL HPLC and 6220 time-of-flight MS). The mass spectra of this industrial produced organophilic clay mineral showed four different molecular masses (M+H+ of 304.30, 332.33, 360.36 and 388.39. The four substances could be separated by HPLC (20 x 2 mm Zorbax C18 reversed phase column, 0.5 mL/min isocratic flow with 90% acetonitrile and 0.1% formic acid in water, run time of 7 minutes). The linear working range of the method was 5 to 1000 µg/L, with a limit of quantification of 1 µg/L n-alkyl ammonium compound. All four compounds of the Tixogel were extracted with identical extraction efficiencies and are hence suitable for accurate quantification procedures. Next steps of the methodology to develop are the application of the organophilic clays in an indoor rainfall simulation experiment at a small scale of 2 m². At present the methodology has been tested only for one particular soil. Future tests will be performed to see if the chosen methodology needs soil specific treatment when applied to more soils of different textural composition.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-08-27
... DEPARTMENT OF AGRICULTURE [Docket Number: USDA-2013-0003] Science-Based Methods for Entity-Scale Quantification of Greenhouse Gas Sources and Sinks From Agriculture and Forestry Practices AGENCY: Office of the... of Agriculture (USDA) has prepared a report containing methods for quantifying entity-scale...
[DNA quantification of blood samples pre-treated with pyramidon].
Zhu, Chuan-Hong; Zheng, Dao-Li; Ni, Rao-Zhi; Wang, Hai-Sheng; Ning, Ping; Fang, Hui; Liu, Yan
2014-06-01
To study DNA quantification and STR typing of samples pre-treated with pyramidon. The blood samples of ten unrelated individuals were anticoagulated in EDTA. The blood stains were made on the filter paper. The experimental groups were divided into six groups in accordance with the storage time, 30 min, 1 h, 3 h, 6 h, 12 h and 24h after pre-treated with pyramidon. DNA was extracted by three methods: magnetic bead-based extraction, QIAcube DNA purification method and Chelex-100 method. The quantification of DNA was made by fluorescent quantitative PCR. STR typing was detected by PCR-STR fluorescent technology. In the same DNA extraction method, the sample DNA decreased gradually with times after pre-treatment with pyramidon. In the same storage time, the DNA quantification in different extraction methods had significant differences. Sixteen loci DNA typing were detected in 90.56% of samples. Pyramidon pre-treatment could cause DNA degradation, but effective STR typing can be achieved within 24 h. The magnetic bead-based extraction is the best method for STR profiling and DNA extraction.
Simple and rapid quantification of brominated vegetable oil in commercial soft drinks by LC–MS
Chitranshi, Priyanka; da Costa, Gonçalo Gamboa
2016-01-01
We report here a simple and rapid method for the quantification of brominated vegetable oil (BVO) in soft drinks based upon liquid chromatography–electrospray ionization mass spectrometry. Unlike previously reported methods, this novel method does not require hydrolysis, extraction or derivatization steps, but rather a simple “dilute and shoot” sample preparation. The quantification is conducted by mass spectrometry in selected ion recording mode and a single point standard addition procedure. The method was validated in the range of 5–25 μg/mL BVO, encompassing the legal limit of 15 μg/mL established by the US FDA for fruit-flavored beverages in the US market. The method was characterized by excellent intra- and inter-assay accuracy (97.3–103.4%) and very low imprecision [0.5–3.6% (RSD)]. The direct nature of the quantification, simplicity, and excellent statistical performance of this methodology constitute clear advantages in relation to previously published methods for the analysis of BVO in soft drinks. PMID:27451219
Constellation Program Lessons Learned in the Quantification and Use of Aerodynamic Uncertainty
NASA Technical Reports Server (NTRS)
Walker, Eric L.; Hemsch, Michael J.; Pinier, Jeremy T.; Bibb, Karen L.; Chan, David T.; Hanke, Jeremy L.
2011-01-01
The NASA Constellation Program has worked for the past five years to develop a re- placement for the current Space Transportation System. Of the elements that form the Constellation Program, only two require databases that define aerodynamic environments and their respective uncertainty: the Ares launch vehicles and the Orion crew and launch abort vehicles. Teams were established within the Ares and Orion projects to provide repre- sentative aerodynamic models including both baseline values and quantified uncertainties. A technical team was also formed within the Constellation Program to facilitate integra- tion among the project elements. This paper is a summary of the collective experience of the three teams working with the quantification and use of uncertainty in aerodynamic environments: the Ares and Orion project teams as well as the Constellation integration team. Not all of the lessons learned discussed in this paper could be applied during the course of the program, but they are included in the hope of benefiting future projects.
USDA-ARS?s Scientific Manuscript database
High performance liquid chromatography of dabsyl derivatives of amino acids was employed for quantification of physiologic amino acids in cucurbits. This method is particularly useful because the dabsyl derivatives of glutamine and citrulline are sufficiently separated to allow quantification of ea...
Are LOD and LOQ Reliable Parameters for Sensitivity Evaluation of Spectroscopic Methods?
Ershadi, Saba; Shayanfar, Ali
2018-03-22
The limit of detection (LOD) and the limit of quantification (LOQ) are common parameters to assess the sensitivity of analytical methods. In this study, the LOD and LOQ of previously reported terbium sensitized analysis methods were calculated by different methods, and the results were compared with sensitivity parameters [lower limit of quantification (LLOQ)] of U.S. Food and Drug Administration guidelines. The details of the calibration curve and standard deviation of blank samples of three different terbium-sensitized luminescence methods for the quantification of mycophenolic acid, enrofloxacin, and silibinin were used for the calculation of LOD and LOQ. A comparison of LOD and LOQ values calculated by various methods and LLOQ shows a considerable difference. The significant difference of the calculated LOD and LOQ with various methods and LLOQ should be considered in the sensitivity evaluation of spectroscopic methods.
Microwave digestion for the quantification of inorganic elements in coal and coal ash using ICP-OES.
Low, Fiona; Zhang, Lian
2012-11-15
In this paper, microwave digestion conditions have been optimised to achieve complete recoveries for the ash-forming inorganic elements in coal and coal combustion fly ash, during the analysis by inductively coupled plasma optical emission spectroscopy (ICP-OES). The elements analysed include six major (Al, Ca, Fe, K, Mg and Na) and twelve trace (As, Ba, Be, Co, Cr, Cu, Li, Mn, Ni, Pb, Sr and V). Seven reference samples have been tested, including two standard coal references, SRM1632c and SARM19, their corresponding high-temperature ashes (HTAs), and three coal fly ash references, SRM1633c, SRM2690 and BCR38. The recoveries of individual elements in these samples have been examined intensively, as a function of the amount of hydrofluoric acid (HF, 0-2.0 ml), microwave power (900 W vs. 1200 W) and sample mass (0.05 g vs. 0.1 g). As have been confirmed, the recoveries of these individual elements varied significantly with the microwave digestion condition, elemental type and sample property. For the coal references and their HTAs, the use of HF can be ruled out for most of the elements, except K associated with feldspar, Pb and V. In particular, the recovery of Pb in coal is highly sample-specific and thus unpredictable. The majority of elements in fly ash references require the use of 0.1-0.2 ml HF for a complete recovery. Al in fly ash is the only exceptional element which gave incomplete recoveries throughout, suggesting the use of a complementary technique for its quantification. As has proven to be the only element inconsequential of sample type and digestion conditions, achieving complete recoveries for all cases. On the power parameter, using a higher power such as 1200 W is critical, which has proved to be an ultimatum for the recovery of certain elements, especially in fly ash. Halving sample mass from 0.1 g to 0.05 g was also found to be insignificant. Copyright © 2012 Elsevier B.V. All rights reserved.
Extended generalized recurrence plot quantification of complex circular patterns
NASA Astrophysics Data System (ADS)
Riedl, Maik; Marwan, Norbert; Kurths, Jürgen
2017-03-01
The generalized recurrence plot is a modern tool for quantification of complex spatial patterns. Its application spans the analysis of trabecular bone structures, Turing patterns, turbulent spatial plankton patterns, and fractals. Determinism is a central measure in this framework quantifying the level of regularity of spatial structures. We show by basic examples of fully regular patterns of different symmetries that this measure underestimates the orderliness of circular patterns resulting from rotational symmetries. We overcome this crucial problem by checking additional structural elements of the generalized recurrence plot which is demonstrated with the examples. Furthermore, we show the potential of the extended quantity of determinism applying it to more irregular circular patterns which are generated by the complex Ginzburg-Landau-equation and which can be often observed in real spatially extended dynamical systems. So, we are able to reconstruct the main separations of the system's parameter space analyzing single snapshots of the real part only, in contrast to the use of the original quantity. This ability of the proposed method promises also an improved description of other systems with complicated spatio-temporal dynamics typically occurring in fluid dynamics, climatology, biology, ecology, social sciences, etc.
Suhr, Anna Catharina; Vogeser, Michael; Grimm, Stefanie H
2016-05-30
For quotable quantitative analysis of endogenous analytes in complex biological samples by isotope dilution LC-MS/MS, the creation of appropriate calibrators is a challenge, since analyte-free authentic material is in general not available. Thus, surrogate matrices are often used to prepare calibrators and controls. However, currently employed validation protocols do not include specific experiments to verify the suitability of a surrogate matrix calibration for quantification of authentic matrix samples. The aim of the study was the development of a novel validation experiment to test whether surrogate matrix based calibrators enable correct quantification of authentic matrix samples. The key element of the novel validation experiment is the inversion of nonlabelled analytes and their stable isotope labelled (SIL) counterparts in respect to their functions, i.e. SIL compound is the analyte and nonlabelled substance is employed as internal standard. As a consequence, both surrogate and authentic matrix are analyte-free regarding SIL analytes, which allows a comparison of both matrices. We called this approach Isotope Inversion Experiment. As figure of merit we defined the accuracy of inverse quality controls in authentic matrix quantified by means of a surrogate matrix calibration curve. As a proof-of-concept application a LC-MS/MS assay addressing six corticosteroids (cortisol, cortisone, corticosterone, 11-deoxycortisol, 11-deoxycorticosterone, and 17-OH-progesterone) was chosen. The integration of the Isotope Inversion Experiment in the validation protocol for the steroid assay was successfully realized. The accuracy results of the inverse quality controls were all in all very satisfying. As a consequence the suitability of a surrogate matrix calibration for quantification of the targeted steroids in human serum as authentic matrix could be successfully demonstrated. The Isotope Inversion Experiment fills a gap in the validation process for LC-MS/MS assays quantifying endogenous analytes. We consider it a valuable and convenient tool to evaluate the correct quantification of authentic matrix samples based on a calibration curve in surrogate matrix. Copyright © 2016 Elsevier B.V. All rights reserved.
Taylor, Jonathan Christopher; Fenner, John Wesley
2017-11-29
Semi-quantification methods are well established in the clinic for assisted reporting of (I123) Ioflupane images. Arguably, these are limited diagnostic tools. Recent research has demonstrated the potential for improved classification performance offered by machine learning algorithms. A direct comparison between methods is required to establish whether a move towards widespread clinical adoption of machine learning algorithms is justified. This study compared three machine learning algorithms with that of a range of semi-quantification methods, using the Parkinson's Progression Markers Initiative (PPMI) research database and a locally derived clinical database for validation. Machine learning algorithms were based on support vector machine classifiers with three different sets of features: Voxel intensities Principal components of image voxel intensities Striatal binding radios from the putamen and caudate. Semi-quantification methods were based on striatal binding ratios (SBRs) from both putamina, with and without consideration of the caudates. Normal limits for the SBRs were defined through four different methods: Minimum of age-matched controls Mean minus 1/1.5/2 standard deviations from age-matched controls Linear regression of normal patient data against age (minus 1/1.5/2 standard errors) Selection of the optimum operating point on the receiver operator characteristic curve from normal and abnormal training data Each machine learning and semi-quantification technique was evaluated with stratified, nested 10-fold cross-validation, repeated 10 times. The mean accuracy of the semi-quantitative methods for classification of local data into Parkinsonian and non-Parkinsonian groups varied from 0.78 to 0.87, contrasting with 0.89 to 0.95 for classifying PPMI data into healthy controls and Parkinson's disease groups. The machine learning algorithms gave mean accuracies between 0.88 to 0.92 and 0.95 to 0.97 for local and PPMI data respectively. Classification performance was lower for the local database than the research database for both semi-quantitative and machine learning algorithms. However, for both databases, the machine learning methods generated equal or higher mean accuracies (with lower variance) than any of the semi-quantification approaches. The gain in performance from using machine learning algorithms as compared to semi-quantification was relatively small and may be insufficient, when considered in isolation, to offer significant advantages in the clinical context.
Isak, I; Patel, M; Riddell, M; West, M; Bowers, T; Wijeyekoon, S; Lloyd, J
2016-08-01
Fourier transform infrared (FTIR) spectroscopy was used in this study for the rapid quantification of polyhydroxyalkanoates (PHA) in mixed and pure culture bacterial biomass. Three different statistical analysis methods (regression, partial least squares (PLS) and nonlinear) were applied to the FTIR data and the results were plotted against the PHA values measured with the reference gas chromatography technique. All methods predicted PHA content in mixed culture biomass with comparable efficiency, indicated by similar residuals values. The PHA in these cultures ranged from low to medium concentration (0-44 wt% of dried biomass content). However, for the analysis of the combined mixed and pure culture biomass with PHA concentration ranging from low to high (0-93% of dried biomass content), the PLS method was most efficient. This paper reports, for the first time, the use of a single calibration model constructed with a combination of mixed and pure cultures covering a wide PHA range, for predicting PHA content in biomass. Currently no one universal method exists for processing FTIR data for polyhydroxyalkanoates (PHA) quantification. This study compares three different methods of analysing FTIR data for quantification of PHAs in biomass. A new data-processing approach was proposed and the results were compared against existing literature methods. Most publications report PHA quantification of medium range in pure culture. However, in our study we encompassed both mixed and pure culture biomass containing a broader range of PHA in the calibration curve. The resulting prediction model is useful for rapid quantification of a wider range of PHA content in biomass. © 2016 The Society for Applied Microbiology.
NASA Astrophysics Data System (ADS)
Delbeck, Sven; Küpper, Lukas; Heise, Herbert M.
2018-02-01
Spectroscopic analysis of different biofluids and bodyfluid-like media has been realized by using tapered flat silver halide fiber elements as infrared biosensors. Optical stability and biocompatibility testing of the sensor elements have been performed with in-vitro samples under representative physiological conditions. After improving the reproducibility of manufacturing the sensor elements, the incoupling of radiation and the general handling including their chemical composition characterization, the fiber sensors were further optimized for the experiments. Stability tests in physiological solutions as well as porcine blood have shown that best results for biospectroscopic applications are available for the mid-IR fingerprint region, with the most stable behaviour as analyzed by the single-beam spectra. Despite several contrary reports, the silver halide material tested is toxic to cell lines chosen from the DIN standard specification for biocompatibility testing. Spectral changes as well as the results based on the DIN standard showed that pretreatment of the fibers is unavoidable to prevent direct contact of cells or human tissue and the silver halide material. Further applications of tapered flat silver halide fibers for the quantification of analytes in bodyfluids have also been tested by ensheathing the fiber-optic sensor element with a dialysis membrane. With the successfully produced prototype, results of diffusion rates and performance of a membrane-ensheathed fiber probe have been obtained. An invitro monitoring fiber sensor was developed aiming at the implantation of a microdialysis system for the analytical quantification of biomolecules such as glucose, lactate and others.
Chen, Xing; Pavan, Matteo; Heinzer-Schweizer, Susanne; Boesiger, Peter; Henning, Anke
2012-01-01
This report describes our efforts on quantification of tissue metabolite concentrations in mM by nuclear Overhauser enhanced and proton decoupled (13) C magnetic resonance spectroscopy and the Electric Reference To access In vivo Concentrations (ERETIC) method. Previous work showed that a calibrated synthetic magnetic resonance spectroscopy-like signal transmitted through an optical fiber and inductively coupled into a transmit/receive coil represents a reliable reference standard for in vivo (1) H magnetic resonance spectroscopy quantification on a clinical platform. In this work, we introduce a related implementation that enables simultaneous proton decoupling and ERETIC-based metabolite quantification and hence extends the applicability of the ERETIC method to nuclear Overhauser enhanced and proton decoupled in vivo (13) C magnetic resonance spectroscopy. In addition, ERETIC signal stability under the influence of simultaneous proton decoupling is investigated. The proposed quantification method was cross-validated against internal and external reference standards on human skeletal muscle. The ERETIC signal intensity stability was 100.65 ± 4.18% over 3 months including measurements with and without proton decoupling. Glycogen and unsaturated fatty acid concentrations measured with the ERETIC method were in excellent agreement with internal creatine and external phantom reference methods, showing a difference of 1.85 ± 1.21% for glycogen and 1.84 ± 1.00% for unsaturated fatty acid between ERETIC and creatine-based quantification, whereas the deviations between external reference and creatine-based quantification are 6.95 ± 9.52% and 3.19 ± 2.60%, respectively. Copyright © 2011 Wiley Periodicals, Inc.
Alves, L P S; Almeida, A T; Cruz, L M; Pedrosa, F O; de Souza, E M; Chubatsu, L S; Müller-Santos, M; Valdameri, G
2017-01-16
The conventional method for quantification of polyhydroxyalkanoates based on whole-cell methanolysis and gas chromatography (GC) is laborious and time-consuming. In this work, a method based on flow cytometry of Nile red stained bacterial cells was established to quantify poly-3-hydroxybutyrate (PHB) production by the diazotrophic and plant-associated bacteria, Herbaspirillum seropedicae and Azospirillum brasilense. The method consists of three steps: i) cell permeabilization, ii) Nile red staining, and iii) analysis by flow cytometry. The method was optimized step-by-step and can be carried out in less than 5 min. The final results indicated a high correlation coefficient (R2=0.99) compared to a standard method based on methanolysis and GC. This method was successfully applied to the quantification of PHB in epiphytic bacteria isolated from rice roots.
Wang, Hanghang; Muehlbauer, Michael J.; O’Neal, Sara K.; Newgard, Christopher B.; Hauser, Elizabeth R.; Shah, Svati H.
2017-01-01
The field of metabolomics as applied to human disease and health is rapidly expanding. In recent efforts of metabolomics research, greater emphasis has been placed on quality control and method validation. In this study, we report an experience with quality control and a practical application of method validation. Specifically, we sought to identify and modify steps in gas chromatography-mass spectrometry (GC-MS)-based, non-targeted metabolomic profiling of human plasma that could influence metabolite identification and quantification. Our experimental design included two studies: (1) a limiting-dilution study, which investigated the effects of dilution on analyte identification and quantification; and (2) a concentration-specific study, which compared the optimal plasma extract volume established in the first study with the volume used in the current institutional protocol. We confirmed that contaminants, concentration, repeatability and intermediate precision are major factors influencing metabolite identification and quantification. In addition, we established methods for improved metabolite identification and quantification, which were summarized to provide recommendations for experimental design of GC-MS-based non-targeted profiling of human plasma. PMID:28841195
NASA Astrophysics Data System (ADS)
Sitko, Rafał
2008-11-01
Knowledge of X-ray tube spectral distribution is necessary in theoretical methods of matrix correction, i.e. in both fundamental parameter (FP) methods and theoretical influence coefficient algorithms. Thus, the influence of X-ray tube distribution on the accuracy of the analysis of thin films and bulk samples is presented. The calculations are performed using experimental X-ray tube spectra taken from the literature and theoretical X-ray tube spectra evaluated by three different algorithms proposed by Pella et al. (X-Ray Spectrom. 14 (1985) 125-135), Ebel (X-Ray Spectrom. 28 (1999) 255-266), and Finkelshtein and Pavlova (X-Ray Spectrom. 28 (1999) 27-32). In this study, Fe-Cr-Ni system is selected as an example and the calculations are performed for X-ray tubes commonly applied in X-ray fluorescence analysis (XRF), i.e., Cr, Mo, Rh and W. The influence of X-ray tube spectra on FP analysis is evaluated when quantification is performed using various types of calibration samples. FP analysis of bulk samples is performed using pure-element bulk standards and multielement bulk standards similar to the analyzed material, whereas for FP analysis of thin films, the bulk and thin pure-element standards are used. For the evaluation of the influence of X-ray tube spectra on XRF analysis performed by theoretical influence coefficient methods, two algorithms for bulk samples are selected, i.e. Claisse-Quintin (Can. Spectrosc. 12 (1967) 129-134) and COLA algorithms (G.R. Lachance, Paper Presented at the International Conference on Industrial Inorganic Elemental Analysis, Metz, France, June 3, 1981) and two algorithms (constant and linear coefficients) for thin films recently proposed by Sitko (X-Ray Spectrom. 37 (2008) 265-272).
Surface smoothness: cartilage biomarkers for knee OA beyond the radiologist
NASA Astrophysics Data System (ADS)
Tummala, Sudhakar; Dam, Erik B.
2010-03-01
Fully automatic imaging biomarkers may allow quantification of patho-physiological processes that a radiologist would not be able to assess reliably. This can introduce new insight but is problematic to validate due to lack of meaningful ground truth expert measurements. Rather than quantification accuracy, such novel markers must therefore be validated against clinically meaningful end-goals such as the ability to allow correct diagnosis. We present a method for automatic cartilage surface smoothness quantification in the knee joint. The quantification is based on a curvature flow method used on tibial and femoral cartilage compartments resulting from an automatic segmentation scheme. These smoothness estimates are validated for their ability to diagnose osteoarthritis and compared to smoothness estimates based on manual expert segmentations and to conventional cartilage volume quantification. We demonstrate that the fully automatic markers eliminate the time required for radiologist annotations, and in addition provide a diagnostic marker superior to the evaluated semi-manual markers.
Rapid and Easy Protocol for Quantification of Next-Generation Sequencing Libraries.
Hawkins, Steve F C; Guest, Paul C
2018-01-01
The emergence of next-generation sequencing (NGS) over the last 10 years has increased the efficiency of DNA sequencing in terms of speed, ease, and price. However, the exact quantification of a NGS library is crucial in order to obtain good data on sequencing platforms developed by the current market leader Illumina. Different approaches for DNA quantification are available currently and the most commonly used are based on analysis of the physical properties of the DNA through spectrophotometric or fluorometric methods. Although these methods are technically simple, they do not allow exact quantification as can be achieved using a real-time quantitative PCR (qPCR) approach. A qPCR protocol for DNA quantification with applications in NGS library preparation studies is presented here. This can be applied in various fields of study such as medical disorders resulting from nutritional programming disturbances.
NASA Astrophysics Data System (ADS)
Akram, Muhammad Farooq Bin
The management of technology portfolios is an important element of aerospace system design. New technologies are often applied to new product designs to ensure their competitiveness at the time they are introduced to market. The future performance of yet-to- be designed components is inherently uncertain, necessitating subject matter expert knowledge, statistical methods and financial forecasting. Estimates of the appropriate parameter settings often come from disciplinary experts, who may disagree with each other because of varying experience and background. Due to inherent uncertain nature of expert elicitation in technology valuation process, appropriate uncertainty quantification and propagation is very critical. The uncertainty in defining the impact of an input on performance parameters of a system makes it difficult to use traditional probability theory. Often the available information is not enough to assign the appropriate probability distributions to uncertain inputs. Another problem faced during technology elicitation pertains to technology interactions in a portfolio. When multiple technologies are applied simultaneously on a system, often their cumulative impact is non-linear. Current methods assume that technologies are either incompatible or linearly independent. It is observed that in case of lack of knowledge about the problem, epistemic uncertainty is the most suitable representation of the process. It reduces the number of assumptions during the elicitation process, when experts are forced to assign probability distributions to their opinions without sufficient knowledge. Epistemic uncertainty can be quantified by many techniques. In present research it is proposed that interval analysis and Dempster-Shafer theory of evidence are better suited for quantification of epistemic uncertainty in technology valuation process. Proposed technique seeks to offset some of the problems faced by using deterministic or traditional probabilistic approaches for uncertainty propagation. Non-linear behavior in technology interactions is captured through expert elicitation based technology synergy matrices (TSM). Proposed TSMs increase the fidelity of current technology forecasting methods by including higher order technology interactions. A test case for quantification of epistemic uncertainty on a large scale problem of combined cycle power generation system was selected. A detailed multidisciplinary modeling and simulation environment was adopted for this problem. Results have shown that evidence theory based technique provides more insight on the uncertainties arising from incomplete information or lack of knowledge as compared to deterministic or probability theory methods. Margin analysis was also carried out for both the techniques. A detailed description of TSMs and their usage in conjunction with technology impact matrices and technology compatibility matrices is discussed. Various combination methods are also proposed for higher order interactions, which can be applied according to the expert opinion or historical data. The introduction of technology synergy matrix enabled capturing the higher order technology interactions, and improvement in predicted system performance.
DOT National Transportation Integrated Search
2016-02-01
In this study, a computational approach for conducting durability analysis of bridges using detailed finite element models is developed. The underlying approach adopted is based on the hypothesis that the two main factors affecting the life of a brid...
DOT National Transportation Integrated Search
2009-04-01
The main goal of this study identified by NJDOT can be defined as the quantification of the effects of : management treatments on roadway operations and safety on urban collectors with access. : Since, urban collector road runs through highly d...
A conceptually and computationally simple method for the definition, display, quantification, and comparison of the shapes of three-dimensional mathematical molecular models is presented. Molecular or solvent-accessible volume and surface area can also be calculated. Algorithms, ...
Shimizu, Eri; Kato, Hisashi; Nakagawa, Yuki; Kodama, Takashi; Futo, Satoshi; Minegishi, Yasutaka; Watanabe, Takahiro; Akiyama, Hiroshi; Teshima, Reiko; Furui, Satoshi; Hino, Akihiro; Kitta, Kazumi
2008-07-23
A novel type of quantitative competitive polymerase chain reaction (QC-PCR) system for the detection and quantification of the Roundup Ready soybean (RRS) was developed. This system was designed based on the advantage of a fully validated real-time PCR method used for the quantification of RRS in Japan. A plasmid was constructed as a competitor plasmid for the detection and quantification of genetically modified soy, RRS. The plasmid contained the construct-specific sequence of RRS and the taxon-specific sequence of lectin1 (Le1), and both had 21 bp oligonucleotide insertion in the sequences. The plasmid DNA was used as a reference molecule instead of ground seeds, which enabled us to precisely and stably adjust the copy number of targets. The present study demonstrated that the novel plasmid-based QC-PCR method could be a simple and feasible alternative to the real-time PCR method used for the quantification of genetically modified organism contents.
Piñeiro, Zulema; Cantos-Villar, Emma; Palma, Miguel; Puertas, Belen
2011-11-09
A validated HPLC method with fluorescence detection for the simultaneous quantification of hydroxytyrosol and tyrosol in red wines is described. Detection conditions for both compounds were optimized (excitation at 279 and 278 and emission at 631 and 598 nm for hydroxytyrosol and tyrosol, respectively). The validation of the analytical method was based on selectivity, linearity, robustness, detection and quantification limits, repeatability, and recovery. The detection and quantification limits in red wines were set at 0.023 and 0.076 mg L(-1) for hydroxytyrosol and at 0.007 and 0.024 mg L(-1) for tyrosol determination, respectively. Precision values, both within-day and between-day (n = 5), remained below 3% for both compounds. In addition, a fractional factorial experimental design was developed to analyze the influence of six different conditions on analysis. The final optimized HPLC-fluorescence method allowed the analysis of 30 nonpretreated Spanish red wines to evaluate their hydroxytyrosol and tyrosol contents.
Reiter, Rolf; Wetzel, Martin; Hamesch, Karim; Strnad, Pavel; Asbach, Patrick; Haas, Matthias; Siegmund, Britta; Trautwein, Christian; Hamm, Bernd; Klatt, Dieter; Braun, Jürgen; Sack, Ingolf; Tzschätzsch, Heiko
2018-01-01
Although it has been known for decades that patients with alpha1-antitrypsin deficiency (AATD) have an increased risk of cirrhosis and hepatocellular carcinoma, limited data exist on non-invasive imaging-based methods for assessing liver fibrosis such as magnetic resonance elastography (MRE) and acoustic radiation force impulse (ARFI) quantification, and no data exist on 2D-shear wave elastography (2D-SWE). Therefore, the purpose of this study is to evaluate and compare the applicability of different elastography methods for the assessment of AATD-related liver fibrosis. Fifteen clinically asymptomatic AATD patients (11 homozygous PiZZ, 4 heterozygous PiMZ) and 16 matched healthy volunteers were examined using MRE and ARFI quantification. Additionally, patients were examined with 2D-SWE. A high correlation is evident for the shear wave speed (SWS) determined with different elastography methods in AATD patients: 2D-SWE/MRE, ARFI quantification/2D-SWE, and ARFI quantification/MRE (R = 0.8587, 0.7425, and 0.6914, respectively; P≤0.0089). Four AATD patients with pathologically increased SWS were consistently identified with all three methods-MRE, ARFI quantification, and 2D-SWE. The high correlation and consistent identification of patients with pathologically increased SWS using MRE, ARFI quantification, and 2D-SWE suggest that elastography has the potential to become a suitable imaging tool for the assessment of AATD-related liver fibrosis. These promising results provide motivation for further investigation of non-invasive assessment of AATD-related liver fibrosis using elastography.
Otani, Kyoko; Nakazono, Akemi; Salgo, Ivan S; Lang, Roberto M; Takeuchi, Masaaki
2016-10-01
Echocardiographic determination of left heart chamber volumetric parameters by using manual tracings during multiple beats is tedious in atrial fibrillation (AF). The aim of this study was to determine the usefulness of fully automated left chamber quantification software with single-beat three-dimensional transthoracic echocardiographic data sets in patients with AF. Single-beat full-volume three-dimensional transthoracic echocardiographic data sets were prospectively acquired during consecutive multiple cardiac beats (≥10 beats) in 88 patients with AF. In protocol 1, left ventricular volumes, left ventricular ejection fraction, and maximal left atrial volume were validated using automated quantification against the manual tracing method in identical beats in 10 patients. In protocol 2, automated quantification-derived averaged values from multiple beats were compared with the corresponding values obtained from the indexed beat in all patients. Excellent correlations of left chamber parameters between automated quantification and the manual method were observed (r = 0.88-0.98) in protocol 1. The time required for the analysis with the automated quantification method (5 min) was significantly less compared with the manual method (27 min) (P < .0001). In protocol 2, there were excellent linear correlations between the averaged left chamber parameters and the corresponding values obtained from the indexed beat (r = 0.94-0.99), and test-retest variability of left chamber parameters was low (3.5%-4.8%). Three-dimensional transthoracic echocardiography with fully automated quantification software is a rapid and reliable way to measure averaged values of left heart chamber parameters during multiple consecutive beats. Thus, it is a potential new approach for left chamber quantification in patients with AF in daily routine practice. Copyright © 2016 American Society of Echocardiography. Published by Elsevier Inc. All rights reserved.
Lamb Wave Damage Quantification Using GA-Based LS-SVM.
Sun, Fuqiang; Wang, Ning; He, Jingjing; Guan, Xuefei; Yang, Jinsong
2017-06-12
Lamb waves have been reported to be an efficient tool for non-destructive evaluations (NDE) for various application scenarios. However, accurate and reliable damage quantification using the Lamb wave method is still a practical challenge, due to the complex underlying mechanism of Lamb wave propagation and damage detection. This paper presents a Lamb wave damage quantification method using a least square support vector machine (LS-SVM) and a genetic algorithm (GA). Three damage sensitive features, namely, normalized amplitude, phase change, and correlation coefficient, were proposed to describe changes of Lamb wave characteristics caused by damage. In view of commonly used data-driven methods, the GA-based LS-SVM model using the proposed three damage sensitive features was implemented to evaluate the crack size. The GA method was adopted to optimize the model parameters. The results of GA-based LS-SVM were validated using coupon test data and lap joint component test data with naturally developed fatigue cracks. Cases of different loading and manufacturer were also included to further verify the robustness of the proposed method for crack quantification.
Lamb Wave Damage Quantification Using GA-Based LS-SVM
Sun, Fuqiang; Wang, Ning; He, Jingjing; Guan, Xuefei; Yang, Jinsong
2017-01-01
Lamb waves have been reported to be an efficient tool for non-destructive evaluations (NDE) for various application scenarios. However, accurate and reliable damage quantification using the Lamb wave method is still a practical challenge, due to the complex underlying mechanism of Lamb wave propagation and damage detection. This paper presents a Lamb wave damage quantification method using a least square support vector machine (LS-SVM) and a genetic algorithm (GA). Three damage sensitive features, namely, normalized amplitude, phase change, and correlation coefficient, were proposed to describe changes of Lamb wave characteristics caused by damage. In view of commonly used data-driven methods, the GA-based LS-SVM model using the proposed three damage sensitive features was implemented to evaluate the crack size. The GA method was adopted to optimize the model parameters. The results of GA-based LS-SVM were validated using coupon test data and lap joint component test data with naturally developed fatigue cracks. Cases of different loading and manufacturer were also included to further verify the robustness of the proposed method for crack quantification. PMID:28773003
Powder X-ray diffraction method for the quantification of cocrystals in the crystallization mixture.
Padrela, Luis; de Azevedo, Edmundo Gomes; Velaga, Sitaram P
2012-08-01
The solid state purity of cocrystals critically affects their performance. Thus, it is important to accurately quantify the purity of cocrystals in the final crystallization product. The aim of this study was to develop a powder X-ray diffraction (PXRD) quantification method for investigating the purity of cocrystals. The method developed was employed to study the formation of indomethacin-saccharin (IND-SAC) cocrystals by mechanochemical methods. Pure IND-SAC cocrystals were geometrically mixed with 1:1 w/w mixture of indomethacin/saccharin in various proportions. An accurately measured amount (550 mg) of the mixture was used for the PXRD measurements. The most intense, non-overlapping, characteristic diffraction peak of IND-SAC was used to construct the calibration curve in the range 0-100% (w/w). This calibration model was validated and used to monitor the formation of IND-SAC cocrystals by liquid-assisted grinding (LAG). The IND-SAC cocrystal calibration curve showed excellent linearity (R(2) = 0.9996) over the entire concentration range, displaying limit of detection (LOD) and limit of quantification (LOQ) values of 1.23% (w/w) and 3.74% (w/w), respectively. Validation results showed excellent correlations between actual and predicted concentrations of IND-SAC cocrystals (R(2) = 0.9981). The accuracy and reliability of the PXRD quantification method depend on the methods of sample preparation and handling. The crystallinity of the IND-SAC cocrystals was higher when larger amounts of methanol were used in the LAG method. The PXRD quantification method is suitable and reliable for verifying the purity of cocrystals in the final crystallization product.
SBIR Phase I final report, Sensor for direct, rapid and complete elemental analysis of coal
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liu, Chunyi
This Final Report is the result of the DOE SBIR Phase I assistance agreement No: DE-FOA-0001619 awarded to Applied Spectra, Inc. During the nine-month Phase I effort, we successfully demonstrated the ability to quantify rare-earth elements (REE) in coal using LIBS (Laser Induced Breakdown Spectroscopy) along with other elements of interest such as silicon (Si), aluminum (Al), magnesium (Mg), calcium (Ca), potassium (K), titanium (Ti) and iron (Fe). In addition to elemental quantification, eighteen different coal types could be classified with 100% certainty using their LIBS spectrum. High-resolution LA-ICP-MS surface mapping showed a correlation between REE and other prevalent elementsmore » such as aluminum, silicon, and titanium.« less
NASA Technical Reports Server (NTRS)
Benek, John A.; Luckring, James M.
2017-01-01
A NATO symposium held in 2008 identified many promising sensitivity analysis and un-certainty quantification technologies, but the maturity and suitability of these methods for realistic applications was not known. The STO Task Group AVT-191 was established to evaluate the maturity and suitability of various sensitivity analysis and uncertainty quantification methods for application to realistic problems of interest to NATO. The program ran from 2011 to 2015, and the work was organized into four discipline-centric teams: external aerodynamics, internal aerodynamics, aeroelasticity, and hydrodynamics. This paper presents an overview of the AVT-191 program content.
NASA Technical Reports Server (NTRS)
Benek, John A.; Luckring, James M.
2017-01-01
A NATO symposium held in Greece in 2008 identified many promising sensitivity analysis and uncertainty quantification technologies, but the maturity and suitability of these methods for realistic applications was not clear. The NATO Science and Technology Organization, Task Group AVT-191 was established to evaluate the maturity and suitability of various sensitivity analysis and uncertainty quantification methods for application to realistic vehicle development problems. The program ran from 2011 to 2015, and the work was organized into four discipline-centric teams: external aerodynamics, internal aerodynamics, aeroelasticity, and hydrodynamics. This paper summarizes findings and lessons learned from the task group.
[Detection of recombinant-DNA in foods from stacked genetically modified plants].
Sorokina, E Iu; Chernyshova, O N
2012-01-01
A quantitative real-time multiplex polymerase chain reaction method was applied to the detection and quantification of MON863 and MON810 in stacked genetically modified maize MON 810xMON 863. The limit of detection was approximately 0,1%. The accuracy of the quantification, measured as bias from the accepted value and the relative repeatability standard deviation, which measures the intra-laboratory variability, were within 25% at each GM-level. A method verification has demonstrated that the MON 863 and the MON810 methods can be equally applied in quantification of the respective events in stacked MON810xMON 863.
Source separation on hyperspectral cube applied to dermatology
NASA Astrophysics Data System (ADS)
Mitra, J.; Jolivot, R.; Vabres, P.; Marzani, F. S.
2010-03-01
This paper proposes a method of quantification of the components underlying the human skin that are supposed to be responsible for the effective reflectance spectrum of the skin over the visible wavelength. The method is based on independent component analysis assuming that the epidermal melanin and the dermal haemoglobin absorbance spectra are independent of each other. The method extracts the source spectra that correspond to the ideal absorbance spectra of melanin and haemoglobin. The noisy melanin spectrum is fixed using a polynomial fit and the quantifications associated with it are reestimated. The results produce feasible quantifications of each source component in the examined skin patch.
CCQM Pilot Study CCQM-P140: Quantitative surface analysis of multi-element alloy films
NASA Astrophysics Data System (ADS)
Kim, Kyung Joong; Jang, Jong Shik; Kim, An Soon; Suh, Jung Ki; Chung, Yong-Duck; Hodoroaba, Vasile-Dan; Wirth, Thomas; Unger, Wolfgang; Kang, Hee Jae; Popov, Oleg; Popov, Inna; Kuselman, Ilya; Lee, Yeon Hee; Sykes, David E.; Wang, Meiling; Wang, Hai; Ogiwara, Toshiya; Nishio, Mitsuaki; Tanuma, Shigeo; Simons, David; Szakal, Christopher; Osborn, William; Terauchi, Shinya; Ito, Mika; Kurokawa, Akira; Fujimoto, Toshiyuki; Jordaan, Werner; Jeong, Chil Seong; Havelund, Rasmus; Spencer, Steve; Shard, Alex; Streeck, Cornelia; Beckhoff, Burkhard; Eicke, Axel; Terborg, Ralf
2015-01-01
A pilot study for a quantitative surface analysis of multi-element alloy films has been performed by the Surface Analysis Working Group (SAWG) of the Consultative Committee for Amount of Substance (CCQM). The aim of this pilot study is to evaluate a protocol for a key comparison to demonstrate the equivalence of measures by National Metrology Institutes (NMIs) and Designated Institutes (DI) for the mole fractions of multi-element alloy films. A Cu(In,Ga)Se2 (CIGS) film with non-uniform depth distribution was chosen as a representative multi-element alloy film. The mole fractions of the reference and the test CIGS films were certified by isotope dilution—inductively coupled plasma/mass spectrometry. A total number counting (TNC) method was used as a method to determine the signal intensities of the constituent elements acquired in SIMS, XPS and AES depth profiling. TNC method is comparable with the certification process because the certified mole fractions are the average values of the films. The mole fractions of the CIGS films were measured by Secondary Ion Mass Spectrometry (SIMS), Auger Electron Spectroscopy (AES), X-ray Photoelectron Spectroscopy (XPS), X-Ray Fluorescence (XRF) Analysis and Electron Probe Micro Analysis (EPMA) with Energy Dispersive X-ray Spectrometry (EDX). Fifteen laboratories from eight NMIs, one DI, and six non-NMIs participated in this pilot study. The average mole fractions of the reported data showed relative standard deviations from 5.5 % to 6.8 % and average relative expanded uncertainties in the range from 4.52 % to 4.86 % for the four test CIGS specimens. These values are smaller than those in the key comparison CCQM-K67 for the measurement of mole fractions of Fe-Ni alloy films. As one result it can be stated that SIMS, XPS and AES protocols relying on the quantification of CIGS films using the TNC method are mature to be used in a CCQM key comparison. Main text. To reach the main text of this paper, click on Final Report. The final report has been peer-reviewed and approved for publication by CCQM.
NASA Astrophysics Data System (ADS)
Greer, Tyler; Lietz, Christopher B.; Xiang, Feng; Li, Lingjun
2015-01-01
Absolute quantification of protein targets using liquid chromatography-mass spectrometry (LC-MS) is a key component of candidate biomarker validation. One popular method combines multiple reaction monitoring (MRM) using a triple quadrupole instrument with stable isotope-labeled standards (SIS) for absolute quantification (AQUA). LC-MRM AQUA assays are sensitive and specific, but they are also expensive because of the cost of synthesizing stable isotope peptide standards. While the chemical modification approach using mass differential tags for relative and absolute quantification (mTRAQ) represents a more economical approach when quantifying large numbers of peptides, these reagents are costly and still suffer from lower throughput because only two concentration values per peptide can be obtained in a single LC-MS run. Here, we have developed and applied a set of five novel mass difference reagents, isotopic N, N-dimethyl leucine (iDiLeu). These labels contain an amine reactive group, triazine ester, are cost effective because of their synthetic simplicity, and have increased throughput compared with previous LC-MS quantification methods by allowing construction of a four-point standard curve in one run. iDiLeu-labeled peptides show remarkably similar retention time shifts, slightly lower energy thresholds for higher-energy collisional dissociation (HCD) fragmentation, and high quantification accuracy for trypsin-digested protein samples (median errors <15%). By spiking in an iDiLeu-labeled neuropeptide, allatostatin, into mouse urine matrix, two quantification methods are validated. The first uses one labeled peptide as an internal standard to normalize labeled peptide peak areas across runs (<19% error), whereas the second enables standard curve creation and analyte quantification in one run (<8% error).
A new scenario-based approach to damage detection using operational modal parameter estimates
NASA Astrophysics Data System (ADS)
Hansen, J. B.; Brincker, R.; López-Aenlle, M.; Overgaard, C. F.; Kloborg, K.
2017-09-01
In this paper a vibration-based damage localization and quantification method, based on natural frequencies and mode shapes, is presented. The proposed technique is inspired by a damage assessment methodology based solely on the sensitivity of mass-normalized experimental determined mode shapes. The present method differs by being based on modal data extracted by means of Operational Modal Analysis (OMA) combined with a reasonable Finite Element (FE) representation of the test structure and implemented in a scenario-based framework. Besides a review of the basic methodology this paper addresses fundamental theoretical as well as practical considerations which are crucial to the applicability of a given vibration-based damage assessment configuration. Lastly, the technique is demonstrated on an experimental test case using automated OMA. Both the numerical study as well as the experimental test case presented in this paper are restricted to perturbations concerning mass change.
Vu, Dai Long; Ranglová, Karolína; Hájek, Jan; Hrouzek, Pavel
2018-05-01
Quantification of selenated amino-acids currently relies on methods employing inductively coupled plasma mass spectrometry (ICP-MS). Although very accurate, these methods do not allow the simultaneous determination of standard amino-acids, hampering the comparison of the content of selenated versus non-selenated species such as methionine (Met) and selenomethionine (SeMet). This paper reports two approaches for the simultaneous quantification of Met and SeMet. In the first approach, standard enzymatic hydrolysis employing Protease XIV was applied for the preparation of samples. The second approach utilized methanesulfonic acid (MA) for the hydrolysis of samples, either in a reflux system or in a microwave oven, followed by derivatization with diethyl ethoxymethylenemalonate. The prepared samples were then analyzed by multiple reaction monitoring high performance liquid chromatography tandem mass spectrometry (MRM-HPLC-MS/MS). Both approaches provided platforms for the accurate determination of selenium/sulfur substitution rate in Met. Moreover the second approach also provided accurate simultaneous quantification of Met and SeMet with a low limit of detection, low limit of quantification and wide linearity range, comparable to the commonly used gas chromatography mass spectrometry (GC-MS) method or ICP-MS. The novel method was validated using certified reference material in conjunction with the GC-MS reference method. Copyright © 2018. Published by Elsevier B.V.
Establishment of Kansei Database and Application to Design for Consensus Building
NASA Astrophysics Data System (ADS)
Yasuda, Keiichi; Shiraki, Wataru
Reflecting the recent social background where the importance of bridge landscape design is recognized and the new business style of citizen-involved infrastructure development has started, there has been a growing need of design where aesthetic feeling of actual users is reflected. In this research, a focus has been placed on the Kansei engineering technique where users' needs are reflected on product development. A questionnaire survey has been conducted for bridge engineers who are most intensively involved in design work and students as actual users. The result was analyzed by factor analysis and the Hayashi's quantification methods (category I). A tool required at consensus-building occasions has been created to change design elements and display accompanying evaluation difference while using the Kansei database.
Noninvasive imaging of bone microarchitecture
Patsch, Janina M.; Burghardt, Andrew J.; Kazakia, Galateia; Majumdar, Sharmila
2015-01-01
The noninvasive quantification of peripheral compartment-specific bone microarchitecture is feasible with high-resolution peripheral quantitative computed tomography (HR-pQCT) and high-resolution magnetic resonance imaging (HR-MRI). In addition to classic morphometric indices, both techniques provide a suitable basis for virtual biomechanical testing using finite element (FE) analyses. Methodical limitations, morphometric parameter definition, and motion artifacts have to be considered to achieve optimal data interpretation from imaging studies. With increasing availability of in vivo high-resolution bone imaging techniques, special emphasis should be put on quality control including multicenter, cross-site validations. Importantly, conclusions from interventional studies investigating the effects of antiosteoporotic drugs on bone microarchitecture should be drawn with care, ideally involving imaging scientists, translational researchers, and clinicians. PMID:22172043
Uncertainty quantification and propagation in nuclear density functional theory
Schunck, N.; McDonnell, J. D.; Higdon, D.; ...
2015-12-23
Nuclear density functional theory (DFT) is one of the main theoretical tools used to study the properties of heavy and superheavy elements, or to describe the structure of nuclei far from stability. While on-going eff orts seek to better root nuclear DFT in the theory of nuclear forces, energy functionals remain semi-phenomenological constructions that depend on a set of parameters adjusted to experimental data in fi nite nuclei. In this study, we review recent eff orts to quantify the related uncertainties, and propagate them to model predictions. In particular, we cover the topics of parameter estimation for inverse problems, statisticalmore » analysis of model uncertainties and Bayesian inference methods. Illustrative examples are taken from the literature.« less
Pyschik, Marcelina; Klein-Hitpaß, Marcel; Girod, Sabrina; Winter, Martin; Nowak, Sascha
2017-02-01
In this study, an optimized method using capillary electrophoresis (CE) with a direct contactless conductivity detector (C 4 D) for a new application field is presented for the quantification of fluoride in common used lithium ion battery (LIB) electrolyte using LiPF 6 in organic carbonate solvents and in ionic liquids (ILs) after contacted to Li metal. The method development for finding the right buffer and the suitable CE conditions for the quantification of fluoride was investigated. The results of the concentration of fluoride in different LIB electrolyte samples were compared to the results from the ion-selective electrode (ISE). The relative standard deviations (RSDs) and recovery rates for fluoride were obtained with a very high accuracy in both methods. The results of the fluoride concentration in the LIB electrolytes were in very good agreement for both methods. In addition, the limit of detection (LOD) and limit of quantification (LOQ) values were determined for the CE method. The CE method has been applied also for the quantification of fluoride in ILs. In the fresh IL sample, the concentration of fluoride was under the LOD. Another sample of the IL mixed with Li metal has been investigated as well. It was possible to quantify the fluoride concentration in this sample. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
De Spiegelaere, Ward; Malatinkova, Eva; Lynch, Lindsay; Van Nieuwerburgh, Filip; Messiaen, Peter; O'Doherty, Una; Vandekerckhove, Linos
2014-06-01
Quantification of integrated proviral HIV DNA by repetitive-sampling Alu-HIV PCR is a candidate virological tool to monitor the HIV reservoir in patients. However, the experimental procedures and data analysis of the assay are complex and hinder its widespread use. Here, we provide an improved and simplified data analysis method by adopting binomial and Poisson statistics. A modified analysis method on the basis of Poisson statistics was used to analyze the binomial data of positive and negative reactions from a 42-replicate Alu-HIV PCR by use of dilutions of an integration standard and on samples of 57 HIV-infected patients. Results were compared with the quantitative output of the previously described Alu-HIV PCR method. Poisson-based quantification of the Alu-HIV PCR was linearly correlated with the standard dilution series, indicating that absolute quantification with the Poisson method is a valid alternative for data analysis of repetitive-sampling Alu-HIV PCR data. Quantitative outputs of patient samples assessed by the Poisson method correlated with the previously described Alu-HIV PCR analysis, indicating that this method is a valid alternative for quantifying integrated HIV DNA. Poisson-based analysis of the Alu-HIV PCR data enables absolute quantification without the need of a standard dilution curve. Implementation of the CI estimation permits improved qualitative analysis of the data and provides a statistical basis for the required minimal number of technical replicates. © 2014 The American Association for Clinical Chemistry.
Uncertainty Quantification in Alchemical Free Energy Methods.
Bhati, Agastya P; Wan, Shunzhou; Hu, Yuan; Sherborne, Brad; Coveney, Peter V
2018-06-12
Alchemical free energy methods have gained much importance recently from several reports of improved ligand-protein binding affinity predictions based on their implementation using molecular dynamics simulations. A large number of variants of such methods implementing different accelerated sampling techniques and free energy estimators are available, each claimed to be better than the others in its own way. However, the key features of reproducibility and quantification of associated uncertainties in such methods have barely been discussed. Here, we apply a systematic protocol for uncertainty quantification to a number of popular alchemical free energy methods, covering both absolute and relative free energy predictions. We show that a reliable measure of error estimation is provided by ensemble simulation-an ensemble of independent MD simulations-which applies irrespective of the free energy method. The need to use ensemble methods is fundamental and holds regardless of the duration of time of the molecular dynamics simulations performed.
Deconinck, E; Crevits, S; Baten, P; Courselle, P; De Beer, J
2011-04-05
A fully validated UHPLC method for the identification and quantification of folic acid in pharmaceutical preparations was developed. The starting conditions for the development were calculated starting from the HPLC conditions of a validated method. These start conditions were tested on four different UHPLC columns: Grace Vision HT™ C18-P, C18, C18-HL and C18-B (2 mm × 100 mm, 1.5 μm). After selection of the stationary phase, the method was further optimised by testing two aqueous and two organic phases and by adapting to a gradient method. The obtained method was fully validated based on its measurement uncertainty (accuracy profile) and robustness tests. A UHPLC method was obtained for the identification and quantification of folic acid in pharmaceutical preparations, which will cut analysis times and solvent consumption. Copyright © 2010 Elsevier B.V. All rights reserved.
Alves, L.P.S.; Almeida, A.T.; Cruz, L.M.; Pedrosa, F.O.; de Souza, E.M.; Chubatsu, L.S.; Müller-Santos, M.; Valdameri, G.
2017-01-01
The conventional method for quantification of polyhydroxyalkanoates based on whole-cell methanolysis and gas chromatography (GC) is laborious and time-consuming. In this work, a method based on flow cytometry of Nile red stained bacterial cells was established to quantify poly-3-hydroxybutyrate (PHB) production by the diazotrophic and plant-associated bacteria, Herbaspirillum seropedicae and Azospirillum brasilense. The method consists of three steps: i) cell permeabilization, ii) Nile red staining, and iii) analysis by flow cytometry. The method was optimized step-by-step and can be carried out in less than 5 min. The final results indicated a high correlation coefficient (R2=0.99) compared to a standard method based on methanolysis and GC. This method was successfully applied to the quantification of PHB in epiphytic bacteria isolated from rice roots. PMID:28099582
Rakesh Minocha; P. Thangavel; Om Parkash Dhankher; Stephanie Long
2008-01-01
The HPLC method presented here for the quantification of metal-binding thiols is considerably shorter than most previously published methods. It is a sensitive and highly reproducible method that separates monobromobimane tagged monothiols (cysteine, glutathione, γ-glutamylcysteine) along with polythiols (PC2, PC3...
Roger, B; Fernandez, X; Jeannot, V; Chahboun, J
2010-01-01
The essential oil obtained from iris rhizomes is one of the most precious raw materials for the perfume industry. Its fragrance is due to irones that are gradually formed by oxidative degradation of iridals during rhizome ageing. The development of an alternative method allowing irone quantification in iris rhizomes using HS-SPME-GC. The development of the method using HS-SPME-GC was achieved using the results obtained from a conventional method, i.e. a solid-liquid extraction (SLE) followed by irone quantification by CG. Among several calibration methods tested, internal calibration gave the best results and was the least sensitive to the matrix effect. The proposed method using HS-SPME-GC is as accurate and reproducible as the conventional one using SLE. These two methods were used to monitor and compare irone concentrations in iris rhizomes that had been stored for 6 months to 9 years. Irone quantification in iris rhizome can be achieved using HS-SPME-GC. This method can thus be used for the quality control of the iris rhizomes. It offers the advantage of combining extraction and analysis with an automated device and thus allows a large number of rhizome batches to be analysed and compared in a limited amount of time. Copyright © 2010 John Wiley & Sons, Ltd.
Nicolás, Paula; Lassalle, Verónica L; Ferreira, María L
2017-02-01
The aim of this manuscript was to study the application of a new method of protein quantification in Candida antarctica lipase B commercial solutions. Error sources associated to the traditional Bradford technique were demonstrated. Eight biocatalysts based on C. antarctica lipase B (CALB) immobilized onto magnetite nanoparticles were used. Magnetite nanoparticles were coated with chitosan (CHIT) and modified with glutaraldehyde (GLUT) and aminopropyltriethoxysilane (APTS). Later, CALB was adsorbed on the modified support. The proposed novel protein quantification method included the determination of sulfur (from protein in CALB solution) by means of Atomic Emission by Inductive Coupling Plasma (AE-ICP). Four different protocols were applied combining AE-ICP and classical Bradford assays, besides Carbon, Hydrogen and Nitrogen (CHN) analysis. The calculated error in protein content using the "classic" Bradford method with bovine serum albumin as standard ranged from 400 to 1200% when protein in CALB solution was quantified. These errors were calculated considering as "true protein content values" the results of the amount of immobilized protein obtained with the improved method. The optimum quantification procedure involved the combination of Bradford method, ICP and CHN analysis. Copyright © 2016 Elsevier Inc. All rights reserved.
Newbury, Dale E; Ritchie, Nicholas W M
2015-10-01
A scanning electron microscope with a silicon drift detector energy-dispersive X-ray spectrometer (SEM/SDD-EDS) was used to analyze materials containing the low atomic number elements B, C, N, O, and F achieving a high degree of accuracy. Nearly all results fell well within an uncertainty envelope of ±5% relative (where relative uncertainty (%)=[(measured-ideal)/ideal]×100%). Quantification was performed with the standards-based "k-ratio" method with matrix corrections calculated based on the Pouchou and Pichoir expression for the ionization depth distribution function, as implemented in the NIST DTSA-II EDS software platform. The analytical strategy that was followed involved collection of high count (>2.5 million counts from 100 eV to the incident beam energy) spectra measured with a conservative input count rate that restricted the deadtime to ~10% to minimize coincidence effects. Standards employed included pure elements and simple compounds. A 10 keV beam was employed to excite the K- and L-shell X-rays of intermediate and high atomic number elements with excitation energies above 3 keV, e.g., the Fe K-family, while a 5 keV beam was used for analyses of elements with excitation energies below 3 keV, e.g., the Mo L-family.
USDA-ARS?s Scientific Manuscript database
Arbuscular mycorrhizal fungi (AMF) are well-known plant symbionts which provide enhanced phosphorus uptake as well as other benefits to their host plants. Quantification of mycorrhizal biomass and root colonization has traditionally been performed by root staining and microscopic examination methods...
The quantification of solute concentrations in laboratory aquifer models has been largely limited to the use of sampling ports, from which samples are collected for external analysis. One of the drawbacks to this method is that the act of sampling may disturb plume dynamics and ...
21 CFR 530.24 - Procedure for announcing analytical methods for drug residue quantification.
Code of Federal Regulations, 2011 CFR
2011-04-01
..., DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) ANIMAL DRUGS, FEEDS, AND RELATED PRODUCTS EXTRALABEL DRUG USE IN ANIMALS Specific Provisions Relating to Extralabel Use of Animal and Human Drugs in Food-Producing Animals § 530.24 Procedure for announcing analytical methods for drug residue quantification. (a...
NASA Astrophysics Data System (ADS)
Nischkauer, Winfried; Vanhaecke, Frank; Bernacchi, Sébastien; Herwig, Christoph; Limbeck, Andreas
2014-11-01
Nebulising liquid samples and using the aerosol thus obtained for further analysis is the standard method in many current analytical techniques, also with inductively coupled plasma (ICP)-based devices. With such a set-up, quantification via external calibration is usually straightforward for samples with aqueous or close-to-aqueous matrix composition. However, there is a variety of more complex samples. Such samples can be found in medical, biological, technological and industrial contexts and can range from body fluids, like blood or urine, to fuel additives or fermentation broths. Specialized nebulizer systems or careful digestion and dilution are required to tackle such demanding sample matrices. One alternative approach is to convert the liquid into a dried solid and to use laser ablation for sample introduction. Up to now, this approach required the application of internal standards or matrix-adjusted calibration due to matrix effects. In this contribution, we show a way to circumvent these matrix effects while using simple external calibration for quantification. The principle of representative sampling that we propose uses radial line-scans across the dried residue. This compensates for centro-symmetric inhomogeneities typically observed in dried spots. The effectiveness of the proposed sampling strategy is exemplified via the determination of phosphorus in biochemical fermentation media. However, the universal viability of the presented measurement protocol is postulated. Detection limits using laser ablation-ICP-optical emission spectrometry were in the order of 40 μg mL- 1 with a reproducibility of 10 % relative standard deviation (n = 4, concentration = 10 times the quantification limit). The reported sensitivity is fit-for-purpose in the biochemical context described here, but could be improved using ICP-mass spectrometry, if future analytical tasks would require it. Trueness of the proposed method was investigated by cross-validation with conventional liquid measurements, and by analyzing IAEA-153 reference material (Trace Elements in Milk Powder); a good agreement with the certified value for phosphorus was obtained.
Richardson, Keith; Denny, Richard; Hughes, Chris; Skilling, John; Sikora, Jacek; Dadlez, Michał; Manteca, Angel; Jung, Hye Ryung; Jensen, Ole Nørregaard; Redeker, Virginie; Melki, Ronald; Langridge, James I.; Vissers, Johannes P.C.
2013-01-01
A probability-based quantification framework is presented for the calculation of relative peptide and protein abundance in label-free and label-dependent LC-MS proteomics data. The results are accompanied by credible intervals and regulation probabilities. The algorithm takes into account data uncertainties via Poisson statistics modified by a noise contribution that is determined automatically during an initial normalization stage. Protein quantification relies on assignments of component peptides to the acquired data. These assignments are generally of variable reliability and may not be present across all of the experiments comprising an analysis. It is also possible for a peptide to be identified to more than one protein in a given mixture. For these reasons the algorithm accepts a prior probability of peptide assignment for each intensity measurement. The model is constructed in such a way that outliers of any type can be automatically reweighted. Two discrete normalization methods can be employed. The first method is based on a user-defined subset of peptides, while the second method relies on the presence of a dominant background of endogenous peptides for which the concentration is assumed to be unaffected. Normalization is performed using the same computational and statistical procedures employed by the main quantification algorithm. The performance of the algorithm will be illustrated on example data sets, and its utility demonstrated for typical proteomics applications. The quantification algorithm supports relative protein quantification based on precursor and product ion intensities acquired by means of data-dependent methods, originating from all common isotopically-labeled approaches, as well as label-free ion intensity-based data-independent methods. PMID:22871168
Psifidi, Androniki; Dovas, Chrysostomos; Banos, Georgios
2011-01-01
Background Single nucleotide polymorphisms (SNP) have proven to be powerful genetic markers for genetic applications in medicine, life science and agriculture. A variety of methods exist for SNP detection but few can quantify SNP frequencies when the mutated DNA molecules correspond to a small fraction of the wild-type DNA. Furthermore, there is no generally accepted gold standard for SNP quantification, and, in general, currently applied methods give inconsistent results in selected cohorts. In the present study we sought to develop a novel method for accurate detection and quantification of SNP in DNA pooled samples. Methods The development and evaluation of a novel Ligase Chain Reaction (LCR) protocol that uses a DNA-specific fluorescent dye to allow quantitative real-time analysis is described. Different reaction components and thermocycling parameters affecting the efficiency and specificity of LCR were examined. Several protocols, including gap-LCR modifications, were evaluated using plasmid standard and genomic DNA pools. A protocol of choice was identified and applied for the quantification of a polymorphism at codon 136 of the ovine PRNP gene that is associated with susceptibility to a transmissible spongiform encephalopathy in sheep. Conclusions The real-time LCR protocol developed in the present study showed high sensitivity, accuracy, reproducibility and a wide dynamic range of SNP quantification in different DNA pools. The limits of detection and quantification of SNP frequencies were 0.085% and 0.35%, respectively. Significance The proposed real-time LCR protocol is applicable when sensitive detection and accurate quantification of low copy number mutations in DNA pools is needed. Examples include oncogenes and tumour suppressor genes, infectious diseases, pathogenic bacteria, fungal species, viral mutants, drug resistance resulting from point mutations, and genetically modified organisms in food. PMID:21283808
Lao, Yexing; Yang, Cuiping; Zou, Wei; Gan, Manquan; Chen, Ping; Su, Weiwei
2012-05-01
The cryptand Kryptofix 2.2.2 is used extensively as a phase-transfer reagent in the preparation of [18F]fluoride-labelled radiopharmaceuticals. However, it has considerable acute toxicity. The aim of this study was to develop and validate a method for rapid (within 1 min), specific and sensitive quantification of Kryptofix 2.2.2 at trace levels. Chromatographic separations were carried out by rapid-resolution liquid chromatography (Agilent ZORBAX SB-C18 rapid-resolution column, 2.1 × 30 mm, 3.5 μm). Tandem mass spectra were acquired using a triple quadrupole mass spectrometer equipped with an electrospray ionization interface. Quantitative mass spectrometric analysis was conducted in positive ion mode and multiple reaction monitoring mode for the m/z 377.3 → 114.1 transition for Kryptofix 2.2.2. The external standard method was used for quantification. The method met the precision and efficiency requirements for PET radiopharmaceuticals, providing satisfactory results for specificity, matrix effect, stability, linearity (0.5-100 ng/ml, r(2)=0.9975), precision (coefficient of variation < 5%), accuracy (relative error < ± 3%), sensitivity (lower limit of quantification=0.5 ng) and detection time (<1 min). Fluorodeoxyglucose (n=6) was analysed, and the Kryptofix 2.2.2 content was found to be well below the maximum permissible levels approved by the US Food and Drug Administration. The developed method has a short analysis time (<1 min) and high sensitivity (lower limit of quantification=0.5 ng/ml) and can be successfully applied to rapid quantification of Kryptofix 2.2.2 at trace levels in fluorodeoxyglucose. This method could also be applied to other [18F]fluorine-labelled radiopharmaceuticals that use Kryptofix 2.2.2 as a phase-transfer reagent.
NASA Astrophysics Data System (ADS)
Zhang, Guoxia; Li, Qing; Zhu, Yan; Wang, Zheng
2018-07-01
An additional quantification strategy using a desolvating nebulizer system (DNS) for solution-based calibration was developed. For quantitative analysis, laser ablation (LA) and DNS-generated aerosols were coupled using a "Y" connector and introduced into the inductively coupled plasma (ICP). These aerosols were also observed by scanning electron microscopy following collection on a silicon chip. Internal standards (108Ag, 64Cu, 89Y) were used to correct for the different aerosol transport efficiencies between the DNS and LA. The correlation coefficients of the calibration curves for all elements ranged from 0.9986 to 0.9999. Standard reference materials (NIST 610-616 and GBW08407-08411) were used to demonstrate the accuracy and precision of the method. The results were in good agreement with certified values, and the relative standard deviation (RSD) of most elements was <3%. The limits of detection (LODs) for 50Cr, 55Mn, 59Co, 60Ni, 66Zn, 89Y, 110Cd, 139La, 140Ce, 146Nd, 147Sm, 157Gd, 163Dy, 166Er, and 208Pb were 23, 3, 3, 19, 31, 4, 12, 0.4, 0.9, 0.1, 0.2, 2, 0.3, 0.4, and 21 ng/g, respectively, which were significantly better than those obtained by other methods. Further, this approach was applied for the analysis of multiple elements in biological tissues, and the results were in good agreement with those obtained using solution-based inductively coupled plasma-mass spectrometry (ICP-MS).
Critical aspects of data analysis for quantification in laser-induced breakdown spectroscopy
NASA Astrophysics Data System (ADS)
Motto-Ros, V.; Syvilay, D.; Bassel, L.; Negre, E.; Trichard, F.; Pelascini, F.; El Haddad, J.; Harhira, A.; Moncayo, S.; Picard, J.; Devismes, D.; Bousquet, B.
2018-02-01
In this study, a collaborative contest focused on LIBS data processing has been conducted in an original way since the participants did not share the same samples to be analyzed on their own LIBS experiments but a set of LIBS spectra obtained from one single experiment. Each participant was asked to provide the predicted concentrations of several elements for two glass samples. The analytical contest revealed a wide diversity of results among participants, even when the same spectral lines were considered for the analysis. Then, a parametric study was conducted to investigate the influence of each step during the data processing. This study was based on several analytical figures of merit such as the determination coefficient, uncertainty, limit of quantification and prediction ability (i.e., trueness). Then, it was possible to interpret the results provided by the participants, emphasizing the fact that the type of data extraction, baseline modeling as well as the calibration model play key roles in the quantification performance of the technique. This work provides a set of recommendations based on a systematic evaluation of the quantification procedure with the aim of optimizing the methodological steps toward the standardization of LIBS.
NASA Astrophysics Data System (ADS)
Greunz, Theresia; Duchaczek, Hubert; Sagl, Raffaela; Duchoslav, Jiri; Steinberger, Roland; Strauß, Bernhard; Stifter, David
2017-02-01
Cr(VI) is known for its corrosion inhibitive properties and is, despite legal regulations, still a potential candidate to be added to thin (1-3 μm) protective coatings applied on, e.g., electrical steel as used for transformers, etc. However, Cr(VI) is harmful to the environment and to the human health. Hence, a reliable quantification of it is of decisive interest. Commonly, an alkaline extraction with a photometric endpoint detection of Cr(VI) is used for such material systems. However, this procedure requires an accurate knowledge on sample parameters such as dry film thickness and coating density that are occasionally associated with significant experimental errors. We present a comprehensive study of a coating system with a defined Cr(VI) pigment concentration applied on electrical steel. X-ray photoelectron spectroscopy (XPS) was employed to resolve the elemental chromium concentration and the chemical state. Turning to the fact that XPS is extremely surface sensitive (<10 nm) and that the lowest commonly achievable lateral resolution is a number of times higher than the coating thickness (∼2 μm), a bulk analysis was achieved with XPS line scans on extended wedge-shaped tapers through the coating. For that purpose a special sample preparation step performed on an ultra-microtome was required prior to analysis. Since a temperature increase leads to a reduction of Cr(VI) we extend our method on samples, which were subjected to different curing temperatures. We show that our proposed approach now allows to determine the elemental and Cr(VI) concentration and distribution inside the coating.
Advanced Modeling and Uncertainty Quantification for Flight Dynamics; Interim Results and Challenges
NASA Technical Reports Server (NTRS)
Hyde, David C.; Shweyk, Kamal M.; Brown, Frank; Shah, Gautam
2014-01-01
As part of the NASA Vehicle Systems Safety Technologies (VSST), Assuring Safe and Effective Aircraft Control Under Hazardous Conditions (Technical Challenge #3), an effort is underway within Boeing Research and Technology (BR&T) to address Advanced Modeling and Uncertainty Quantification for Flight Dynamics (VSST1-7). The scope of the effort is to develop and evaluate advanced multidisciplinary flight dynamics modeling techniques, including integrated uncertainties, to facilitate higher fidelity response characterization of current and future aircraft configurations approaching and during loss-of-control conditions. This approach is to incorporate multiple flight dynamics modeling methods for aerodynamics, structures, and propulsion, including experimental, computational, and analytical. Also to be included are techniques for data integration and uncertainty characterization and quantification. This research shall introduce new and updated multidisciplinary modeling and simulation technologies designed to improve the ability to characterize airplane response in off-nominal flight conditions. The research shall also introduce new techniques for uncertainty modeling that will provide a unified database model comprised of multiple sources, as well as an uncertainty bounds database for each data source such that a full vehicle uncertainty analysis is possible even when approaching or beyond Loss of Control boundaries. Methodologies developed as part of this research shall be instrumental in predicting and mitigating loss of control precursors and events directly linked to causal and contributing factors, such as stall, failures, damage, or icing. The tasks will include utilizing the BR&T Water Tunnel to collect static and dynamic data to be compared to the GTM extended WT database, characterizing flight dynamics in off-nominal conditions, developing tools for structural load estimation under dynamic conditions, devising methods for integrating various modeling elements into a real-time simulation capability, generating techniques for uncertainty modeling that draw data from multiple modeling sources, and providing a unified database model that includes nominal plus increments for each flight condition. This paper presents status of testing in the BR&T water tunnel and analysis of the resulting data and efforts to characterize these data using alternative modeling methods. Program challenges and issues are also presented.
Abraham, Jerrold L.; Chandra, Subhash; Agrawal, Anoop
2014-01-01
Recently, a report raised the possibility of shrapnel-induced chronic beryllium disease (CBD) from long-term exposure to the surface of retained aluminum shrapnel fragments in the body. Since the shrapnel fragments contained trace beryllium, methodological developments were needed for beryllium quantification and to study its spatial distribution in relation to other matrix elements, such as aluminum and iron, in metallurgic samples. In this work, we developed methodology for quantification of trace beryllium in samples of shrapnel fragments and other metallurgic sample-types with main matrix of aluminum (aluminum cans from soda, beer, carbonated water, and aluminum foil). Sample preparation procedures were developed for dissolving beryllium for its quantification with the fluorescence detection method for homogenized measurements. The spatial distribution of trace beryllium on the sample surface and in 3D was imaged with a dynamic secondary ion mass spectrometry (SIMS) instrument, CAMECA IMS 3f SIMS ion microscope. The beryllium content of shrapnel (~100 ppb) was the same as the trace quantities of beryllium found in aluminum cans. The beryllium content of aluminum foil (~25 ppb) was significantly lower than cans. SIMS imaging analysis revealed beryllium to be distributed in the form of low micron-sized particles and clusters distributed randomly in X-Y-and Z dimensions, and often in association with iron, in the main aluminum matrix of cans. These observations indicate a plausible formation of Be-Fe or Al-Be alloy in the matrix of cans. Further observations were made on fluids (carbonated water) for understanding if trace beryllium in cans leached out and contaminated the food product. A direct comparison of carbonated water in aluminum cans and plastic bottles revealed that beryllium was below the detection limits of the fluorescence detection method (~0.01 ppb). These observations indicate that beryllium present in aluminum matrix was either present in an immobile form or its mobilization into the food product was prevented by a polymer coating on the inside of cans, a practice used in food industry to prevent contamination of food products. The lack of such coating in retained shrapnel fragments renders their surface a possible source of contamination for long-term exposure of tissues and fluids and induction of disease, as characterized in a recent study. Methodological developments reported here can be extended to studies of beryllium in electronics devices and components. PMID:25146877
Abraham, J L; Chandra, S; Agrawal, A
2014-11-01
Recently, a report raised the possibility of shrapnel-induced chronic beryllium disease from long-term exposure to the surface of retained aluminum shrapnel fragments in the body. Since the shrapnel fragments contained trace beryllium, methodological developments were needed for beryllium quantification and to study its spatial distribution in relation to other matrix elements, such as aluminum and iron, in metallurgic samples. In this work, we developed methodology for quantification of trace beryllium in samples of shrapnel fragments and other metallurgic sample-types with main matrix of aluminum (aluminum cans from soda, beer, carbonated water and aluminum foil). Sample preparation procedures were developed for dissolving beryllium for its quantification with the fluorescence detection method for homogenized measurements. The spatial distribution of trace beryllium on the sample surface and in 3D was imaged with a dynamic secondary ion mass spectrometry instrument, CAMECA IMS 3f secondary ion mass spectrometry ion microscope. The beryllium content of shrapnel (∼100 ppb) was the same as the trace quantities of beryllium found in aluminum cans. The beryllium content of aluminum foil (∼25 ppb) was significantly lower than cans. SIMS imaging analysis revealed beryllium to be distributed in the form of low micron-sized particles and clusters distributed randomly in X-Y- and Z dimensions, and often in association with iron, in the main aluminum matrix of cans. These observations indicate a plausible formation of Be-Fe or Al-Be alloy in the matrix of cans. Further observations were made on fluids (carbonated water) for understanding if trace beryllium in cans leached out and contaminated the food product. A direct comparison of carbonated water in aluminum cans and plastic bottles revealed that beryllium was below the detection limits of the fluorescence detection method (∼0.01 ppb). These observations indicate that beryllium present in aluminum matrix was either present in an immobile form or its mobilization into the food product was prevented by a polymer coating on the inside of cans, a practice used in food industry to prevent contamination of food products. The lack of such coating in retained shrapnel fragments renders their surface a possible source of contamination for long-term exposure of tissues and fluids and induction of disease, as characterized in a recent study. Methodological developments reported here can be extended to studies of beryllium in electronics devices and components. © 2014 The Authors Journal of Microscopy © 2014 Royal Microscopical Society.
NASA Astrophysics Data System (ADS)
Wang, Hongrui; Wang, Cheng; Wang, Ying; Gao, Xiong; Yu, Chen
2017-06-01
This paper presents a Bayesian approach using Metropolis-Hastings Markov Chain Monte Carlo algorithm and applies this method for daily river flow rate forecast and uncertainty quantification for Zhujiachuan River using data collected from Qiaotoubao Gage Station and other 13 gage stations in Zhujiachuan watershed in China. The proposed method is also compared with the conventional maximum likelihood estimation (MLE) for parameter estimation and quantification of associated uncertainties. While the Bayesian method performs similarly in estimating the mean value of daily flow rate, it performs over the conventional MLE method on uncertainty quantification, providing relatively narrower reliable interval than the MLE confidence interval and thus more precise estimation by using the related information from regional gage stations. The Bayesian MCMC method might be more favorable in the uncertainty analysis and risk management.
Plasma protein absolute quantification by nano-LC Q-TOF UDMSE for clinical biomarker verification
ILIES, MARIA; IUGA, CRISTINA ADELA; LOGHIN, FELICIA; DHOPLE, VISHNU MUKUND; HAMMER, ELKE
2017-01-01
Background and aims Proteome-based biomarker studies are targeting proteins that could serve as diagnostic, prognosis, and prediction molecules. In the clinical routine, immunoassays are currently used for the absolute quantification of such biomarkers, with the major limitation that only one molecule can be targeted per assay. The aim of our study was to test a mass spectrometry based absolute quantification method for the verification of plasma protein sets which might serve as reliable biomarker panels for the clinical practice. Methods Six EDTA plasma samples were analyzed after tryptic digestion using a high throughput data independent acquisition nano-LC Q-TOF UDMSE proteomics approach. Synthetic Escherichia coli standard peptides were spiked in each sample for the absolute quantification. Data analysis was performed using ProgenesisQI v2.0 software (Waters Corporation). Results Our method ensured absolute quantification of 242 non redundant plasma proteins in a single run analysis. The dynamic range covered was 105. 86% were represented by classical plasma proteins. The overall median coefficient of variation was 0.36, while a set of 63 proteins was found to be highly stable. Absolute protein concentrations strongly correlated with values reviewed in the literature. Conclusions Nano-LC Q-TOF UDMSE proteomic analysis can be used for a simple and rapid determination of absolute amounts of plasma proteins. A large number of plasma proteins could be analyzed, while a wide dynamic range was covered with low coefficient of variation at protein level. The method proved to be a reliable tool for the quantification of protein panel for biomarker verification in the clinical practice. PMID:29151793
Loziuk, Philip L.; Sederoff, Ronald R.; Chiang, Vincent L.; Muddiman, David C.
2014-01-01
Quantitative mass spectrometry has become central to the field of proteomics and metabolomics. Selected reaction monitoring is a widely used method for the absolute quantification of proteins and metabolites. This method renders high specificity using several product ions measured simultaneously. With growing interest in quantification of molecular species in complex biological samples, confident identification and quantitation has been of particular concern. A method to confirm purity or contamination of product ion spectra has become necessary for achieving accurate and precise quantification. Ion abundance ratio assessments were introduced to alleviate some of these issues. Ion abundance ratios are based on the consistent relative abundance (RA) of specific product ions with respect to the total abundance of all product ions. To date, no standardized method of implementing ion abundance ratios has been established. Thresholds by which product ion contamination is confirmed vary widely and are often arbitrary. This study sought to establish criteria by which the relative abundance of product ions can be evaluated in an absolute quantification experiment. These findings suggest that evaluation of the absolute ion abundance for any given transition is necessary in order to effectively implement RA thresholds. Overall, the variation of the RA value was observed to be relatively constant beyond an absolute threshold ion abundance. Finally, these RA values were observed to fluctuate significantly over a 3 year period, suggesting that these values should be assessed as close as possible to the time at which data is collected for quantification. PMID:25154770
NASA Astrophysics Data System (ADS)
Toman, Blaza; Nelson, Michael A.; Bedner, Mary
2017-06-01
Chemical measurement methods are designed to promote accurate knowledge of a measurand or system. As such, these methods often allow elicitation of latent sources of variability and correlation in experimental data. They typically implement measurement equations that support quantification of effects associated with calibration standards and other known or observed parametric variables. Additionally, multiple samples and calibrants are usually analyzed to assess accuracy of the measurement procedure and repeatability by the analyst. Thus, a realistic assessment of uncertainty for most chemical measurement methods is not purely bottom-up (based on the measurement equation) or top-down (based on the experimental design), but inherently contains elements of both. Confidence in results must be rigorously evaluated for the sources of variability in all of the bottom-up and top-down elements. This type of analysis presents unique challenges due to various statistical correlations among the outputs of measurement equations. One approach is to use a Bayesian hierarchical (BH) model which is intrinsically rigorous, thus making it a straightforward method for use with complex experimental designs, particularly when correlations among data are numerous and difficult to elucidate or explicitly quantify. In simpler cases, careful analysis using GUM Supplement 1 (MC) methods augmented with random effects meta analysis yields similar results to a full BH model analysis. In this article we describe both approaches to rigorous uncertainty evaluation using as examples measurements of 25-hydroxyvitamin D3 in solution reference materials via liquid chromatography with UV absorbance detection (LC-UV) and liquid chromatography mass spectrometric detection using isotope dilution (LC-IDMS).
2014-04-01
Barrier methods for critical exponent problems in geometric analysis and mathematical physics, J. Erway and M. Holst, Submitted for publication ...TR-14-33 A Posteriori Error Analysis and Uncertainty Quantification for Adaptive Multiscale Operator Decomposition Methods for Multiphysics...Problems Approved for public release, distribution is unlimited. April 2014 HDTRA1-09-1-0036 Donald Estep and Michael
Review of analytical methods for the quantification of iodine in complex matrices.
Shelor, C Phillip; Dasgupta, Purnendu K
2011-09-19
Iodine is an essential element of human nutrition. Nearly a third of the global population has insufficient iodine intake and is at risk of developing Iodine Deficiency Disorders (IDD). Most countries have iodine supplementation and monitoring programs. Urinary iodide (UI) is the biomarker used for epidemiological studies; only a few methods are currently used routinely for analysis. These methods either require expensive instrumentation with qualified personnel (inductively coupled plasma-mass spectrometry, instrumental nuclear activation analysis) or oxidative sample digestion to remove potential interferences prior to analysis by a kinetic colorimetric method originally introduced by Sandell and Kolthoff ~75 years ago. The Sandell-Kolthoff (S-K) method is based on the catalytic effect of iodide on the reaction between Ce(4+) and As(3+). No available technique fully fits the needs of developing countries; research into inexpensive reliable methods and instrumentation are needed. There have been multiple reviews of methods used for epidemiological studies and specific techniques. However, a general review of iodine determination on a wide-ranging set of complex matrices is not available. While this review is not comprehensive, we cover the principal developments since the original development of the S-K method. Copyright © 2011 Elsevier B.V. All rights reserved.
Tey, Wei Keat; Kuang, Ye Chow; Ooi, Melanie Po-Leen; Khoo, Joon Joon
2018-03-01
Interstitial fibrosis in renal biopsy samples is a scarring tissue structure that may be visually quantified by pathologists as an indicator to the presence and extent of chronic kidney disease. The standard method of quantification by visual evaluation presents reproducibility issues in the diagnoses. This study proposes an automated quantification system for measuring the amount of interstitial fibrosis in renal biopsy images as a consistent basis of comparison among pathologists. The system extracts and segments the renal tissue structures based on colour information and structural assumptions of the tissue structures. The regions in the biopsy representing the interstitial fibrosis are deduced through the elimination of non-interstitial fibrosis structures from the biopsy area and quantified as a percentage of the total area of the biopsy sample. A ground truth image dataset has been manually prepared by consulting an experienced pathologist for the validation of the segmentation algorithms. The results from experiments involving experienced pathologists have demonstrated a good correlation in quantification result between the automated system and the pathologists' visual evaluation. Experiments investigating the variability in pathologists also proved the automated quantification error rate to be on par with the average intra-observer variability in pathologists' quantification. Interstitial fibrosis in renal biopsy samples is a scarring tissue structure that may be visually quantified by pathologists as an indicator to the presence and extent of chronic kidney disease. The standard method of quantification by visual evaluation presents reproducibility issues in the diagnoses due to the uncertainties in human judgement. An automated quantification system for accurately measuring the amount of interstitial fibrosis in renal biopsy images is presented as a consistent basis of comparison among pathologists. The system identifies the renal tissue structures through knowledge-based rules employing colour space transformations and structural features extraction from the images. In particular, the renal glomerulus identification is based on a multiscale textural feature analysis and a support vector machine. The regions in the biopsy representing interstitial fibrosis are deduced through the elimination of non-interstitial fibrosis structures from the biopsy area. The experiments conducted evaluate the system in terms of quantification accuracy, intra- and inter-observer variability in visual quantification by pathologists, and the effect introduced by the automated quantification system on the pathologists' diagnosis. A 40-image ground truth dataset has been manually prepared by consulting an experienced pathologist for the validation of the segmentation algorithms. The results from experiments involving experienced pathologists have demonstrated an average error of 9 percentage points in quantification result between the automated system and the pathologists' visual evaluation. Experiments investigating the variability in pathologists involving samples from 70 kidney patients also proved the automated quantification error rate to be on par with the average intra-observer variability in pathologists' quantification. The accuracy of the proposed quantification system has been validated with the ground truth dataset and compared against the pathologists' quantification results. It has been shown that the correlation between different pathologists' estimation of interstitial fibrosis area has significantly improved, demonstrating the effectiveness of the quantification system as a diagnostic aide. Copyright © 2017 Elsevier B.V. All rights reserved.
Naveen, P.; Lingaraju, H. B.; Prasad, K. Shyam
2017-01-01
Mangiferin, a polyphenolic xanthone glycoside from Mangifera indica, is used as traditional medicine for the treatment of numerous diseases. The present study was aimed to develop and validate a reversed-phase high-performance liquid chromatography (RP-HPLC) method for the quantification of mangiferin from the bark extract of M. indica. RP-HPLC analysis was performed by isocratic elution with a low-pressure gradient using 0.1% formic acid: acetonitrile (87:13) as a mobile phase with a flow rate of 1.5 ml/min. The separation was done at 26°C using a Kinetex XB-C18 column as stationary phase and the detection wavelength at 256 nm. The proposed method was validated for linearity, precision, accuracy, limit of detection, limit of quantification, and robustness by the International Conference on Harmonisation guidelines. In linearity, the excellent correlation coefficient more than 0.999 indicated good fitting of the curve and also good linearity. The intra- and inter-day precision showed < 1% of relative standard deviation of peak area indicated high reliability and reproducibility of the method. The recovery values at three different levels (50%, 100%, and 150%) of spiked samples were found to be 100.47, 100.89, and 100.99, respectively, and low standard deviation value < 1% shows high accuracy of the method. In robustness, the results remain unaffected by small variation in the analytical parameters, which shows the robustness of the method. Liquid chromatography–mass spectrometry analysis confirmed the presence of mangiferin with M/Z value of 421. The assay developed by HPLC method is a simple, rapid, and reliable for the determination of mangiferin from M. indica. SUMMARY The present study was intended to develop and validate an RP-HPLC method for the quantification of mangiferin from the bark extract of M. indica. The developed method was validated for linearity, precision, accuracy, limit of detection, limit of quantification and robustness by International Conference on Harmonization guidelines. This study proved that the developed assay by HPLC method is a simple, rapid and reliable for the quantification of the mangiferin from M. indica. Abbreviations Used: M. indica: Mangifera indica, RP-HPLC: Reversed-phase high-performance liquid chromatography, M/Z: Mass to charge ratio, ICH: International conference on harmonization, % RSD: Percentage of relative standard deviation, ppm: Parts per million, LOD: Limit of detection, LOQ: Limit of quantification. PMID:28539748
Naveen, P; Lingaraju, H B; Prasad, K Shyam
2017-01-01
Mangiferin, a polyphenolic xanthone glycoside from Mangifera indica , is used as traditional medicine for the treatment of numerous diseases. The present study was aimed to develop and validate a reversed-phase high-performance liquid chromatography (RP-HPLC) method for the quantification of mangiferin from the bark extract of M. indica . RP-HPLC analysis was performed by isocratic elution with a low-pressure gradient using 0.1% formic acid: acetonitrile (87:13) as a mobile phase with a flow rate of 1.5 ml/min. The separation was done at 26°C using a Kinetex XB-C18 column as stationary phase and the detection wavelength at 256 nm. The proposed method was validated for linearity, precision, accuracy, limit of detection, limit of quantification, and robustness by the International Conference on Harmonisation guidelines. In linearity, the excellent correlation coefficient more than 0.999 indicated good fitting of the curve and also good linearity. The intra- and inter-day precision showed < 1% of relative standard deviation of peak area indicated high reliability and reproducibility of the method. The recovery values at three different levels (50%, 100%, and 150%) of spiked samples were found to be 100.47, 100.89, and 100.99, respectively, and low standard deviation value < 1% shows high accuracy of the method. In robustness, the results remain unaffected by small variation in the analytical parameters, which shows the robustness of the method. Liquid chromatography-mass spectrometry analysis confirmed the presence of mangiferin with M/Z value of 421. The assay developed by HPLC method is a simple, rapid, and reliable for the determination of mangiferin from M. indica . The present study was intended to develop and validate an RP-HPLC method for the quantification of mangiferin from the bark extract of M. indica . The developed method was validated for linearity, precision, accuracy, limit of detection, limit of quantification and robustness by International Conference on Harmonization guidelines. This study proved that the developed assay by HPLC method is a simple, rapid and reliable for the quantification of the mangiferin from M. indica . Abbreviations Used: M. indica : Mangifera indica , RP-HPLC: Reversed-phase high-performance liquid chromatography, M/Z: Mass to charge ratio, ICH: International conference on harmonization, % RSD: Percentage of relative standard deviation, ppm: Parts per million, LOD: Limit of detection, LOQ: Limit of quantification.
NASA Astrophysics Data System (ADS)
Arnst, M.; Abello Álvarez, B.; Ponthot, J.-P.; Boman, R.
2017-11-01
This paper is concerned with the characterization and the propagation of errors associated with data limitations in polynomial-chaos-based stochastic methods for uncertainty quantification. Such an issue can arise in uncertainty quantification when only a limited amount of data is available. When the available information does not suffice to accurately determine the probability distributions that must be assigned to the uncertain variables, the Bayesian method for assigning these probability distributions becomes attractive because it allows the stochastic model to account explicitly for insufficiency of the available information. In previous work, such applications of the Bayesian method had already been implemented by using the Metropolis-Hastings and Gibbs Markov Chain Monte Carlo (MCMC) methods. In this paper, we present an alternative implementation, which uses an alternative MCMC method built around an Itô stochastic differential equation (SDE) that is ergodic for the Bayesian posterior. We draw together from the mathematics literature a number of formal properties of this Itô SDE that lend support to its use in the implementation of the Bayesian method, and we describe its discretization, including the choice of the free parameters, by using the implicit Euler method. We demonstrate the proposed methodology on a problem of uncertainty quantification in a complex nonlinear engineering application relevant to metal forming.
NASA Astrophysics Data System (ADS)
Zivkovic, Sanja; Momcilovic, Milos; Staicu, Angela; Mutic, Jelena; Trtica, Milan; Savovic, Jelena
2017-02-01
The aim of this study was to develop a simple laser induced breakdown spectroscopy (LIBS) method for quantitative elemental analysis of powdered biological materials based on laboratory prepared calibration samples. The analysis was done using ungated single pulse LIBS in ambient air at atmospheric pressure. Transversely-Excited Atmospheric pressure (TEA) CO2 laser was used as an energy source for plasma generation on samples. The material used for the analysis was a blue-green alga Spirulina, widely used in food and pharmaceutical industries and also in a few biotechnological applications. To demonstrate the analytical potential of this particular LIBS system the obtained spectra were compared to the spectra obtained using a commercial LIBS system based on pulsed Nd:YAG laser. A single sample of known concentration was used to estimate detection limits for Ba, Ca, Fe, Mg, Mn, Si and Sr and compare detection power of these two LIBS systems. TEA CO2 laser based LIBS was also applied for quantitative analysis of the elements in powder Spirulina samples. Analytical curves for Ba, Fe, Mg, Mn and Sr were constructed using laboratory produced matrix-matched calibration samples. Inductively coupled plasma optical emission spectroscopy (ICP-OES) was used as the reference technique for elemental quantification, and reasonably well agreement between ICP and LIBS data was obtained. Results confirm that, in respect to its sensitivity and precision, TEA CO2 laser based LIBS can be successfully applied for quantitative analysis of macro and micro-elements in algal samples. The fact that nearly all classes of materials can be prepared as powders implies that the proposed method could be easily extended to a quantitative analysis of different kinds of materials, organic, biological or inorganic.
Fariñas, Juan C; Rucandio, Isabel; Pomares-Alfonso, Mario S; Villanueva-Tagle, Margarita E; Larrea, María T
2016-07-01
An Inductively Coupled Plasma Optical Emission Spectrometry method for simultaneous determination of Al, Ca, Cu, Fe, In, Mn, Ni, Si, Sr, Y, Zn, Zr and rare earth elements (La, Ce, Pr, Nd, Sm, Eu, Gd, Tb, Dy, Ho, Er, Tm, Yb, and Lu) in magnesium alloys, including the new rare earth elements-alloyed magnesium, has been developed. Robust conditions have been established as nebulizer argon flow rate of 0.5mLmin(-1) and RF incident power of 1500W, in which matrix effects were significantly reduced around 10%. Three acid digestion procedures were performed at 110°C in closed PFA vessels heated in an oven, in closed TFM vessels heated in a microwave furnace, and in open polypropylene tubes with reflux caps heated in a graphite block. The three digestion procedures are suitable to put into solution the magnesium alloys samples. From the most sensitive lines, one analytical line with lack or low spectral interferences has been selected for each element. Mg, Rh and Sc have been studied as internal standards. Among them, Rh was selected as the best one by using Rh I 343.488nm and Rh II 249.078nm lines as a function of the analytical lines. The trueness and precision have been established by using the Certified Reference Material BCS 316, as well as by means of recovery studies. Quantification limits were between 0.1 and 9mgkg(-1) for Lu and Pr, respectively, in a 2gL(-1) magnesium matrix solution. The method developed has been applied to the commercial alloys AM60, AZ80, ZK30, AJ62, WE54 and AE44. Copyright © 2016 Elsevier B.V. All rights reserved.
Celi, Simona; Berti, Sergio
2014-10-01
Optical coherence tomography (OCT) is a catheter-based medical imaging technique that produces cross-sectional images of blood vessels. This technique is particularly useful for studying coronary atherosclerosis. In this paper, we present a new framework that allows a segmentation and quantification of OCT images of coronary arteries to define the plaque type and stenosis grading. These analyses are usually carried out on-line on the OCT-workstation where measuring is mainly operator-dependent and mouse-based. The aim of this program is to simplify and improve the processing of OCT images for morphometric investigations and to present a fast procedure to obtain 3D geometrical models that can also be used for external purposes such as for finite element simulations. The main phases of our toolbox are the lumen segmentation and the identification of the main tissues in the artery wall. We validated the proposed method with identification and segmentation manually performed by expert OCT readers. The method was evaluated on ten datasets from clinical routine and the validation was performed on 210 images randomly extracted from the pullbacks. Our results show that automated segmentation of the vessel and of the tissue components are possible off-line with a precision that is comparable to manual segmentation for the tissue component and to the proprietary-OCT-console for the lumen segmentation. Several OCT sections have been processed to provide clinical outcome. Copyright © 2014 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Schiavi, Federica; Bolfan-Casanova, Nathalie
2017-04-01
The amount and distribution of volatiles (water, carbon dioxide …) in magmas represent key parameters for the understanding of magma processes and dynamics within volcanic plumbing systems. Micro-Raman spectroscopy is an excellent technique for accurate determination of volatile contents in magmas, as it combines several advantages. The technique is non-destructive and requires minimal sample preparation before the analysis. Its high lateral and in-depth spatial resolution is crucial for the study of small objects and samples that are chemically and texturally heterogeneous at the small scale (microns). Moreover, the high confocality allows analysis of sample regions not exposed to the surface and 3D mapping. We present a universal calibration of Raman spectroscopy for quantification of volatiles in silicate glasses. The proposed method is based on internal calibration, i.e., on the correlation between the glass water content and the ratio between the areas of the water and silicate Raman bands. Synthetic glasses with variable major element compositions (basaltic, andesitic, rhyolitic, dacitic ..) bearing different H2O (up to 7 wt%) and CO2 contents are used as standard glasses. Natural silicate glasses, mainly in the form of melt inclusions, are used to test the goodness of the proposed method. In addition to quantification of volatiles in glass, in bubble-bearing melt inclusions we perform micro-Raman spectroscopy investigation of gas-bearing bubbles for accurate determination of total volatile contents in melt inclusions.
Ginder-Vogel, Matthew; Landrot, Gautier; Fischel, Jason S.; Sparks, Donald L.
2009-01-01
Quantification of the initial rates of environmental reactions at the mineral/water interface is a fundamental prerequisite to determining reaction mechanisms and contaminant transport modeling and predicting environmental risk. Until recently, experimental techniques with adequate time resolution and elemental sensitivity to measure initial rates of the wide variety of environmental reactions were quite limited. Techniques such as electron paramagnetic resonance and Fourier transform infrared spectroscopies suffer from limited elemental specificity and poor sensitivity to inorganic elements, respectively. Ex situ analysis of batch and stirred-flow systems provides high elemental sensitivity; however, their time resolution is inadequate to characterize rapid environmental reactions. Here we apply quick-scanning x-ray absorption spectroscopy (Q-XAS), at sub-second time-scales, to measure the initial oxidation rate of As(III) to As(V) by hydrous manganese(IV) oxide. Using Q-XAS, As(III) and As(V) concentrations were determined every 0.98 s in batch reactions. The initial apparent As(III) depletion rate constants (t < 30 s) measured with Q-XAS are nearly twice as large as rate constants measured with traditional analytical techniques. Our results demonstrate the importance of developing analytical techniques capable of analyzing environmental reactions on the same time scale as they occur. Given the high sensitivity, elemental specificity, and time resolution of Q-XAS, it has many potential applications. They could include measuring not only redox reactions but also dissolution/precipitation reactions, such as the formation and/or reductive dissolution of Fe(III) (hydr)oxides, solid-phase transformations (i.e., formation of layered-double hydroxide minerals), or almost any other reaction occurring in aqueous media that can be measured using x-ray absorption spectroscopy. PMID:19805269
Quantification of fungicides in snow-melt runoff from turf: A comparison of four extraction methods
USDA-ARS?s Scientific Manuscript database
A variety of pesticides are used to control diverse stressors to turf. These pesticides have a wide range in physical and chemical properties. The objective of this project was to develop an extraction and analysis method for quantification of chlorothalonil and PCNB (pentachloronitrobenzene), two p...
The quantification of solute concentrations in laboratory aquifer models has been largely limited to the use of sampling ports, from which samples are collected for external analysis. One of the drawbacks to this method is that the act of sampling may disturb plume dynamics and ...
The quantification of solute concentrations in laboratory aquifer models has been largely limited to the use of sampling ports, from which samples are collected for external analysis. One of the drawbacks to this method is that the act of sampling may disturb plume dynamics and ...
The quantification of solute concentrations in laboratory aquifer models has been largely limited to the use of sampling ports, from which samples are collected for external analysis. One of the drawbacks to this method is that the act of sampling may disturb plume dynamics and ...
Chai, Liuying; Zhang, Jianwei; Zhang, Lili; Chen, Tongsheng
2015-03-01
Spectral measurement of fluorescence resonance energy transfer (FRET), spFRET, is a widely used FRET quantification method in living cells today. We set up a spectrometer-microscope platform that consists of a miniature fiber optic spectrometer and a widefield fluorescence microscope for the spectral measurement of absolute FRET efficiency (E) and acceptor-to-donor concentration ratio (R(C)) in single living cells. The microscope was used for guiding cells and the spectra were simultaneously detected by the miniature fiber optic spectrometer. Moreover, our platform has independent excitation and emission controllers, so different excitations can share the same emission channel. In addition, we developed a modified spectral FRET quantification method (mlux-FRET) for the multiple donors and multiple acceptors FRET construct (mD∼nA) sample, and we also developed a spectra-based 2-channel acceptor-sensitized FRET quantification method (spE-FRET). We implemented these modified FRET quantification methods on our platform to measure the absolute E and R(C) values of tandem constructs with different acceptor/donor stoichiometries in single living Huh-7 cells.
Model Update of a Micro Air Vehicle (MAV) Flexible Wing Frame with Uncertainty Quantification
NASA Technical Reports Server (NTRS)
Reaves, Mercedes C.; Horta, Lucas G.; Waszak, Martin R.; Morgan, Benjamin G.
2004-01-01
This paper describes a procedure to update parameters in the finite element model of a Micro Air Vehicle (MAV) to improve displacement predictions under aerodynamics loads. Because of fabrication, materials, and geometric uncertainties, a statistical approach combined with Multidisciplinary Design Optimization (MDO) is used to modify key model parameters. Static test data collected using photogrammetry are used to correlate with model predictions. Results show significant improvements in model predictions after parameters are updated; however, computed probabilities values indicate low confidence in updated values and/or model structure errors. Lessons learned in the areas of wing design, test procedures, modeling approaches with geometric nonlinearities, and uncertainties quantification are all documented.
Le, Huy Q.; Molloi, Sabee
2011-01-01
Purpose: To experimentally investigate whether a computed tomography (CT) system based on CdZnTe (CZT) detectors in conjunction with a least-squares parameter estimation technique can be used to decompose four different materials. Methods: The material decomposition process was divided into a segmentation task and a quantification task. A least-squares minimization algorithm was used to decompose materials with five measurements of the energy dependent linear attenuation coefficients. A small field-of-view energy discriminating CT system was built. The CT system consisted of an x-ray tube, a rotational stage, and an array of CZT detectors. The CZT array was composed of 64 pixels, each of which is 0.8×0.8×3 mm. Images were acquired at 80 kVp in fluoroscopic mode at 50 ms per frame. The detector resolved the x-ray spectrum into energy bins of 22–32, 33–39, 40–46, 47–56, and 57–80 keV. Four phantoms were constructed from polymethylmethacrylate (PMMA), polyethylene, polyoxymethylene, hydroxyapatite, and iodine. Three phantoms were composed of three materials with embedded hydroxyapatite (50, 150, 250, and 350 mg∕ml) and iodine (4, 8, 12, and 16 mg∕ml) contrast elements. One phantom was composed of four materials with embedded hydroxyapatite (150 and 350 mg∕ml) and iodine (8 and 16 mg∕ml). Calibrations consisted of PMMA phantoms with either hydroxyapatite (100, 200, 300, 400, and 500 mg∕ml) or iodine (5, 15, 25, 35, and 45 mg∕ml) embedded. Filtered backprojection and a ramp filter were used to reconstruct images from each energy bin. Material segmentation and quantification were performed and compared between different phantoms. Results: All phantoms were decomposed accurately, but some voxels in the base material regions were incorrectly identified. Average quantification errors of hydroxyapatite∕iodine were 9.26∕7.13%, 7.73∕5.58%, and 12.93∕8.23% for the three-material PMMA, polyethylene, and polyoxymethylene phantoms, respectively. The average errors for the four-material phantom were 15.62% and 2.76% for hydroxyapatite and iodine, respectively. Conclusions: The calibrated least-squares minimization technique of decomposition performed well in breast imaging tasks with an energy resolving detector. This method can provide material basis images containing concentrations of the relevant materials that can potentially be valuable in the diagnostic process. PMID:21361191
NASA Astrophysics Data System (ADS)
Bieda, Bogusław; Grzesik, Katarzyna
2017-11-01
The study proposes an stochastic approach based on Monte Carlo (MC) simulation for life cycle assessment (LCA) method limited to life cycle inventory (LCI) study for rare earth elements (REEs) recovery from the secondary materials processes production applied to the New Krankberg Mine in Sweden. The MC method is recognizes as an important tool in science and can be considered the most effective quantification approach for uncertainties. The use of stochastic approach helps to characterize the uncertainties better than deterministic method. Uncertainty of data can be expressed through a definition of probability distribution of that data (e.g. through standard deviation or variance). The data used in this study are obtained from: (i) site-specific measured or calculated data, (ii) values based on literature, (iii) the ecoinvent process "rare earth concentrate, 70% REO, from bastnäsite, at beneficiation". Environmental emissions (e.g, particulates, uranium-238, thorium-232), energy and REE (La, Ce, Nd, Pr, Sm, Dy, Eu, Tb, Y, Sc, Yb, Lu, Tm, Y, Gd) have been inventoried. The study is based on a reference case for the year 2016. The combination of MC analysis with sensitivity analysis is the best solution for quantified the uncertainty in the LCI/LCA. The reliability of LCA results may be uncertain, to a certain degree, but this uncertainty can be noticed with the help of MC method.
Liu, Ruijuan; Wang, Mengmeng; Ding, Li
2014-10-01
Menadione (VK3), an essential fat-soluble naphthoquinone, takes very important physiological and pathological roles, but its detection and quantification is challenging. Herein, a new method was developed for quantification of VK3 in human plasma by liquid chromatography-tandem mass spectrometry (LC-MS/MS) after derivatization with 3-mercaptopropionic acid via Michael addition reaction. The derivative had been identified by the mass spectra and the derivatization conditions were optimized by considering different parameters. The method was demonstrated with high sensitivity and a low limit of quantification of 0.03 ng mL(-1) for VK3, which is about 33-fold better than that for the direct analysis of the underivatized compound. The method also had good precision and reproducibility. It was applied in the determination of basal VK3 in human plasma and a clinical pharmacokinetic study of menadiol sodium diphosphate. Furthermore, the method for the quantification of VK3 using LC-MS/MS was reported in this paper for the first time, and it will provide an important strategy for the further research on VK3 and menadione analogs. Copyright © 2014 Elsevier B.V. All rights reserved.
HPLC Quantification of astaxanthin and canthaxanthin in Salmonidae eggs.
Tzanova, Milena; Argirova, Mariana; Atanasov, Vasil
2017-04-01
Astaxanthin and canthaxanthin are naturally occurring antioxidants referred to as xanthophylls. They are used as food additives in fish farms to improve the organoleptic qualities of salmonid products and to prevent reproductive diseases. This study reports the development and single-laboratory validation of a rapid method for quantification of astaxanthin and canthaxanthin in eggs of rainbow trout (Oncorhynchus mykiss) and brook trout (Salvelinus fontinalis М.). An advantage of the proposed method is the perfect combination of selective extraction of the xanthophylls and analysis of the extract by high-performance liquid chromatography and photodiode array detection. The method validation was carried out in terms of linearity, accuracy, precision, recovery and limits of detection and quantification. The method was applied for simultaneous quantification of the two xanthophylls in eggs of rainbow trout and brook trout after their selective extraction. The results show that astaxanthin accumulations in salmonid fish eggs are larger than those of canthaxanthin. As the levels of these two xanthophylls affect fish fertility, this method can be used to improve the nutritional quality and to minimize the occurrence of the M74 syndrome in fish populations. Copyright © 2016 John Wiley & Sons, Ltd.
Cankar, Katarina; Štebih, Dejan; Dreo, Tanja; Žel, Jana; Gruden, Kristina
2006-01-01
Background Real-time PCR is the technique of choice for nucleic acid quantification. In the field of detection of genetically modified organisms (GMOs) quantification of biotech products may be required to fulfil legislative requirements. However, successful quantification depends crucially on the quality of the sample DNA analyzed. Methods for GMO detection are generally validated on certified reference materials that are in the form of powdered grain material, while detection in routine laboratories must be performed on a wide variety of sample matrixes. Due to food processing, the DNA in sample matrixes can be present in low amounts and also degraded. In addition, molecules of plant origin or from other sources that affect PCR amplification of samples will influence the reliability of the quantification. Further, the wide variety of sample matrixes presents a challenge for detection laboratories. The extraction method must ensure high yield and quality of the DNA obtained and must be carefully selected, since even components of DNA extraction solutions can influence PCR reactions. GMO quantification is based on a standard curve, therefore similarity of PCR efficiency for the sample and standard reference material is a prerequisite for exact quantification. Little information on the performance of real-time PCR on samples of different matrixes is available. Results Five commonly used DNA extraction techniques were compared and their suitability for quantitative analysis was assessed. The effect of sample matrix on nucleic acid quantification was assessed by comparing 4 maize and 4 soybean matrixes. In addition 205 maize and soybean samples from routine analysis were analyzed for PCR efficiency to assess variability of PCR performance within each sample matrix. Together with the amount of DNA needed for reliable quantification, PCR efficiency is the crucial parameter determining the reliability of quantitative results, therefore it was chosen as the primary criterion by which to evaluate the quality and performance on different matrixes and extraction techniques. The effect of PCR efficiency on the resulting GMO content is demonstrated. Conclusion The crucial influence of extraction technique and sample matrix properties on the results of GMO quantification is demonstrated. Appropriate extraction techniques for each matrix need to be determined to achieve accurate DNA quantification. Nevertheless, as it is shown that in the area of food and feed testing matrix with certain specificities is impossible to define strict quality controls need to be introduced to monitor PCR. The results of our study are also applicable to other fields of quantitative testing by real-time PCR. PMID:16907967
Provost, Karine; Leblond, Antoine; Gauthier-Lemire, Annie; Filion, Édith; Bahig, Houda; Lord, Martin
2017-09-01
Planar perfusion scintigraphy with 99m Tc-labeled macroaggregated albumin is often used for pretherapy quantification of regional lung perfusion in lung cancer patients, particularly those with poor respiratory function. However, subdividing lung parenchyma into rectangular regions of interest, as done on planar images, is a poor reflection of true lobar anatomy. New tridimensional methods using SPECT and SPECT/CT have been introduced, including semiautomatic lung segmentation software. The present study evaluated inter- and intraobserver agreement on quantification using SPECT/CT software and compared the results for regional lung contribution obtained with SPECT/CT and planar scintigraphy. Methods: Thirty lung cancer patients underwent ventilation-perfusion scintigraphy with 99m Tc-macroaggregated albumin and 99m Tc-Technegas. The regional lung contribution to perfusion and ventilation was measured on both planar scintigraphy and SPECT/CT using semiautomatic lung segmentation software by 2 observers. Interobserver and intraobserver agreement for the SPECT/CT software was assessed using the intraclass correlation coefficient, Bland-Altman plots, and absolute differences in measurements. Measurements from planar and tridimensional methods were compared using the paired-sample t test and mean absolute differences. Results: Intraclass correlation coefficients were in the excellent range (above 0.9) for both interobserver and intraobserver agreement using the SPECT/CT software. Bland-Altman analyses showed very narrow limits of agreement. Absolute differences were below 2.0% in 96% of both interobserver and intraobserver measurements. There was a statistically significant difference between planar and SPECT/CT methods ( P < 0.001) for quantification of perfusion and ventilation for all right lung lobes, with a maximal mean absolute difference of 20.7% for the right middle lobe. There was no statistically significant difference in quantification of perfusion and ventilation for the left lung lobes using either method; however, absolute differences reached 12.0%. The total right and left lung contributions were similar for the two methods, with a mean difference of 1.2% for perfusion and 2.0% for ventilation. Conclusion: Quantification of regional lung perfusion and ventilation using SPECT/CT-based lung segmentation software is highly reproducible. This tridimensional method yields statistically significant differences in measurements for right lung lobes when compared with planar scintigraphy. We recommend that SPECT/CT-based quantification be used for all lung cancer patients undergoing pretherapy evaluation of regional lung function. © 2017 by the Society of Nuclear Medicine and Molecular Imaging.
A multi-center study benchmarks software tools for label-free proteome quantification
Gillet, Ludovic C; Bernhardt, Oliver M.; MacLean, Brendan; Röst, Hannes L.; Tate, Stephen A.; Tsou, Chih-Chiang; Reiter, Lukas; Distler, Ute; Rosenberger, George; Perez-Riverol, Yasset; Nesvizhskii, Alexey I.; Aebersold, Ruedi; Tenzer, Stefan
2016-01-01
The consistent and accurate quantification of proteins by mass spectrometry (MS)-based proteomics depends on the performance of instruments, acquisition methods and data analysis software. In collaboration with the software developers, we evaluated OpenSWATH, SWATH2.0, Skyline, Spectronaut and DIA-Umpire, five of the most widely used software methods for processing data from SWATH-MS (sequential window acquisition of all theoretical fragment ion spectra), a method that uses data-independent acquisition (DIA) for label-free protein quantification. We analyzed high-complexity test datasets from hybrid proteome samples of defined quantitative composition acquired on two different MS instruments using different SWATH isolation windows setups. For consistent evaluation we developed LFQbench, an R-package to calculate metrics of precision and accuracy in label-free quantitative MS, and report the identification performance, robustness and specificity of each software tool. Our reference datasets enabled developers to improve their software tools. After optimization, all tools provided highly convergent identification and reliable quantification performance, underscoring their robustness for label-free quantitative proteomics. PMID:27701404
Jacchia, Sara; Nardini, Elena; Savini, Christian; Petrillo, Mauro; Angers-Loustau, Alexandre; Shim, Jung-Hyun; Trijatmiko, Kurniawan; Kreysa, Joachim; Mazzara, Marco
2015-02-18
In this study, we developed, optimized, and in-house validated a real-time PCR method for the event-specific detection and quantification of Golden Rice 2, a genetically modified rice with provitamin A in the grain. We optimized and evaluated the performance of the taxon (targeting rice Phospholipase D α2 gene)- and event (targeting the 3' insert-to-plant DNA junction)-specific assays that compose the method as independent modules, using haploid genome equivalents as unit of measurement. We verified the specificity of the two real-time PCR assays and determined their dynamic range, limit of quantification, limit of detection, and robustness. We also confirmed that the taxon-specific DNA sequence is present in single copy in the rice genome and verified its stability of amplification across 132 rice varieties. A relative quantification experiment evidenced the correct performance of the two assays when used in combination.
Accurate proteome-wide protein quantification from high-resolution 15N mass spectra
2011-01-01
In quantitative mass spectrometry-based proteomics, the metabolic incorporation of a single source of 15N-labeled nitrogen has many advantages over using stable isotope-labeled amino acids. However, the lack of a robust computational framework for analyzing the resulting spectra has impeded wide use of this approach. We have addressed this challenge by introducing a new computational methodology for analyzing 15N spectra in which quantification is integrated with identification. Application of this method to an Escherichia coli growth transition reveals significant improvement in quantification accuracy over previous methods. PMID:22182234
Jiang, Tingting; Dai, Yongmei; Miao, Miao; Zhang, Yue; Song, Chenglin; Wang, Zhixu
2015-07-01
To evaluate the usefulness and efficiency of a novel dietary method among urban pregnant women. Sixty one pregnant women were recruited from the ward and provided with a meal accurately weighed before cooking. The meal was photographed from three different angles before and after eating. The subjects were also interviewed for 24 h dietary recall by the investigators. Food weighting, image quantification and 24 h dietary recall were conducted by investigators from three different groups, and the messages were isolated from each other. Food consumption was analyzed on bases of classification and total summation. Nutrient intake from the meal was calculated for each subject. The data obtained from the dietary recall and the image quantification were compared with the actual values. Correlation and regression analyses were carried out on values between weight method and image quantification as well as dietary recall. Total twenty three kinds of food including rice, vegetables, fish, meats and soy bean curd were included in the experimental meal for the study. Compared with data from 24 h dietary recall (r = 0.413, P < 0.05), food weight estimated by image quantification (r = 0.778, P < 0.05, n = 308) were more correlated with weighed data, and show more concentrated linear distribution. Absolute difference distribution between image quantification and weight method of all food was 77.23 ± 56.02 (P < 0.05, n = 61), which was much small than the difference (172.77 ± 115.18) between 24 h recall and weight method. Values of almost all nutrients, including energy, protein, fat, carbohydrate, vitamin A, vitamin C, calcium, iron and zine calculated based on food weight from image quantification were more close to those of weighed data compared with 24 h dietary recall (P < 0.01). The results found by the Bland Altman analysis showed that the majority of the measurements for nutrient intake, were scattered along the mean difference line and close to the equality line (difference = 0). The plots show fairly good agreement between estimated and actual food consumption. It indicate that the differences (including the outliers) were random and did not exhibit any systematic bias, being consistent over different levels of mean food amount. On the other hand, the questionnaire showed that fifty six pregnant women considered the image quantification was less time-consuming and burdened than 24 h recall. Fifty eight of them would like to use image quantification to know their dietary status. The novel method which called instant photography (image quantification) for dietary assessment is more effective than conventional 24 h dietary recall and it also can obtain food intake values close to weighed data.
Beyond the Great Wall: gold of the silk roads and the first empire of the steppes.
Radtke, Martin; Reiche, Ina; Reinholz, Uwe; Riesemeier, Heinrich; Guerra, Maria F
2013-02-05
Fingerprinting ancient gold work requires the use of nondestructive techniques with high spatial resolution (down to 25 μm) and good detection limits (micrograms per gram level). In this work experimental setups and protocols for synchrotron radiation induced X-ray fluorescence (SRXRF) at the BAMline of the Berlin electron storage ring company for synchrotron radiation (BESSY) in Berlin for the measurement of characteristic trace elements of gold are compared considering the difficulties, shown in previous works, connected to the quantification of Pt. The best experimental conditions and calculation methods were achieved by using an excitation energy of 11.58 keV, a silicon drift chamber detector (SDD) detector, and pure element reference standards. A detection limit of 3 μg/g has been reached. This newly developed method was successfully applied to provenancing the Xiongnu gold from the Gol Mod necropolis, excavated under the aegis of the United Nations Educational, Scientific and Cultural Organization (UNESCO). The composition of the base alloys and the presence of Pt and Sn showed that, contrary to what is expected, the gold foils from the first powerful empire of the steppes along the Great Wall were produced with alluvial gold from local placer deposits located in Zaamar, Boroo, and in the Selenga River.
Quantification of taurine in energy drinks using ¹H NMR.
Hohmann, Monika; Felbinger, Christine; Christoph, Norbert; Wachter, Helmut; Wiest, Johannes; Holzgrabe, Ulrike
2014-05-01
The consumption of so called energy drinks is increasing, especially among adolescents. These beverages commonly contain considerable amounts of the amino sulfonic acid taurine, which is related to a magnitude of various physiological effects. The customary method to control the legal limit of taurine in energy drinks is LC-UV/vis with postcolumn derivatization using ninhydrin. In this paper we describe the quantification of taurine in energy drinks by (1)H NMR as an alternative to existing methods of quantification. Variation of pH values revealed the separation of a distinct taurine signal in (1)H NMR spectra, which was applied for integration and quantification. Quantification was performed using external calibration (R(2)>0.9999; linearity verified by Mandel's fitting test with a 95% confidence level) and PULCON. Taurine concentrations in 20 different energy drinks were analyzed by both using (1)H NMR and LC-UV/vis. The deviation between (1)H NMR and LC-UV/vis results was always below the expanded measurement uncertainty of 12.2% for the LC-UV/vis method (95% confidence level) and at worst 10.4%. Due to the high accordance to LC-UV/vis data and adequate recovery rates (ranging between 97.1% and 108.2%), (1)H NMR measurement presents a suitable method to quantify taurine in energy drinks. Copyright © 2013 Elsevier B.V. All rights reserved.
Louwagie, Mathilde; Kieffer-Jaquinod, Sylvie; Dupierris, Véronique; Couté, Yohann; Bruley, Christophe; Garin, Jérôme; Dupuis, Alain; Jaquinod, Michel; Brun, Virginie
2012-07-06
Accurate quantification of pure peptides and proteins is essential for biotechnology, clinical chemistry, proteomics, and systems biology. The reference method to quantify peptides and proteins is amino acid analysis (AAA). This consists of an acidic hydrolysis followed by chromatographic separation and spectrophotometric detection of amino acids. Although widely used, this method displays some limitations, in particular the need for large amounts of starting material. Driven by the need to quantify isotope-dilution standards used for absolute quantitative proteomics, particularly stable isotope-labeled (SIL) peptides and PSAQ proteins, we developed a new AAA assay (AAA-MS). This method requires neither derivatization nor chromatographic separation of amino acids. It is based on rapid microwave-assisted acidic hydrolysis followed by high-resolution mass spectrometry analysis of amino acids. Quantification is performed by comparing MS signals from labeled amino acids (SIL peptide- and PSAQ-derived) with those of unlabeled amino acids originating from co-hydrolyzed NIST standard reference materials. For both SIL peptides and PSAQ standards, AAA-MS quantification results were consistent with classical AAA measurements. Compared to AAA assay, AAA-MS was much faster and was 100-fold more sensitive for peptide and protein quantification. Finally, thanks to the development of a labeled protein standard, we also extended AAA-MS analysis to the quantification of unlabeled proteins.
NASA Astrophysics Data System (ADS)
Margheri, Luca; Sagaut, Pierre
2016-11-01
To significantly increase the contribution of numerical computational fluid dynamics (CFD) simulation for risk assessment and decision making, it is important to quantitatively measure the impact of uncertainties to assess the reliability and robustness of the results. As unsteady high-fidelity CFD simulations are becoming the standard for industrial applications, reducing the number of required samples to perform sensitivity (SA) and uncertainty quantification (UQ) analysis is an actual engineering challenge. The novel approach presented in this paper is based on an efficient hybridization between the anchored-ANOVA and the POD/Kriging methods, which have already been used in CFD-UQ realistic applications, and the definition of best practices to achieve global accuracy. The anchored-ANOVA method is used to efficiently reduce the UQ dimension space, while the POD/Kriging is used to smooth and interpolate each anchored-ANOVA term. The main advantages of the proposed method are illustrated through four applications with increasing complexity, most of them based on Large-Eddy Simulation as a high-fidelity CFD tool: the turbulent channel flow, the flow around an isolated bluff-body, a pedestrian wind comfort study in a full scale urban area and an application to toxic gas dispersion in a full scale city area. The proposed c-APK method (anchored-ANOVA-POD/Kriging) inherits the advantages of each key element: interpolation through POD/Kriging precludes the use of quadrature schemes therefore allowing for a more flexible sampling strategy while the ANOVA decomposition allows for a better domain exploration. A comparison of the three methods is given for each application. In addition, the importance of adding flexibility to the control parameters and the choice of the quantity of interest (QoI) are discussed. As a result, global accuracy can be achieved with a reasonable number of samples allowing computationally expensive CFD-UQ analysis.
Gibby, Jacob T; Njeru, Dennis K; Cvetko, Steve T; Heiny, Eric L; Creer, Andrew R; Gibby, Wendell A
We correlate and evaluate the accuracy of accepted anthropometric methods of percent body fat (%BF) quantification, namely, hydrostatic weighing (HW) and air displacement plethysmography (ADP), to 2 automatic adipose tissue quantification methods using computed tomography (CT). Twenty volunteer subjects (14 men, 6 women) received head-to-toe CT scans. Hydrostatic weighing and ADP were obtained from 17 and 12 subjects, respectively. The CT data underwent conversion using 2 separate algorithms, namely, the Schneider method and the Beam method, to convert Hounsfield units to their respective tissue densities. The overall mass and %BF of both methods were compared with HW and ADP. When comparing ADP to CT data using the Schneider method and Beam method, correlations were r = 0.9806 and 0.9804, respectively. Paired t tests indicated there were no statistically significant biases. Additionally, observed average differences in %BF between ADP and the Schneider method and the Beam method were 0.38% and 0.77%, respectively. The %BF measured from ADP, the Schneider method, and the Beam method all had significantly higher mean differences when compared with HW (3.05%, 2.32%, and 1.94%, respectively). We have shown that total body mass correlates remarkably well with both the Schneider method and Beam method of mass quantification. Furthermore, %BF calculated with the Schneider method and Beam method CT algorithms correlates remarkably well with ADP. The application of these CT algorithms have utility in further research to accurately stratify risk factors with periorgan, visceral, and subcutaneous types of adipose tissue, and has the potential for significant clinical application.
NASA Astrophysics Data System (ADS)
Hönicke, Philipp; Krämer, Markus; Lühl, Lars; Andrianov, Konstantin; Beckhoff, Burkhard; Dietsch, Rainer; Holz, Thomas; Kanngießer, Birgit; Weißbach, Danny; Wilhein, Thomas
2018-07-01
With the advent of both modern X-ray fluorescence (XRF) methods and improved analytical reliability requirements the demand for suitable reference samples has increased. Especially in nanotechnology with the very low areal mass depositions, quantification becomes considerably more difficult. However, the availability of suited reference samples is drastically lower than the demand. Physical vapor deposition techniques have been enhanced significantly in the last decade driven by the need for extremely precise film parameters in multilayer production. We have applied those techniques for the development of layer-like reference samples with mass depositions in the ng-range and well below for Ca, Cu, Pb, Mo, Pd, Pb, La, Fe and Ni. Numerous other elements would also be possible. Several types of reference samples were fabricated: multi-elemental layer and extremely low (sub-monolayer) samples for various applications in XRF and total-reflection XRF analysis. Those samples were characterized and compared at three different synchrotron radiation beamlines at the BESSY II electron storage ring employing the reference-free XRF approach based on physically calibrated instrumentation. In addition, the homogeneity of the multi-elemental coatings was checked at the P04 beamline at DESY. The measurements demonstrate the high precision achieved in the manufacturing process as well as the versatility of application fields for the presented reference samples.
Spalla, S; Baffi, C; Barbante, C; Turetta, C; Turretta, C; Cozzi, G; Beone, G M; Bettinelli, M
2009-10-30
In recent years identification of the geographical origin of food has grown more important as consumers have become interested in knowing the provenance of the food that they purchase and eat. Certification schemes and labels have thus been developed to protect consumers and genuine producers from the improper use of popular brand names or renowned geographical origins. As the tomato is one of the major components of what is considered to be the healthy Mediterranean diet, it is important to be able to determine the geographical origin of tomatoes and tomato-based products such as tomato sauce. The aim of this work is to develop an analytical method to determine rare earth elements (RRE) for the control of the geographic origin of tomatoes. The content of REE in tomato plant samples collected from an agricultural area in Piacenza, Italy, was determined, using four different digestion procedures with and without HF. Microwave dissolution with HNO3 + H2O2 proved to be the most suitable digestion procedure. Inductively coupled plasma quadrupole mass spectrometry (ICPQMS) and inductively coupled plasma sector field plasma mass spectrometry (ICPSFMS) instruments, both coupled with a desolvation system, were used to determine the REE in tomato plants in two different laboratories. A matched calibration curve method was used for the quantification of the analytes. The detection limits (MDLs) of the method ranged from 0.03 ng g(-1) for Ho, Tm, and Lu to 2 ng g(-1) for La and Ce. The precision, in terms of relative standard deviation on six replicates, was good, with values ranging, on average, from 6.0% for LREE (light rare earth elements) to 16.5% for HREE (heavy rare earth elements). These detection limits allowed the determination of the very low concentrations of REE present in tomato berries. For the concentrations of REE in tomato plants, the following trend was observed: roots > leaves > stems > berries. Copyright 2009 John Wiley & Sons, Ltd.
Dong, Tao; Yu, Liang; Gao, Difeng; Yu, Xiaochen; Miao, Chao; Zheng, Yubin; Lian, Jieni; Li, Tingting; Chen, Shulin
2015-12-01
Accurate determination of fatty acid contents is routinely required in microalgal and yeast biofuel studies. A method of rapid in situ fatty acid methyl ester (FAME) derivatization directly from wet fresh microalgal and yeast biomass was developed in this study. This method does not require prior solvent extraction or dehydration. FAMEs were prepared with a sequential alkaline hydrolysis (15 min at 85 °C) and acidic esterification (15 min at 85 °C) process. The resulting FAMEs were extracted into n-hexane and analyzed using gas chromatography. The effects of each processing parameter (temperature, reaction time, and water content) upon the lipids quantification in the alkaline hydrolysis step were evaluated with a full factorial design. This method could tolerate water content up to 20% (v/v) in total reaction volume, which equaled up to 1.2 mL of water in biomass slurry (with 0.05-25 mg of fatty acid). There were no significant differences in FAME quantification (p>0.05) between the standard AOAC 991.39 method and the proposed wet in situ FAME preparation method. This fatty acid quantification method is applicable to fresh wet biomass of a wide range of microalgae and yeast species.
NASA Astrophysics Data System (ADS)
Tang, Qixiang; Owusu Twumasi, Jones; Hu, Jie; Wang, Xingwei; Yu, Tzuyang
2018-03-01
Structural steel members have become integral components in the construction of civil engineering infrastructures such as bridges, stadiums, and shopping centers due to versatility of steel. Owing to the uniqueness in the design and construction of steel structures, rigorous non-destructive evaluation techniques are needed during construction and operation processes to prevent the loss of human lives and properties. This research aims at investigating the application of photoacoustic fiber optic transducers (FOT) for detecting surface rust of a steel rod. Surface ultrasonic waves propagation in intact and corroded steel rods was simulated using finite element method (FEM). Radial displacements were collected and short-time Fourier transform (STFT) was applied to obtain the spectrogram. It was found that the presence of surface rust between the FOT and the receiver can be detected in both time and frequency domain. In addition, spectrogram can be used to locate and quantify surface rust. Furthermore, a surface rust detection algorithm utilizing the FOT has been proposed for detection, location and quantification of the surface rust.
[Progress in stable isotope labeled quantitative proteomics methods].
Zhou, Yuan; Shan, Yichu; Zhang, Lihua; Zhang, Yukui
2013-06-01
Quantitative proteomics is an important research field in post-genomics era. There are two strategies for proteome quantification: label-free methods and stable isotope labeling methods which have become the most important strategy for quantitative proteomics at present. In the past few years, a number of quantitative methods have been developed, which support the fast development in biology research. In this work, we discuss the progress in the stable isotope labeling methods for quantitative proteomics including relative and absolute quantitative proteomics, and then give our opinions on the outlook of proteome quantification methods.
Tsukahara, Keita; Takabatake, Reona; Masubuchi, Tomoko; Futo, Satoshi; Minegishi, Yasutaka; Noguchi, Akio; Kondo, Kazunari; Nishimaki-Mogami, Tomoko; Kurashima, Takeyo; Mano, Junichi; Kitta, Kazumi
2016-01-01
A real-time PCR-based analytical method was developed for the event-specific quantification of a genetically modified (GM) soybean event, MON87701. First, a standard plasmid for MON87701 quantification was constructed. The conversion factor (C f ) required to calculate the amount of genetically modified organism (GMO) was experimentally determined for a real-time PCR instrument. The determined C f for the real-time PCR instrument was 1.24. For the evaluation of the developed method, a blind test was carried out in an inter-laboratory trial. The trueness and precision were evaluated as the bias and reproducibility of relative standard deviation (RSDr), respectively. The determined biases and the RSDr values were less than 30 and 13%, respectively, at all evaluated concentrations. The limit of quantitation of the method was 0.5%, and the developed method would thus be applicable for practical analyses for the detection and quantification of MON87701.
Takabatake, Reona; Onishi, Mari; Koiwa, Tomohiro; Futo, Satoshi; Minegishi, Yasutaka; Akiyama, Hiroshi; Teshima, Reiko; Kurashima, Takeyo; Mano, Junichi; Furui, Satoshi; Kitta, Kazumi
2013-01-01
A novel real-time polymerase chain reaction (PCR)-based quantitative screening method was developed for three genetically modified soybeans: RRS, A2704-12, and MON89788. The 35S promoter (P35S) of cauliflower mosaic virus is introduced into RRS and A2704-12 but not MON89788. We then designed a screening method comprised of the combination of the quantification of P35S and the event-specific quantification of MON89788. The conversion factor (Cf) required to convert the amount of a genetically modified organism (GMO) from a copy number ratio to a weight ratio was determined experimentally. The trueness and precision were evaluated as the bias and reproducibility of relative standard deviation (RSDR), respectively. The determined RSDR values for the method were less than 25% for both targets. We consider that the developed method would be suitable for the simple detection and approximate quantification of GMO.
Wang, Hongrui; Wang, Cheng; Wang, Ying; ...
2017-04-05
This paper presents a Bayesian approach using Metropolis-Hastings Markov Chain Monte Carlo algorithm and applies this method for daily river flow rate forecast and uncertainty quantification for Zhujiachuan River using data collected from Qiaotoubao Gage Station and other 13 gage stations in Zhujiachuan watershed in China. The proposed method is also compared with the conventional maximum likelihood estimation (MLE) for parameter estimation and quantification of associated uncertainties. While the Bayesian method performs similarly in estimating the mean value of daily flow rate, it performs over the conventional MLE method on uncertainty quantification, providing relatively narrower reliable interval than the MLEmore » confidence interval and thus more precise estimation by using the related information from regional gage stations. As a result, the Bayesian MCMC method might be more favorable in the uncertainty analysis and risk management.« less
NASA Astrophysics Data System (ADS)
García-Florentino, Cristina; Maguregui, Maite; Marguí, Eva; Torrent, Laura; Queralt, Ignasi; Madariaga, Juan Manuel
2018-05-01
In this work, a Total Reflection X-ray fluorescence (TXRF) spectrometry based quantitative methodology for elemental characterization of liquid extracts and solids belonging to old building materials and their degradation products from a building of the beginning of 20th century with a high historic cultural value in Getxo, (Basque Country, North of Spain) is proposed. This quantification strategy can be considered a faster methodology comparing to traditional Energy or Wavelength Dispersive X-ray fluorescence (ED-XRF and WD-XRF) spectrometry based methodologies or other techniques such as Inductively Coupled Plasma Mass Spectrometry (ICP-MS). In particular, two kinds of liquid extracts were analysed: (i) water soluble extracts from different mortars and (ii) acid extracts from mortars, black crusts, and calcium carbonate formations. In order to try to avoid the acid extraction step of the materials and their degradation products, it was also studied the TXRF direct measurement of the powdered solid suspensions in water. With this aim, different parameters such as the deposition volume and the measuring time were studied for each kind of samples. Depending on the quantified element, the limits of detection achieved with the TXRF quantitative methodologies for liquid extracts and solids were set around 0.01-1.2 and 2-200 mg/L respectively. The quantification of K, Ca, Ti, Mn, Fe, Zn, Rb, Sr, Sn and Pb in the liquid extracts was proved to be a faster alternative to other more classic quantification techniques (i.e. ICP-MS), accurate enough to obtain information about the composition of the acidic soluble part of the materials and their degradation products. Regarding the solid samples measured as suspensions, it was quite difficult to obtain stable and repetitive suspensions affecting in this way the accuracy of the results. To cope with this problem, correction factors based on the quantitative results obtained using ED-XRF were calculated to improve the accuracy of the TXRF results.
Hagendorfer, Harald; Kaegi, Ralf; Traber, Jacqueline; Mertens, Stijn F L; Scherrers, Roger; Ludwig, Christian; Ulrich, Andrea
2011-11-14
In this work we discuss about the method development, applicability and limitations of an asymmetric flow field flow fractionation (A4F) system in combination with a multi-detector setup consisting of UV/vis, light scattering, and inductively coupled plasma mass spectrometry (ICPMS). The overall aim was to obtain a size dependent-, element specific-, and quantitative method appropriate for the characterization of metallic engineered nanoparticle (ENP) dispersions. Thus, systematic investigations of crucial method parameters were performed by employing well characterized Au nanoparticles (Au-NPs) as a defined model system. For good separation performance, the A4F flow-, membrane-, and carrier conditions were optimized. To obtain reliable size information, the use of laser light scattering based detectors was evaluated, where an online dynamic light scattering (DLS) detector showed good results for the investigated Au-NP up to a size of 80 nm in hydrodynamic diameter. To adapt large sensitivity differences of the various detectors, as well as to guarantee long term stability and minimum contamination of the mass spectrometer a split-flow concept for coupling ICPMS was evaluated. To test for reliable quantification, the ICPMS signal response of ionic Au standards was compared to that of Au-NP. Using proper stabilization with surfactants, no difference for concentrations of 1-50 μg Au L(-1) in the size range from 5 to 80 nm for citrate stabilized dispersions was observed. However, studies using different A4F channel membranes showed unspecific particle-membrane interaction resulting in retention time shifts and unspecific loss of nanoparticles, depending on the Au-NP system as well as membrane batch and type. Thus, reliable quantification and discrimination of ionic and particular species was performed using ICPMS in combination with ultracentrifugation instead of direct quantification with the A4F multi-detector setup. Figures of merit were obtained, by comparing the results from the multi detector approach outlined above, with results from batch-DLS and transmission electron microscopy (TEM). Furthermore, validation performed with certified NIST Au-NP showed excellent agreement. The developed methods show potential for characterization of other commonly used and important metallic engineered nanoparticles. Copyright © 2011 Elsevier B.V. All rights reserved.
Lowering the quantification limit of the QubitTM RNA HS assay using RNA spike-in.
Li, Xin; Ben-Dov, Iddo Z; Mauro, Maurizio; Williams, Zev
2015-05-06
RNA quantification is often a prerequisite for most RNA analyses such as RNA sequencing. However, the relatively low sensitivity and large sample consumption of traditional RNA quantification methods such as UV spectrophotometry and even the much more sensitive fluorescence-based RNA quantification assays, such as the Qubit™ RNA HS Assay, are often inadequate for measuring minute levels of RNA isolated from limited cell and tissue samples and biofluids. Thus, there is a pressing need for a more sensitive method to reliably and robustly detect trace levels of RNA without interference from DNA. To improve the quantification limit of the Qubit™ RNA HS Assay, we spiked-in a known quantity of RNA to achieve the minimum reading required by the assay. Samples containing trace amounts of RNA were then added to the spike-in and measured as a reading increase over RNA spike-in baseline. We determined the accuracy and precision of reading increases between 1 and 20 pg/μL as well as RNA-specificity in this range, and compared to those of RiboGreen(®), another sensitive fluorescence-based RNA quantification assay. We then applied Qubit™ Assay with RNA spike-in to quantify plasma RNA samples. RNA spike-in improved the quantification limit of the Qubit™ RNA HS Assay 5-fold, from 25 pg/μL down to 5 pg/μL while maintaining high specificity to RNA. This enabled quantification of RNA with original concentration as low as 55.6 pg/μL compared to 250 pg/μL for the standard assay and decreased sample consumption from 5 to 1 ng. Plasma RNA samples that were not measurable by the Qubit™ RNA HS Assay were measurable by our modified method. The Qubit™ RNA HS Assay with RNA spike-in is able to quantify RNA with high specificity at 5-fold lower concentration and uses 5-fold less sample quantity than the standard Qubit™ Assay.
Psifidi, Androniki; Dovas, Chrysostomos; Banos, Georgios
2011-01-19
Single nucleotide polymorphisms (SNP) have proven to be powerful genetic markers for genetic applications in medicine, life science and agriculture. A variety of methods exist for SNP detection but few can quantify SNP frequencies when the mutated DNA molecules correspond to a small fraction of the wild-type DNA. Furthermore, there is no generally accepted gold standard for SNP quantification, and, in general, currently applied methods give inconsistent results in selected cohorts. In the present study we sought to develop a novel method for accurate detection and quantification of SNP in DNA pooled samples. The development and evaluation of a novel Ligase Chain Reaction (LCR) protocol that uses a DNA-specific fluorescent dye to allow quantitative real-time analysis is described. Different reaction components and thermocycling parameters affecting the efficiency and specificity of LCR were examined. Several protocols, including gap-LCR modifications, were evaluated using plasmid standard and genomic DNA pools. A protocol of choice was identified and applied for the quantification of a polymorphism at codon 136 of the ovine PRNP gene that is associated with susceptibility to a transmissible spongiform encephalopathy in sheep. The real-time LCR protocol developed in the present study showed high sensitivity, accuracy, reproducibility and a wide dynamic range of SNP quantification in different DNA pools. The limits of detection and quantification of SNP frequencies were 0.085% and 0.35%, respectively. The proposed real-time LCR protocol is applicable when sensitive detection and accurate quantification of low copy number mutations in DNA pools is needed. Examples include oncogenes and tumour suppressor genes, infectious diseases, pathogenic bacteria, fungal species, viral mutants, drug resistance resulting from point mutations, and genetically modified organisms in food.
Pocock, Tessa; Król, Marianna; Huner, Norman P A
2004-01-01
Chorophylls and carotenoids are functionally important pigment molecules in photosynthetic organisms. Methods for the determination of chlorophylls a and b, beta-carotene, neoxanthin, and the pigments that are involved in photoprotective cycles such as the xanthophylls are discussed. These cycles involve the reversible de-epoxidation of violaxanthin into antheraxanthin and zeaxanthin, as well as the reversible de-epoxidation of lutein-5,6-epoxide into lutein. This chapter describes pigment extraction procedures from higher plants and green algae. Methods for the determination and quantification using high-performance liquid chromatograpy (HPLC) are described as well as methods for the separation and purification of pigments for use as standards using thin-layer chromatography (TLC). In addition, several spectrophotometric methods for the quantification of chlorophylls a and b are described.
Metering error quantification under voltage and current waveform distortion
NASA Astrophysics Data System (ADS)
Wang, Tao; Wang, Jia; Xie, Zhi; Zhang, Ran
2017-09-01
With integration of more and more renewable energies and distortion loads into power grid, the voltage and current waveform distortion results in metering error in the smart meters. Because of the negative effects on the metering accuracy and fairness, it is an important subject to study energy metering combined error. In this paper, after the comparing between metering theoretical value and real recorded value under different meter modes for linear and nonlinear loads, a quantification method of metering mode error is proposed under waveform distortion. Based on the metering and time-division multiplier principles, a quantification method of metering accuracy error is proposed also. Analyzing the mode error and accuracy error, a comprehensive error analysis method is presented which is suitable for new energy and nonlinear loads. The proposed method has been proved by simulation.
A simple and fast method for extraction and quantification of cryptophyte phycoerythrin.
Thoisen, Christina; Hansen, Benni Winding; Nielsen, Søren Laurentius
2017-01-01
The microalgal pigment phycoerythrin (PE) is of commercial interest as natural colorant in food and cosmetics, as well as fluoroprobes for laboratory analysis. Several methods for extraction and quantification of PE are available but they comprise typically various extraction buffers, repetitive freeze-thaw cycles and liquid nitrogen, making extraction procedures more complicated. A simple method for extraction of PE from cryptophytes is described using standard laboratory materials and equipment. The cryptophyte cells on the filters were disrupted at -80 °C and added phosphate buffer for extraction at 4 °C followed by absorbance measurement. The cryptophyte Rhodomonas salina was used as a model organism. •Simple method for extraction and quantification of phycoerythrin from cryptophytes.•Minimal usage of equipment and chemicals, and low labor costs.•Applicable for industrial and biological purposes.
Pre-concentration technique for reduction in "Analytical instrument requirement and analysis"
NASA Astrophysics Data System (ADS)
Pal, Sangita; Singha, Mousumi; Meena, Sher Singh
2018-04-01
Availability of analytical instruments for a methodical detection of known and unknown effluents imposes a serious hindrance in qualification and quantification. Several analytical instruments such as Elemental analyzer, ICP-MS, ICP-AES, EDXRF, ion chromatography, Electro-analytical instruments which are not only expensive but also time consuming, required maintenance, damaged essential parts replacement which are of serious concern. Move over for field study and instant detection installation of these instruments are not convenient to each and every place. Therefore, technique such as pre-concentration of metal ions especially for lean stream elaborated and justified. Chelation/sequestration is the key of immobilization technique which is simple, user friendly, most effective, least expensive, time efficient; easy to carry (10g - 20g vial) to experimental field/site has been demonstrated.
Fiber-Optic Defect and Damage Locator System for Wind Turbine Blades
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dr. Vahid Sotoudeh; Dr. Richard J. Black; Dr. Behzad Moslehi
2010-10-30
IFOS in collaboration with Auburn University demonstrated the feasibility of a Fiber Bragg Grating (FBG) integrated sensor system capable of providing real time in-situ defect detection, localization and quantification of damage. In addition, the system is capable of validating wind turbine blade structural models, using recent advances in non-contact, non-destructive dynamic testing of composite structures. This new generation method makes it possible to analyze wind turbine blades not only non-destructively, but also without physically contacting or implanting intrusive electrical elements and transducers into the structure. Phase I successfully demonstrated the feasibility of the technology with the construction of a 1.5more » kHz sensor interrogator and preliminary instrumentation and testing of both composite material coupons and a wind turbine blade.« less
Menéndez-Miranda, Mario; Encinar, Jorge Ruiz; Costa-Fernández, José M; Sanz-Medel, Alfredo
2015-11-27
Hyphenation of asymmetric flow field-flow fractionation (AF4) to an on-line elemental detection (inductively coupled plasma-mass spectrometry, ICP-MS) is proposed as a powerful diagnostic tool for quantum dots bioconjugation studies. In particular, conjugation effectiveness between a "model" monoclonal IgG antibody (Ab) and CdSe/ZnS core-shell Quantum Dots (QDs), surface-coated with an amphiphilic polymer, has been monitored here by such hybrid AF4-ICP-MS technique. Experimental conditions have been optimized searching for a proper separation between the sought bioconjugates from the eventual free reagents excesses employed during the bioconjugation (QDs and antibodies). Composition and pH of the carrier have been found to be critical parameters to ensure an efficient separation while ensuring high species recovery from the AF4 channel. An ICP-MS equipped with a triple quadropole was selected as elemental detector to enable sensitive and reliable simultaneous quantification of the elemental constituents, including sulfur, of the nanoparticulated species and the antibody. The hyphenated technique used provided nanoparticle size-based separation, elemental detection, and composition analysis capabilities that turned out to be instrumental in order to investigate in depth the Ab-QDs bioconjugation process. Moreover, the analytical strategy here proposed allowed us not only to clearly identify the bioconjugation reaction products but also to quantify nanoparticle:antibodies bioconjugation efficiency. This is a key issue in future development of analytical and bioanalytical photoluminescent QDs applications. Copyright © 2015 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Takahashi, Tomoko; Thornton, Blair
2017-12-01
This paper reviews methods to compensate for matrix effects and self-absorption during quantitative analysis of compositions of solids measured using Laser Induced Breakdown Spectroscopy (LIBS) and their applications to in-situ analysis. Methods to reduce matrix and self-absorption effects on calibration curves are first introduced. The conditions where calibration curves are applicable to quantification of compositions of solid samples and their limitations are discussed. While calibration-free LIBS (CF-LIBS), which corrects matrix effects theoretically based on the Boltzmann distribution law and Saha equation, has been applied in a number of studies, requirements need to be satisfied for the calculation of chemical compositions to be valid. Also, peaks of all elements contained in the target need to be detected, which is a bottleneck for in-situ analysis of unknown materials. Multivariate analysis techniques are gaining momentum in LIBS analysis. Among the available techniques, principal component regression (PCR) analysis and partial least squares (PLS) regression analysis, which can extract related information to compositions from all spectral data, are widely established methods and have been applied to various fields including in-situ applications in air and for planetary explorations. Artificial neural networks (ANNs), where non-linear effects can be modelled, have also been investigated as a quantitative method and their applications are introduced. The ability to make quantitative estimates based on LIBS signals is seen as a key element for the technique to gain wider acceptance as an analytical method, especially in in-situ applications. In order to accelerate this process, it is recommended that the accuracy should be described using common figures of merit which express the overall normalised accuracy, such as the normalised root mean square errors (NRMSEs), when comparing the accuracy obtained from different setups and analytical methods.
Witte, Anna Kristina; Fister, Susanne; Mester, Patrick; Schoder, Dagmar; Rossmanith, Peter
2016-11-01
Fast and reliable pathogen detection is an important issue for human health. Since conventional microbiological methods are rather slow, there is growing interest in detection and quantification using molecular methods. The droplet digital polymerase chain reaction (ddPCR) is a relatively new PCR method for absolute and accurate quantification without external standards. Using the Listeria monocytogenes specific prfA assay, we focused on the questions of whether the assay was directly transferable to ddPCR and whether ddPCR was suitable for samples derived from heterogeneous matrices, such as foodstuffs that often included inhibitors and a non-target bacterial background flora. Although the prfA assay showed suboptimal cluster formation, use of ddPCR for quantification of L. monocytogenes from pure bacterial cultures, artificially contaminated cheese, and naturally contaminated foodstuff was satisfactory over a relatively broad dynamic range. Moreover, results demonstrated the outstanding detection limit of one copy. However, while poorer DNA quality, such as resulting from longer storage, can impair ddPCR, internal amplification control (IAC) of prfA by ddPCR, that is integrated in the genome of L. monocytogenes ΔprfA, showed even slightly better quantification over a broader dynamic range. Graphical Abstract Evaluating the absolute quantification potential of ddPCR targeting Listeria monocytogenes prfA.
Gaubert, Alexandra; Jeudy, Jérémy; Rougemont, Blandine; Bordes, Claire; Lemoine, Jérôme; Casabianca, Hervé; Salvador, Arnaud
2016-07-01
In a stricter legislative context, greener detergent formulations are developed. In this way, synthetic surfactants are frequently replaced by bio-sourced surfactants and/or used at lower concentrations in combination with enzymes. In this paper, a LC-MS/MS method was developed for the identification and quantification of enzymes in laundry detergents. Prior to the LC-MS/MS analyses, a specific sample preparation protocol was developed due to matrix complexity (high surfactant percentages). Then for each enzyme family mainly used in detergent formulations (protease, amylase, cellulase, and lipase), specific peptides were identified on a high resolution platform. A LC-MS/MS method was then developed in selected reaction monitoring (SRM) MS mode for the light and corresponding heavy peptides. The method was linear on the peptide concentration ranges 25-1000 ng/mL for protease, lipase, and cellulase; 50-1000 ng/mL for amylase; and 5-1000 ng/mL for cellulase in both water and laundry detergent matrices. The application of the developed analytical strategy to real commercial laundry detergents enabled enzyme identification and absolute quantification. For the first time, identification and absolute quantification of enzymes in laundry detergent was realized by LC-MS/MS in a single run. Graphical Abstract Identification and quantification of enzymes by LC-MS/MS.
Paul B. Alaback; Duncan C. Lutes
1997-01-01
Methods for the quantification of coarse woody debris volume and the description of spatial patterning were studied in the Tenderfoot Creek Experimental Forest, Montana. The line transect method was found to be an accurate, unbiased estimator of down debris volume (> 10cm diameter) on 1/4 hectare fixed-area plots, when perpendicular lines were used. The Fischer...
Recent advances on aptamer-based biosensors to detection of platelet-derived growth factor.
Razmi, Nasrin; Baradaran, Behzad; Hejazi, Maryam; Hasanzadeh, Mohammad; Mosafer, Jafar; Mokhtarzadeh, Ahad; de la Guardia, Miguel
2018-08-15
Platelet-derived growth factor (PDGF-BB), a significant serum cytokine, is an important protein biomarker in diagnosis and recognition of cancer, which straightly rolled in proceeding of various cell transformations, including tumor growth and its development. Fibrosis, atherosclerosis are certain appalling diseases, which PDGF-BB is near to them. Generally, the expression amount of PDGF-BB increases in human life-threatening tumors serving as an indicator for tumor angiogenesis. Thus, identification and quantification of PDGF-BB in biomedical fields are particularly important. Affinity chromatography, immunohistochemical methods and enzyme-linked immunosorbent assay (ELISA), conventional methods for PDGF-BB detection, requiring high-cost and complicated instrumentation, take too much time and offer deficient sensitivity and selectivity, which restrict their usage in real applications. Hence, it is essential to design and build enhanced systems and platforms for the recognition and quantification of protein biomarkers. In the past few years, biosensors especially aptasensors have been received noticeable attention for the detection of PDGF-BB owing to their high sensitivity, selectivity, accuracy, fast response, and low cost. Since the role and importance of developing aptasensors in cancer diagnosis is undeniable. In this review, optical and electrochemical aptasensors, which have been applied by many researchers for PDGF-BB cancer biomarker detection, have been mentioned and merits and demerits of them have been explained and compared. Efforts related to design and development of aptamer-based biosensors using nanoparticles for sensitive and selective detection of PDGF-BB have been reviewed considering: Aptamer importance as recognition elements, principal, application and the recent improvements and developments of aptamer based optical and electrochemical methods. In addition, commercial biosensors and future perspectives for rapid and on-site detection of PDGF-BB have been summarized. Copyright © 2018 Elsevier B.V. All rights reserved.
Misra, Ankita; Shukla, Pushpendra Kumar; Kumar, Bhanu; Chand, Jai; Kushwaha, Poonam; Khalid, Md.; Singh Rawat, Ajay Kumar; Srivastava, Sharad
2017-01-01
Background: Gloriosa superba L. (Colchicaceae) is used as adjuvant therapy in gout for its potential antimitotic activity due to high colchicine(s) alkaloids. Objective: This study aimed to develop an easy, cheap, precise, and accurate high-performance thin-layer chromatographic (HPTLC) validated method for simultaneous quantification of bioactive alkaloids (colchicine and gloriosine) in G. superba L. and to identify its elite chemotype(s) from Sikkim Himalayas (India). Methods: The HPTLC chromatographic method was developed using mobile phase of chloroform: acetone: diethyl amine (5:4:1) at λmax of 350 nm. Results: Five germplasms were collected from targeted region, and on morpho-anatomical inspection, no significant variation was observed among them. Quantification data reveal that content of colchicine (Rf: 0.72) and gloriosine (Rf: 0.61) varies from 0.035%–0.150% to 0.006%–0.032% (dry wt. basis). Linearity of method was obtained in the concentration range of 100–400 ng/spot of marker(s), exhibiting regression coefficient of 0.9987 (colchicine) and 0.9983 (gloriosine) with optimum recovery of 97.79 ± 3.86 and 100.023% ± 0.01%, respectively. Limit of detection and limit of quantification were analyzed, respectively, as 6.245, 18.926 and 8.024, 24.316 (ng). Two germplasms, namely NBG-27 and NBG-26, were found to be elite chemotype of both the markers. Conclusion: The developed method is validated in terms of accuracy, recovery, and precision studies as per the ICH guidelines (2005) and can be adopted for the simultaneous quantification of colchicine and gloriosine in phytopharmaceuticals. In addition, this study is relevant to explore the chemotypic variability in metabolite content for commercial and medicinal purposes. SUMMARY An easy, cheap, precise, and accurate high performance thin layer chromatographic (HPTLC) validated method for simultaneous quantification of bioactive alkaloids (colchicine and gloriosine) in G. superba L.Five germplasms were collected from targeted region, and on morpho anatomical inspection, no significant variation was observed among themQuantification data reveal that content of colchicine (Rf: 0.72) and gloriosine (Rf: 0.61) varies from 0.035%–0.150% to 0.006%–0.032% (dry wt. basis)Two germplasms, namely NBG 27 and NBG 26, were found to be elite chemotype of both the markers. PMID:29142436
Bihan, Kevin; Sauzay, Chloé; Goldwirt, Lauriane; Charbonnier-Beaupel, Fanny; Hulot, Jean-Sebastien; Funck-Brentano, Christian; Zahr, Noël
2015-02-01
Vemurafenib (Zelboraf) is a new tyrosine kinase inhibitor that selectively targets activated BRAF V600E gene and is indicated for the treatment of advanced BRAF mutation-positive melanoma. We developed a simple method for vemurafenib quantification using liquid chromatography-tandem mass spectrometry. A stability study of vemurafenib in human plasma was also performed. (13)C(6)-vemurafenib was used as the internal standard. A single-step protein precipitation was used for plasma sample preparation. Chromatography was performed on an Acquity UPLC system (Waters) with chromatographic separation by the use of an Acquity UPLC BEH C18 column (2.1 × 50 mm, 1.7-mm particle size; Waters). Quantification was performed using the monitoring of multiple reactions of following transitions: m/z 488.2 → 381.0 for vemurafenib and m/z 494.2 → 387.0 for internal standard. This method was linear over the range from 1.0 to 100.0 mcg/mL. The lower limit of quantification was 0.1 mcg/mL for vemurafenib in plasma. Vemurafenib remained stable for 1 month at all levels tested, when stored indifferently at room temperature (20 °C), at +4 °C, or at -20 °C. This method was used successfully to perform a plasma pharmacokinetic study of vemurafenib in a patient after oral administration at a steady state. This liquid chromatography-tandem mass spectrometry method for vemurafenib quantification in human plasma is simple, rapid, specific, sensitive, accurate, precise, and reliable.
Uncertainty Quantification for Robust Control of Wind Turbines using Sliding Mode Observer
NASA Astrophysics Data System (ADS)
Schulte, Horst
2016-09-01
A new quantification method of uncertain models for robust wind turbine control using sliding-mode techniques is presented with the objective to improve active load mitigation. This approach is based on the so-called equivalent output injection signal, which corresponds to the average behavior of the discontinuous switching term, establishing and maintaining a motion on a so-called sliding surface. The injection signal is directly evaluated to obtain estimates of the uncertainty bounds of external disturbances and parameter uncertainties. The applicability of the proposed method is illustrated by the quantification of a four degree-of-freedom model of the NREL 5MW reference turbine containing uncertainties.
Mapping the seafloor, with end users in mind
NASA Astrophysics Data System (ADS)
Lecours, V.
2017-12-01
In the last 25 years, as more seafloor data and user-friendly analysis tools have become available, the amount and diversity of applications making use of such data have considerably increased. While limitations in the utility of the data caused by the data collection and processing methods may be quite apparent to experts, such limitations may be less obvious to users with different background and expertise. For instance, it has been acknowledged many times in the literature that seafloor data are often treated as true representations of the seafloor rather that as models. This lack of understanding brings hidden dangers to unsuspecting end users misusing data, which may result in misleading outcomes/conclusions for different applications like marine geomorphology, marine habitat mapping, marine conservation, and management of marine resources. In this paper, I identify common practices of both data producers and users that can prevent a proper use of seafloor data. Using seafloor data from a variety of locations and sources, I demonstrate how the choice of soundings interpolator, elements of data quality, scale alterations, and backscatter representation can impact applications. I show how these elements propagate throughout analyses and directly influence outcomes, sometimes in predictable ways (e.g. in marine geomorphology) and sometimes in unpredictable ways (e.g. in marine habitat mapping). Regardless of the final use of seafloor data, better and more transparent error and uncertainty quantification and representation should be implemented at the data collection, processing, and analysis levels. Complete metadata should always be documented, with elements related to data provenance, survey, scale, error and uncertainty quantification, and any other information relevant to further use of seafloor data, in order to create a community of users aware of data quality and limitations. As the number of applications using seafloor data increases, some of the fundamental issues associated with the nature of the data are not being addressed quickly or broadly enough, increasing the risk of misuse. There is a need to reunite data collectors and users to fulfill the potential of seafloor data for different applications.
NASA Astrophysics Data System (ADS)
Buongiorno, J.; Lloyd, K. G.; Shumaker, A.; Schippers, A.; Webster, G.; Weightman, A.; Turner, S.
2015-12-01
Nearly 75% of the Earth's surface is covered by marine sediment that is home to an estimated 2.9 x 1029 microbial cells. A substantial impediment to understanding the abundance and distribution of cells within marine sediment is the lack of a consistent and reliable method for their taxon-specific quantification. Catalyzed reporter fluorescent in situ hybridization (CARD-FISH) provides taxon-specific enumeration, but this process requires passing a large enzyme through cell membranes, decreasing its precision relative to general cell counts using a small DNA stain. In 2015, Yamaguchi et al. developed FISH hybridization chain reaction (FISH-HCR) as an in situ whole cell detection method for environmental microorganisms. FISH-HCR amplifies the fluorescent signal, as does CARD-FISH, but it allows for milder cell permeation methods that might prevent yield loss. To compare FISH-HCR to CARD-FISH, we examined bacteria and archaea cell counts within two sediment cores, Lille Belt (~78 meters deep) and Landsort Deep (90 meters deep), which were retrieved from the Baltic Sea Basin during IODP Expedition 347. Preliminary analysis shows that CARD-FISH counts are below the quantification limit for most depths across both cores. By contrast, quantification of cells was possible with FISH-HCR in all examined depths. When quantification with CARD-FISH was above the limit of detection, counts with FISH-HCR were up to 11 fold higher for Bacteria and 3 fold higher for Archaea from the same sediment sample. Further, FISH-HCR counts follow the trends of on board counts nicely, indicating that FISH-HCR may better reflect the cellular abundance within marine sediment than other quantification methods, including qPCR. Using FISH-HCR, we found that archaeal cell counts were on average greater than bacterial cell counts, but within the same order of magnitude.
Eriksen, Jane N; Madsen, Pia L; Dragsted, Lars O; Arrigoni, Eva
2017-02-01
An improved UHPLC-DAD-based method was developed and validated for quantification of major carotenoids present in spinach, serum, chylomicrons, and feces. Separation was achieved with gradient elution within 12.5 min for six dietary carotenoids and the internal standard, echinenone. The proposed method provides, for all standard components, resolution > 1.1, linearity covering the target range (R > 0.99), LOQ < 0.035 mg/L, and intraday and interday RSDs < 2 and 10%, respectively. Suitability of the method was tested on biological matrices. Method precision (RSD%) for carotenoid quantification in serum, chylomicrons, and feces was below 10% for intra- and interday analysis, except for lycopene. Method accuracy was consistent with mean recoveries ranging from 78.8 to 96.9% and from 57.2 to 96.9% for all carotenoids, except for lycopene, in serum and feces, respectively. Additionally, an interlaboratory validation study on spinach at two institutions showed no significant differences in lutein or β-carotene content, when evaluated on four occasions.
Nahar, Limon Khatun; Cordero, Rosa Elena; Nutt, David; Lingford-Hughes, Anne; Turton, Samuel; Durant, Claire; Wilson, Sue; Paterson, Sue
2016-01-01
Abstract A highly sensitive and fully validated method was developed for the quantification of baclofen in human plasma. After adjusting the pH of the plasma samples using a phosphate buffer solution (pH 4), baclofen was purified using mixed mode (C8/cation exchange) solid-phase extraction (SPE) cartridges. Endogenous water-soluble compounds and lipids were removed from the cartridges before the samples were eluted and concentrated. The samples were analyzed using triple-quadrupole liquid chromatography–tandem mass spectrometry (LC–MS-MS) with triggered dynamic multiple reaction monitoring mode for simultaneous quantification and confirmation. The assay was linear from 25 to 1,000 ng/mL (r2 > 0.999; n = 6). Intraday (n = 6) and interday (n = 15) imprecisions (% relative standard deviation) were <5%, and the average recovery was 30%. The limit of detection of the method was 5 ng/mL, and the limit of quantification was 25 ng/mL. Plasma samples from healthy male volunteers (n = 9, median age: 22) given two single oral doses of baclofen (10 and 60 mg) on nonconsecutive days were analyzed to demonstrate method applicability. PMID:26538544
USDA-ARS?s Scientific Manuscript database
The semi-metallic mineral Se, a naturally-occurring trace element, is primarily found as selenate originating from sedimentary and shale rock formations, e.g., in the western side of the San Joaquin Valley of central California (WSJV). Because selenate-Se is water soluble, bioavailable and biomagnif...
Arashida, Naoko; Nishimoto, Rumi; Harada, Masashi; Shimbo, Kazutaka; Yamada, Naoyuki
2017-02-15
Amino acids and their related metabolites play important roles in various physiological processes and have consequently become biomarkers for diseases. However, accurate quantification methods have only been established for major compounds, such as amino acids and a limited number of target metabolites. We previously reported a highly sensitive high-throughput method for the simultaneous quantification of amines using 3-aminopyridyl-N-succinimidyl carbamate as a derivatization reagent combined with liquid chromatography-tandem mass spectrometry (LC-MS/MS). Herein, we report the successful development of a practical and accurate LC-MS/MS method to analyze low concentrations of 40 physiological amines in 19 min. Thirty-five of these amines showed good linearity, limits of quantification, accuracy, precision, and recovery characteristics in plasma, with scheduled selected reaction monitoring acquisitions. Plasma samples from 10 healthy volunteers were evaluated using our newly developed method. The results revealed that 27 amines were detected in one of the samples, and that 24 of these compounds could be quantified. Notably, this new method successfully quantified metabolites with high accuracy across three orders of magnitude, with lowest and highest averaged concentrations of 31.7 nM (for spermine) and 18.3 μM (for α-aminobutyric acid), respectively. Copyright © 2016 Elsevier B.V. All rights reserved.
Barco, Sebastiano; Castagnola, Elio; Moscatelli, Andrea; Rudge, James; Tripodi, Gino; Cangemi, Giuliana
2017-10-25
In this paper we show the development and validation of a volumetric absorptive microsampling (VAMS™)-LC-MS/MS method for the simultaneous quantification of four antibiotics: piperacillin-tazobactam, meropenem, linezolid and ceftazidime in 10μL human blood. The novel VAMS-LC-MS/MS method has been compared with a dried blood spot (DBS)-based method in terms of impact of hematocrit (HCT) on accuracy, reproducibility, recovery and matrix effect. Antibiotics were extracted from VAMS and DBS by protein precipitation with methanol after a re-hydration step at 37°C for 10min. LC-MS/MS was carried out on a Thermo Scientific™ TSQ Quantum™ Access MAX triple quadrupole coupled to an Accela ™UHPLC system. The VAMS-LC-MS/MS method is selective, precise and reproducible. In contrast to DBS, it allows an accurate quantification without any HCT influence. It has been applied to samples derived from pediatric patients under therapy. VAMS is a valid alternative sampling strategy for the quantification of antibiotics and is valuable in support of clinical PK/PD studies and consequently therapeutic drug monitoring (TDM) in pediatrics. Copyright © 2017 Elsevier B.V. All rights reserved.
GMO quantification: valuable experience and insights for the future.
Milavec, Mojca; Dobnik, David; Yang, Litao; Zhang, Dabing; Gruden, Kristina; Zel, Jana
2014-10-01
Cultivation and marketing of genetically modified organisms (GMOs) have been unevenly adopted worldwide. To facilitate international trade and to provide information to consumers, labelling requirements have been set up in many countries. Quantitative real-time polymerase chain reaction (qPCR) is currently the method of choice for detection, identification and quantification of GMOs. This has been critically assessed and the requirements for the method performance have been set. Nevertheless, there are challenges that should still be highlighted, such as measuring the quantity and quality of DNA, and determining the qPCR efficiency, possible sequence mismatches, characteristics of taxon-specific genes and appropriate units of measurement, as these remain potential sources of measurement uncertainty. To overcome these problems and to cope with the continuous increase in the number and variety of GMOs, new approaches are needed. Statistical strategies of quantification have already been proposed and expanded with the development of digital PCR. The first attempts have been made to use new generation sequencing also for quantitative purposes, although accurate quantification of the contents of GMOs using this technology is still a challenge for the future, and especially for mixed samples. New approaches are needed also for the quantification of stacks, and for potential quantification of organisms produced by new plant breeding techniques.
Determination of fluorine by total reflection X-ray fluorescence spectrometry
NASA Astrophysics Data System (ADS)
Tarsoly, G.; Óvári, M.; Záray, Gy.
2010-04-01
There is a growing interest in determination of low Z elements, i.e. carbon to phosphorus, in various samples. Total reflection X-ray fluorescence spectrometry (TXRF) has been already established as a suitable trace element analytical method with low sample demand and quite good quantification limits. Recently, the determinable element range was extended towards Z = 6 (carbon). In this study, the analytical performance of the total reflection X-ray fluorescence spectrometry for determination of fluorine was investigated applying a spectrometer equipped with Cr-anode X-ray tube, multilayer monochromator, vacuum chamber, and a silicon drift detector (SDD) with ultra thin window was used. The detection limit for fluorine was found to be 5 mg L - 1 (equivalent to 10 ng absolute) in aqueous matrix. The linear range of the fluorine determination is between 15 and 500 mg L - 1 , within this range the precision is below 10%. The matrix effects of the other halogens (chlorine, bromine and iodine), and sulfate were also investigated. It has been established that the upper allowed concentration limit of the above interfering elements is 100, 200, 50 and 100 mg L - 1 for Cl, Br, I and sulfate, respectively. Moreover, the role of the pre-siliconization of the quartz carrier plate was investigated. It was found, that the presence of the silicone results in poorer analytical performance, which can be explained by the thicker sample residue and stronger self-absorption of the fluorescent radiation.
Bostijn, N; Hellings, M; Van Der Veen, M; Vervaet, C; De Beer, T
2018-07-12
UltraViolet (UV) spectroscopy was evaluated as an innovative Process Analytical Technology (PAT) - tool for the in-line and real-time quantitative determination of low-dosed active pharmaceutical ingredients (APIs) in a semi-solid (gel) and a liquid (suspension) pharmaceutical formulation during their batch production process. The performance of this new PAT-tool (i.e., UV spectroscopy) was compared with an already more established PAT-method based on Raman spectroscopy. In-line UV measurements were carried out with an immersion probe while for the Raman measurements a non-contact PhAT probe was used. For both studied formulations, an in-line API quantification model was developed and validated per spectroscopic technique. The known API concentrations (Y) were correlated with the corresponding in-line collected preprocessed spectra (X) through a Partial Least Squares (PLS) regression. Each developed quantification method was validated by calculating the accuracy profile on the basis of the validation experiments. Furthermore, the measurement uncertainty was determined based on the data generated for the determination of the accuracy profiles. From the accuracy profile of the UV- and Raman-based quantification method for the gel, it was concluded that at the target API concentration of 2% (w/w), 95 out of 100 future routine measurements given by the Raman method will not deviate more than 10% (relative error) from the true API concentration, whereas for the UV method the acceptance limits of 10% were exceeded. For the liquid formulation, the Raman method was not able to quantify the API in the low-dosed suspension (0.09% (w/w) API). In contrast, the in-line UV method was able to adequately quantify the API in the suspension. This study demonstrated that UV spectroscopy can be adopted as a novel in-line PAT-technique for low-dose quantification purposes in pharmaceutical processes. Important is that none of the two spectroscopic techniques was superior to the other for both formulations: the Raman method was more accurate in quantifying the API in the gel (2% (w/w) API), while the UV method performed better for API quantification in the suspension (0.09% (w/w) API). Copyright © 2018 Elsevier B.V. All rights reserved.
Lautié, Emmanuelle; Rasse, Catherine; Rozet, Eric; Mourgues, Claire; Vanhelleputte, Jean-Paul; Quetin-Leclercq, Joëlle
2013-02-01
The aim of this study was to find if fast microwave-assisted extraction could be an alternative to the conventional Soxhlet extraction for the quantification of rotenone in yam bean seeds by SPE and HPLC-UV. For this purpose, an experimental design was used to determine the optimal conditions of the microwave extraction. Then the values of the quantification on three accessions from two different species of yam bean seeds were compared using the two different kinds of extraction. A microwave extraction of 11 min at 55°C using methanol/dichloromethane (50:50) allowed rotenone extraction either equivalently or more efficiently than the 8-h-Soxhlet extraction method and was less sensitive to moisture content. The selectivity, precision, trueness, accuracy, and limit of quantification of the method with microwave extraction were also demonstrated. © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Teman, Carolin J.; Wilson, Andrew R.; Perkins, Sherrie L.; Hickman, Kimberly; Prchal, Josef T.; Salama, Mohamed E.
2010-01-01
Evaluation of bone marrow fibrosis and osteosclerosis in myeloproliferative neoplasms (MPN) is subject to interobserver inconsistency. Performance data for currently utilized fibrosis grading systems are lacking, and classification scales for osteosclerosis do not exist. Digital imaging can serve as a quantification method for fibrosis and osteosclerosis. We used digital imaging techniques for trabecular area assessment and reticulin-fiber quantification. Patients with all Philadelphia negative MPN subtypes had higher trabecular volume than controls (p ≤0.0015). Results suggest that the degree of osteosclerosis helps differentiate primary myelofibrosis from other MPN. Numerical quantification of fibrosis highly correlated with subjective scores, and interobserver correlation was satisfactory. Digital imaging provides accurate quantification for osteosclerosis and fibrosis. PMID:20122729
Targeted Quantification of Isoforms of a Thylakoid-Bound Protein: MRM Method Development.
Bru-Martínez, Roque; Martínez-Márquez, Ascensión; Morante-Carriel, Jaime; Sellés-Marchart, Susana; Martínez-Esteso, María José; Pineda-Lucas, José Luis; Luque, Ignacio
2018-01-01
Targeted mass spectrometric methods such as selected/multiple reaction monitoring (SRM/MRM) have found intense application in protein detection and quantification which competes with classical immunoaffinity techniques. It provides a universal procedure to develop a fast, highly specific, sensitive, accurate, and cheap methodology for targeted detection and quantification of proteins based on the direct analysis of their surrogate peptides typically generated by tryptic digestion. This methodology can be advantageously applied in the field of plant proteomics and particularly for non-model species since immunoreagents are scarcely available. Here, we describe the issues to take into consideration in order to develop a MRM method to detect and quantify isoforms of the thylakoid-bound protein polyphenol oxidase from the non-model and database underrepresented species Eriobotrya japonica Lindl.
Watkins, Preston S; Castellon, Benjamin T; Tseng, Chiyen; Wright, Moncie V; Matson, Cole W; Cobb, George P
2018-04-13
A consistent analytical method incorporating sulfuric acid (H 2 SO 4 ) digestion and ICP-MS quantification has been developed for TiO 2 quantification in biotic and abiotic environmentally relevant matrices. Sample digestion in H 2 SO 4 at 110°C provided consistent results without using hydrofluoric acid or microwave digestion. Analysis of seven replicate samples for four matrices on each of 3 days produced Ti recoveries of 97% ± 2.5%, 91 % ± 4.0%, 94% ± 1.8%, and 73 % ± 2.6% (mean ± standard deviation) from water, fish tissue, periphyton, and sediment, respectively. The method demonstrated consistent performance in analysis of water collected over a 1 month.
Evaluating life-safety risk of fieldwork at New Zealand's active volcanoes
NASA Astrophysics Data System (ADS)
Deligne, Natalia; Jolly, Gill; Taig, Tony; Webb, Terry
2014-05-01
Volcano observatories monitor active or potentially active volcanoes. Although the number and scope of remote monitoring instruments and methods continues to grow, in-person field data collection is still required for comprehensive monitoring. Fieldwork anywhere, and especially in mountainous areas, contains an element of risk. However, on volcanoes with signs of unrest, there is an additional risk of volcanic activity escalating while on site, with potentially lethal consequences. As an employer, a volcano observatory is morally and sometimes legally obligated to take reasonable measures to ensure staff safety and to minimise occupational risk. Here we present how GNS Science evaluates life-safety risk for volcanologists engaged in fieldwork on New Zealand volcanoes with signs of volcanic unrest. Our method includes several key elements: (1) an expert elicitation for how likely an eruption is within a given time frame, (2) quantification of, based on historical data when possible, given a small, moderate, or large eruption, the likelihood of exposure to near-vent processes, ballistics, or surge at various distances from the vent, and (3) estimate of fatality rate given exposure to these volcanic hazards. The final product quantifies hourly fatality risk at various distances from a volcanic vent; various thresholds of risk (for example, zones with more than 10-5 hourly fatality risk) trigger different levels of required approval to undertake work. Although an element of risk will always be present when conducting fieldwork on potentially active volcanoes, this is a first step towards providing objective guidance for go/no go decisions for volcanic monitoring.
NASA Astrophysics Data System (ADS)
Zhu, Wei; Lin, Che-Jen; Wang, Xun; Sommar, Jonas; Fu, Xuewu; Feng, Xinbin
2016-04-01
Reliable quantification of air-surface fluxes of elemental Hg vapor (Hg0) is crucial for understanding mercury (Hg) global biogeochemical cycles. There have been extensive measurements and modeling efforts devoted to estimating the exchange fluxes between the atmosphere and various surfaces (e.g., soil, canopies, water, snow, etc.) in the past three decades. However, large uncertainties remain due to the complexity of Hg0 bidirectional exchange, limitations of flux quantification techniques and challenges in model parameterization. In this study, we provide a critical review on the state of science in the atmosphere-surface exchange of Hg0. Specifically, the advancement of flux quantification techniques, mechanisms in driving the air-surface Hg exchange and modeling efforts are presented. Due to the semi-volatile nature of Hg0 and redox transformation of Hg in environmental media, Hg deposition and evasion are influenced by multiple environmental variables including seasonality, vegetative coverage and its life cycle, temperature, light, moisture, atmospheric turbulence and the presence of reactants (e.g., O3, radicals, etc.). However, the effects of these processes on flux have not been fundamentally and quantitatively determined, which limits the accuracy of flux modeling. We compile an up-to-date global observational flux database and discuss the implication of flux data on the global Hg budget. Mean Hg0 fluxes obtained by micrometeorological measurements do not appear to be significantly greater than the fluxes measured by dynamic flux chamber methods over unpolluted surfaces (p = 0.16, one-tailed, Mann-Whitney U test). The spatiotemporal coverage of existing Hg0 flux measurements is highly heterogeneous with large data gaps existing in multiple continents (Africa, South Asia, Middle East, South America and Australia). The magnitude of the evasion flux is strongly enhanced by human activities, particularly at contaminated sites. Hg0 flux observations in East Asia are comparatively larger in magnitude than the rest of the world, suggesting substantial re-emission of previously deposited mercury from anthropogenic sources. The Hg0 exchange over pristine surfaces (e.g., background soil and water) and vegetation needs better constraints for global analyses of the atmospheric Hg budget. The existing knowledge gap and the associated research needs for future measurements and modeling efforts for the air-surface exchange of Hg0 are discussed.
Model Uncertainty Quantification Methods In Data Assimilation
NASA Astrophysics Data System (ADS)
Pathiraja, S. D.; Marshall, L. A.; Sharma, A.; Moradkhani, H.
2017-12-01
Data Assimilation involves utilising observations to improve model predictions in a seamless and statistically optimal fashion. Its applications are wide-ranging; from improving weather forecasts to tracking targets such as in the Apollo 11 mission. The use of Data Assimilation methods in high dimensional complex geophysical systems is an active area of research, where there exists many opportunities to enhance existing methodologies. One of the central challenges is in model uncertainty quantification; the outcome of any Data Assimilation study is strongly dependent on the uncertainties assigned to both observations and models. I focus on developing improved model uncertainty quantification methods that are applicable to challenging real world scenarios. These include developing methods for cases where the system states are only partially observed, where there is little prior knowledge of the model errors, and where the model error statistics are likely to be highly non-Gaussian.
Allevi, Pietro; Femia, Eti Alessandra; Costa, Maria Letizia; Cazzola, Roberta; Anastasia, Mario
2008-11-28
The present report describes a method for the quantification of N-acetyl- and N-glycolylneuraminic acids without any derivatization, using their (13)C(3)-isotopologues as internal standards and a C(18) reversed-phase column modified by decylboronic acid which allows for the first time a complete chromatographic separation between the two analytes. The method is based on high-performance liquid chromatographic coupled with electrospray ion-trap mass spectrometry. The limit of quantification of the method is 0.1mg/L (2.0ng on column) for both analytes. The calibration curves are linear for both sialic acids over the range of 0.1-80mg/L (2.0-1600ng on column) with a correlation coefficient greater than 0.997. The proposed method was applied to the quantitative determination of sialic acids released from fetuin as a model of glycoproteins.
NASA Technical Reports Server (NTRS)
Leser, Patrick E.; Hochhalter, Jacob D.; Newman, John A.; Leser, William P.; Warner, James E.; Wawrzynek, Paul A.; Yuan, Fuh-Gwo
2015-01-01
Utilizing inverse uncertainty quantification techniques, structural health monitoring can be integrated with damage progression models to form probabilistic predictions of a structure's remaining useful life. However, damage evolution in realistic structures is physically complex. Accurately representing this behavior requires high-fidelity models which are typically computationally prohibitive. In the present work, a high-fidelity finite element model is represented by a surrogate model, reducing computation times. The new approach is used with damage diagnosis data to form a probabilistic prediction of remaining useful life for a test specimen under mixed-mode conditions.
Interferences in the direct quantification of bisphenol S in paper by means of thermochemolysis.
Becerra, Valentina; Odermatt, Jürgen
2013-02-01
This article analyses the interferences in the quantification of traces of bisphenol S in paper by applying the direct analytical method "analytical pyrolysis gas chromatography mass spectrometry" (Py-GC/MS) in conjunction with on-line derivatisation with tetramethylammonium hydroxide (TMAH). As the analytes are simultaneously analysed with the matrix, the interferences derive from the matrix. The investigated interferences are found in the analysis of paper samples, which include bisphenol S derivative compounds. As the free bisphenol S is the hydrolysis product of the bisphenol S derivative compounds, the detected amount of bisphenol S in the sample may be overestimated. It is found that the formation of free bisphenol S from the bisphenol S derivative compounds is enhanced in the presence of tetramethylammonium hydroxide (TMAH) under pyrolytic conditions. In order to avoid the formation of bisphenol S trimethylsulphonium hydroxide (TMSH) is introduced. Different parameters are optimised in the development of the quantification method with TMSH. The quantification method based on TMSH thermochemolysis has been validated in terms of reproducibility and accuracy. Copyright © 2012 Elsevier B.V. All rights reserved.
Multivariate Analysis for Quantification of Plutonium(IV) in Nitric Acid Based on Absorption Spectra
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lines, Amanda M.; Adami, Susan R.; Sinkov, Sergey I.
Development of more effective, reliable, and fast methods for monitoring process streams is a growing opportunity for analytical applications. Many fields can benefit from on-line monitoring, including the nuclear fuel cycle where improved methods for monitoring radioactive materials will facilitate maintenance of proper safeguards and ensure safe and efficient processing of materials. On-line process monitoring with a focus on optical spectroscopy can provide a fast, non-destructive method for monitoring chemical species. However, identification and quantification of species can be hindered by the complexity of the solutions if bands overlap or show condition-dependent spectral features. Plutonium (IV) is one example ofmore » a species which displays significant spectral variation with changing nitric acid concentration. Single variate analysis (i.e. Beer’s Law) is difficult to apply to the quantification of Pu(IV) unless the nitric acid concentration is known and separate calibration curves have been made for all possible acid strengths. Multivariate, or chemometric, analysis is an approach that allows for the accurate quantification of Pu(IV) without a priori knowledge of nitric acid concentration.« less
Dobnik, David; Spilsberg, Bjørn; Bogožalec Košir, Alexandra; Holst-Jensen, Arne; Žel, Jana
2015-08-18
Presence of genetically modified organisms (GMO) in food and feed products is regulated in many countries. The European Union (EU) has implemented a threshold for labeling of products containing more than 0.9% of authorized GMOs per ingredient. As the number of GMOs has increased over time, standard-curve based simplex quantitative polymerase chain reaction (qPCR) analyses are no longer sufficiently cost-effective, despite widespread use of initial PCR based screenings. Newly developed GMO detection methods, also multiplex methods, are mostly focused on screening and detection but not quantification. On the basis of droplet digital PCR (ddPCR) technology, multiplex assays for quantification of all 12 EU authorized GM maize lines (per April first 2015) were developed. Because of high sequence similarity of some of the 12 GM targets, two separate multiplex assays were needed. In both assays (4-plex and 10-plex), the transgenes were labeled with one fluorescence reporter and the endogene with another (GMO concentration = transgene/endogene ratio). It was shown that both multiplex assays produce specific results and that performance parameters such as limit of quantification, repeatability, and trueness comply with international recommendations for GMO quantification methods. Moreover, for samples containing GMOs, the throughput and cost-effectiveness is significantly improved compared to qPCR. Thus, it was concluded that the multiplex ddPCR assays could be applied for routine quantification of 12 EU authorized GM maize lines. In case of new authorizations, the events can easily be added to the existing multiplex assays. The presented principle of quantitative multiplexing can be applied to any other domain.
Neutron-Encoded Protein Quantification by Peptide Carbamylation
NASA Astrophysics Data System (ADS)
Ulbrich, Arne; Merrill, Anna E.; Hebert, Alexander S.; Westphall, Michael S.; Keller, Mark P.; Attie, Alan D.; Coon, Joshua J.
2014-01-01
We describe a chemical tag for duplex proteome quantification using neutron encoding (NeuCode). The method utilizes the straightforward, efficient, and inexpensive carbamylation reaction. We demonstrate the utility of NeuCode carbamylation by accurately measuring quantitative ratios from tagged yeast lysates mixed in known ratios and by applying this method to quantify differential protein expression in mice fed a either control or high-fat diet.
Cools, Katherine; Terry, Leon A
2012-07-15
Glucosinolates are β-thioglycosides which are found naturally in Cruciferae including the genus Brassica. When enzymatically hydrolysed, glucosinolates yield isothiocyanates and give a pungent taste. Both glucosinolates and isothiocyanates have been linked with anticancer activity as well as antifungal and antibacterial properties and therefore the quantification of these compounds is scientifically important. A wide range of literature exists on glucosinolates, however the extraction and quantification procedures differ greatly resulting in discrepancies between studies. The aim of this study was therefore to compare the most popular extraction procedures to identify the most efficacious method and whether each extraction can also be used for the quantification of total isothiocyanates. Four extraction techniques were compared for the quantification of sinigrin from mustard cv. Centennial (Brassica juncea L.) seed; boiling water, boiling 50% (v/v) aqueous acetonitrile, boiling 100% methanol and 70% (v/v) aqueous methanol at 70 °C. Prior to injection into the HPLC, the extractions which involved solvents (acetonitrile or methanol) were freeze-dried and resuspended in water. To identify whether the same extract could be used to measure total isothiocyanates, a dichloromethane extraction was carried out on the sinigrin extracts. For the quantification of sinigrin alone, boiling 50% (v/v) acetonitrile was found to be the most efficacious extraction solvent of the four tested yielding 15% more sinigrin than the water extraction. However, the removal of the acetonitrile by freeze-drying had a negative impact on the isothiocyanate content. Quantification of both sinigrin and total isothiocyanates was possible when the sinigrin was extracted using boiling water. Two columns were compared for the quantification of sinigrin revealing the Zorbax Eclipse to be the best column using this particular method. Copyright © 2012 Elsevier B.V. All rights reserved.
Monitoring of metallic contaminants in energy drinks using ICP-MS.
Kilic, Serpil; Cengiz, Mehmet Fatih; Kilic, Murat
2018-03-09
In this study, an improved method was validated for the determination of some metallic contaminants (arsenic (As), chromium (Cr), cadmium (Cd), lead (Pb), iron (Fe), nickel (Ni), copper (Cu), Mn, and antimony (Sb)) in energy drinks using inductive coupled plasma mass spectrometry (ICP-MS). The validation procedure was applied for the evaluation of linearity, repeatability, recovery, limit of detection, and quantification. In addition, to verify the trueness of the method, it was participated in an interlaboratory proficiency test for heavy metals in soft drink organized by the LGC (Laboratory of the Government Chemist) Standard. Validated method was used to monitor for the determination of metallic contaminants in commercial energy drink samples. Concentrations of As, Cr, Cd, Pb, Fe, Ni, Cu, Mn, and Sb in the samples were found in the ranges of 0.76-6.73, 13.25-100.96, 0.16-2.11, 9.33-28.96, 334.77-937.12, 35.98-303.97, 23.67-60.48, 5.45-489.93, and 0.01-0.42 μg L -1 , respectively. The results were compared with the provisional guideline or parametric values of the elements for drinking waters set by the WHO (World Health Organization) and EC (European Commission). As, Cd, Cu, and Sb did not exceed the WHO and EC provisional guideline or parametric values. However, the other elements (Cr, Pb, Fe, Ni, and Mn) were found to be higher than their relevant limits at various levels.
2016-04-01
QUANTIFICATION OF VX NERVE AGENT IN VARIOUS FOOD MATRICES BY SOLID-PHASE EXTRACTION ULTRA-PERFORMANCE...TITLE AND SUBTITLE Quantification of VX Nerve Agent in Various Food Matrices by Solid-Phase Extraction Ultra-Performance Liquid Chromatography... food matrices. The mixed-mode cation exchange (MCX) sorbent and Quick, Easy, Cheap, Effective, Rugged, and Safe (QuEChERS) methods were used for
Targeted methods for quantitative analysis of protein glycosylation
Goldman, Radoslav; Sanda, Miloslav
2018-01-01
Quantification of proteins by LC-MS/MS-MRM has become a standard method with broad projected clinical applicability. MRM quantification of protein modifications is, however, far less utilized, especially in the case of glycoproteins. This review summarizes current methods for quantitative analysis of protein glycosylation with a focus on MRM methods. We describe advantages of this quantitative approach, analytical parameters that need to be optimized to achieve reliable measurements, and point out the limitations. Differences between major classes of N- and O-glycopeptides are described and class-specific glycopeptide assays are demonstrated. PMID:25522218
Lee, Min-Jeong; Seo, Da-Young; Lee, Hea-Eun; Wang, In-Chun; Kim, Woo-Sik; Jeong, Myung-Yung; Choi, Guang J
2011-01-17
Along with the risk-based approach, process analytical technology (PAT) has emerged as one of the key elements to fully implement QbD (quality-by-design). Near-infrared (NIR) spectroscopy has been extensively applied as an in-line/on-line analytical tool in biomedical and chemical industries. In this study, the film thickness on pharmaceutical pellets was examined for quantification using in-line NIR spectroscopy during a fluid-bed coating process. A precise monitoring of coating thickness and its prediction with a suitable control strategy is crucial to the quality assurance of solid dosage forms including dissolution characteristics. Pellets of a test formulation were manufactured and coated in a fluid-bed by spraying a hydroxypropyl methylcellulose (HPMC) coating solution. NIR spectra were acquired via a fiber-optic probe during the coating process, followed by multivariate analysis utilizing partial least squares (PLS) calibration models. The actual coating thickness of pellets was measured by two separate methods, confocal laser scanning microscopy (CLSM) and laser diffraction particle size analysis (LD-PSA). Both characterization methods gave superb correlation results, and all determination coefficient (R(2)) values exceeded 0.995. In addition, a prediction coating experiment for 70min demonstrated that the end-point can be accurately designated via NIR in-line monitoring with appropriate calibration models. In conclusion, our approach combining in-line NIR monitoring with CLSM and LD-PSA can be applied as an effective PAT tool for fluid-bed pellet coating processes. Copyright © 2010 Elsevier B.V. All rights reserved.
Szakács, Zoltán; Mészáros, Tamás; de Jonge, Marien I; Gyurcsányi, Róbert E
2018-05-30
Detection and counting of single virus particles in liquid samples are largely limited to narrow size distribution of viruses and purified formulations. To address these limitations, here we propose a calibration-free method that enables concurrently the selective recognition, counting and sizing of virus particles as demonstrated through the detection of human respiratory syncytial virus (RSV), an enveloped virus with a broad size distribution, in throat swab samples. RSV viruses were selectively labeled through their attachment glycoproteins (G) with fluorescent aptamers, which further enabled their identification, sizing and counting at the single particle level by fluorescent nanoparticle tracking analysis. The proposed approach seems to be generally applicable to virus detection and quantification. Moreover, it could be successfully applied to detect single RSV particles in swab samples of diagnostic relevance. Since the selective recognition is associated with the sizing of each detected particle, this method enables to discriminate viral elements linked to the virus as well as various virus forms and associations.
Novel approach in k0-NAA for highly concentrated REE Samples.
Abdollahi Neisiani, M; Latifi, M; Chaouki, J; Chilian, C
2018-04-01
The present paper presents a new approach for k 0 -NAA for accurate quantification with short turnaround analysis times for rare earth elements (REEs) in high content mineral matrices. REE k 0 and Q 0 values, spectral interferences and nuclear interferences were experimentally evaluated and improved with Alfa Aesar Specpure Plasma Standard 1000mgkg -1 mono-rare earth solutions. The new iterative gamma-ray self-attenuation and neutron self-shielding methods were investigated with powder standards prepared from 100mg of 99.9% Alfa Aesar mono rare earth oxide diluted with silica oxide. The overall performance of the new k 0 -NAA method for REEs was validated using a certified reference material (CRM) from Canadian Certified Reference Materials Project (REE-2) with REE content ranging from 7.2mgkg -1 for Yb to 9610mgkg -1 for Ce. The REE concentration was determined with uncertainty below 7% (at 95% confidence level) and proved good consistency with the CRM certified concentrations. Copyright © 2017 Elsevier B.V. All rights reserved.
Selective Detection of Peptide-Oligonucleotide Heteroconjugates Utilizing Capillary HPLC-ICPMS
NASA Astrophysics Data System (ADS)
Catron, Brittany; Caruso, Joseph A.; Limbach, Patrick A.
2012-06-01
A method for the selective detection and quantification of peptide:oligonucleotide heteroconjugates, such as those generated by protein:nucleic acid cross-links, using capillary reversed-phase high performance liquid chromatography (cap-RPHPLC) coupled with inductively coupled plasma mass spectrometry detection (ICPMS) is described. The selective detection of phosphorus as 31P+, the only natural isotope, in peptide-oligonucleotide heteroconjugates is enabled by the elemental detection capabilities of the ICPMS. Mobile phase conditions that allow separation of heteroconjugates while maintaining ICPMS compatibility were investigated. We found that trifluoroacetic acid (TFA) mobile phases, used in conventional peptide separations, and hexafluoroisopropanol/triethylamine (HFIP/TEA) mobile phases, used in conventional oligonucleotide separations, both are compatible with ICPMS and enable heteroconjugate separation. The TFA-based separations yielded limits of detection (LOD) of ~40 ppb phosphorus, which is nearly seven times lower than the LOD for HFIP/TEA-based separations. Using the TFA mobile phase, 1-2 pmol of a model heteroconjugate were routinely separated and detected by this optimized capLC-ICPMS method.
Markiewicz-Keszycka, Maria; Casado-Gavalda, Maria P; Cama-Moncunill, Xavier; Cama-Moncunill, Raquel; Dixit, Yash; Cullen, Patrick J; Sullivan, Carl
2018-04-01
Gluten free (GF) diets are prone to mineral deficiency, thus effective monitoring of the elemental composition of GF products is important to ensure a balanced micronutrient diet. The objective of this study was to test the potential of laser-induced breakdown spectroscopy (LIBS) analysis combined with chemometrics for at-line monitoring of ash, potassium and magnesium content of GF flours: tapioca, potato, maize, buckwheat, brown rice and a GF flour mixture. Concentrations of ash, potassium and magnesium were determined with reference methods and LIBS. PCA analysis was performed and presented the potential for discrimination of the six GF flours. For the quantification analysis PLSR models were developed; R 2 cal were 0.99 for magnesium and potassium and 0.97 for ash. The study revealed that LIBS combined with chemometrics is a convenient method to quantify concentrations of ash, potassium and magnesium and present the potential to classify different types of flours. Copyright © 2017 Elsevier Ltd. All rights reserved.
Takabatake, Reona; Masubuchi, Tomoko; Futo, Satoshi; Minegishi, Yasutaka; Noguchi, Akio; Kondo, Kazunari; Teshima, Reiko; Kurashima, Takeyo; Mano, Junichi; Kitta, Kazumi
2014-01-01
A novel real-time PCR-based analytical method was developed for the event-specific quantification of a genetically modified (GM) maize event, MIR162. We first prepared a standard plasmid for MIR162 quantification. The conversion factor (Cf) required to calculate the genetically modified organism (GMO) amount was empirically determined for two real-time PCR instruments, the Applied Biosystems 7900HT (ABI7900) and the Applied Biosystems 7500 (ABI7500) for which the determined Cf values were 0.697 and 0.635, respectively. To validate the developed method, a blind test was carried out in an interlaboratory study. The trueness and precision were evaluated as the bias and reproducibility of relative standard deviation (RSDr). The determined biases were less than 25% and the RSDr values were less than 20% at all evaluated concentrations. These results suggested that the limit of quantitation of the method was 0.5%, and that the developed method would thus be suitable for practical analyses for the detection and quantification of MIR162.
Image-guided spatial localization of heterogeneous compartments for magnetic resonance
An, Li; Shen, Jun
2015-01-01
Purpose: Image-guided localization SPectral Localization Achieved by Sensitivity Heterogeneity (SPLASH) allows rapid measurement of signals from irregularly shaped anatomical compartments without using phase encoding gradients. Here, the authors propose a novel method to address the issue of heterogeneous signal distribution within the localized compartments. Methods: Each compartment was subdivided into multiple subcompartments and their spectra were solved by Tikhonov regularization to enforce smoothness within each compartment. The spectrum of a given compartment was generated by combining the spectra of the components of that compartment. The proposed method was first tested using Monte Carlo simulations and then applied to reconstructing in vivo spectra from irregularly shaped ischemic stroke and normal tissue compartments. Results: Monte Carlo simulations demonstrate that the proposed regularized SPLASH method significantly reduces localization and metabolite quantification errors. In vivo results show that the intracompartment regularization results in ∼40% reduction of error in metabolite quantification. Conclusions: The proposed method significantly reduces localization errors and metabolite quantification errors caused by intracompartment heterogeneous signal distribution. PMID:26328977
Microfluidics-based digital quantitative PCR for single-cell small RNA quantification.
Yu, Tian; Tang, Chong; Zhang, Ying; Zhang, Ruirui; Yan, Wei
2017-09-01
Quantitative analyses of small RNAs at the single-cell level have been challenging because of limited sensitivity and specificity of conventional real-time quantitative PCR methods. A digital quantitative PCR (dqPCR) method for miRNA quantification has been developed, but it requires the use of proprietary stem-loop primers and only applies to miRNA quantification. Here, we report a microfluidics-based dqPCR (mdqPCR) method, which takes advantage of the Fluidigm BioMark HD system for both template partition and the subsequent high-throughput dqPCR. Our mdqPCR method demonstrated excellent sensitivity and reproducibility suitable for quantitative analyses of not only miRNAs but also all other small RNA species at the single-cell level. Using this method, we discovered that each sperm has a unique miRNA profile. © The Authors 2017. Published by Oxford University Press on behalf of Society for the Study of Reproduction. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
Comparision of ICP-OES and MP-AES in determing soil nutrients by Mechlich3 method
NASA Astrophysics Data System (ADS)
Tonutare, Tonu; Penu, Priit; Krebstein, Kadri; Rodima, Ako; Kolli, Raimo; Shanskiy, Merrit
2014-05-01
Accurate, routine testing of nutrients in soil samples is critical to understanding soil potential fertility. There are different factors which must be taken into account selecting the best analytical technique for soil laboratory analysis. Several techniques can provide adequate detection range for same analytical subject. In similar cases the choise of technique will depend on factors such as sample throughput, required infrastructure, ease of use, used chemicals and need for gas supply and operating costs. Mehlich 3 extraction method is widely used for the determination of the plant available nutrient elements contents in agricultural soils. For determination of Ca, K, and Mg from soil extract depending of laboratory ICP and AAS techniques are used, also flame photometry for K in some laboratories. For the determination of extracted P is used ICP or Vis spectrometry. The excellent sensitivity and wide working range for all extracted elements make ICP a nearly ideal method, so long as the sample throughput is big enough to justify the initial capital outlay. Other advantage of ICP techniques is the multiplex character (simultaneous acquisition of all wavelengths). Depending on element the detection limits are in range 0.1 - 1000 μg/L. For smaller laboratories with low sample throughput requirements the use of AAS is more common. Flame AAS is a fast, relatively cheap and easy technique for analysis of elements. The disadvantages of the method is single element analysis and use of flammable gas, like C2H2 and oxidation gas N2O for some elements. Detection limits of elements for AAS lays from 1 to 1000 μg/L. MP-AES offers a unique alternative to both, AAS and ICP-OES techniques with its detection power, speed of analysis. MP-AES is quite new, simple and relatively inexpensive multielemental technique, which is use self-sustained atmospheric pressure microwave plasma (MP) using nitrogen gas generated by nitrogen generator. Therefore not needs for argon and flammable (C2H2) gases, cylinder handling and the running costs of equipment are low. Detection limits of elements for MP-AES lays between the AAS and ICP ones. The objective of this study was to compare the results of soil analysis using two multielemental analytical methods - ICP-OES and MP-AES. In the experiment, different soil types with various texture, content of organic matter and pH were used. For the study soil samples of Albeluvisols, Leptosols, Cambisols, Regosols and Histosols were used . The plant available nutrients were estimated by Mehlich 3 extraction. The ICP-OES analysis were provided in the Estonian Agricultural Research Centre and MP-AES analysis in department of Soil Science and Agrochemistry at Estonian University of Life Sciences. The detection limits and limits of quantification of Ca, K, Mg and P in extracts are calculated and reported.
Monjure, C. J.; Tatum, C. D.; Panganiban, A. T.; Arainga, M.; Traina-Dorge, V.; Marx, P. A.; Didier, E. S.
2014-01-01
Introduction Quantification of plasma viral load (PVL) is used to monitor disease progression in SIV-infected macaques. This study was aimed at optimizing of performance characteristics of the quantitative PCR (qPCR) PVL assay. Methods The PVL quantification procedure was optimized by inclusion of an exogenous control Hepatitis C Virus armored RNA (aRNA), a plasma concentration step, extended digestion with proteinase K, and a second RNA elution step. Efficiency of viral RNA (vRNA) extraction was compared using several commercial vRNA extraction kits. Various parameters of qPCR targeting the gag region of SIVmac239, SIVsmE660 and the LTR region of SIVagmSAB were also optimized. Results Modifications of the SIV PVL qPCR procedure increased vRNA recovery, reduced inhibition and improved analytical sensitivity. The PVL values determined by this SIV PVL qPCR correlated with quantification results of SIV-RNA in the same samples using the “industry standard” method of branched-DNA (bDNA) signal amplification. Conclusions Quantification of SIV genomic RNA in plasma of rhesus macaques using this optimized SIV PVL qPCR is equivalent to the bDNA signal amplification method, less costly and more versatile. Use of heterologous aRNA as an internal control is useful for optimizing performance characteristics of PVL qPCRs. PMID:24266615
Lowe, Ross H.; Karschner, Erin L.; Schwilke, Eugene W.; Barnes, Allan J.; Huestis, Marilyn A.
2009-01-01
A two-dimensional (2D) gas chromatography/electron impact-mass spectrometry (GC/EI-MS) method for simultaneous quantification of Δ9-tetrahydrocannabinol (THC), 11-hydroxy-Δ9-tetrahydrocannabinol (11-OH-THC), and 11-nor-Δ9-tetrahydrocannabinol-9-carboxylic acid (THCCOOH) in human plasma was developed and validated. The method employs 2D capillary GC and cryofocusing for enhanced resolution and sensitivity. THC, 11-OH-THC, and THCCOOH were extracted by precipitation with acetonitrile followed by solid-phase extraction. GC separation of trimethylsilyl derivatives of analytes was accomplished with two capillary columns in series coupled via a pneumatic Deans switch system. Detection and quantification were accomplished with a bench-top single quadrupole mass spectrometer operated in electron impact-selected ion monitoring mode. Limits of quantification (LOQ) were 0.125, 0.25 and 0.125 ng/mL for THC, 11-OH-THC, and THCCOOH, respectively. Accuracy ranged from 86.0 to 113.0% for all analytes. Intra- and inter-assay precision, as percent relative standard deviation, was less than 14.1% for THC, 11-OH-THC, and THCCOOH. The method was successfully applied to quantification of THC and its 11-OH-THC and THCCOOH metabolites in plasma specimens following controlled administration of THC. PMID:17640656
Quantification of DNA using the luminescent oxygen channeling assay.
Patel, R; Pollner, R; de Keczer, S; Pease, J; Pirio, M; DeChene, N; Dafforn, A; Rose, S
2000-09-01
Simplified and cost-effective methods for the detection and quantification of nucleic acid targets are still a challenge in molecular diagnostics. Luminescent oxygen channeling assay (LOCI(TM)) latex particles can be conjugated to synthetic oligodeoxynucleotides and hybridized, via linking probes, to different DNA targets. These oligomer-conjugated LOCI particles survive thermocycling in a PCR reaction and allow quantified detection of DNA targets in both real-time and endpoint formats. The endpoint DNA quantification format utilized two sensitizer bead types that are sensitive to separate illumination wavelengths. These two bead types were uniquely annealed to target or control amplicons, and separate illuminations generated time-resolved chemiluminescence, which distinguished the two amplicon types. In the endpoint method, ratios of the two signals allowed determination of the target DNA concentration over a three-log range. The real-time format allowed quantification of the DNA target over a six-log range with a linear relationship between threshold cycle and log of the number of DNA targets. This is the first report of the use of an oligomer-labeled latex particle assay capable of producing DNA quantification and sequence-specific chemiluminescent signals in a homogeneous format. It is also the first report of the generation of two signals from a LOCI assay. The methods described here have been shown to be easily adaptable to new DNA targets because of the generic nature of the oligomer-labeled LOCI particles.
Dapic, Irena; Kobetic, Renata; Brkljacic, Lidija; Kezic, Sanja; Jakasa, Ivone
2018-02-01
The free fatty acids (FFAs) are one of the major components of the lipids in the stratum corneum (SC), the uppermost layer of the skin. Relative composition of FFAs has been proposed as a biomarker of the skin barrier status in patients with atopic dermatitis (AD). Here, we developed an LC-ESI-MS/MS method for simultaneous quantification of a range of FFAs with long and very long chain length in the SC collected by adhesive tape (D-Squame). The method, based on derivatization with 2-bromo-1-methylpyridinium iodide and 3-carbinol-1-methylpyridinium iodide, allowed highly sensitive detection and quantification of FFAs using multiple reaction monitoring. For the quantification, we applied a surrogate analyte approach and internal standardization using isotope labeled derivatives of FFAs. Adhesive tapes showed the presence of several FFAs, which are also present in the SC, a problem encountered in previous studies. Therefore, the levels of FFAs in the SC were corrected using C12:0, which was present on the adhesive tape, but not detected in the SC. The method was applied to SC samples from patients with atopic dermatitis and healthy subjects. Quantification using multiple reaction monitoring allowed sufficient sensitivity to analyze FFAs of chain lengths C16-C28 in the SC collected on only one tape strip. Copyright © 2017 John Wiley & Sons, Ltd.
Bilbao, Aivett; Zhang, Ying; Varesio, Emmanuel; Luban, Jeremy; Strambio-De-Castillia, Caterina; Lisacek, Frédérique; Hopfgartner, Gérard
2016-01-01
Data-independent acquisition LC-MS/MS techniques complement supervised methods for peptide quantification. However, due to the wide precursor isolation windows, these techniques are prone to interference at the fragment ion level, which in turn is detrimental for accurate quantification. The “non-outlier fragment ion” (NOFI) ranking algorithm has been developed to assign low priority to fragment ions affected by interference. By using the optimal subset of high priority fragment ions these interfered fragment ions are effectively excluded from quantification. NOFI represents each fragment ion as a vector of four dimensions related to chromatographic and MS fragmentation attributes and applies multivariate outlier detection techniques. Benchmarking conducted on a well-defined quantitative dataset (i.e. the SWATH Gold Standard), indicates that NOFI on average is able to accurately quantify 11-25% more peptides than the commonly used Top-N library intensity ranking method. The sum of the area of the Top3-5 NOFIs produces similar coefficients of variation as compared to the library intensity method but with more accurate quantification results. On a biologically relevant human dendritic cell digest dataset, NOFI properly assigns low priority ranks to 85% of annotated interferences, resulting in sensitivity values between 0.92 and 0.80 against 0.76 for the Spectronaut interference detection algorithm. PMID:26412574
A method to characterize the roughness of 2-D line features: recrystallization boundaries.
Sun, J; Zhang, Y B; Dahl, A B; Conradsen, K; Juul Jensen, D
2017-03-01
A method is presented, which allows quantification of the roughness of nonplanar boundaries of objects for which the neutral plane is not known. The method provides quantitative descriptions of both the local and global characteristics. How the method can be used to estimate the sizes of rough features and local curvatures is also presented. The potential of the method is illustrated by quantification of the roughness of two recrystallization boundaries in a pure Al specimen characterized by scanning electron microscopy. © 2016 The Authors Journal of Microscopy © 2016 Royal Microscopical Society.
Quantifying construction and demolition waste: an analytical review.
Wu, Zezhou; Yu, Ann T W; Shen, Liyin; Liu, Guiwen
2014-09-01
Quantifying construction and demolition (C&D) waste generation is regarded as a prerequisite for the implementation of successful waste management. In literature, various methods have been employed to quantify the C&D waste generation at both regional and project levels. However, an integrated review that systemically describes and analyses all the existing methods has yet to be conducted. To bridge this research gap, an analytical review is conducted. Fifty-seven papers are retrieved based on a set of rigorous procedures. The characteristics of the selected papers are classified according to the following criteria - waste generation activity, estimation level and quantification methodology. Six categories of existing C&D waste quantification methodologies are identified, including site visit method, waste generation rate method, lifetime analysis method, classification system accumulation method, variables modelling method and other particular methods. A critical comparison of the identified methods is given according to their characteristics and implementation constraints. Moreover, a decision tree is proposed for aiding the selection of the most appropriate quantification method in different scenarios. Based on the analytical review, limitations of previous studies and recommendations of potential future research directions are further suggested. Copyright © 2014 Elsevier Ltd. All rights reserved.
de Andrade, Camila Kulek; de Brito, Patrícia Micaella Klack; Dos Anjos, Vanessa Egéa; Quináia, Sueli Pércio
2018-02-01
A slurry sampling electrothermal atomic absorption spectrometric method is proposed for the determination of trace elements such as Cu, Cr, Cd and Pb in yogurt. The main factors affecting the slurry preparation were optimized: nature and concentration of acid solution and sonication time. The analytical method was validated in-house by calibration, linearity, limits of detection and quantification, precision and accuracy test obtaining satisfactory results in all cases. The proposed method was applied for the determination of Cd, Cr, Cu and Pb in some Brazilian yogurt samples. For these samples, the concentrations ranged from 2.5±0.2 to 12.4±0.2ngg -1 ; 34±3 to 899±7ngg -1 ; <8.3 to 12±1ngg -1 ; and <35.4 to 210±16ngg -1 for Cd, Cu, Cr and Pb, respectively. The daily intake of Cd, Cu, Cr and Pb via consumption of these samples was estimated. Copyright © 2017 Elsevier Ltd. All rights reserved.
Chevolleau, S; Noguer-Meireles, M-H; Jouanin, I; Naud, N; Pierre, F; Gueraud, F; Debrauwer, L
2018-04-15
Red or processed meat rich diets have been shown to be associated with an elevated risk of colorectal cancer (CRC). One major hypothesis involves dietary heme iron which induces lipid peroxidation. The quantification of the resulting reactive aldehydes (e.g. HNE and HHE) in the colon lumen is therefore of great concern since these compounds are known for their cytotoxic and genotoxic properties. UHPLC-ESI-MS/MS method has been developed and validated for HNE and HHE quantification in rat faeces. Samples were derivatised using a brominated reagent (BBHA) in presence of pre-synthesized deuterated internal standards (HNE-d11/HHE-d5), extracted by solid phase extraction, and then analysed by LC-positive ESI-MS/MS (MRM) on a TSQ Vantage mass spectrometer. The use of BBHA allowed the efficient stabilisation of the unstable and reactive hydroxy-alkenals HNE and HHE. The MRM method allowed selective detection of HNE and HHE on the basis of characteristic transitions monitored from both the 79 and 81 bromine isotopic peaks. This method was validated according to the European Medicines Agency (EMEA) guidelines, by determining selectivity, sensitivity, linearity, carry-over effect, recovery, matrix effect, repeatability, trueness and intermediate precision. The performance of the method enabled the quantification of HNE and HHE in concentrations 0.10-0.15 μM in faecal water. Results are presented on the application to the quantification of HNE and HHE in different faecal waters obtained from faeces of rats fed diets with various fatty acid compositions thus corresponding to different pro-oxidative features. Copyright © 2018 Elsevier B.V. All rights reserved.
Gil, Jeovanis; Cabrales, Ania; Reyes, Osvaldo; Morera, Vivian; Betancourt, Lázaro; Sánchez, Aniel; García, Gerardo; Moya, Galina; Padrón, Gabriel; Besada, Vladimir; González, Luis Javier
2012-02-23
Growth hormone-releasing peptide 6 (GHRP-6, His-(DTrp)-Ala-Trp-(DPhe)-Lys-NH₂, MW=872.44 Da) is a potent growth hormone secretagogue that exhibits a cytoprotective effect, maintaining tissue viability during acute ischemia/reperfusion episodes in different organs like small bowel, liver and kidneys. In the present work a quantitative method to analyze GHRP-6 in human plasma was developed and fully validated following FDA guidelines. The method uses an internal standard (IS) of GHRP-6 with ¹³C-labeled Alanine for quantification. Sample processing includes a precipitation step with cold acetone to remove the most abundant plasma proteins, recovering the GHRP-6 peptide with a high yield. Quantification was achieved by LC-MS in positive full scan mode in a Q-Tof mass spectrometer. The sensitivity of the method was evaluated, establishing the lower limit of quantification at 5 ng/mL and a range for the calibration curve from 5 ng/mL to 50 ng/mL. A dilution integrity test was performed to analyze samples at higher concentration of GHRP-6. The validation process involved five calibration curves and the analysis of quality control samples to determine accuracy and precision. The calibration curves showed R² higher than 0.988. The stability of the analyte and its internal standard (IS) was demonstrated in all conditions the samples would experience in a real time analyses. This method was applied to the quantification of GHRP-6 in plasma from nine healthy volunteers participating in a phase I clinical trial. Copyright © 2011 Elsevier B.V. All rights reserved.
Zhu, Haitao; Nie, Binbin; Liu, Hua; Guo, Hua; Demachi, Kazuyuki; Sekino, Masaki; Shan, Baoci
2016-05-01
Phase map cross-correlation detection and quantification may produce highlighted signal at superparamagnetic iron oxide nanoparticles, and distinguish them from other hypointensities. The method may quantify susceptibility change by performing least squares analysis between a theoretically generated magnetic field template and an experimentally scanned phase image. Because characteristic phase recognition requires the removal of phase wrap and phase background, additional steps of phase unwrapping and filtering may increase the chance of computing error and enlarge the inconsistence among algorithms. To solve problem, phase gradient cross-correlation and quantification method is developed by recognizing characteristic phase gradient pattern instead of phase image because phase gradient operation inherently includes unwrapping and filtering functions. However, few studies have mentioned the detectable limit of currently used phase gradient calculation algorithms. The limit may lead to an underestimation of large magnetic susceptibility change caused by high-concentrated iron accumulation. In this study, mathematical derivation points out the value of maximum detectable phase gradient calculated by differential chain algorithm in both spatial and Fourier domain. To break through the limit, a modified quantification method is proposed by using unwrapped forward differentiation for phase gradient generation. The method enlarges the detectable range of phase gradient measurement and avoids the underestimation of magnetic susceptibility. Simulation and phantom experiments were used to quantitatively compare different methods. In vivo application performs MRI scanning on nude mice implanted by iron-labeled human cancer cells. Results validate the limit of detectable phase gradient and the consequent susceptibility underestimation. Results also demonstrate the advantage of unwrapped forward differentiation compared with differential chain algorithms for susceptibility quantification at high-concentrated iron accumulation. Copyright © 2015 Elsevier Inc. All rights reserved.
Wang, Tao; Liu, Tingting; Wang, Zejian; Tian, Xiwei; Yang, Yi; Guo, Meijin; Chu, Ju; Zhuang, Yingping
2016-05-01
The rapid and real-time lipid determination can provide valuable information on process regulation and optimization in the algal lipid mass production. In this study, a rapid, accurate and precise quantification method of in vivo cellular lipids of Chlorella protothecoides using low field nuclear magnetic resonance (LF-NMR) was newly developed. LF-NMR was extremely sensitive to the algal lipids with the limits of the detection (LOD) of 0.0026g and 0.32g/L in dry lipid samples and algal broth, respectively, as well as limits of quantification (LOQ) of 0.0093g and 1.18g/L. Moreover, the LF-NMR signal was specifically proportional to the cellular lipids of C. protothecoides, thus the superior regression curves existing in a wide detection range from 0.02 to 0.42g for dry lipids and from 1.12 to 8.97gL(-1) of lipid concentration for in vivo lipid quantification were obtained with all R(2) higher than 0.99, irrespective of the lipid content and fatty acids profile variations. The accuracy of this novel method was further verified to be reliable by comparing lipid quantification results to those obtained by GC-MS. And the relative standard deviation (RSD) of LF-NMR results were smaller than 2%, suggesting the precision of this method. Finally, this method was successfully used in the on-line lipid monitoring during the algal lipid fermentation processes, making it possible for better understanding of the lipid accumulation mechanism and dynamic bioprocess control. Copyright © 2016 Elsevier B.V. All rights reserved.
Elmer-Dixon, Margaret M; Bowler, Bruce E
2018-05-19
A novel approach to quantify mixed lipid systems is described. Traditional approaches to lipid vesicle quantification are time consuming, require large amounts of material and are destructive. We extend our recently described method for quantification of pure lipid systems to mixed lipid systems. The method only requires a UV-Vis spectrometer and does not destroy sample. Mie scattering data from absorbance measurements are used as input into a Matlab program to calculate the total vesicle concentration and the concentrations of each lipid in the mixed lipid system. The technique is fast and accurate, which is essential for analytical lipid binding experiments. Copyright © 2018. Published by Elsevier Inc.
Mota, Maria Fernanda S; Souza, Marcella F; Bon, Elba P S; Rodrigues, Marcoaurelio A; Freitas, Suely Pereira
2018-05-24
The use of colorimetric methods for protein quantification in microalgae is hindered by their elevated amounts of membrane-embedded intracellular proteins. In this work, the protein content of three species of microalgae was determined by the Lowry method after the cells were dried, ball-milled, and treated with the detergent sodium dodecyl sulfate (SDS). Results demonstrated that the association of milling and SDS treatment resulted in a 3- to 7-fold increase in protein quantification. Milling promoted microalgal disaggregation and cell wall disruption enabling access of the SDS detergent to the microalgal intracellular membrane proteins and their efficient solubilization and quantification. © 2018 Phycological Society of America.
Bassereau, Maud; Chaintreau, Alain; Duperrex, Stéphanie; Joulain, Daniel; Leijs, Hans; Loesing, Gerd; Owen, Neil; Sherlock, Alan; Schippa, Christine; Thorel, Pierre-Jean; Vey, Matthias
2007-01-10
The performances of the GC-MS determination of suspected allergens in fragrance concentrates have been investigated. The limit of quantification was experimentally determined (10 mg/L), and the variability was investigated for three different data treatment strategies: (1) two columns and three quantification ions; (2) two columns and one quantification ion; and (3) one column and three quantification ions. The first strategy best minimizes the risk of determination bias due to coelutions. This risk was evaluated by calculating the probability of coeluting a suspected allergen with perfume constituents exhibiting ions in common. For hydroxycitronellal, when using a two-column strategy, this may statistically occur more than once every 36 analyses for one ion or once every 144 analyses for three ions in common.
Misra, Ankita; Shukla, Pushpendra Kumar; Kumar, Bhanu; Chand, Jai; Kushwaha, Poonam; Khalid, Md; Singh Rawat, Ajay Kumar; Srivastava, Sharad
2017-10-01
Gloriosa superba L. (Colchicaceae) is used as adjuvant therapy in gout for its potential antimitotic activity due to high colchicine(s) alkaloids. This study aimed to develop an easy, cheap, precise, and accurate high-performance thin-layer chromatographic (HPTLC) validated method for simultaneous quantification of bioactive alkaloids (colchicine and gloriosine) in G. superba L. and to identify its elite chemotype(s) from Sikkim Himalayas (India). The HPTLC chromatographic method was developed using mobile phase of chloroform: acetone: diethyl amine (5:4:1) at λ max of 350 nm. Five germplasms were collected from targeted region, and on morpho-anatomical inspection, no significant variation was observed among them. Quantification data reveal that content of colchicine ( R f : 0.72) and gloriosine ( R f : 0.61) varies from 0.035%-0.150% to 0.006%-0.032% (dry wt. basis). Linearity of method was obtained in the concentration range of 100-400 ng/spot of marker(s), exhibiting regression coefficient of 0.9987 (colchicine) and 0.9983 (gloriosine) with optimum recovery of 97.79 ± 3.86 and 100.023% ± 0.01%, respectively. Limit of detection and limit of quantification were analyzed, respectively, as 6.245, 18.926 and 8.024, 24.316 (ng). Two germplasms, namely NBG-27 and NBG-26, were found to be elite chemotype of both the markers. The developed method is validated in terms of accuracy, recovery, and precision studies as per the ICH guidelines (2005) and can be adopted for the simultaneous quantification of colchicine and gloriosine in phytopharmaceuticals. In addition, this study is relevant to explore the chemotypic variability in metabolite content for commercial and medicinal purposes. An easy, cheap, precise, and accurate high performance thin layer chromatographic (HPTLC) validated method for simultaneous quantification of bioactive alkaloids (colchicine and gloriosine) in G. superba L.Five germplasms were collected from targeted region, and on morpho anatomical inspection, no significant variation was observed among themQuantification data reveal that content of colchicine (Rf: 0.72) and gloriosine (Rf: 0.61) varies from 0.035%-0.150% to 0.006%-0.032% (dry wt. basis)Two germplasms, namely NBG 27 and NBG 26, were found to be elite chemotype of both the markers.
Neiens, Patrick; De Simone, Angela; Ramershoven, Anna; Höfner, Georg; Allmendinger, Lars; Wanner, Klaus T
2018-03-03
MS Binding Assays represent a label-free alternative to radioligand binding assays. In this study, we present an LC-ESI-MS/MS method for the quantification of (R,R)-4-(2-benzhydryloxyethyl)-1-(4-fluorobenzyl)piperidin-3-ol [(R,R)-D-84, (R,R)-1], (S,S)-reboxetine [(S,S)-2], and (S)-citalopram [(S)-3] employed as highly selective nonlabeled reporter ligands in MS Binding Assays addressing the dopamine [DAT, (R,R)-D-84], norepinephrine [NET, (S,S)-reboxetine] and serotonin transporter [SERT, (S)-citalopram], respectively. The developed LC-ESI-MS/MS method uses a pentafluorphenyl stationary phase in combination with a mobile phase composed of acetonitrile and ammonium formate buffer for chromatography and a triple quadrupole mass spectrometer in the multiple reaction monitoring mode for mass spectrometric detection. Quantification is based on deuterated derivatives of all three analytes serving as internal standards. The established LC-ESI-MS/MS method enables fast, robust, selective and highly sensitive quantification of all three reporter ligands in a single chromatographic run. The method was validated according to the Center for Drug Evaluation and Research (CDER) guideline for bioanalytical method validation regarding selectivity, accuracy, precision, calibration curve and sensitivity. Finally, filtration-based MS Binding Assays were performed for all three monoamine transporters based on this LC-ESI-MS/MS quantification method as read out. The affinities determined in saturation experiments for (R,R)-D-84 toward hDAT, for (S,S)-reboxetine toward hNET, and for (S)-citalopram toward hSERT, respectively, were in good accordance with results from literature, clearly demonstrating that the established MS Binding Assays have the potential to be an efficient alternative to radioligand binding assays widely used for this purpose so far. Copyright © 2018 John Wiley & Sons, Ltd.
Daniel J. Miller; Kelly M. Burnett
2008-01-01
Debris flows are important geomorphic agents in mountainous terrains that shape channel environments and add a dynamic element to sediment supply and channel disturbance. Identification of channels susceptible to debris-flow inputs of sediment and organic debris, and quantification of the likelihood and magnitude of those inputs, are key tasks for characterizing...
NASA Astrophysics Data System (ADS)
He, Jingjing; Wang, Dengjiang; Zhang, Weifang
2015-03-01
This study presents an experimental and modeling study for damage detection and quantification in riveted lap joints. Embedded lead zirconate titanate piezoelectric (PZT) ceramic wafer-type sensors are employed to perform in-situ non-destructive testing during fatigue cyclical loading. A multi-feature integration method is developed to quantify the crack size using signal features of correlation coefficient, amplitude change, and phase change. In addition, probability of detection (POD) model is constructed to quantify the reliability of the developed sizing method. Using the developed crack size quantification method and the resulting POD curve, probabilistic fatigue life prediction can be performed to provide comprehensive information for decision-making. The effectiveness of the overall methodology is demonstrated and validated using several aircraft lap joint specimens from different manufactures and under different loading conditions.
Martins, Ayrton F; Frank, Carla da S; Altissimo, Joseline; de Oliveira, Júlia A; da Silva, Daiane S; Reichert, Jaqueline F; Souza, Darliana M
2017-08-24
Statins are classified as being amongst the most prescribed agents for treating hypercholesterolaemia and preventing vascular diseases. In this study, a rapid and effective liquid chromatography method, assisted by diode array detection, was designed and validated for the simultaneous quantification of atorvastatin (ATO) and simvastatin (SIM) in hospital effluent samples. The solid phase extraction (SPE) of the analytes was optimized regarding sorbent material and pH, and the dispersive liquid-liquid microextraction (DLLME), in terms of pH, ionic strength, type and volume of extractor/dispersor solvents. The performance of both extraction procedures was evaluated in terms of linearity, quantification limits, accuracy (recovery %), precision and matrix effects for each analyte. The methods proved to be linear in the concentration range considered; the quantification limits were 0.45 µg L -1 for ATO and 0.75 µg L -1 for SIM; the matrix effect was almost absent in both methods and the average recoveries remained between 81.5-90.0%; and the RSD values were <20%. The validated methods were applied to the quantification of the statins in real samples of hospital effluent; the concentrations ranged from 18.8 µg L -1 to 35.3 µg L -1 for ATO, and from 30.3 µg L -1 to 38.5 µg L -1 for SIM. Since the calculated risk quotient was ≤192, the occurrence of ATO and SIM in hospital effluent poses a potential serious risk to human health and the aquatic ecosystem.
Stepanović, Srdjan; Vuković, Dragana; Hola, Veronika; Di Bonaventura, Giovanni; Djukić, Slobodanka; Cirković, Ivana; Ruzicka, Filip
2007-08-01
The details of all steps involved in the quantification of biofilm formation in microtiter plates are described. The presented protocol incorporates information on assessment of biofilm production by staphylococci, gained both by direct experience as well as by analysis of methods for assaying biofilm production. The obtained results should simplify quantification of biofilm formation in microtiter plates, and make it more reliable and comparable among different laboratories.
On the Confounding Effect of Temperature on Chemical Shift-Encoded Fat Quantification
Hernando, Diego; Sharma, Samir D.; Kramer, Harald; Reeder, Scott B.
2014-01-01
Purpose To characterize the confounding effect of temperature on chemical shift-encoded (CSE) fat quantification. Methods The proton resonance frequency of water, unlike triglycerides, depends on temperature. This leads to a temperature dependence of the spectral models of fat (relative to water) that are commonly used by CSE-MRI methods. Simulation analysis was performed for 1.5 Tesla CSE fat–water signals at various temperatures and echo time combinations. Oil–water phantoms were constructed and scanned at temperatures between 0 and 40°C using spectroscopy and CSE imaging at three echo time combinations. An explanted human liver, rejected for transplantation due to steatosis, was scanned using spectroscopy and CSE imaging. Fat–water reconstructions were performed using four different techniques: magnitude and complex fitting, with standard or temperature-corrected signal modeling. Results In all experiments, magnitude fitting with standard signal modeling resulted in large fat quantification errors. Errors were largest for echo time combinations near TEinit ≈ 1.3 ms, ΔTE ≈ 2.2 ms. Errors in fat quantification caused by temperature-related frequency shifts were smaller with complex fitting, and were avoided using a temperature-corrected signal model. Conclusion Temperature is a confounding factor for fat quantification. If not accounted for, it can result in large errors in fat quantifications in phantom and ex vivo acquisitions. PMID:24123362
NASA Astrophysics Data System (ADS)
Martel, Dimitri; Tse Ve Koon, K.; Le Fur, Yann; Ratiney, Hélène
2015-11-01
Two-dimensional spectroscopy offers the possibility to unambiguously distinguish metabolites by spreading out the multiplet structure of J-coupled spin systems into a second dimension. Quantification methods that perform parametric fitting of the 2D MRS signal have recently been proposed for resolved PRESS (JPRESS) but not explicitly for Localized Correlation Spectroscopy (LCOSY). Here, through a whole metabolite quantification approach, correlation spectroscopy quantification performances are studied. The ability to quantify metabolite relaxation constant times is studied for three localized 2D MRS sequences (LCOSY, LCTCOSY and the JPRESS) in vitro on preclinical MR systems. The issues encountered during implementation and quantification strategies are discussed with the help of the Fisher matrix formalism. The described parameterized models enable the computation of the lower bound for error variance - generally known as the Cramér Rao bounds (CRBs), a standard of precision - on the parameters estimated from these 2D MRS signal fittings. LCOSY has a theoretical net signal loss of two per unit of acquisition time compared to JPRESS. A rapid analysis could point that the relative CRBs of LCOSY compared to JPRESS (expressed as a percentage of the concentration values) should be doubled but we show that this is not necessarily true. Finally, the LCOSY quantification procedure has been applied on data acquired in vivo on a mouse brain.
Ott, Stephan J; Musfeldt, Meike; Ullmann, Uwe; Hampe, Jochen; Schreiber, Stefan
2004-06-01
The composition of the human intestinal flora is important for the health status of the host. The global composition and the presence of specific pathogens are relevant to the effects of the flora. Therefore, accurate quantification of all major bacterial populations of the enteric flora is needed. A TaqMan real-time PCR-based method for the quantification of 20 dominant bacterial species and groups of the intestinal flora has been established on the basis of 16S ribosomal DNA taxonomy. A PCR with conserved primers was used for all reactions. In each real-time PCR, a universal probe for quantification of total bacteria and a specific probe for the species in question were included. PCR with conserved primers and the universal probe for total bacteria allowed relative and absolute quantification. Minor groove binder probes increased the sensitivity of the assays 10- to 100-fold. The method was evaluated by cross-reaction experiments and quantification of bacteria in complex clinical samples from healthy patients. A sensitivity of 10(1) to 10(3) bacterial cells per sample was achieved. No significant cross-reaction was observed. The real-time PCR assays presented may facilitate understanding of the intestinal bacterial flora through a normalized global estimation of the major contributing species.
Sánchez-Guijo, Alberto; Oji, Vinzenz; Hartmann, Michaela F.; Traupe, Heiko; Wudy, Stefan A.
2015-01-01
Steroids are primarily present in human fluids in their sulfated forms. Profiling of these compounds is important from both diagnostic and physiological points of view. Here, we present a novel method for the quantification of 11 intact steroid sulfates in human serum by LC-MS/MS. The compounds analyzed in our method, some of which are quantified for the first time in blood, include cholesterol sulfate, pregnenolone sulfate, 17-hydroxy-pregnenolone sulfate, 16-α-hydroxy-dehydroepiandrosterone sulfate, dehydroepiandrosterone sulfate, androstenediol sulfate, androsterone sulfate, epiandrosterone sulfate, testosterone sulfate, epitestosterone sulfate, and dihydrotestosterone sulfate. The assay was conceived to quantify sulfated steroids in a broad range of concentrations, requiring only 300 μl of serum. The method has been validated and its performance was studied at three quality controls, selected for each compound according to its physiological concentration. The assay showed good linearity (R2 > 0.99) and recovery for all the compounds, with limits of quantification ranging between 1 and 80 ng/ml. Averaged intra-day and between-day precisions (coefficient of variation) and accuracies (relative errors) were below 10%. The method has been successfully applied to study the sulfated steroidome in diseases such as steroid sulfatase deficiency, proving its diagnostic value. This is, to our best knowledge, the most comprehensive method available for the quantification of sulfated steroids in human blood. PMID:26239050
Leveraging transcript quantification for fast computation of alternative splicing profiles.
Alamancos, Gael P; Pagès, Amadís; Trincado, Juan L; Bellora, Nicolás; Eyras, Eduardo
2015-09-01
Alternative splicing plays an essential role in many cellular processes and bears major relevance in the understanding of multiple diseases, including cancer. High-throughput RNA sequencing allows genome-wide analyses of splicing across multiple conditions. However, the increasing number of available data sets represents a major challenge in terms of computation time and storage requirements. We describe SUPPA, a computational tool to calculate relative inclusion values of alternative splicing events, exploiting fast transcript quantification. SUPPA accuracy is comparable and sometimes superior to standard methods using simulated as well as real RNA-sequencing data compared with experimentally validated events. We assess the variability in terms of the choice of annotation and provide evidence that using complete transcripts rather than more transcripts per gene provides better estimates. Moreover, SUPPA coupled with de novo transcript reconstruction methods does not achieve accuracies as high as using quantification of known transcripts, but remains comparable to existing methods. Finally, we show that SUPPA is more than 1000 times faster than standard methods. Coupled with fast transcript quantification, SUPPA provides inclusion values at a much higher speed than existing methods without compromising accuracy, thereby facilitating the systematic splicing analysis of large data sets with limited computational resources. The software is implemented in Python 2.7 and is available under the MIT license at https://bitbucket.org/regulatorygenomicsupf/suppa. © 2015 Alamancos et al.; Published by Cold Spring Harbor Laboratory Press for the RNA Society.
miR-MaGiC improves quantification accuracy for small RNA-seq.
Russell, Pamela H; Vestal, Brian; Shi, Wen; Rudra, Pratyaydipta D; Dowell, Robin; Radcliffe, Richard; Saba, Laura; Kechris, Katerina
2018-05-15
Many tools have been developed to profile microRNA (miRNA) expression from small RNA-seq data. These tools must contend with several issues: the small size of miRNAs, the small number of unique miRNAs, the fact that similar miRNAs can be transcribed from multiple loci, and the presence of miRNA isoforms known as isomiRs. Methods failing to address these issues can return misleading information. We propose a novel quantification method designed to address these concerns. We present miR-MaGiC, a novel miRNA quantification method, implemented as a cross-platform tool in Java. miR-MaGiC performs stringent mapping to a core region of each miRNA and defines a meaningful set of target miRNA sequences by collapsing the miRNA space to "functional groups". We hypothesize that these two features, mapping stringency and collapsing, provide more optimal quantification to a more meaningful unit (i.e., miRNA family). We test miR-MaGiC and several published methods on 210 small RNA-seq libraries, evaluating each method's ability to accurately reflect global miRNA expression profiles. We define accuracy as total counts close to the total number of input reads originating from miRNAs. We find that miR-MaGiC, which incorporates both stringency and collapsing, provides the most accurate counts.
Kolacsek, Orsolya; Pergel, Enikő; Varga, Nóra; Apáti, Ágota; Orbán, Tamás I
2017-01-20
There are numerous applications of quantitative PCR for both diagnostic and basic research. As in many other techniques the basis of quantification is that comparisons are made between different (unknown and known or reference) specimens of the same entity. When the aim is to compare real quantities of different species in samples, one cannot escape their separate precise absolute quantification. We have established a simple and reliable method for this purpose (Ct shift method) which combines the absolute and the relative approach. It requires a plasmid standard containing both sequences of amplicons to be compared (e.g. the target of interest and the endogenous control). It can serve as a reference sample with equal copies of templates for both targets. Using the ΔΔCt formula we can quantify the exact ratio of the two templates in each unknown sample. The Ct shift method has been successfully applied for transposon gene copy measurements, as well as for comparison of different mRNAs in cDNA samples. This study provides the proof of concept and introduces some potential applications of the method; the absolute nature of results even without the need for real reference samples can contribute to the universality of the method and comparability of different studies. Copyright © 2016 Elsevier B.V. All rights reserved.
Methods to Detect Nitric Oxide and its Metabolites in Biological Samples
Bryan, Nathan S.; Grisham, Matthew B.
2007-01-01
Nitric oxide (NO) methodology is a complex and often confusing science and the focus of many debates and discussion concerning NO biochemistry. NO is involved in many physiological processes including regulation of blood pressure, immune response and neural communication. Therefore its accurate detection and quantification is critical to understanding health and disease. Due to the extremely short physiological half life of this gaseous free radical, alternative strategies for the detection of reaction products of NO biochemistry have been developed. The quantification of NO metabolites in biological samples provides valuable information with regards to in vivo NO production, bioavailability and metabolism. Simply sampling a single compartment such as blood or plasma may not always provide an accurate assessment of whole body NO status, particularly in tissues. Therefore, extrapolation of plasma or blood NO status to specific tissues of interest is no longer a valid approach. As a result, methods continue to be developed and validated which allow the detection and quantification of NO and NO-related products/metabolites in multiple compartments of experimental animals in vivo. The methods described in this review is not an exhaustive or comprehensive discussion of all methods available for the detection of NO but rather a description of the most commonly used and practical methods which allow accurate and sensitive quantification of NO products/metabolites in multiple biological matrices under normal physiological conditions. PMID:17664129
MRI-based methods for quantification of the cerebral metabolic rate of oxygen
Rodgers, Zachary B; Detre, John A
2016-01-01
The brain depends almost entirely on oxidative metabolism to meet its significant energy requirements. As such, the cerebral metabolic rate of oxygen (CMRO2) represents a key measure of brain function. Quantification of CMRO2 has helped elucidate brain functional physiology and holds potential as a clinical tool for evaluating neurological disorders including stroke, brain tumors, Alzheimer’s disease, and obstructive sleep apnea. In recent years, a variety of magnetic resonance imaging (MRI)-based CMRO2 quantification methods have emerged. Unlike positron emission tomography – the current “gold standard” for measurement and mapping of CMRO2 – MRI is non-invasive, relatively inexpensive, and ubiquitously available in modern medical centers. All MRI-based CMRO2 methods are based on modeling the effect of paramagnetic deoxyhemoglobin on the magnetic resonance signal. The various methods can be classified in terms of the MRI contrast mechanism used to quantify CMRO2: T2*, T2′, T2, or magnetic susceptibility. This review article provides an overview of MRI-based CMRO2 quantification techniques. After a brief historical discussion motivating the need for improved CMRO2 methodology, current state-of-the-art MRI-based methods are critically appraised in terms of their respective tradeoffs between spatial resolution, temporal resolution, and robustness, all of critical importance given the spatially heterogeneous and temporally dynamic nature of brain energy requirements. PMID:27089912
Jensen, Jacob S; Egebo, Max; Meyer, Anne S
2008-05-28
Accomplishment of fast tannin measurements is receiving increased interest as tannins are important for the mouthfeel and color properties of red wines. Fourier transform mid-infrared spectroscopy allows fast measurement of different wine components, but quantification of tannins is difficult due to interferences from spectral responses of other wine components. Four different variable selection tools were investigated for the identification of the most important spectral regions which would allow quantification of tannins from the spectra using partial least-squares regression. The study included the development of a new variable selection tool, iterative backward elimination of changeable size intervals PLS. The spectral regions identified by the different variable selection methods were not identical, but all included two regions (1485-1425 and 1060-995 cm(-1)), which therefore were concluded to be particularly important for tannin quantification. The spectral regions identified from the variable selection methods were used to develop calibration models. All four variable selection methods identified regions that allowed an improved quantitative prediction of tannins (RMSEP = 69-79 mg of CE/L; r = 0.93-0.94) as compared to a calibration model developed using all variables (RMSEP = 115 mg of CE/L; r = 0.87). Only minor differences in the performance of the variable selection methods were observed.
Barricklow, Jason; Ryder, Tim F; Furlong, Michael T
2009-08-01
During LC-MS/MS quantification of a small molecule in human urine samples from a clinical study, an unexpected peak was observed to nearly co-elute with the analyte of interest in many study samples. Improved chromatographic resolution revealed the presence of at least 3 non-analyte peaks, which were identified as cysteine metabolites and N-acetyl (mercapturic acid) derivatives thereof. These metabolites produced artifact responses in the parent compound MRM channel due to decomposition in the ionization source of the mass spectrometer. Quantitative comparison of the analyte concentrations in study samples using the original chromatographic method and the improved chromatographic separation method demonstrated that the original method substantially over-estimated the analyte concentration in many cases. The substitution of electrospray ionization (ESI) for atmospheric pressure chemical ionization (APCI) nearly eliminated the source instability of these metabolites, which would have mitigated their interference in the quantification of the analyte, even without chromatographic separation. These results 1) demonstrate the potential for thiol metabolite interferences during the quantification of small molecules in pharmacokinetic samples, and 2) underscore the need to carefully evaluate LC-MS/MS methods for molecules that can undergo metabolism to thiol adducts to ensure that they are not susceptible to such interferences during quantification.
Quantifying errors without random sampling.
Phillips, Carl V; LaPole, Luwanna M
2003-06-12
All quantifications of mortality, morbidity, and other health measures involve numerous sources of error. The routine quantification of random sampling error makes it easy to forget that other sources of error can and should be quantified. When a quantification does not involve sampling, error is almost never quantified and results are often reported in ways that dramatically overstate their precision. We argue that the precision implicit in typical reporting is problematic and sketch methods for quantifying the various sources of error, building up from simple examples that can be solved analytically to more complex cases. There are straightforward ways to partially quantify the uncertainty surrounding a parameter that is not characterized by random sampling, such as limiting reported significant figures. We present simple methods for doing such quantifications, and for incorporating them into calculations. More complicated methods become necessary when multiple sources of uncertainty must be combined. We demonstrate that Monte Carlo simulation, using available software, can estimate the uncertainty resulting from complicated calculations with many sources of uncertainty. We apply the method to the current estimate of the annual incidence of foodborne illness in the United States. Quantifying uncertainty from systematic errors is practical. Reporting this uncertainty would more honestly represent study results, help show the probability that estimated values fall within some critical range, and facilitate better targeting of further research.
Nahar, Limon Khatun; Cordero, Rosa Elena; Nutt, David; Lingford-Hughes, Anne; Turton, Samuel; Durant, Claire; Wilson, Sue; Paterson, Sue
2016-03-01
A highly sensitive and fully validated method was developed for the quantification of baclofen in human plasma. After adjusting the pH of the plasma samples using a phosphate buffer solution (pH 4), baclofen was purified using mixed mode (C8/cation exchange) solid-phase extraction (SPE) cartridges. Endogenous water-soluble compounds and lipids were removed from the cartridges before the samples were eluted and concentrated. The samples were analyzed using triple-quadrupole liquid chromatography-tandem mass spectrometry (LC-MS-MS) with triggered dynamic multiple reaction monitoring mode for simultaneous quantification and confirmation. The assay was linear from 25 to 1,000 ng/mL (r(2) > 0.999; n = 6). Intraday (n = 6) and interday (n = 15) imprecisions (% relative standard deviation) were <5%, and the average recovery was 30%. The limit of detection of the method was 5 ng/mL, and the limit of quantification was 25 ng/mL. Plasma samples from healthy male volunteers (n = 9, median age: 22) given two single oral doses of baclofen (10 and 60 mg) on nonconsecutive days were analyzed to demonstrate method applicability. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Li, Xiang; Wang, Xiuxiu; Yang, Jielin; Liu, Yueming; He, Yuping; Pan, Liangwen
2014-05-16
To date, over 150 genetically modified (GM) crops are widely cultivated. To comply with regulations developed for genetically modified organisms (GMOs), including labeling policies, many detection methods for GMO identification and quantification have been developed. To detect the entrance and exit of unauthorized GM crop events in China, we developed a novel quadruplex real-time PCR method for simultaneous detection and quantification of GM cotton events GHB119 and T304-40 in cotton-derived products (based on the 5'-flanking sequence) and the insect-resistance gene Cry2Ae. The limit of detection was 10 copies for GHB119 and Cry2Ae and 25 copies for T304-40. The limit of quantification was 25 copies for GHB119 and Cry2Ae and 50 copies for T304-40. Moreover, low bias and acceptable standard deviation and relative standard deviation values were obtained in quantification analysis of six blind samples containing different GHB119 and T304-40 ingredients. The developed quadruplex quantitative method could be used for quantitative detection of two GM cotton events (GHB119 and T304-40) and Cry2Ae gene ingredient in cotton derived products.
2014-01-01
Background To date, over 150 genetically modified (GM) crops are widely cultivated. To comply with regulations developed for genetically modified organisms (GMOs), including labeling policies, many detection methods for GMO identification and quantification have been developed. Results To detect the entrance and exit of unauthorized GM crop events in China, we developed a novel quadruplex real-time PCR method for simultaneous detection and quantification of GM cotton events GHB119 and T304-40 in cotton-derived products (based on the 5′-flanking sequence) and the insect-resistance gene Cry2Ae. The limit of detection was 10 copies for GHB119 and Cry2Ae and 25 copies for T304-40. The limit of quantification was 25 copies for GHB119 and Cry2Ae and 50 copies for T304-40. Moreover, low bias and acceptable standard deviation and relative standard deviation values were obtained in quantification analysis of six blind samples containing different GHB119 and T304-40 ingredients. Conclusions The developed quadruplex quantitative method could be used for quantitative detection of two GM cotton events (GHB119 and T304-40) and Cry2Ae gene ingredient in cotton derived products. PMID:24884946
DOE Office of Scientific and Technical Information (OSTI.GOV)
Huan, Xun; Safta, Cosmin; Sargsyan, Khachik
The development of scramjet engines is an important research area for advancing hypersonic and orbital flights. Progress toward optimal engine designs requires accurate flow simulations together with uncertainty quantification. However, performing uncertainty quantification for scramjet simulations is challenging due to the large number of uncertain parameters involved and the high computational cost of flow simulations. These difficulties are addressed in this paper by developing practical uncertainty quantification algorithms and computational methods, and deploying them in the current study to large-eddy simulations of a jet in crossflow inside a simplified HIFiRE Direct Connect Rig scramjet combustor. First, global sensitivity analysis ismore » conducted to identify influential uncertain input parameters, which can help reduce the system’s stochastic dimension. Second, because models of different fidelity are used in the overall uncertainty quantification assessment, a framework for quantifying and propagating the uncertainty due to model error is presented. In conclusion, these methods are demonstrated on a nonreacting jet-in-crossflow test problem in a simplified scramjet geometry, with parameter space up to 24 dimensions, using static and dynamic treatments of the turbulence subgrid model, and with two-dimensional and three-dimensional geometries.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Huan, Xun; Safta, Cosmin; Sargsyan, Khachik
The development of scramjet engines is an important research area for advancing hypersonic and orbital flights. Progress toward optimal engine designs requires accurate flow simulations together with uncertainty quantification. However, performing uncertainty quantification for scramjet simulations is challenging due to the large number of uncertain parameters involved and the high computational cost of flow simulations. These difficulties are addressed in this paper by developing practical uncertainty quantification algorithms and computational methods, and deploying them in the current study to large-eddy simulations of a jet in crossflow inside a simplified HIFiRE Direct Connect Rig scramjet combustor. First, global sensitivity analysis ismore » conducted to identify influential uncertain input parameters, which can help reduce the system’s stochastic dimension. Second, because models of different fidelity are used in the overall uncertainty quantification assessment, a framework for quantifying and propagating the uncertainty due to model error is presented. Finally, these methods are demonstrated on a nonreacting jet-in-crossflow test problem in a simplified scramjet geometry, with parameter space up to 24 dimensions, using static and dynamic treatments of the turbulence subgrid model, and with two-dimensional and three-dimensional geometries.« less
NASA Astrophysics Data System (ADS)
Huan, Xun; Safta, Cosmin; Sargsyan, Khachik; Geraci, Gianluca; Eldred, Michael S.; Vane, Zachary P.; Lacaze, Guilhem; Oefelein, Joseph C.; Najm, Habib N.
2018-03-01
The development of scramjet engines is an important research area for advancing hypersonic and orbital flights. Progress toward optimal engine designs requires accurate flow simulations together with uncertainty quantification. However, performing uncertainty quantification for scramjet simulations is challenging due to the large number of uncertain parameters involved and the high computational cost of flow simulations. These difficulties are addressed in this paper by developing practical uncertainty quantification algorithms and computational methods, and deploying them in the current study to large-eddy simulations of a jet in crossflow inside a simplified HIFiRE Direct Connect Rig scramjet combustor. First, global sensitivity analysis is conducted to identify influential uncertain input parameters, which can help reduce the systems stochastic dimension. Second, because models of different fidelity are used in the overall uncertainty quantification assessment, a framework for quantifying and propagating the uncertainty due to model error is presented. These methods are demonstrated on a nonreacting jet-in-crossflow test problem in a simplified scramjet geometry, with parameter space up to 24 dimensions, using static and dynamic treatments of the turbulence subgrid model, and with two-dimensional and three-dimensional geometries.
Dupré, Mathieu; Gilquin, Benoit; Fenaille, François; Feraudet-Tarisse, Cécile; Dano, Julie; Ferro, Myriam; Simon, Stéphanie; Junot, Christophe; Brun, Virginie; Becher, François
2015-08-18
The development of rapid methods for unambiguous identification and precise quantification of protein toxins in various matrices is essential for public health surveillance. Nowadays, analytical strategies classically rely on sensitive immunological assays, but mass spectrometry constitutes an attractive complementary approach thanks to direct measurement and protein characterization ability. We developed here an innovative multiplex immuno-LC-MS/MS method for the simultaneous and specific quantification of the three potential biological warfare agents, ricin, staphylococcal enterotoxin B, and epsilon toxin, in complex human biofluids and food matrices. At least 7 peptides were targeted for each toxin (43 peptides in total) with a quadrupole-Orbitrap high-resolution instrument for exquisite detection specificity. Quantification was performed using stable isotope-labeled toxin standards spiked early in the sample. Lower limits of quantification were determined at or close to 1 ng·mL(-1). The whole process was successfully applied to the quantitative analysis of toxins in complex samples such as milk, human urine, and plasma. Finally, we report new data on toxin stability with no evidence of toxin degradation in milk in a 48 h time frame, allowing relevant quantitative toxin analysis for samples collected in this time range.
Huan, Xun; Safta, Cosmin; Sargsyan, Khachik; ...
2018-02-09
The development of scramjet engines is an important research area for advancing hypersonic and orbital flights. Progress toward optimal engine designs requires accurate flow simulations together with uncertainty quantification. However, performing uncertainty quantification for scramjet simulations is challenging due to the large number of uncertain parameters involved and the high computational cost of flow simulations. These difficulties are addressed in this paper by developing practical uncertainty quantification algorithms and computational methods, and deploying them in the current study to large-eddy simulations of a jet in crossflow inside a simplified HIFiRE Direct Connect Rig scramjet combustor. First, global sensitivity analysis ismore » conducted to identify influential uncertain input parameters, which can help reduce the system’s stochastic dimension. Second, because models of different fidelity are used in the overall uncertainty quantification assessment, a framework for quantifying and propagating the uncertainty due to model error is presented. In conclusion, these methods are demonstrated on a nonreacting jet-in-crossflow test problem in a simplified scramjet geometry, with parameter space up to 24 dimensions, using static and dynamic treatments of the turbulence subgrid model, and with two-dimensional and three-dimensional geometries.« less
Detection and Quantification of Human Fecal Pollution with Real-Time PCR
ABSTRACT Assessment of health risk and fecal bacteria loads associated with human fecal pollution requires a reliable host-specific genetic marker and a rapid quantification method. We report the development of quantitative PCR assays for enumeration of two recently described ...
Porra, Luke; Swan, Hans; Ho, Chien
2015-08-01
Introduction: Acoustic Radiation Force Impulse (ARFI) Quantification measures shear wave velocities (SWVs) within the liver. It is a reliable method for predicting the severity of liver fibrosis and has the potential to assess fibrosis in any part of the liver, but previous research has found ARFI quantification in the right lobe more accurate than in the left lobe. A lack of standardised applied transducer force when performing ARFI quantification in the left lobe of the liver may account for some of this inaccuracy. The research hypothesis of this present study predicted that an increase in applied transducer force would result in an increase in SWVs measured. Methods: ARFI quantification within the left lobe of the liver was performed within a group of healthy volunteers (n = 28). During each examination, each participant was subjected to ARFI quantification at six different levels of transducer force applied to the epigastric abdominal wall. Results: A repeated measures ANOVA test showed that ARFI quantification was significantly affected by applied transducer force (p = 0.002). Significant pairwise comparisons using Bonferroni correction for multiple comparisons showed that with an increase in applied transducer force, there was a decrease in SWVs. Conclusion: Applied transducer force has a significant effect on SWVs within the left lobe of the liver and it may explain some of the less accurate and less reliable results in previous studies where transducer force was not taken into consideration. Future studies in the left lobe of the liver should take this into account and control for applied transducer force.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Di, Zichao; Chen, Si; Hong, Young Pyo
X-ray fluorescence tomography is based on the detection of fluorescence x-ray photons produced following x-ray absorption while a specimen is rotated; it provides information on the 3D distribution of selected elements within a sample. One limitation in the quality of sample recovery is the separation of elemental signals due to the finite energy resolution of the detector. Another limitation is the effect of self-absorption, which can lead to inaccurate results with dense samples. To recover a higher quality elemental map, we combine x-ray fluorescence detection with a second data modality: conventional x-ray transmission tomography using absorption. By using these combinedmore » signals in a nonlinear optimization-based approach, we demonstrate the benefit of our algorithm on real experimental data and obtain an improved quantitative reconstruction of the spatial distribution of dominant elements in the sample. Furthermore, compared with single-modality inversion based on x-ray fluorescence alone, this joint inversion approach reduces ill-posedness and should result in improved elemental quantification and better correction of self-absorption.« less
Joint reconstruction of x-ray fluorescence and transmission tomography
Di, Zichao Wendy; Chen, Si; Hong, Young Pyo; Jacobsen, Chris; Leyffer, Sven; Wild, Stefan M.
2017-01-01
X-ray fluorescence tomography is based on the detection of fluorescence x-ray photons produced following x-ray absorption while a specimen is rotated; it provides information on the 3D distribution of selected elements within a sample. One limitation in the quality of sample recovery is the separation of elemental signals due to the finite energy resolution of the detector. Another limitation is the effect of self-absorption, which can lead to inaccurate results with dense samples. To recover a higher quality elemental map, we combine x-ray fluorescence detection with a second data modality: conventional x-ray transmission tomography using absorption. By using these combined signals in a nonlinear optimization-based approach, we demonstrate the benefit of our algorithm on real experimental data and obtain an improved quantitative reconstruction of the spatial distribution of dominant elements in the sample. Compared with single-modality inversion based on x-ray fluorescence alone, this joint inversion approach reduces ill-posedness and should result in improved elemental quantification and better correction of self-absorption. PMID:28788848
DOE Office of Scientific and Technical Information (OSTI.GOV)
Haslam, J J; Wall, M A; Johnson, D L
We have measured and modeled the change in electrical resistivity due to partial transformation to the martensitic {alpha}{prime}-phase in a {delta}-phase Pu-Ga matrix. The primary objective is to relate the change in resistance, measured with a 4-probe technique during the transformation, to the volume fraction of the {alpha}{prime} phase created in the microstructure. Analysis by finite element methods suggests that considerable differences in the resistivity may be anticipated depending on the orientational and morphological configurations of the {alpha}{prime} particles. Finite element analysis of the computed resistance of an assembly of lenticular shaped particles indicates that series resistor or parallel resistormore » approximations are inaccurate and can lead to an underestimation of the predicted amount of {alpha}{prime} in the sample by 15% or more. Comparison of the resistivity of a simulated network of partially transformed grains or portions of grains suggests that a correction to the measured resistivity allows quantification of the amount of {alpha}{prime} phase in the microstructure with minimal consideration of how the {alpha}{prime} morphology may evolve. It is found that the average of the series and parallel resistor approximations provide the most accurate relationship between the measured resistivity and the amount of {alpha}{prime} phase. The methods described here are applicable to any evolving two-phase microstructure in which the resistance difference between the two phases is measurable.« less
Khalilzadeh, Balal; Shadjou, Nasrin; Afsharan, Hadi; Eskandani, Morteza; Nozad Charoudeh, Hojjatollah; Rashidi, Mohammad-Reza
2016-01-01
Introduction:Growing demands for ultrasensitive biosensing have led to the development of numerous signal amplification strategies. In this report, a novel electrochemiluminescence (ECL) method was developed for the detection and determination of caspase-3 activity based on reduced graphene oxide sheets decorated by gold nanoparticles as signal amplification element and horseradish peroxidase enzyme (HRP) as ECL intensity enhancing agent. Methods: The ECL intensity of the luminol was improved by using the streptavidin coated magnetic beads and HRP in the presence of hydrogen peroxide. The cleavage behavior of caspase-3 was characterized by cyclic voltammetry (CV) and electrochemical impedance spectroscopy (EIS) techniques using biotinylated peptide (DEVD containing peptide) which was coated on reduced graphene oxide decorated with gold nanoparticle. The surface modification of graphene oxide was successfully confirmed by FTIR, UV-vis and x-ray spectroscopy. Results: ECL based biosensor showed that the linear dynamic range (LDR) and the lower limit of quantification (LLOQ) were 0.5-100 and 0.5 femtomolar (fM), respectively. Finally, the performance of the engineered peptide based biosensor was validated in the A549 cell line as real samples. Conclusion: The prepared peptide based biosensor could be considered as an excellent candidate for early detection of apoptosis, cell turnover, and cancer related diseases. PMID:27853677
Esquinas, Pedro L; Uribe, Carlos F; Gonzalez, M; Rodríguez-Rodríguez, Cristina; Häfeli, Urs O; Celler, Anna
2017-07-20
The main applications of 188 Re in radionuclide therapies include trans-arterial liver radioembolization and palliation of painful bone-metastases. In order to optimize 188 Re therapies, the accurate determination of radiation dose delivered to tumors and organs at risk is required. Single photon emission computed tomography (SPECT) can be used to perform such dosimetry calculations. However, the accuracy of dosimetry estimates strongly depends on the accuracy of activity quantification in 188 Re images. In this study, we performed a series of phantom experiments aiming to investigate the accuracy of activity quantification for 188 Re SPECT using high-energy and medium-energy collimators. Objects of different shapes and sizes were scanned in Air, non-radioactive water (Cold-water) and water with activity (Hot-water). The ordered subset expectation maximization algorithm with clinically available corrections (CT-based attenuation, triple-energy window (TEW) scatter and resolution recovery was used). For high activities, the dead-time corrections were applied. The accuracy of activity quantification was evaluated using the ratio of the reconstructed activity in each object to this object's true activity. Each object's activity was determined with three segmentation methods: a 1% fixed threshold (for cold background), a 40% fixed threshold and a CT-based segmentation. Additionally, the activity recovered in the entire phantom, as well as the average activity concentration of the phantom background were compared to their true values. Finally, Monte-Carlo simulations of a commercial [Formula: see text]-camera were performed to investigate the accuracy of the TEW method. Good quantification accuracy (errors <10%) was achieved for the entire phantom, the hot-background activity concentration and for objects in cold background segmented with a 1% threshold. However, the accuracy of activity quantification for objects segmented with 40% threshold or CT-based methods decreased (errors >15%), mostly due to partial-volume effects. The Monte-Carlo simulations confirmed that TEW-scatter correction applied to 188 Re, although practical, yields only approximate estimates of the true scatter.
Hendriks, Lyndsey; Gundlach-Graham, Alexander; Günther, Detlef
2018-04-25
Due to the rapid development of nanotechnologies, engineered nanomaterials (ENMs) and nanoparticles (ENPs) are becoming a part of everyday life: nanotechnologies are quickly migrating from laboratory benches to store shelves and industrial processes. As the use of ENPs continues to expand, their release into the environment is unavoidable; however, understanding the mechanisms and degree of ENP release is only possible through direct detection of these nanospecies in relevant matrices and at realistic concentrations. Key analytical requirements for quantitative detection of ENPs include high sensitivity to detect small particles at low total mass concentrations and the need to separate signals of ENPs from a background of dissolved elemental species and natural nanoparticles (NNPs). To this end, an emerging method called single-particle inductively coupled plasma mass spectrometry (sp-ICPMS) has demonstrated great potential for the characterization of inorganic nanoparticles (NPs) at environmentally relevant concentrations. Here, we comment on the capabilities of modern sp-ICPMS analysis with particular focus on the measurement possibilities offered by ICP-time-of-flight mass spectrometry (ICP-TOFMS). ICP-TOFMS delivers complete elemental mass spectra for individual NPs, which allows for high-throughput, untargeted quantitative analysis of dispersed NPs in natural matrices. Moreover, the multi-element detection capabilities of ICP-TOFMS enable new NP-analysis strategies, including online calibration via microdroplets for accurate NP mass quantification and matrix compensation.
Direct quantification of rare earth doped titania nanoparticles in individual human cells
NASA Astrophysics Data System (ADS)
Jeynes, J. C. G.; Jeynes, C.; Palitsin, V.; Townley, H. E.
2016-07-01
There are many possible biomedical applications for titania nanoparticles (NPs) doped with rare earth elements (REEs), from dose enhancement and diagnostic imaging in radiotherapy, to biosensing. However, there are concerns that the NPs could disintegrate in the body thus releasing toxic REE ions to undesired locations. As a first step, we investigate how accurately the Ti/REE ratio from the NPs can be measured inside human cells. A quantitative analysis of whole, unsectioned, individual human cells was performed using proton microprobe elemental microscopy. This method is unique in being able to quantitatively analyse all the elements in an unsectioned individual cell with micron resolution, while also scanning large fields of view. We compared the Ti/REE signal inside cells to NPs that were outside the cells, non-specifically absorbed onto the polypropylene substrate. We show that the REE signal in individual cells co-localises with the titanium signal, indicating that the NPs have remained intact. Within the uncertainty of the measurement, there is no difference between the Ti/REE ratio inside and outside the cells. Interestingly, we also show that there is considerable variation in the uptake of the NPs from cell-to-cell, by a factor of more than 10. We conclude that the NPs enter the cells and remain intact. The large heterogeneity in NP concentrations from cell-to-cell should be considered if they are to be used therapeutically.
Kikuchi, Hiroyuki; Tsutsumi, Tomoaki; Matsuda, Rieko
2012-01-01
A method for the quantification of histamine in fish and fish products using tandem solid-phase extraction and fluorescence derivatization with fluorescamine was previously developed. In this study, we improved this analytical method to develop an official test method for quantification of histamine in fish and fish products, and performed a single laboratory study to validate it. Recovery tests of histamine from fillet (Thunnus obesus), and two fish products (fish sauce and salted and dried whole big-eye sardine) that were spiked at the level of 25 and 50 µg/g for T. obesus, and 50 and 100 µg/g for the two fish products, were carried out. The recoveries of histamine from the three samples tested were 88.8-99.6% with good repeatability (1.3-2.1%) and reproducibility (2.1-4.7%). Therefore, this method is acceptable for the quantification of histamine in fish and fish products. Moreover, surveillance of histamine content in food on the market was conducted using this method, and high levels of histamine were detected in some fish products.
Hidau, Mahendra Kumar; Kolluru, Srikanth; Palakurthi, Srinath
2018-02-01
A sensitive and selective RP-HPLC method has been developed and validated for the quantification of a highly potent poly ADP ribose polymerase inhibitor talazoparib (TZP) in rat plasma. Chromatographic separation was performed with isocratic elution method. Absorbance for TZP was measured with a UV detector (SPD-20A UV-vis) at a λ max of 227 nm. Protein precipitation was used to extract the drug from plasma samples using methanol-acetonitrile (65:35) as the precipitating solvent. The method proved to be sensitive and reproducible over a 100-2000 ng/mL linearity range with a lower limit of quantification (LLQC) of 100 ng/mL. TZP recovery was found to be >85%. Following analytical method development and validation, it was successfully employed to determine the plasma protein binding of TZP. TZP has a high level of protein binding in rat plasma (95.76 ± 0.38%) as determined by dialysis method. Copyright © 2017 John Wiley & Sons, Ltd.
Begou, O; Kontou, A; Raikos, N; Sarafidis, K; Roilides, E; Papadoyannis, I N; Gika, H G
2017-03-15
The development and validation of an ultra-high pressure liquid chromatography (UHPLC) tandem mass spectrometry (MS/MS) method was performed with the aim to be applied for the quantification of plasma teicoplanin concentrations in neonates. Pharmacokinetic data of teicoplanin in the neonatal population is very limited, therefore, a sensitive and reliable method for the determination of all isoforms of teicoplanin applied in a low volume of sample is of real importance. Teicoplanin main components were extracted by a simple acetonitrile precipitation step and analysed on a C18 chromatographic column by a triple quadrupole MS with electrospray ionization. The method provides quantitative data over a linear range of 25-6400ng/mL with LOD 8.5ng/mL and LOQ 25ng/mL for total teicoplanin. The method was applied in plasma samples from neonates to support pharmacokinetic data and proved to be a reliable and fast method for the quantification of teicoplanin concentration levels in plasma of infants during therapy in Intensive Care Unit. Copyright © 2016 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Muñiz, Rocío; Lobo, Lara; Németh, Katalin; Péter, László; Pereiro, Rosario
2017-09-01
There is still a lack of approaches for quantitative depth-profiling when dealing with glow discharges (GD) coupled to mass spectrometric detection. The purpose of this work is to develop quantification procedures using pulsed GD (PGD) - time of flight mass spectrometry. In particular, research was focused towards the depth profile analysis of Cu/NiCu nanolayers and multilayers electrodeposited on Si wafers. PGDs are characterized by three different regions due to the temporal application of power: prepeak, plateau and afterglow. This last region is the most sensitive and so it is convenient for quantitative analysis of minor components; however, major elements are often saturated, even at 30 W of applied radiofrequency power for these particular samples. For such cases, we have investigated two strategies based on a multimatrix calibration procedure: (i) using the afterglow region for all the sample components except for the major element (Cu) that was analyzed in the plateau, and (ii) using the afterglow region for all the elements measuring the ArCu signal instead of Cu. Seven homogeneous certified reference materials containing Si, Cr, Fe, Co, Ni and Cu have been used for quantification. Quantitative depth profiles obtained with these two strategies for samples containing 3 or 6 multilayers (of a few tens of nanometers each layer) were in agreement with the expected values, both in terms of thickness and composition of the layers.
Computer-aided Assessment of Regional Abdominal Fat with Food Residue Removal in CT
Makrogiannis, Sokratis; Caturegli, Giorgio; Davatzikos, Christos; Ferrucci, Luigi
2014-01-01
Rationale and Objectives Separate quantification of abdominal subcutaneous and visceral fat regions is essential to understand the role of regional adiposity as risk factor in epidemiological studies. Fat quantification is often based on computed tomography (CT) because fat density is distinct from other tissue densities in the abdomen. However, the presence of intestinal food residues with densities similar to fat may reduce fat quantification accuracy. We introduce an abdominal fat quantification method in CT with interest in food residue removal. Materials and Methods Total fat was identified in the feature space of Hounsfield units and divided into subcutaneous and visceral components using model-based segmentation. Regions of food residues were identified and removed from visceral fat using a machine learning method integrating intensity, texture, and spatial information. Cost-weighting and bagging techniques were investigated to address class imbalance. Results We validated our automated food residue removal technique against semimanual quantifications. Our feature selection experiments indicated that joint intensity and texture features produce the highest classification accuracy at 95%. We explored generalization capability using k-fold cross-validation and receiver operating characteristic (ROC) analysis with variable k. Losses in accuracy and area under ROC curve between maximum and minimum k were limited to 0.1% and 0.3%. We validated tissue segmentation against reference semimanual delineations. The Dice similarity scores were as high as 93.1 for subcutaneous fat and 85.6 for visceral fat. Conclusions Computer-aided regional abdominal fat quantification is a reliable computational tool for large-scale epidemiological studies. Our proposed intestinal food residue reduction scheme is an original contribution of this work. Validation experiments indicate very good accuracy and generalization capability. PMID:24119354
Expert judgement and uncertainty quantification for climate change
NASA Astrophysics Data System (ADS)
Oppenheimer, Michael; Little, Christopher M.; Cooke, Roger M.
2016-05-01
Expert judgement is an unavoidable element of the process-based numerical models used for climate change projections, and the statistical approaches used to characterize uncertainty across model ensembles. Here, we highlight the need for formalized approaches to unifying numerical modelling with expert judgement in order to facilitate characterization of uncertainty in a reproducible, consistent and transparent fashion. As an example, we use probabilistic inversion, a well-established technique used in many other applications outside of climate change, to fuse two recent analyses of twenty-first century Antarctic ice loss. Probabilistic inversion is but one of many possible approaches to formalizing the role of expert judgement, and the Antarctic ice sheet is only one possible climate-related application. We recommend indicators or signposts that characterize successful science-based uncertainty quantification.
Zhao, Ming; Huang, Run; Peng, Leilei
2012-11-19
Förster resonant energy transfer (FRET) is extensively used to probe macromolecular interactions and conformation changes. The established FRET lifetime analysis method measures the FRET process through its effect on the donor lifetime. In this paper we present a method that directly probes the time-resolved FRET signal with frequency domain Fourier lifetime excitation-emission matrix (FLEEM) measurements. FLEEM separates fluorescent signals by their different phonon energy pathways from excitation to emission. The FRET process generates a unique signal channel that is initiated by donor excitation but ends with acceptor emission. Time-resolved analysis of the FRET EEM channel allows direct measurements on the FRET process, unaffected by free fluorophores that might be present in the sample. Together with time-resolved analysis on non-FRET channels, i.e. donor and acceptor EEM channels, time resolved EEM analysis allows precise quantification of FRET in the presence of free fluorophores. The method is extended to three-color FRET processes, where quantification with traditional methods remains challenging because of the significantly increased complexity in the three-way FRET interactions. We demonstrate the time-resolved EEM analysis method with quantification of three-color FRET in incompletely hybridized triple-labeled DNA oligonucleotides. Quantitative measurements of the three-color FRET process in triple-labeled dsDNA are obtained in the presence of free single-labeled ssDNA and double-labeled dsDNA. The results establish a quantification method for studying multi-color FRET between multiple macromolecules in biochemical equilibrium.
Zhao, Ming; Huang, Run; Peng, Leilei
2012-01-01
Förster resonant energy transfer (FRET) is extensively used to probe macromolecular interactions and conformation changes. The established FRET lifetime analysis method measures the FRET process through its effect on the donor lifetime. In this paper we present a method that directly probes the time-resolved FRET signal with frequency domain Fourier lifetime excitation-emission matrix (FLEEM) measurements. FLEEM separates fluorescent signals by their different phonon energy pathways from excitation to emission. The FRET process generates a unique signal channel that is initiated by donor excitation but ends with acceptor emission. Time-resolved analysis of the FRET EEM channel allows direct measurements on the FRET process, unaffected by free fluorophores that might be present in the sample. Together with time-resolved analysis on non-FRET channels, i.e. donor and acceptor EEM channels, time resolved EEM analysis allows precise quantification of FRET in the presence of free fluorophores. The method is extended to three-color FRET processes, where quantification with traditional methods remains challenging because of the significantly increased complexity in the three-way FRET interactions. We demonstrate the time-resolved EEM analysis method with quantification of three-color FRET in incompletely hybridized triple-labeled DNA oligonucleotides. Quantitative measurements of the three-color FRET process in triple-labeled dsDNA are obtained in the presence of free single-labeled ssDNA and double-labeled dsDNA. The results establish a quantification method for studying multi-color FRET between multiple macromolecules in biochemical equilibrium. PMID:23187535
Grelewska-Nowotko, Katarzyna; Żurawska-Zajfert, Magdalena; Żmijewska, Ewelina; Sowa, Sławomir
2018-05-01
In recent years, digital polymerase chain reaction (dPCR), a new molecular biology technique, has been gaining in popularity. Among many other applications, this technique can also be used for the detection and quantification of genetically modified organisms (GMOs) in food and feed. It might replace the currently widely used real-time PCR method (qPCR), by overcoming problems related to the PCR inhibition and the requirement of certified reference materials to be used as a calibrant. In theory, validated qPCR methods can be easily transferred to the dPCR platform. However, optimization of the PCR conditions might be necessary. In this study, we report the transfer of two validated qPCR methods for quantification of maize DAS1507 and NK603 events to the droplet dPCR (ddPCR) platform. After some optimization, both methods have been verified according to the guidance of the European Network of GMO Laboratories (ENGL) on analytical method verification (ENGL working group on "Method Verification." (2011) Verification of Analytical Methods for GMO Testing When Implementing Interlaboratory Validated Methods). Digital PCR methods performed equally or better than the qPCR methods. Optimized ddPCR methods confirm their suitability for GMO determination in food and feed.
Rashed-Ul Islam, S M; Jahan, Munira; Tabassum, Shahina
2015-01-01
Virological monitoring is the best predictor for the management of chronic hepatitis B virus (HBV) infections. Consequently, it is important to use the most efficient, rapid and cost-effective testing systems for HBV DNA quantification. The present study compared the performance characteristics of a one-step HBV polymerase chain reaction (PCR) vs the two-step HBV PCR method for quantification of HBV DNA from clinical samples. A total of 100 samples consisting of 85 randomly selected samples from patients with chronic hepatitis B (CHB) and 15 samples from apparently healthy individuals were enrolled in this study. Of the 85 CHB clinical samples tested, HBV DNA was detected from 81% samples by one-step PCR method with median HBV DNA viral load (VL) of 7.50 × 10 3 lU/ml. In contrast, 72% samples were detected by the two-step PCR system with median HBV DNA of 3.71 × 10 3 lU/ml. The one-step method showed strong linear correlation with two-step PCR method (r = 0.89; p < 0.0001). Both methods showed good agreement at Bland-Altman plot, with a mean difference of 0.61 log 10 IU/ml and limits of agreement of -1.82 to 3.03 log 10 IU/ml. The intra-assay and interassay coefficients of variation (CV%) of plasma samples (4-7 log 10 IU/ml) for the one-step PCR method ranged between 0.33 to 0.59 and 0.28 to 0.48 respectively, thus demonstrating a high level of concordance between the two methods. Moreover, elimination of the DNA extraction step in the one-step PCR kit allowed time-efficient and significant labor and cost savings for the quantification of HBV DNA in a resource limited setting. Rashed-Ul Islam SM, Jahan M, Tabassum S. Evaluation of a Rapid One-step Real-time PCR Method as a High-throughput Screening for Quantification of Hepatitis B Virus DNA in a Resource-limited Setting. Euroasian J Hepato-Gastroenterol 2015;5(1):11-15.
Jahan, Munira; Tabassum, Shahina
2015-01-01
Virological monitoring is the best predictor for the management of chronic hepatitis B virus (HBV) infections. Consequently, it is important to use the most efficient, rapid and cost-effective testing systems for HBV DNA quantification. The present study compared the performance characteristics of a one-step HBV polymerase chain reaction (PCR) vs the two-step HBV PCR method for quantification of HBV DNA from clinical samples. A total of 100 samples consisting of 85 randomly selected samples from patients with chronic hepatitis B (CHB) and 15 samples from apparently healthy individuals were enrolled in this study. Of the 85 CHB clinical samples tested, HBV DNA was detected from 81% samples by one-step PCR method with median HBV DNA viral load (VL) of 7.50 × 103 lU/ml. In contrast, 72% samples were detected by the two-step PCR system with median HBV DNA of 3.71 × 103 lU/ml. The one-step method showed strong linear correlation with two-step PCR method (r = 0.89; p < 0.0001). Both methods showed good agreement at Bland-Altman plot, with a mean difference of 0.61 log10 IU/ml and limits of agreement of -1.82 to 3.03 log10 IU/ml. The intra-assay and interassay coefficients of variation (CV%) of plasma samples (4-7 log10 IU/ml) for the one-step PCR method ranged between 0.33 to 0.59 and 0.28 to 0.48 respectively, thus demonstrating a high level of concordance between the two methods. Moreover, elimination of the DNA extraction step in the one-step PCR kit allowed time-efficient and significant labor and cost savings for the quantification of HBV DNA in a resource limited setting. How to cite this article Rashed-Ul Islam SM, Jahan M, Tabassum S. Evaluation of a Rapid One-step Real-time PCR Method as a High-throughput Screening for Quantification of Hepatitis B Virus DNA in a Resource-limited Setting. Euroasian J Hepato-Gastroenterol 2015;5(1):11-15. PMID:29201678
NASA Astrophysics Data System (ADS)
Rudmin, Daniel
Ionic polymer-metal composites (IPMCs) are some of the most well-known electro-active polymers. This is due to their large deformation provided a relatively low voltage source. IPMCs have been acknowledged as a potential candidate for biomedical applications such as cardiac catheters and surgical probes; however, there is still no existing mass manufacturing of IPMCs. This study intends to provide a theoretical framework which could be used to design practical purpose IPMCs depending on the end users interest. This study begins by investigating methodologies used to develop quantify the physical actuation of an IPMC in 3-dimensional space. This approach is taken in two separate means; however, both approaches utilize the finite element method. The first approach utilizes the finite element method in order to describe the dynamic response of a segmented IPMC actuator. The first approach manually constructs each element with a local coordinate system. Each system undergoes a rigid body motion along the element and deformation of the element is expressed in the local coordinate frame. The physical phenomenon in this system is simplified by utilizing a lumped RC model in order to simplify the electro-mechanical phenomena in the IPMC dynamics. The second study investigates 3D modeling of a rod shaped IPMC actuator by explicitly coupling electrostatics, transport phenomenon, and solid mechanics. This portion of the research will briefly discuss the mathematical background that more accurately quantifies the physical phenomena. Solving for the 3-dimensional actuation is explicitly carried out again by utilizing the finite element method. The numerical result is conducted in a software package known as COMSOL MULTIPHYSICS. This simulation allows for explicit geometric rendering as well as more explicit quantification of the physical quantities such as concentration, electric field, and deflection. The final study will conduct design optimization on the COMSOL simulation in order to provide conceptual motivation for future designs. Utilizing a multi-physics analysis approach on a three dimensional cylinder and tube type IPMC provides physically accurate results for time dependent end effector displacement given a voltage source. Simulations are conducted with the finite element method and are also validated with empirical evidences. Having an in-depth understanding of the physical coupling provides optimal design parameters that cannot be altered from a standard electro-mechanical coupling. These parameters are altered in order to determine optimal designs for end-effector displacement, maximum force, and improved mobility with limited voltage magnitude. Design alterations are conducted on the electrode patterns in order to provide greater mobility, electrode size for efficient bending, and Nafion diameter for improved force. The results of this study will provide optimal design parameters of the IPMC for different applications.
DETECTION AND QUANTIFICATION OF COW FECAL POLLUTION WITH REAL-TIME PCR
Assessment of health risk and fecal bacteria loads associated with cow fecal pollution requires a reliable host-specific genetic marker and a rapid quantification method. We report the development of quantitative PCR assays for enumeration of two recently described cow-specific g...
Julien, Jennifer; Dumont, Nathalie; Lebouil, David; Germain, Patrick
2014-01-01
Current waste management policies favor biogases (digester gases (DGs) and landfill gases (LFGs)) valorization as it becomes a way for energy politics. However, volatile organic silicon compounds (VOSiCs) contained into DGs/LFGs severely damage combustion engines and endanger the conversion into electricity by power plants, resulting in a high purification level requirement. Assessing treatment efficiency is still difficult. No consensus has been reached to provide a standardized sampling and quantification of VOSiCs into gases because of their diversity, their physicochemical properties, and the omnipresence of silicon in analytical chains. Usually, samplings are done by adsorption or absorption and quantification made by gas chromatography-mass spectrometry (GC-MS) or inductively coupled plasma-optical emission spectrometry (ICP-OES). In this objective, this paper presents and discusses the optimization of a patented method consisting in VOSiCs sampling by absorption of 100% ethanol and quantification of total Si by ICP-OES. PMID:25379538
Round robin test on quantification of amyloid-β 1-42 in cerebrospinal fluid by mass spectrometry.
Pannee, Josef; Gobom, Johan; Shaw, Leslie M; Korecka, Magdalena; Chambers, Erin E; Lame, Mary; Jenkins, Rand; Mylott, William; Carrillo, Maria C; Zegers, Ingrid; Zetterberg, Henrik; Blennow, Kaj; Portelius, Erik
2016-01-01
Cerebrospinal fluid (CSF) amyloid-β 1-42 (Aβ42) is an important biomarker for Alzheimer's disease, both in diagnostics and to monitor disease-modifying therapies. However, there is a great need for standardization of methods used for quantification. To overcome problems associated with immunoassays, liquid chromatography-tandem mass spectrometry (LC-MS/MS) has emerged as a critical orthogonal alternative. We compared results for CSF Aβ42 quantification in a round robin study performed in four laboratories using similar sample preparation methods and LC-MS instrumentation. The LC-MS results showed excellent correlation between laboratories (r(2) >0.98), high analytical precision, and good correlation with enzyme-linked immunosorbent assay (r(2) >0.85). The use of a common reference sample further decreased interlaboratory variation. Our results indicate that LC-MS is suitable for absolute quantification of Aβ42 in CSF and highlight the importance of developing a certified reference material. Copyright © 2016 The Alzheimer's Association. Published by Elsevier Inc. All rights reserved.
Verant, Michelle L; Bohuski, Elizabeth A; Lorch, Jeffery M; Blehert, David S
2016-03-01
The continued spread of white-nose syndrome and its impacts on hibernating bat populations across North America has prompted nationwide surveillance efforts and the need for high-throughput, noninvasive diagnostic tools. Quantitative real-time polymerase chain reaction (qPCR) analysis has been increasingly used for detection of the causative fungus, Pseudogymnoascus destructans, in both bat- and environment-associated samples and provides a tool for quantification of fungal DNA useful for research and monitoring purposes. However, precise quantification of nucleic acid from P. destructans is dependent on effective and standardized methods for extracting nucleic acid from various relevant sample types. We describe optimized methodologies for extracting fungal nucleic acids from sediment, guano, and swab-based samples using commercial kits together with a combination of chemical, enzymatic, and mechanical modifications. Additionally, we define modifications to a previously published intergenic spacer-based qPCR test for P. destructans to refine quantification capabilities of this assay. © 2016 The Author(s).
A multicenter study benchmarks software tools for label-free proteome quantification.
Navarro, Pedro; Kuharev, Jörg; Gillet, Ludovic C; Bernhardt, Oliver M; MacLean, Brendan; Röst, Hannes L; Tate, Stephen A; Tsou, Chih-Chiang; Reiter, Lukas; Distler, Ute; Rosenberger, George; Perez-Riverol, Yasset; Nesvizhskii, Alexey I; Aebersold, Ruedi; Tenzer, Stefan
2016-11-01
Consistent and accurate quantification of proteins by mass spectrometry (MS)-based proteomics depends on the performance of instruments, acquisition methods and data analysis software. In collaboration with the software developers, we evaluated OpenSWATH, SWATH 2.0, Skyline, Spectronaut and DIA-Umpire, five of the most widely used software methods for processing data from sequential window acquisition of all theoretical fragment-ion spectra (SWATH)-MS, which uses data-independent acquisition (DIA) for label-free protein quantification. We analyzed high-complexity test data sets from hybrid proteome samples of defined quantitative composition acquired on two different MS instruments using different SWATH isolation-window setups. For consistent evaluation, we developed LFQbench, an R package, to calculate metrics of precision and accuracy in label-free quantitative MS and report the identification performance, robustness and specificity of each software tool. Our reference data sets enabled developers to improve their software tools. After optimization, all tools provided highly convergent identification and reliable quantification performance, underscoring their robustness for label-free quantitative proteomics.
Chottier, Claire; Chatain, Vincent; Julien, Jennifer; Dumont, Nathalie; Lebouil, David; Germain, Patrick
2014-01-01
Current waste management policies favor biogases (digester gases (DGs) and landfill gases (LFGs)) valorization as it becomes a way for energy politics. However, volatile organic silicon compounds (VOSiCs) contained into DGs/LFGs severely damage combustion engines and endanger the conversion into electricity by power plants, resulting in a high purification level requirement. Assessing treatment efficiency is still difficult. No consensus has been reached to provide a standardized sampling and quantification of VOSiCs into gases because of their diversity, their physicochemical properties, and the omnipresence of silicon in analytical chains. Usually, samplings are done by adsorption or absorption and quantification made by gas chromatography-mass spectrometry (GC-MS) or inductively coupled plasma-optical emission spectrometry (ICP-OES). In this objective, this paper presents and discusses the optimization of a patented method consisting in VOSiCs sampling by absorption of 100% ethanol and quantification of total Si by ICP-OES.
Verant, Michelle; Bohuski, Elizabeth A.; Lorch, Jeffrey M.; Blehert, David
2016-01-01
The continued spread of white-nose syndrome and its impacts on hibernating bat populations across North America has prompted nationwide surveillance efforts and the need for high-throughput, noninvasive diagnostic tools. Quantitative real-time polymerase chain reaction (qPCR) analysis has been increasingly used for detection of the causative fungus, Pseudogymnoascus destructans, in both bat- and environment-associated samples and provides a tool for quantification of fungal DNA useful for research and monitoring purposes. However, precise quantification of nucleic acid fromP. destructans is dependent on effective and standardized methods for extracting nucleic acid from various relevant sample types. We describe optimized methodologies for extracting fungal nucleic acids from sediment, guano, and swab-based samples using commercial kits together with a combination of chemical, enzymatic, and mechanical modifications. Additionally, we define modifications to a previously published intergenic spacer–based qPCR test for P. destructans to refine quantification capabilities of this assay.
Quaternary ammonium isobaric tag for a relative and absolute quantification of peptides.
Setner, Bartosz; Stefanowicz, Piotr; Szewczuk, Zbigniew
2018-02-01
Isobaric labeling quantification of peptides has become a method of choice for mass spectrometry-based proteomics studies. However, despite of wide variety of commercially available isobaric tags, none of the currently available methods offers significant improvement of sensitivity of detection during MS experiment. Recently, many strategies were applied to increase the ionization efficiency of peptides involving chemical modifications introducing quaternary ammonium fixed charge. Here, we present a novel quaternary ammonium-based isobaric tag for relative and absolute quantification of peptides (QAS-iTRAQ 2-plex). Upon collisional activation, the new stable benzylic-type cationic reporter ion is liberated from the tag. Deuterium atoms were used to offset the differential masses of a reporter group. We tested the applicability of QAS-iTRAQ 2-plex reagent on a series of model peptides as well as bovine serum albumin tryptic digest. Obtained results suggest usefulness of this isobaric ionization tag for relative and absolute quantification of peptides. Copyright © 2017 John Wiley & Sons, Ltd.
de Albuquerque, Carlos Diego L; Sobral-Filho, Regivaldo G; Poppi, Ronei J; Brolo, Alexandre G
2018-01-16
Single molecule surface-enhanced Raman spectroscopy (SM-SERS) has the potential to revolutionize quantitative analysis at ultralow concentrations (less than 1 nM). However, there are no established protocols to generalize the application of this technique in analytical chemistry. Here, a protocol for quantification at ultralow concentrations using SM-SERS is proposed. The approach aims to take advantage of the stochastic nature of the single-molecule regime to achieved lower limits of quantification (LOQ). Two emerging contaminants commonly found in aquatic environments, enrofloxacin (ENRO) and ciprofloxacin (CIPRO), were chosen as nonresonant molecular probes. The methodology involves a multivariate resolution curve fitting known as non-negative matrix factorization with alternating least-squares algorithm (NMF-ALS) to solve spectral overlaps. The key element of the quantification is to realize that, under SM-SERS conditions, the Raman intensity generated by a molecule adsorbed on a "hotspot" can be digitalized. Therefore, the number of SERS event counts (rather than SERS intensities) was shown to be proportional to the solution concentration. This allowed the determination of both ENRO and CIPRO with high accuracy and precision even at ultralow concentrations regime. The LOQ for both ENRO and CIPRO were achieved at 2.8 pM. The digital SERS protocol, suggested here, is a roadmap for the implementation of SM-SERS as a routine tool for quantification at ultralow concentrations.
NASA Astrophysics Data System (ADS)
Hirt, Ulrike; Mewes, Melanie; Meyer, Burghard C.
The structure of a landscape is highly relevant for research and planning (such as fulfilling the requirements of the Water Framework Directive - WFD - and for implementation of comprehensive catchment planning). There is a high potential for restoration of linear landscape elements in most European landscapes. By implementing the WFD in Germany, the restoration of linear landscape elements could be a valuable measure, for example to reduce nutrient input into rivers. Despite this importance of landscape structures for water and nutrients fluxes, biodiversity and the appearance of a landscape, specific studies of the linear elements are rare for larger catchment areas. Existing studies are limited because they either use remote sensing data, which does not adequately differentiate all types of linear landscape elements, or they focus only on a specific type of linear element. To address these limitations, we developed a framework allowing comprehensive quantification of linear landscape elements for catchment areas, using publicly available biotope type data. We analysed the dependence of landscape structures on natural regions and regional soil characteristics. Three data sets (differing in biotopes, soil parameters and natural regions) were generated for the catchment area of the middle Mulde River (2700 km 2) in Germany, using overlay processes in geographic information systems (GIS), followed by statistical evaluation. The linear landscape components of the total catchment area are divided into roads (55%), flowing water (21%), tree rows (14%), avenues (5%), and hedges (2%). The occurrence of these landscape components varies regionally among natural units and different soil regions. For example, the mixed deciduous stands (3.5 m/ha) are far more frequent in foothills (6 m/ha) than in hill country (0.9 m/ha). In contrast, fruit trees are more frequent in hill country (5.2 m/ha) than in the cooler foothills (0.5 m/ha). Some 70% of avenues, and 40% of tree rows, are discontinuous; in contrast, only 20% of hedges are discontinuous. Using our innovative framework, comprehensive information about landscape elements can now be obtained for regional applications. This approach can be applied to other regions and is highly relevant for landscape planning, erosion control, protection of waters and preservation of biotopes and species.
Phylogenetic Quantification of Intra-tumour Heterogeneity
Schwarz, Roland F.; Trinh, Anne; Sipos, Botond; Brenton, James D.; Goldman, Nick; Markowetz, Florian
2014-01-01
Intra-tumour genetic heterogeneity is the result of ongoing evolutionary change within each cancer. The expansion of genetically distinct sub-clonal populations may explain the emergence of drug resistance, and if so, would have prognostic and predictive utility. However, methods for objectively quantifying tumour heterogeneity have been missing and are particularly difficult to establish in cancers where predominant copy number variation prevents accurate phylogenetic reconstruction owing to horizontal dependencies caused by long and cascading genomic rearrangements. To address these challenges, we present MEDICC, a method for phylogenetic reconstruction and heterogeneity quantification based on a Minimum Event Distance for Intra-tumour Copy-number Comparisons. Using a transducer-based pairwise comparison function, we determine optimal phasing of major and minor alleles, as well as evolutionary distances between samples, and are able to reconstruct ancestral genomes. Rigorous simulations and an extensive clinical study show the power of our method, which outperforms state-of-the-art competitors in reconstruction accuracy, and additionally allows unbiased numerical quantification of tumour heterogeneity. Accurate quantification and evolutionary inference are essential to understand the functional consequences of tumour heterogeneity. The MEDICC algorithms are independent of the experimental techniques used and are applicable to both next-generation sequencing and array CGH data. PMID:24743184
Quan, Phenix-Lan; Sauzade, Martin
2018-01-01
Digital Polymerase Chain Reaction (dPCR) is a novel method for the absolute quantification of target nucleic acids. Quantification by dPCR hinges on the fact that the random distribution of molecules in many partitions follows a Poisson distribution. Each partition acts as an individual PCR microreactor and partitions containing amplified target sequences are detected by fluorescence. The proportion of PCR-positive partitions suffices to determine the concentration of the target sequence without a need for calibration. Advances in microfluidics enabled the current revolution of digital quantification by providing efficient partitioning methods. In this review, we compare the fundamental concepts behind the quantification of nucleic acids by dPCR and quantitative real-time PCR (qPCR). We detail the underlying statistics of dPCR and explain how it defines its precision and performance metrics. We review the different microfluidic digital PCR formats, present their underlying physical principles, and analyze the technological evolution of dPCR platforms. We present the novel multiplexing strategies enabled by dPCR and examine how isothermal amplification could be an alternative to PCR in digital assays. Finally, we determine whether the theoretical advantages of dPCR over qPCR hold true by perusing studies that directly compare assays implemented with both methods. PMID:29677144
NASA Astrophysics Data System (ADS)
Zhao, Fengjun; Liu, Junting; Qu, Xiaochao; Xu, Xianhui; Chen, Xueli; Yang, Xiang; Cao, Feng; Liang, Jimin; Tian, Jie
2014-12-01
To solve the multicollinearity issue and unequal contribution of vascular parameters for the quantification of angiogenesis, we developed a quantification evaluation method of vascular parameters for angiogenesis based on in vivo micro-CT imaging of hindlimb ischemic model mice. Taking vascular volume as the ground truth parameter, nine vascular parameters were first assembled into sparse principal components (PCs) to reduce the multicolinearity issue. Aggregated boosted trees (ABTs) were then employed to analyze the importance of vascular parameters for the quantification of angiogenesis via the loadings of sparse PCs. The results demonstrated that vascular volume was mainly characterized by vascular area, vascular junction, connectivity density, segment number and vascular length, which indicated they were the key vascular parameters for the quantification of angiogenesis. The proposed quantitative evaluation method was compared with both the ABTs directly using the nine vascular parameters and Pearson correlation, which were consistent. In contrast to the ABTs directly using the vascular parameters, the proposed method can select all the key vascular parameters simultaneously, because all the key vascular parameters were assembled into the sparse PCs with the highest relative importance.
A Cyber-Attack Detection Model Based on Multivariate Analyses
NASA Astrophysics Data System (ADS)
Sakai, Yuto; Rinsaka, Koichiro; Dohi, Tadashi
In the present paper, we propose a novel cyber-attack detection model based on two multivariate-analysis methods to the audit data observed on a host machine. The statistical techniques used here are the well-known Hayashi's quantification method IV and cluster analysis method. We quantify the observed qualitative audit event sequence via the quantification method IV, and collect similar audit event sequence in the same groups based on the cluster analysis. It is shown in simulation experiments that our model can improve the cyber-attack detection accuracy in some realistic cases where both normal and attack activities are intermingled.
Methods for detection of GMOs in food and feed.
Marmiroli, Nelson; Maestri, Elena; Gullì, Mariolina; Malcevschi, Alessio; Peano, Clelia; Bordoni, Roberta; De Bellis, Gianluca
2008-10-01
This paper reviews aspects relevant to detection and quantification of genetically modified (GM) material within the feed/food chain. The GM crop regulatory framework at the international level is evaluated with reference to traceability and labelling. Current analytical methods for the detection, identification, and quantification of transgenic DNA in food and feed are reviewed. These methods include quantitative real-time PCR, multiplex PCR, and multiplex real-time PCR. Particular attention is paid to methods able to identify multiple GM events in a single reaction and to the development of microdevices and microsensors, though they have not been fully validated for application.
Developing and Implementing the Data Mining Algorithms in RAVEN
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sen, Ramazan Sonat; Maljovec, Daniel Patrick; Alfonsi, Andrea
The RAVEN code is becoming a comprehensive tool to perform probabilistic risk assessment, uncertainty quantification, and verification and validation. The RAVEN code is being developed to support many programs and to provide a set of methodologies and algorithms for advanced analysis. Scientific computer codes can generate enormous amounts of data. To post-process and analyze such data might, in some cases, take longer than the initial software runtime. Data mining algorithms/methods help in recognizing and understanding patterns in the data, and thus discover knowledge in databases. The methodologies used in the dynamic probabilistic risk assessment or in uncertainty and error quantificationmore » analysis couple system/physics codes with simulation controller codes, such as RAVEN. RAVEN introduces both deterministic and stochastic elements into the simulation while the system/physics code model the dynamics deterministically. A typical analysis is performed by sampling values of a set of parameter values. A major challenge in using dynamic probabilistic risk assessment or uncertainty and error quantification analysis for a complex system is to analyze the large number of scenarios generated. Data mining techniques are typically used to better organize and understand data, i.e. recognizing patterns in the data. This report focuses on development and implementation of Application Programming Interfaces (APIs) for different data mining algorithms, and the application of these algorithms to different databases.« less
Gibson, Juliet F; Huang, Jing; Liu, Kristina J; Carlson, Kacie R; Foss, Francine; Choi, Jaehyuk; Edelson, Richard; Hussong, Jerry W.; Mohl, Ramsey; Hill, Sally; Girardi, Sally
2016-01-01
Background Accurate quantification of malignant cells in the peripheral blood of patients with cutaneous T cell lymphoma (CTCL) is important for early detection, prognosis, and monitoring disease burden. Objective Determine the spectrum of current clinical practices; critically evaluate elements of current ISCL B1 and B2 staging criteria; and assess the potential role of TCR-Vβ analysis by flow cytometry. Methods We assessed current clinical practices by survey, and performed a retrospective analysis of 161 patients evaluated at Yale (2011-2014) to compare the sensitivity, specificity, PPV, and NPV of parameters for ISCL B2 staging. Results There was heterogeneity in clinical practices among institutions. ISCL B1 criteria did not capture five Yale cohort patients with immunophenotypic abnormalities who later progressed. TCR-Vβ testing was more specific than PCR and aided diagnosis in detecting clonality, but was of limited benefit in quantification of tumor burden. Limitations Because of limited follow-up involving a single center, further investigation will be necessary to conclude whether our proposed diagnostic algorithm is of general clinical benefit. Conclusion We propose further study of “modified B1 criteria”: CD4/CD8 ratio ≥5, %CD4+/CD26- ≥ 20%, %CD4+/CD7- ≥ 20%, with evidence of clonality. TCR-Vβ testing should be considered in future diagnostic and staging algorithms. PMID:26874819
NASA Astrophysics Data System (ADS)
Heffernan, Julieanne; Biedermann, Eric; Mayes, Alexander; Livings, Richard; Jauriqui, Leanne; Goodlet, Brent; Aldrin, John C.; Mazdiyasni, Siamack
2018-04-01
Process Compensated Resonant Testing (PCRT) is a full-body nondestructive testing (NDT) method that measures the resonance frequencies of a part and correlates them to the part's material and/or damage state. PCRT testing is used in the automotive, aerospace, and power generation industries via automated PASS/FAIL inspections to distinguish parts with nominal process variation from those with the defect(s) of interest. Traditional PCRT tests are created through the statistical analysis of populations of "good" and "bad" parts. However, gathering a statistically significant number of parts can be costly and time-consuming, and the availability of defective parts may be limited. This work uses virtual databases of good and bad parts to create two targeted PCRT inspections for single crystal (SX) nickel-based superalloy turbine blades. Using finite element (FE) models, populations were modeled to include variations in geometric dimensions, material properties, crystallographic orientation, and creep damage. Model results were verified by comparing the frequency variation in the modeled populations with the measured frequency variations of several physical blade populations. Additionally, creep modeling results were verified through the experimental evaluation of coupon geometries. A virtual database of resonance spectra was created from the model data. The virtual database was used to create PCRT inspections to detect crystallographic defects and creep strain. Quantification of creep strain values using the PCRT inspection results was also demonstrated.
Juránková, Jana; Opsteegh, Marieke; Neumayerová, Helena; Kovařčík, Kamil; Frencová, Anita; Baláž, Vojtěch; Volf, Jiří; Koudela, Břetislav
2013-03-31
Undercooked meat containing tissue cysts is one of the most common sources of Toxoplasma gondii infection in humans. Goats are very susceptible to clinical toxoplasmosis, and especially kids are common food animals, thereby representing a risk for human infection. A sequence-specific magnetic capture method was used for isolation of T. gondii DNA from tissue samples from experimentally infected goat-kids and real-time PCR for the 529 bp repeat element allowed quantification of T. gondii DNA. The contamination level in different types of tissue and in two groups of goats euthanized 30 and 90 dpi was compared. The highest concentration of T. gondii DNA in both groups of goats was found in lung tissue, but only the higher parasite count in lung tissue compared to other organs in group A (euthanized 30 dpi) was statistically significant. T. gondii concentrations were higher in liver and dorsal muscle samples from goats euthanized 90 dpi than in goats euthanized at 30 dpi, while the T. gondii concentration in hearts decreased. This study describes for the first time distribution of T. gondii parasites in post-weaned goat kids. New information about T. gondii predilection sites in goats and about the progression of infection between 30 and 90 dpi was achieved. Copyright © 2012 Elsevier B.V. All rights reserved.
2017-01-01
We introduce several new resilience metrics for quantifying the resilience of critical material supply chains to disruptions and validate these metrics using the 2010 rare earth element (REE) crisis as a case study. Our method is a novel application of Event Sequence Analysis, supplemented with interviews of actors across the entire supply chain. We discuss resilience mechanisms in quantitative terms–time lags, response speeds, and maximum magnitudes–and in light of cultural differences between Japanese and European corporate practice. This quantification is crucial if resilience is ever to be taken into account in criticality assessments and a step toward determining supply and demand elasticities in the REE supply chain. We find that the REE system showed resilience mainly through substitution and increased non-Chinese primary production, with a distinct role for stockpiling. Overall, annual substitution rates reached 10% of total demand. Non-Chinese primary production ramped up at a speed of 4% of total market volume per year. The compound effect of these mechanisms was that recovery from the 2010 disruption took two years. The supply disruption did not nudge a system toward an appreciable degree of recycling. This finding has important implications for the circular economy concept, indicating that quite a long period of sustained material constraints will be necessary for a production-consumption system to naturally evolve toward a circular configuration. PMID:28257181
Kamal, Abid; Khan, Washim; Ahmad, Sayeed; Ahmad, F. J.; Saleem, Kishwar
2015-01-01
Objective: The present study was used to design simple, accurate and sensitive reversed phase-high-performance liquid chromatography RP-HPLC and high-performance thin-layer chromatography (HPTLC) methods for the development of quantification of khellin present in the seeds of Ammi visnaga. Materials and Methods: RP-HPLC analysis was performed on a C18 column with methanol: Water (75: 25, v/v) as a mobile phase. The HPTLC method involved densitometric evaluation of khellin after resolving it on silica gel plate using ethyl acetate: Toluene: Formic acid (5.5:4.0:0.5, v/v/v) as a mobile phase. Results: The developed HPLC and HPTLC methods were validated for precision (interday, intraday and intersystem), robustness and accuracy, limit of detection and limit of quantification. The relationship between the concentration of standard solutions and the peak response was linear in both HPLC and HPTLC methods with the concentration range of 10–80 μg/mL in HPLC and 25–1,000 ng/spot in HPTLC for khellin. The % relative standard deviation values for method precision was found to be 0.63–1.97%, 0.62–2.05% in HPLC and HPTLC for khellin respectively. Accuracy of the method was checked by recovery studies conducted at three different concentration levels and the average percentage recovery was found to be 100.53% in HPLC and 100.08% in HPTLC for khellin. Conclusions: The developed HPLC and HPTLC methods for the quantification of khellin were found simple, precise, specific, sensitive and accurate which can be used for routine analysis and quality control of A. visnaga and several formulations containing it as an ingredient. PMID:26681890
Oberbach, Andreas; Schlichting, Nadine; Neuhaus, Jochen; Kullnick, Yvonne; Lehmann, Stefanie; Heinrich, Marco; Dietrich, Arne; Mohr, Friedrich Wilhelm; von Bergen, Martin; Baumann, Sven
2014-12-05
Multiple reaction monitoring (MRM)-based mass spectrometric quantification of peptides and their corresponding proteins has been successfully applied for biomarker validation in serum. The option of multiplexing offers the chance to analyze various proteins in parallel, which is especially important in obesity research. Here, biomarkers that reflect multiple comorbidities and allow monitoring of therapy outcomes are required. Besides the suitability of established MRM assays for serum protein quantification, it is also feasible for analysis of tissues secreting the markers of interest. Surprisingly, studies comparing MRM data sets with established methods are rare, and therefore the biological and clinical value of most analytes remains questionable. A MRM method using nano-UPLC-MS/MS for the quantification of obesity related surrogate markers for several comorbidities in serum, plasma, visceral and subcutaneous adipose tissue was established. Proteotypic peptides for complement C3, adiponectin, angiotensinogen, and plasma retinol binding protein (RBP4) were quantified using isotopic dilution analysis and compared to the standard ELISA method. MRM method variabilities were mainly below 10%. The comparison with other MS-based approaches showed a good correlation. However, large differences in absolute quantification for complement C3 and adiponectin were obtained compared to ELISA, while less marked differences were observed for angiotensinogen and RBP4. The verification of MRM in obesity was performed to discriminate first lean and obese phenotype and second to monitor excessive weight loss after gastric bypass surgery in a seven-month follow-up. The presented MRM assay was able to discriminate obese phenotype from lean and monitor weight loss related changes of surrogate markers. However, inclusion of additional biomarkers was necessary to interpret the MRM data on obesity phenotype properly. In summary, the development of disease-related MRMs should include a step of matching the MRM data with clinically approved standard methods and defining reference values in well-sized representative age, gender, and disease-matched cohorts.
Muratovic, Aida Zuberovic; Hagström, Thomas; Rosén, Johan; Granelli, Kristina; Hellenäs, Karl-Erik
2015-09-11
A method that uses mass spectrometry (MS) for identification and quantification of protein toxins, staphylococcal enterotoxins A and B (SEA and SEB), in milk and shrimp is described. The analysis was performed using a tryptic peptide, from each of the toxins, as the target analyte together with the corresponding (13)C-labeled synthetic internal standard peptide. The performance of the method was evaluated by analyzing spiked samples in the quantification range 2.5-30 ng/g (R² = 0.92-0.99). The limit of quantification (LOQ) in milk and the limit of detection (LOD) in shrimp was 2.5 ng/g, for both SEA and SEB toxins. The in-house reproducibility (RSD) was 8%-30% and 5%-41% at different concentrations for milk and shrimp, respectively. The method was compared to the ELISA method, used at the EU-RL (France), for milk samples spiked with SEA at low levels, in the quantification range of 2.5 to 5 ng/g. The comparison showed good coherence for the two methods: 2.9 (MS)/1.8 (ELISA) and 3.6 (MS)/3.8 (ELISA) ng/g. The major advantage of the developed method is that it allows direct confirmation of the molecular identity and quantitative analysis of SEA and SEB at low nanogram levels using a label and antibody free approach. Therefore, this method is an important step in the development of alternatives to the immune-assay tests currently used for staphylococcal enterotoxin analysis.
Collagen Quantification in Tissue Specimens.
Coentro, João Quintas; Capella-Monsonís, Héctor; Graceffa, Valeria; Wu, Zhuning; Mullen, Anne Maria; Raghunath, Michael; Zeugolis, Dimitrios I
2017-01-01
Collagen is the major extracellular protein in mammals. Accurate quantification of collagen is essential in the biomaterials (e.g., reproducible collagen scaffold fabrication), drug discovery (e.g., assessment of collagen in pathophysiologies, such as fibrosis), and tissue engineering (e.g., quantification of cell-synthesized collagen) fields. Although measuring hydroxyproline content is the most widely used method to quantify collagen in biological specimens, the process is very laborious. To this end, the Sircol™ Collagen Assay is widely used due to its inherent simplicity and convenience. However, this method leads to overestimation of collagen content due to the interaction of Sirius red with basic amino acids of non-collagenous proteins. Herein, we describe the addition of an ultrafiltration purification step in the process to accurately determine collagen content in tissues.
A quantitative witness for Greenberger-Horne-Zeilinger entanglement.
Eltschka, Christopher; Siewert, Jens
2012-01-01
Along with the vast progress in experimental quantum technologies there is an increasing demand for the quantification of entanglement between three or more quantum systems. Theory still does not provide adequate tools for this purpose. The objective is, besides the quest for exact results, to develop operational methods that allow for efficient entanglement quantification. Here we put forward an analytical approach that serves both these goals. We provide a simple procedure to quantify Greenberger-Horne-Zeilinger-type multipartite entanglement in arbitrary three-qubit states. For two qubits this method is equivalent to Wootters' seminal result for the concurrence. It establishes a close link between entanglement quantification and entanglement detection by witnesses, and can be generalised both to higher dimensions and to more than three parties.
A quantitative witness for Greenberger-Horne-Zeilinger entanglement
Eltschka, Christopher; Siewert, Jens
2012-01-01
Along with the vast progress in experimental quantum technologies there is an increasing demand for the quantification of entanglement between three or more quantum systems. Theory still does not provide adequate tools for this purpose. The objective is, besides the quest for exact results, to develop operational methods that allow for efficient entanglement quantification. Here we put forward an analytical approach that serves both these goals. We provide a simple procedure to quantify Greenberger-Horne-Zeilinger–type multipartite entanglement in arbitrary three-qubit states. For two qubits this method is equivalent to Wootters' seminal result for the concurrence. It establishes a close link between entanglement quantification and entanglement detection by witnesses, and can be generalised both to higher dimensions and to more than three parties. PMID:23267431
Deng, Yong; Luo, Zhaoyang; Jiang, Xu; Xie, Wenhao; Luo, Qingming
2015-07-01
We propose a method based on a decoupled fluorescence Monte Carlo model for constructing fluorescence Jacobians to enable accurate quantification of fluorescence targets within turbid media. The effectiveness of the proposed method is validated using two cylindrical phantoms enclosing fluorescent targets within homogeneous and heterogeneous background media. The results demonstrate that our method can recover relative concentrations of the fluorescent targets with higher accuracy than the perturbation fluorescence Monte Carlo method. This suggests that our method is suitable for quantitative fluorescence diffuse optical tomography, especially for in vivo imaging of fluorophore targets for diagnosis of different diseases and abnormalities.
Funderburg, Rebecca; Arevalo, Ricardo; Locmelis, Marek; Adachi, Tomoko
2017-11-01
Laser ablation ICP-MS enables streamlined, high-sensitivity measurements of rare earth element (REE) abundances in geological materials. However, many REE isotope mass stations are plagued by isobaric interferences, particularly from diatomic oxides and argides. In this study, we compare REE abundances quantitated from mass spectra collected with low-resolution (m/Δm = 300 at 5% peak height) and medium-resolution (m/Δm = 2500) mass discrimination. A wide array of geological samples was analyzed, including USGS and NIST glasses ranging from mafic to felsic in composition, with NIST 610 employed as the bracketing calibrating reference material. The medium-resolution REE analyses are shown to be significantly more accurate and precise (at the 95% confidence level) than low-resolution analyses, particularly in samples characterized by low (<μg/g levels) REE abundances. A list of preferred mass stations that are least susceptible to isobaric interferences is reported. These findings impact the reliability of REE abundances derived from LA-ICP-MS methods, particularly those relying on mass analyzers that do not offer tuneable mass-resolution and/or collision cell technologies that can reduce oxide and/or argide formation. Graphical Abstract ᅟ.
NASA Astrophysics Data System (ADS)
Funderburg, Rebecca; Arevalo, Ricardo; Locmelis, Marek; Adachi, Tomoko
2017-07-01
Laser ablation ICP-MS enables streamlined, high-sensitivity measurements of rare earth element (REE) abundances in geological materials. However, many REE isotope mass stations are plagued by isobaric interferences, particularly from diatomic oxides and argides. In this study, we compare REE abundances quantitated from mass spectra collected with low-resolution (m/Δm = 300 at 5% peak height) and medium-resolution (m/Δm = 2500) mass discrimination. A wide array of geological samples was analyzed, including USGS and NIST glasses ranging from mafic to felsic in composition, with NIST 610 employed as the bracketing calibrating reference material. The medium-resolution REE analyses are shown to be significantly more accurate and precise (at the 95% confidence level) than low-resolution analyses, particularly in samples characterized by low (<μg/g levels) REE abundances. A list of preferred mass stations that are least susceptible to isobaric interferences is reported. These findings impact the reliability of REE abundances derived from LA-ICP-MS methods, particularly those relying on mass analyzers that do not offer tuneable mass-resolution and/or collision cell technologies that can reduce oxide and/or argide formation.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Krakowiak, Konrad J.; Wilson, William; James, Simon
2015-01-15
A novel approach for the chemo-mechanical characterization of cement-based materials is presented, which combines the classical grid indentation technique with elemental mapping by scanning electron microscopy-energy dispersive X-ray spectrometry (SEM-EDS). It is illustrated through application to an oil-well cement system with siliceous filler. The characteristic X-rays of major elements (silicon, calcium and aluminum) are measured over the indentation region and mapped back on the indentation points. Measured intensities together with indentation hardness and modulus are considered in a clustering analysis within the framework of Finite Mixture Models with Gaussian component density function. The method is able to successfully isolate themore » calcium-silica-hydrate gel at the indentation scale from its mixtures with other products of cement hydration and anhydrous phases; thus providing a convenient means to link mechanical response to the calcium-to-silicon ratio quantified independently via X-ray wavelength dispersive spectroscopy. A discussion of uncertainty quantification of the estimated chemo-mechanical properties and phase volume fractions, as well as the effect of chemical observables on phase assessment is also included.« less
Arellano, Cécile; Allal, Ben; Goubaa, Anwar; Roché, Henri; Chatelut, Etienne
2014-11-01
A selective and accurate analytical method is needed to quantify tamoxifen and its phase I metabolites in a prospective clinical protocol, for evaluation of pharmacokinetic parameters of tamoxifen and its metabolites in adjuvant treatment of breast cancer. The selectivity of the analytical method is a fundamental criteria to allow the quantification of the main active metabolites (Z)-isomers from (Z)'-isomers. An UPLC-MS/MS method was developed and validated for the quantification of (Z)-tamoxifen, (Z)-endoxifen, (E)-endoxifen, Z'-endoxifen, (Z)'-endoxifen, (Z)-4-hydroxytamoxifen, (Z)-4'-hydroxytamoxifen, N-desmethyl tamoxifen, and tamoxifen-N-oxide. The validation range was set between 0.5ng/mL and 125ng/mL for 4-hydroxytamoxifen and endoxifen isomers, and between 12.5ng/mL and 300ng/mL for tamoxifen, tamoxifen N-desmethyl and tamoxifen-N-oxide. The application to patient plasma samples was performed. Copyright © 2014 Elsevier B.V. All rights reserved.
Multiple products monitoring as a robust approach for peptide quantification.
Baek, Je-Hyun; Kim, Hokeun; Shin, Byunghee; Yu, Myeong-Hee
2009-07-01
Quantification of target peptides and proteins is crucial for biomarker discovery. Approaches such as selected reaction monitoring (SRM) and multiple reaction monitoring (MRM) rely on liquid chromatography and mass spectrometric analysis of defined peptide product ions. These methods are not very widespread because the determination of quantifiable product ion using either SRM or MRM is a very time-consuming process. We developed a novel approach for quantifying target peptides without such an arduous process of ion selection. This method is based on monitoring multiple product ions (multiple products monitoring: MpM) from full-range MS2 spectra of a target precursor. The MpM method uses a scoring system that considers both the absolute intensities of product ions and the similarities between the query MS2 spectrum and the reference MS2 spectrum of the target peptide. Compared with conventional approaches, MpM greatly improves sensitivity and selectivity of peptide quantification using an ion-trap mass spectrometer.
Jordan, Gregor; Onami, Ichio; Heinrich, Julia; Staack, Roland F
2017-11-01
Assessment of active drug exposure of biologics may be crucial for drug development. Typically, ligand-binding assay methods are used to provide free/active drug concentrations. To what extent hybrid LC-MS/MS procedures enable correct 'active' drug quantification is currently under consideration. Experimental & results: The relevance of appropriate extraction condition was evaluated by a hybrid target capture immuno-affinity LC-MS/MS method using total and free/active quality controls (QCs). The rapid extraction (10 min) provided correct results, whereas overnight incubation resulted in significant overestimation of the free/active drug (monclonal antibody) concentration. Conventional total QCs were inappropriate to determine optimal method conditions in contrast to free/active QCs. The 'free/active analyte QC concept' enables development of appropriate extraction conditions for correct active drug quantification by hybrid LC-MS/MS.
Jia, Xin; Fontaine, Benjamin M.; Strobel, Fred; Weinert, Emily E.
2014-01-01
A sensitive, versatile and economical method to extract and quantify cyclic nucleotide monophosphates (cNMPs) using LC-MS/MS, including both 3',5'-cNMPs and 2',3'-cNMPs, in mammalian tissues and cellular systems has been developed. Problems, such as matrix effects from complex biological samples, are addressed and have been optimized. This protocol allows for comparison of multiple cNMPs in the same system and was used to examine the relationship between tissue levels of cNMPs in a panel of rat organs. In addition, the study reports the first identification and quantification of 2',3'-cIMP. The developed method will allow for quantification of cNMPs levels in cells and tissues with varying disease states, which will provide insight into the role(s) and interplay of cNMP signalling pathways. PMID:25513747
Quantification of cardiolipin by liquid chromatography-electrospray ionization mass spectrometry.
Garrett, Teresa A; Kordestani, Reza; Raetz, Christian R H
2007-01-01
Cardiolipin (CL), a tetra-acylated glycerophospholipid composed of two phosphatidyl moieties linked by a bridging glycerol, plays an important role in mitochondrial function in eukaryotic cells. Alterations to the content and acylation state of CL cause mitochondrial dysfunction and may be associated with pathologies such as ischemia, hypothyrodism, aging, and heart failure. The structure of CL is very complex because of microheterogeneity among its four acyl chains. Here we have developed a method for the quantification of CL molecular species by liquid chromatography-electrospray ionization mass spectrometry. We quantify the [M-2H](2-) ion of a CL of a given molecular formula and identify the CLs by their total number of carbons and unsaturations in the acyl chains. This method, developed using mouse macrophage RAW 264.7 tumor cells, is broadly applicable to other cell lines, tissues, bacteria and yeast. Furthermore, this method could be used for the quantification of lyso-CLs and bis-lyso-CLs.
Jia, Xin; Fontaine, Benjamin M; Strobel, Fred; Weinert, Emily E
2014-12-12
A sensitive, versatile and economical method to extract and quantify cyclic nucleotide monophosphates (cNMPs) using LC-MS/MS, including both 3',5'-cNMPs and 2',3'-cNMPs, in mammalian tissues and cellular systems has been developed. Problems, such as matrix effects from complex biological samples, are addressed and have been optimized. This protocol allows for comparison of multiple cNMPs in the same system and was used to examine the relationship between tissue levels of cNMPs in a panel of rat organs. In addition, the study reports the first identification and quantification of 2',3'-cIMP. The developed method will allow for quantification of cNMPs levels in cells and tissues with varying disease states, which will provide insight into the role(s) and interplay of cNMP signalling pathways.
Trimboli, Francesca; Morittu, Valeria Maria; Cicino, Caterina; Palmieri, Camillo; Britti, Domenico
2017-10-13
The substitution of ewe milk with more economic cow milk is a common fraud. Here we present a capillary electrophoresis method for the quantification of ewe milk in ovine/bovine milk mixtures, which allows for the rapid and inexpensive recognition of ewe milk adulteration with cow milk. We utilized a routine CE method for human blood and urine proteins analysis, which fulfilled the separation of skimmed milk proteins in alkaline buffer. Under this condition, ovine and bovine milk exhibited a recognizable and distinct CE protein profiles, with a specific ewe peak showing a reproducible migration zone in ovine/bovine mixtures. Based on ewe specific CE peak, we developed a method for ewe milk quantification in ovine/bovine skimmed milk mixtures, which showed good linearity, precision and accuracy, and a minimum amount of detectable fraudulent cow milk equal to 5%. Copyright © 2017 Elsevier B.V. All rights reserved.
Le Corre, Mathieu; Carey, Susan
2007-11-01
Since the publication of [Gelman, R., & Gallistel, C. R. (1978). The child's understanding of number. Cambridge, MA: Harvard University Press.] seminal work on the development of verbal counting as a representation of number, the nature of the ontogenetic sources of the verbal counting principles has been intensely debated. The present experiments explore proposals according to which the verbal counting principles are acquired by mapping numerals in the count list onto systems of numerical representation for which there is evidence in infancy, namely, analog magnitudes, parallel individuation, and set-based quantification. By asking 3- and 4-year-olds to estimate the number of elements in sets without counting, we investigate whether the numerals that are assigned cardinal meaning as part of the acquisition process display the signatures of what we call "enriched parallel individuation" (which combines properties of parallel individuation and of set-based quantification) or analog magnitudes. Two experiments demonstrate that while "one" to "four" are mapped onto core representations of small sets prior to the acquisition of the counting principles, numerals beyond "four" are only mapped onto analog magnitudes about six months after the acquisition of the counting principles. Moreover, we show that children's numerical estimates of sets from 1 to 4 elements fail to show the signature of numeral use based on analog magnitudes - namely, scalar variability. We conclude that, while representations of small sets provided by parallel individuation, enriched by the resources of set-based quantification are recruited in the acquisition process to provide the first numerical meanings for "one" to "four", analog magnitudes play no role in this process.
Hynstova, Veronika; Sterbova, Dagmar; Klejdus, Borivoj; Hedbavny, Josef; Huska, Dalibor; Adam, Vojtech
2018-01-30
In this study, 14 commercial products (dietary supplements) containing alga Chlorella vulgaris and cyanobacteria Spirulina platensis, originated from China and Japan, were analysed. UV-vis spectrophotometric method was applied for rapid determination of chlorophylls, carotenoids and pheophytins; as degradation products of chlorophylls. High Performance Thin-Layer Chromatography (HPTLC) was used for effective separation of these compounds, and also Atomic Absorption Spectrometry for determination of heavy metals as indicator of environmental pollution. Based on the results obtained from UV-vis spectrophotometric determination of photosynthetic pigments (chlorophylls and carotenoids), it was confirmed that Chlorella vulgaris contains more of all these pigments compared to the cyanobacteria Spirulina platensis. The fastest mobility compound identified in Chlorella vulgaris and Spirulina platensis using HPTLC method was β-carotene. Spectral analysis and standard calibration curve method were used for identification and quantification of separated substances on Thin-Layer Chromatographic plate. Quantification of copper (Cu 2+ , at 324.7 nm) and zinc (Zn 2+ , at 213.9nm) was performed using Flame Atomic Absorption Spectrometry with air-acetylene flame atomization. Quantification of cadmium (Cd 2+ , at 228.8 nm), nickel (Ni 2+ , at 232.0nm) and lead (Pb 2+ , at 283.3nm) by Electrothermal Graphite Furnace Atomic Absorption Spectrometry; and quantification of mercury (Hg 2+ , at 254nm) by Cold Vapour Atomic Absorption Spectrometry. Copyright © 2017 Elsevier B.V. All rights reserved.
Qian, Yiyun; Zhu, Zhenhua; Duan, Jin-Ao; Guo, Sheng; Shang, Erxin; Tao, Jinhua; Su, Shulan; Guo, Jianming
2017-01-15
A highly sensitive method using ultra-high-pressure liquid chromatography coupled with linear ion trap-Orbitrap tandem mass spectrometry (UHPLC-LTQ-Orbitrap-MS) has been developed and validated for the simultaneous identification and quantification of ginkgolic acids and semi-quantification of their metabolites in rat plasma. For the five selected ginkgolic acids, the method was found to be with good linearities (r>0.9991), good intra- and inter-day precisions (RSD<15%), and good accuracies (RE, from -10.33% to 4.92%) as well. Extraction recoveries, matrix effects and stabilities for rat plasm samples were within the required limits. The validated method was successfully applied to investigate the pharmacokinetics of the five ginkgolic acids in rat plasma after oral administration of 3 dosage groups (900mg/kg, 300mg/kg and 100mg/kg). Meanwhile, six metabolites of GA (15:1) and GA (17:1) were identified by comparison of MS data with reported values. The results of validation in terms of linear ranges, precisions and stabilities were established for semi-quantification of metabolites. The curves of relative changes of these metabolites during the metabolic process were constructed by plotting the peak area ratios of metabolites to salicylic acid (internal standard, IS), respectively. Double peaks were observed in all 3 dose groups. Different type of metabolites and different dosage of each metabolite both resulted in different T max . Copyright © 2016 Elsevier B.V. All rights reserved.
Fabregat-Cabello, Neus; Sancho, Juan V; Vidal, Andreu; González, Florenci V; Roig-Navarro, Antoni Francesc
2014-02-07
We present here a new measurement method for the rapid extraction and accurate quantification of technical nonylphenol (NP) and 4-t-octylphenol (OP) in complex matrix water samples by UHPLC-ESI-MS/MS. The extraction of both compounds is achieved in 30min by means of hollow fiber liquid phase microextraction (HF-LPME) using 1-octanol as acceptor phase, which provides an enrichment (preconcentration) factor of 800. On the other hand we have developed a quantification method based on isotope dilution mass spectrometry (IDMS) and singly (13)C1-labeled compounds. To this end the minimal labeled (13)C1-4-(3,6-dimethyl-3-heptyl)-phenol and (13)C1-t-octylphenol isomers were synthesized, which coelute with the natural compounds and allows the compensation of the matrix effect. The quantification was carried out by using isotope pattern deconvolution (IPD), which permits to obtain the concentration of both compounds without the need to build any calibration graph, reducing the total analysis time. The combination of both extraction and determination techniques have allowed to validate for the first time a HF-LPME methodology at the required levels by legislation achieving limits of quantification of 0.1ngmL(-1) and recoveries within 97-109%. Due to the low cost of HF-LPME and total time consumption, this methodology is ready for implementation in routine analytical laboratories. Copyright © 2013 Elsevier B.V. All rights reserved.
Tutorial examples for uncertainty quantification methods.
DOE Office of Scientific and Technical Information (OSTI.GOV)
De Bord, Sarah
2015-08-01
This report details the work accomplished during my 2015 SULI summer internship at Sandia National Laboratories in Livermore, CA. During this internship, I worked on multiple tasks with the common goal of making uncertainty quantification (UQ) methods more accessible to the general scientific community. As part of my work, I created a comprehensive numerical integration example to incorporate into the user manual of a UQ software package. Further, I developed examples involving heat transfer through a window to incorporate into tutorial lectures that serve as an introduction to UQ methods.
NASA Astrophysics Data System (ADS)
Köhler, Reinhard
2014-12-01
We have long been used to the domination of qualitative methods in modern linguistics. Indeed, qualitative methods have advantages such as ease of use and wide applicability to many types of linguistic phenomena. However, this shall not overshadow the fact that a great part of human language is amenable to quantification. Moreover, qualitative methods may lead to over-simplification by employing the rigid yes/no scale. When variability and vagueness of human language must be taken into account, qualitative methods will prove inadequate and give way to quantitative methods [1, p. 11]. In addition to such advantages as exactness and precision, quantitative concepts and methods make it possible to find laws of human language which are just like those in natural sciences. These laws are fundamental elements of linguistic theories in the spirit of the philosophy of science [2,3]. Theorization effort of this type is what quantitative linguistics [1,4,5] is devoted to. The review of Cong and Liu [6] has provided an informative and insightful survey of linguistic complex networks as a young field of quantitative linguistics, including the basic concepts and measures, the major lines of research with linguistic motivation, and suggestions for future research.
Sensitive Detection Using Microfluidics and Nonlinear Amplification
2011-07-22
Quantification of Nucleic Acids via Simultaneous Chemical Initiation of Recombinase Polymerase Amplification Reactions on SlipChip" 2011, 83, 3533... Amplification 5a. CONTRACT NUMBER 5b. GRANT NUMBER N00014-08-1-0936 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) Rustem F. Ismagilov 5d. PROJECT NUMBER 5e...concentrations by combining controlled chemical autocatalytic amplification and stochastic confinement of small particles with the microfluidic
McAda, D.P.
1996-01-01
The Albuquerque Basin in central New Mexico covers an area of about 3,060 square miles. Ground water from the Santa Fe Group aquifer system of the Albuquerque Basin is the principal source of water for municipal, domestic, commercial, and industrial uses in the Albuquerque area, an area of about 410 square miles. Ground- water withdrawal in the basin has increased from about 97,000 acre-feet in 1970 to about 171,000 acre-feet in 1994. About 92 percent of the 1994 total was withdrawn in the Albuquerque area. Management of ground water in the Albuquerque Basin is related to the surface water in the Rio Grande. Because the aquifer system is hydraulically connected to the Rio Grande and water in the river is fully appropriated, the ability to reliably estimate the effects of ground-water withdrawals on flow in the river is important. This report describes the components of the Rio Grande/Santa Fe Group aquifer system in the Albuquerque area and the data availability and data and interpretation needs relating to those components, and presents a plan of study to quantify the hydrologic relations between the Rio Grande and the Santa Fe Group aquifer system. The information needs related to the components of the river/aquifer system are prioritized. Information that is necessary to improve the understanding or quantification of a component in the river/aquifer system is prioritized as essential. Information that could add additional understanding of the system, but would not be necessary to improve the quantification of the system, is prioritized as useful. The study elements are prioritized in the same manner as the information needs; study elements designed to provide information considered necessary to improve the quantification of the system are prioritized as essential, and those designed to provide information that would add additional understanding of the system, but would not be necessary to improve the quantification of the system, are prioritized as useful.
Prado, Marta; Boix, Ana; von Holst, Christoph
2012-07-01
The development of DNA-based methods for the identification and quantification of fish in food and feed samples is frequently focused on a specific fish species and/or on the detection of mitochondrial DNA of fish origin. However, a quantitative method for the most common fish species used by the food and feed industry is needed for official control purposes, and such a method should rely on the use of a single-copy nuclear DNA target owing to its more stable copy number in different tissues. In this article, we report on the development of a real-time PCR method based on the use of a nuclear gene as a target for the simultaneous detection of fish DNA from different species and on the evaluation of its quantification potential. The method was tested in 22 different fish species, including those most commonly used by the food and feed industry, and in negative control samples, which included 15 animal species and nine feed ingredients. The results show that the method reported here complies with the requirements concerning specificity and with the criteria required for real-time PCR methods with high sensitivity.
Multi-fidelity methods for uncertainty quantification in transport problems
NASA Astrophysics Data System (ADS)
Tartakovsky, G.; Yang, X.; Tartakovsky, A. M.; Barajas-Solano, D. A.; Scheibe, T. D.; Dai, H.; Chen, X.
2016-12-01
We compare several multi-fidelity approaches for uncertainty quantification in flow and transport simulations that have a lower computational cost than the standard Monte Carlo method. The cost reduction is achieved by combining a small number of high-resolution (high-fidelity) simulations with a large number of low-resolution (low-fidelity) simulations. We propose a new method, a re-scaled Multi Level Monte Carlo (rMLMC) method. The rMLMC is based on the idea that the statistics of quantities of interest depends on scale/resolution. We compare rMLMC with existing multi-fidelity methods such as Multi Level Monte Carlo (MLMC) and reduced basis methods and discuss advantages of each approach.
NASA Astrophysics Data System (ADS)
Sánchez, Héctor Jorge; Valentinuzzi, María Cecilia; Grenón, Miram; Abraham, José
2008-12-01
Osteoporosis is a disease characterized by low bone mass and microarchitectural deterioration of bone tissue, leading to bone fragility and an increased susceptibility to fractures; the early stage of decreased bone density is called osteopenia. More than 200 million people are affected and about 50% of post-menopausic women are expected to develop the disease. Osteoporosis, osteopenia and periodontal disease have in common several risk factors, being hyperthyroidism and smoking habits the most important ones. There is scarce information in the literature about the association between periodontal disease and osteoporosis and/or osteopenia. Some works suggest that osteoporotic women are susceptible to a higher loss of periodontal insertion, alveolar bones, and teeth. Thirty adult post-menopausic women were studied; some of them were healthy (control group) and the rest of them were undergoing some stage of osteoporosis or osteopenia. All the subjects were healthy, non-smokers, not having dental implants, and with communitarian periodontal index higher than 1(CPI > 1). Samples of saliva and gingival crevice fluid were extracted with calibrated micro-capillaries and deposited on Si reflectors. Known amounts of Ga were added to the samples in order to act as internal standard for quantification by the total reflection x-ray fluorescence technique. Experimental concentrations of several elements (P, S, Cl, K, Ca, Cr, Fe, NI, Cu, and Zn) were determined. The concentration of some elements in saliva showed different behavior as compared to gingival crevice fluid. Some critical elements of bone composition, such as Ca and Zn, present very distinguishable behavior. Improvements in the statistics are required for a better assessment of a routine method and to establish some correlation with periodontal disease. TXRF seems to be a promising method to evaluate the evolution of osteoporosis.
Determination of Metal Elements in Wine Using Laser-Induced Breakdown Spectroscopy (LIBS).
Bocková, Jana; Tian, Ye; Yin, Hualiang; Delepine-Gilon, Nicole; Chen, Yanping; Veis, Pavel; Yu, Jin
2017-08-01
We developed a method for sensitive elemental analysis of wines using laser-induced breakdown spectroscopy (LIBS). In order to overcome the inefficiency of direct ablation of bulk wine (an organic liquid), a thin layer of wine residue was prepared on a metallic target according to an appropriated heating procedure applied to an amount of liquid wine dropped on the target surface. The obtained ensemble was thus ablated. Such a sample preparation procedure used a very small volume of 2 mL of wine and took only 30 min without reagent or solvent. The results show the detection of tens of metal and non-metal elements including majors (Na, Mg, K, Ca), minors, and traces (Li, B, Si, P, Ti, Mn, Fe, Cu, Zn, Rb, Sr, Ba, and Pb) in wines purchased from local supermarkets and from different production places in France. Commercially available wines were then spiked with certified standard solutions of Ti and Fe. Three series of laboratory reference samples were thus prepared using three different wines (a red wine and a white wine from a same production region and a red wine from another production region) with concentrations of Ti and Fe in the range of 1-40 mg/L. Calibration graphs established with the spiked samples allowed extracting the figures-of-merit parameters of the method for wine analysis such as the coefficient of determination ( R 2 ) and the limits of detection and quantification (LOD and LOQ). The calibration curves built with the three wines were then compared. We studied the residual matrix effect between these wines in the determination of the concentrations of Ti and Fe.
Cheng, Dongwan; Zheng, Li; Hou, Junjie; Wang, Jifeng; Xue, Peng; Yang, Fuquan; Xu, Tao
2015-01-01
The absolute quantification of target proteins in proteomics involves stable isotope dilution coupled with multiple reactions monitoring mass spectrometry (SID-MRM-MS). The successful preparation of stable isotope-labeled internal standard peptides is an important prerequisite for the SID-MRM absolute quantification methods. Dimethyl labeling has been widely used in relative quantitative proteomics and it is fast, simple, reliable, cost-effective, and applicable to any protein sample, making it an ideal candidate method for the preparation of stable isotope-labeled internal standards. MRM mass spectrometry is of high sensitivity, specificity, and throughput characteristics and can quantify multiple proteins simultaneously, including low-abundance proteins in precious samples such as pancreatic islets. In this study, a new method for the absolute quantification of three proteases involved in insulin maturation, namely PC1/3, PC2 and CPE, was developed by coupling a stable isotope dimethyl labeling strategy for internal standard peptide preparation with SID-MRM-MS quantitative technology. This method offers a new and effective approach for deep understanding of the functional status of pancreatic β cells and pathogenesis in diabetes.
Quantification of alginate by aggregation induced by calcium ions and fluorescent polycations.
Zheng, Hewen; Korendovych, Ivan V; Luk, Yan-Yeung
2016-01-01
For quantification of polysaccharides, including heparins and alginates, the commonly used carbazole assay involves hydrolysis of the polysaccharide to form a mixture of UV-active dye conjugate products. Here, we describe two efficient detection and quantification methods that make use of the negative charges of the alginate polymer and do not involve degradation of the targeted polysaccharide. The first method utilizes calcium ions to induce formation of hydrogel-like aggregates with alginate polymer; the aggregates can be quantified readily by staining with a crystal violet dye. This method does not require purification of alginate from the culture medium and can measure the large amount of alginate that is produced by a mucoid Pseudomonas aeruginosa culture. The second method employs polycations tethering a fluorescent dye to form suspension aggregates with the alginate polyanion. Encasing the fluorescent dye in the aggregates provides an increased scattering intensity with a sensitivity comparable to that of the conventional carbazole assay. Both approaches provide efficient methods for monitoring alginate production by mucoid P. aeruginosa. Copyright © 2015 Elsevier Inc. All rights reserved.
Lorenz, Dominic; Erasmy, Nicole; Akil, Youssef; Saake, Bodo
2016-04-20
A new method for the chemical characterization of xylans is presented, to overcome the difficulties in quantification of 4-O-methyl-α-D-glucuronic acid (meGlcA). In this regard, the hydrolysis behavior of xylans from beech and birch wood was investigated to obtain the optimum conditions for hydrolysis, using sulfuric acid. Due to varying linkage strengths and degradation, no general method for complete hydrolysis can be designed. Therefore, partial hydrolysis was applied, yielding monosaccharides and small meGlcA containing oligosaccharides. For a new method by HPAEC-UV/VIS, these samples were reductively aminated by 2-aminobenzoic acid. By quantification of monosaccharides and oligosaccharides, as well as comparison with borate-HPAEC and (13)C NMR-spectroscopy, we revealed that the concentrations meGlcA are significantly underestimated compared to conventional methods. The detected concentrations are 85.4% (beech) and 76.3% (birch) higher with the new procedure. Furthermore, the quantified concentrations of xylose were 9.3% (beech) and 6.5% (birch) higher by considering the unhydrolyzed oligosaccharides as well. Copyright © 2015 Elsevier Ltd. All rights reserved.
Srivastava, Nishi; Srivastava, Amit; Srivastava, Sharad; Rawat, Ajay Kumar Singh; Khan, Abdul Rahman
2016-03-01
A rapid, sensitive, selective and robust quantitative densitometric high-performance thin-layer chromatographic method was developed and validated for separation and quantification of syringic acid (SYA) and kaempferol (KML) in the hydrolyzed extracts of Bergenia ciliata and Bergenia stracheyi. The separation was performed on silica gel 60F254 high-performance thin-layer chromatography plates using toluene : ethyl acetate : formic acid (5 : 4: 1, v/v/v) as the mobile phase. The quantification of SYA and KML was carried out using a densitometric reflection/absorption mode at 290 nm. A dense spot of SYA and KML appeared on the developed plate at a retention factor value of 0.61 ± 0.02 and 0.70 ± 0.01. A precise and accurate quantification was performed using linear regression analysis by plotting the peak area vs concentration 100-600 ng/band (correlation coefficient: r = 0.997, regression coefficient: R(2) = 0.996) for SYA and 100-600 ng/band (correlation coefficient: r = 0.995, regression coefficient: R(2) = 0.991) for KML. The developed method was validated in terms of accuracy, recovery and inter- and intraday study as per International Conference on Harmonisation guidelines. The limit of detection and limit of quantification of SYA and KML were determined, respectively, as 91.63, 142.26 and 277.67, 431.09 ng. The statistical data analysis showed that the method is reproducible and selective for the estimation of SYA and KML in extracts of B. ciliata and B. stracheyi. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Clais, S; Boulet, G; Van Kerckhoven, M; Lanckacker, E; Delputte, P; Maes, L; Cos, P
2015-01-01
The viable plate count (VPC) is considered as the reference method for bacterial enumeration in periodontal microbiology but shows some important limitations for anaerobic bacteria. As anaerobes such as Porphyromonas gingivalis are difficult to culture, VPC becomes time-consuming and less sensitive. Hence, efficient normalization of experimental data to bacterial cell count requires alternative rapid and reliable quantification methods. This study compared the performance of VPC with that of turbidity measurement and real-time PCR (qPCR) in an experimental context using highly concentrated bacterial suspensions. Our TaqMan-based qPCR assay for P. gingivalis 16S rRNA proved to be sensitive and specific. Turbidity measurements offer a fast method to assess P. gingivalis growth, but suffer from high variability and a limited dynamic range. VPC was very time-consuming and less repeatable than qPCR. Our study concludes that qPCR provides the most rapid and precise approach for P. gingivalis quantification. Although our data were gathered in a specific research context, we believe that our conclusions on the inferior performance of VPC and turbidity measurements in comparison to qPCR can be extended to other research and clinical settings and even to other difficult-to-culture micro-organisms. Various clinical and research settings require fast and reliable quantification of bacterial suspensions. The viable plate count method (VPC) is generally seen as 'the gold standard' for bacterial enumeration. However, VPC-based quantification of anaerobes such as Porphyromonas gingivalis is time-consuming due to their stringent growth requirements and shows poor repeatability. Comparison of VPC, turbidity measurement and TaqMan-based qPCR demonstrated that qPCR possesses important advantages regarding speed, accuracy and repeatability. © 2014 The Society for Applied Microbiology.
Ermacora, Alessia; Hrnčiřík, Karel
2014-01-01
Substantial progress has been recently made in the development and optimisation of analytical methods for the quantification of 2-MCPD, 3-MCPD and glycidyl esters in oils and fats, and there are a few methods currently available that allow a reliable quantification of these contaminants in bulk oils and fats. On the other hand, no standard method for the analysis of foodstuffs has yet been established. The aim of this study was the development and validation of a new method for the simultaneous quantification of 2-MCPD, 3-MCPD and glycidyl esters in oil-based food products. The developed protocol includes a first step of liquid-liquid extraction and purification of the lipophilic substances of the sample, followed by the application of a previously developed procedure based on acid transesterification, for the indirect quantification of these contaminants in oils and fats. The method validation was carried out on food products (fat-based spreads, creams, margarine, mayonnaise) manufactured in-house, in order to control the manufacturing process and account for any food matrix-analyte interactions (the sample spiking was carried out on the single components used for the formulations rather than the final products). The method showed good accuracy (the recoveries ranged from 97% to 106% for bound 3-MCPD and 2-MCPD and from 88% to 115% for bound glycidol) and sensitivity (the LOD was 0.04 and 0.05 mg kg(-1) for bound MCPD and glycidol, respectively). Repeatability and reproducibility were satisfactory (RSD below 2% and 5%, respectively) for all analytes. The levels of salts and surface-active compounds in the formulation were found to have no impact on the accuracy and the other parameters of the method.
Yin, Hong-Rui; Zhang, Lei; Xie, Li-Qi; Huang, Li-Yong; Xu, Ye; Cai, San-Jun; Yang, Peng-Yuan; Lu, Hao-Jie
2013-09-06
Novel biomarker verification assays are urgently required to improve the efficiency of biomarker development. Benefitting from lower development costs, multiple reaction monitoring (MRM) has been used for biomarker verification as an alternative to immunoassay. However, in general MRM analysis, only one sample can be quantified in a single experiment, which restricts its application. Here, a Hyperplex-MRM quantification approach, which combined mTRAQ for absolute quantification and iTRAQ for relative quantification, was developed to increase the throughput of biomarker verification. In this strategy, equal amounts of internal standard peptides were labeled with mTRAQ reagents Δ0 and Δ8, respectively, as double references, while 4-plex iTRAQ reagents were used to label four different samples as an alternative to mTRAQ Δ4. From the MRM trace and MS/MS spectrum, total amounts and relative ratios of target proteins/peptides of four samples could be acquired simultaneously. Accordingly, absolute amounts of target proteins/peptides in four different samples could be achieved in a single run. In addition, double references were used to increase the reliability of the quantification results. Using this approach, three biomarker candidates, ademosylhomocysteinase (AHCY), cathepsin D (CTSD), and lysozyme C (LYZ), were successfully quantified in colorectal cancer (CRC) tissue specimens of different stages with high accuracy, sensitivity, and reproducibility. To summarize, we demonstrated a promising quantification method for high-throughput verification of biomarker candidates.
Al Feteisi, Hajar; Achour, Brahim; Rostami-Hodjegan, Amin; Barber, Jill
2015-01-01
Drug-metabolizing enzymes and transporters play an important role in drug absorption, distribution, metabolism and excretion and, consequently, they influence drug efficacy and toxicity. Quantification of drug-metabolizing enzymes and transporters in various tissues is therefore essential for comprehensive elucidation of drug absorption, distribution, metabolism and excretion. Recent advances in liquid chromatography coupled with tandem mass spectrometry (LC-MS/MS) have improved the quantification of pharmacologically relevant proteins. This report presents an overview of mass spectrometry-based methods currently used for the quantification of drug-metabolizing enzymes and drug transporters, mainly focusing on applications and cost associated with various quantitative strategies based on stable isotope-labeled standards (absolute quantification peptide standards, quantification concatemers, protein standards for absolute quantification) and label-free analysis. In mass spectrometry, there is no simple relationship between signal intensity and analyte concentration. Proteomic strategies are therefore complex and several factors need to be considered when selecting the most appropriate method for an intended application, including the number of proteins and samples. Quantitative strategies require appropriate mass spectrometry platforms, yet choice is often limited by the availability of appropriate instrumentation. Quantitative proteomics research requires specialist practical skills and there is a pressing need to dedicate more effort and investment to training personnel in this area. Large-scale multicenter collaborations are also needed to standardize quantitative strategies in order to improve physiologically based pharmacokinetic models.